WorldWideScience

Sample records for analysis improves col4a5

  1. MLPA and cDNA analysis improves COL4A5 mutation detection in X-linked Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, JM; Juncker, I; Marcussen, N

    2008-01-01

    for 10-15% of mutations. We have established a method for mutation analysis of COL4A5 based on reverse transcriptase-polymerase chain reaction analysis of mRNA from cultured skin fibroblasts and multiplex ligation-dependent probe amplification (MLPA) on genomic DNA. One advantage of using skin biopsies...

  2. Detection of mutations in the COL4A5 gene by SSCP in X-linked Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Juncker, I; Persson, U

    2001-01-01

    of type IV-collagen. We performed mutation analysis of the COL4A5 gene by PCR-SSCP analysis of each of the 51 exons with flanking intronic sequences in 81 patients suspected of X-linked Alport syndrome including 29 clear X-linked cases, 37 cases from families with a pedigree compatible with X...

  3. High mutation detection rate in the COL4A5 collagen gene in suspected Alport syndrome using PCR and direct DNA sequencing

    DEFF Research Database (Denmark)

    Martin, P; Heiskari, N; Zhou, J

    1998-01-01

    sequence and previously unknown intron sequences flanking exons 2 and 37 of COL4A5 were determined. Furthermore, intron sequences flanking the other 49 exons were expanded from 35 to 190 to facilitate mutation analysis of the gene. Using this information, all 51 exons and the promoter region were PCR......Approximately 85% of patients with Alport syndrome (hereditary nephritis) have been estimated to have mutations in the X chromosomal COL4A5 collagen gene; the remaining cases are autosomal with mutations in the COL4A3 or COL4A4 genes located on chromosome 2. In the present work, the promoter......-amplified and sequenced from DNA of 50 randomly chosen patients with suspected Alport syndrome. Mutations were found in 41 patients, giving a mutation detection rate of 82%. Retrospective analysis of clinical data revealed that two of the cases might be autosomal. Although it could not be determined whether the remaining...

  4. High mutation detection rate in the COL4A5 collagen gene in suspected Alport syndrome using PCR and direct DNA sequencing

    DEFF Research Database (Denmark)

    Martin, P; Heiskari, N; Zhou, J

    1998-01-01

    -amplified and sequenced from DNA of 50 randomly chosen patients with suspected Alport syndrome. Mutations were found in 41 patients, giving a mutation detection rate of 82%. Retrospective analysis of clinical data revealed that two of the cases might be autosomal. Although it could not be determined whether the remaining...... seven cases (14%) were autosomal or X chromosome-linked, it is likely that some of them were autosomal. It is concluded that PCR amplification and direct DNA sequencing of the promoter and exons is currently the best procedure to detect mutations in COL4A5 in Alport syndrome.......Approximately 85% of patients with Alport syndrome (hereditary nephritis) have been estimated to have mutations in the X chromosomal COL4A5 collagen gene; the remaining cases are autosomal with mutations in the COL4A3 or COL4A4 genes located on chromosome 2. In the present work, the promoter...

  5. A nonsense mutation in the COL4A5 collagen gene in a family with X-linked juvenile Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Heiskari, N; Zhou, J

    1995-01-01

    The X-linked form of Alport syndrome is associated with mutations in the COL4A5 gene encoding the alpha 5-chain of type IV collagen. By using PCR-amplification and direct sequencing we identified a novel mutation involving a deletion of the last two bases in the codon GGA for Glycine-1479 in exon...... 47 of the COL4A5 gene in a patient with a juvenile form of X-linked Alport syndrome with deafness. This two base deletion caused a shift in the reading frame and introduced a premature stop codon which resulted in an alpha 5(IV)-chain shortened by 202 residues and lacking almost the entire NC1 domain....... The mutation was found to co-segregate with the disease in the family. The information of the sequence variation in this family was used to perform carrier detection and prenatal diagnosis by allele-specific oligonucleotide hybridization analysis and direct sequencing of PCR amplified exon 47. Prenatal...

  6. Negative Staining for COL4A5 Correlates With Worse Prognosis and More Severe Ultrastructural Alterations in Males With Alport Syndrome

    Directory of Open Access Journals (Sweden)

    Samar M. Said

    2017-01-01

    Discussion: Thus, the COL4A5 expression pattern has an important prognostic value and it correlates with the severity of ultrastructural glomerular basement membrane alterations in males with AS. Loss of COL4A5 staining is uncommon in patients with AS diagnosed in their adulthood.

  7. Alport syndrome caused by inversion of a 21 Mb fragment of the long arm of the X-chromosome comprising exon 9 through 51 of the COL4A5 gene

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Persson, Ulf; juncker, Inger

    2005-01-01

    -by-exon mutation detection strategy such as SSCP analysis or direct sequencing. We have previously reported the results of SSCP analysis of 81 patients suspected of X-linked AS. Genomic DNA from these 81 patients was also analyzed for larger genomic rearrangements, using Southern blotting analysis. Abnormal band...

  8. Improving Loop Dependence Analysis

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven

    2017-01-01

    Programmers can no longer depend on new processors to have significantly improved single-thread performance. Instead, gains have to come from other sources such as the compiler and its optimization passes. Advanced passes make use of information on the dependencies related to loops. We improve th...

  9. Analysis and Improvement of Fireworks Algorithm

    OpenAIRE

    Xi-Guang Li; Shou-Fei Han; Chang-Qing Gong

    2017-01-01

    The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA), this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opp...

  10. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  11. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  12. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  13. Improvement of electrophoresis performance by spectral analysis ...

    African Journals Online (AJOL)

    This paper describes a new design of standard agarose gel electrophoresis procedure for nucleic acids analysis. The electrophoresis was improved by using the real-time spectral analysis of the samples to increase its performance. A laser beam illuminated the analysed sample at wavelength with the highest absorption of ...

  14. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16.......5 rounds. Next we improve the designers' meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Finally, we comment on possible methods to obtain free-start distinguishers and free-start collisions for Fugue-256. © 2011 Springer-Verlag....

  15. Economic Analysis of Improved Alkaline Water Electrolysis

    International Nuclear Information System (INIS)

    Kuckshinrichs, Wilhelm; Ketelaer, Thomas; Koj, Jan Christian

    2017-01-01

    Alkaline water electrolysis (AWE) is a mature hydrogen production technology and there exists a range of economic assessments for available technologies. For advanced AWEs, which may be based on novel polymer-based membrane concepts, it is of prime importance that development comes along with new configurations and technical and economic key process parameters for AWE that might be of interest for further economic assessments. This paper presents an advanced AWE technology referring to three different sites in Europe (Germany, Austria, and Spain). The focus is on financial metrics, the projection of key performance parameters of advanced AWEs, and further financial and tax parameters. For financial analysis from an investor’s (business) perspective, a comprehensive assessment of a technology not only comprises cost analysis but also further financial analysis quantifying attractiveness and supply/market flexibility. Therefore, based on cash flow (CF) analysis, a comprehensible set of metrics may comprise levelised cost of energy or, respectively, levelized cost of hydrogen (LCH) for cost assessment, net present value (NPV) for attractiveness analysis, and variable cost (VC) for analysis of market flexibility. The German AWE site turns out to perform best in all three financial metrics (LCH, NPV, and VC). Though there are slight differences in investment cost and operation and maintenance cost projections for the three sites, the major cost impact is due to the electricity cost. Although investment cost is slightly lower and labor cost is significantly lower in Spain, the difference can not outweigh the higher electricity cost compared to Germany. Given the assumption that the electrolysis operators are customers directly and actively participating in power markets, and based on the regulatory framework in the three countries, in this special case electricity cost in Germany is lowest. However, as electricity cost is profoundly influenced by political decisions as

  16. Analysis and Improvement of Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Xi-Guang Li

    2017-02-01

    Full Text Available The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA, this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opposition-based learning initialization population. Secondly a new explosion amplitude mechanism for the optimal firework is proposed. In addition, the adaptive t-distribution mutation for non-optimal individuals and elite opposition-based learning for the optimal individual are used. Finally, a new selection strategy, namely Disruptive Selection, is proposed to reduce the running time of the algorithm compared with FWA. In our simulation, we apply the CEC2013 standard functions and compare the proposed algorithm (IFWA with SPSO2011, FWA, EFWA and dynFWA. The results show that the proposed algorithm has better overall performance on the test functions.

  17. Methodological considerations for improving Western blot analysis.

    Science.gov (United States)

    MacPhee, Daniel J

    2010-01-01

    The need for a technique that could allow the determination of antigen specificity of antisera led to the development of a method that allowed the production of a replica of proteins, which had been separated electrophoretically on polyacrylamide gels, on to a nitrocellulose membrane. This method was coined Western blotting and is very useful to study the presence, relative abundance, relative molecular mass, post-translational modification, and interaction of specific proteins. As a result it is utilized routinely in many fields of scientific research such as chemistry, biology and biomedical sciences. This review serves to touch on some of the methodological conditions that should be considered to improve Western blot analysis, particularly as a guide for graduate students but also scientists who wish to continue adapting this now fundamental research tool. Copyright 2009 Elsevier Inc. All rights reserved.

  18. Continuous improvement projects: an authorship bibliometric analysis.

    Science.gov (United States)

    Gonzalez Aleu, Fernando; Van Aken, Eileen M

    2017-06-12

    Purpose The purpose of this paper is to describe the current research on hospital continuous improvement projects (CIPs) from an author characteristics' perspective. This work addresses the following questions: who are the predominant research authors in hospital CIPs? To what extent are the research communities collaborating in distinct research groups? How internationalized has hospital CIPs research become with respect to author location? Design/methodology/approach A systematic literature review was conducted, identifying 302 academic publications related to hospital CIPs. Publications were analyzed using: author, quantity, diversity, collaboration, and impact. Findings Hospital CIPs are increasingly attracting new scholars each year. Based on the authors' analysis, authors publishing in this area can be described as a relatively new international community given the countries represented. Originality/value This paper describes the current hospital CIP research by assessing author characteristics. Future work should examine additional attributes to characterize maturity such as how new knowledge is being created and to what extent new knowledge is being disseminated to practitioners.

  19. SPORTS ORGANIZATIONS MANAGEMENT IMPROVEMENT: A SURVEY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alin Molcut

    2015-07-01

    Full Text Available Sport organizations exist to perform tasks that can only be executed through cooperative effort, and sport management is responsible for the performance and success of these organizations. The main of the paper is to analyze several issues of management sports organizations in order to asses their quality management. In this respect a questionnaire has been desingned for performing a survey analysis through a statistical approach. Investigation was conducted over a period of 3 months, and have been questioned a number of managers and coaches of football, all while pursuing an activity in football clubs in the counties of Timis and Arad, the level of training for children and juniors. The results suggest that there is a significant interest for the improvement of management across teams of children and under 21 clubs, emphasis on players' participation and rewarding performance. Furthermore, we can state that in the sports clubs there is established a vision and a mission as well as the objectives of the club's general refers to both sporting performance, and financial performance.

  20. Improved security analysis of Fugue-256

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Bagheri, Nasour; Knudsen, Lars Ramkilde

    2011-01-01

    in the G transform. Next we improve the designers’ meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Next we study the security of Fugue-256 against free-start distinguishers and free-start collisions. In this direction, we use an improved variant of the differential...... transform is mapped with a transform to a 256-bit digest. In this paper, we present some improved as well as new analytical results of Fugue-256 (with lengthpadding). First we improve Aumasson and Phans’ integral distinguisher on the 5.5 rounds of the G transform to 16.5 rounds, thus showing weak diffusion...

  1. Occupational Analysis: A Continuous Improvement Approach

    National Research Council Canada - National Science Library

    Duffy, Tom

    1998-01-01

    .... In doing so, the Air Force has implemented "Quality Air Force (QAF)" (AF Handbook 90-502). QAF is a leadership commitment that inspires trust, teamwork, and continuous improvement everywhere in the Air Force...

  2. Improving Public Perception of Behavior Analysis.

    Science.gov (United States)

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  3. Helping agencies improve their planning analysis techniques.

    Science.gov (United States)

    2011-11-18

    This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...

  4. Analysis of improvement ways of creative accounting

    Directory of Open Access Journals (Sweden)

    I.A. Yuhimenko-Nazaruk

    2017-02-01

    Full Text Available The necessity of carrying out the research in the direction of finding out the ways to improve creative accounting is grounded. The existing approaches of researchers to eliminate the negative consequences of creative accounting are analyzed. Four main groups of researchers' approaches to the improvement of creative accounting are singled out and analyzed. The general and distinctive features of the researchers’ proposals on the improvement of creative accounting are examined. The reasons for the impossibility of using the ethical approach to the improvement of creative accounting in Ukraine in modern conditions are grounded. The necessity of procedural aspects perfection of the creative accounting on the basis of the concept of true and fair view is proved. The classification of the approaches to the construction of accounting methodology in the context of the use of creative accounting is developed. The main regulations of the concept of true and fair view are studied, their use provides an adequate reflection of the company's economic reality in financial reporting.

  5. International Space Station Future Correlation Analysis Improvements

    Science.gov (United States)

    Laible, Michael R.; Pinnamaneni, Murthy; Sugavanam, Sujatha; Grygier, Michael

    2018-01-01

    Ongoing modal analyses and model correlation are performed on different configurations of the International Space Station (ISS). These analyses utilize on-orbit dynamic measurements collected using four main ISS instrumentation systems: External Wireless Instrumentation System (EWIS), Internal Wireless Instrumentation System (IWIS), Space Acceleration Measurement System (SAMS), and Structural Dynamic Measurement System (SDMS). Remote Sensor Units (RSUs) are network relay stations that acquire flight data from sensors. Measured data is stored in the Remote Sensor Unit (RSU) until it receives a command to download data via RF to the Network Control Unit (NCU). Since each RSU has its own clock, it is necessary to synchronize measurements before analysis. Imprecise synchronization impacts analysis results. A study was performed to evaluate three different synchronization techniques: (i) measurements visually aligned to analytical time-response data using model comparison, (ii) Frequency Domain Decomposition (FDD), and (iii) lag from cross-correlation to align measurements. This paper presents the results of this study.

  6. Planning, Conducting, and Documenting Data Analysis for Program Improvement

    Science.gov (United States)

    Winer, Abby; Taylor, Cornelia; Derrington, Taletha; Lucas, Anne

    2015-01-01

    This 2015 document was developed to help technical assistance (TA) providers and state staff define and limit the scope of data analysis for program improvement efforts, including the State Systemic Improvement Plan (SSIP); develop a plan for data analysis; document alternative hypotheses and additional analyses as they are generated; and…

  7. Recent improvements in Thomson scattering data analysis

    International Nuclear Information System (INIS)

    Tillack, M.S.; Lazarus, E.A.

    1980-04-01

    A new profile analysis package for use with the Thomson scattering data on ISX-B has recently been implemented. The primary feature of this package is a weighted least squares fitting of temperature and density data to generate a representative curve, as opposed to the previous hand-fitting technique. The changes will automate the manner in which data are transmitted and manipulated, without affecting the calculational techniques previously used. The computer programs have also been used to estimate the sensitivity of various plasma quantities to the accuracy of the Thomson scattering data

  8. Landsat analysis for improving bamboo forest mapping

    Science.gov (United States)

    Yamamoto, Y.; Suzuoki, Y.; Tsuboi, T.; Iinuma, T.; Iwashita, K.; Nishikawa, H.

    Using satellite data and field data collected periodically over the past years, the vegetation area and the underlying substrate have been mapped. Numerous methods for detecting "vegetation changes" with the aid of digital satellite data have been utilized. Among those methods , the vegetation indices such as RVI, NDVI, or SVI are the most suitable methods to estimate "the change". Vegetation indices are mathematical transformations designed to assess the spectral contribution of vegetation to multispectral observations. Bamboo grove as primary subject in this study is welknown as rapid growing plant, and, on the other hand, the expansion of bamboo grove have been discussed as a regional environmental issue. Change detection of bamboo covered area would be useful to help the preventive countermeasures for bamboo expansion in unwanted areas. As a result, to detect the accurate bamboo covered area, the optimal vegetation indices and band combinations were established through continuous Landsat data based statistical analysis.

  9. Improvement of product design process by knowledge value analysis

    OpenAIRE

    XU, Yang; BERNARD, Alain; PERRY, Nicolas; LAROCHE, Florent

    2013-01-01

    Nowadays, design activities remain the core issue for global product development. As knowledge is more and more integrated, effective analysis of knowledge value becomes very useful for the improvement of product design processes. This paper aims at proposing a framework of knowledge value analysis in the context of product design process. By theoretical analysis and case study, the paper illustrates how knowledge value can be calculated and how the results can help the improvement of product...

  10. Analysis of event data recorder data for vehicle safety improvement

    Science.gov (United States)

    2008-04-01

    The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...

  11. Improving Department of Defense Global Distribution Performance Through Network Analysis

    Science.gov (United States)

    2016-06-01

    maximum time allowed by SDDB business rules, 365 days, and run the improvement algorithm again. Using the budget of 20 improvement days for both the...OF DEFENSE GLOBAL DISTRIBUTION PERFORMANCE THROUGH NETWORK ANALYSIS by Justin A. Thompson June 2016 Thesis Advisor: Samuel E. Buttrey...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE IMPROVING DEPARTMENT OF DEFENSE GLOBAL DISTRIBUTION

  12. Improving Quality Using Architecture Fault Analysis with Confidence Arguments

    Science.gov (United States)

    2015-03-01

    Improving Quality Using Architecture Fault Analysis with Confidence Arguments Peter H. Feiler Charles B. Weinstock John B. Goodenough ...argument are represented explicitly. As reasons for doubt, called defeaters, are removed, confidence in system claims increases [ Goodenough 2013, Weinstock...Peter, Goodenough , John, Gurfinkel, Arie, Weinstock, Charles, & Wrage, Lutz. Reliability Improvement and Validation Framework (CMU/SEI-2012-SR-013

  13. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  14. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2015-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...

  15. an improved structural model for seismic analysis of tall frames

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. This paper proposed and examined an improved structural model that overcomes the deficiencies of the shear frame model by considering the effects of flexible horizontal members and column axial loads in seismic analysis of multi-storey frames. Matrix displacement method of analysis is used on the basis of ...

  16. Improvement of software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, K.; Ivanović, I.

    2015-01-01

    In this paper, we present improvements made on our software for the analysis of visual meteor data. R package MetFns received major updates. Selection filters and algorithms for calculation of zenithal hourly rate and population index, as well as accompanying graphics, are corrected and their performance is improved. Web application MetRApp contains a completely remade user interface and some new features. Also, calculation performances are optimized.

  17. Improving Automatic Text Classification by Integrated Feature Analysis

    Science.gov (United States)

    Busagala, Lazaro S. P.; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    Feature transformation in automatic text classification (ATC) can lead to better classification performance. Furthermore dimensionality reduction is important in ATC. Hence, feature transformation and dimensionality reduction are performed to obtain lower computational costs with improved classification performance. However, feature transformation and dimension reduction techniques have been conventionally considered in isolation. In such cases classification performance can be lower than when integrated. Therefore, we propose an integrated feature analysis approach which improves the classification performance at lower dimensionality. Moreover, we propose a multiple feature integration technique which also improves classification effectiveness.

  18. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  19. Can an understanding of transactional analysis improve postgraduate clinical supervision?

    Science.gov (United States)

    Sivan, Manoj; McKimm, Judy; Held, Sam

    2011-01-01

    Clinical supervision in postgraduate medical training is vital in producing competent and safe health-care practitioners. Effective communication between supervisors and trainees at an interpersonal and professional level determines the quality of the supervision process. Transactional analysis, a theory of personality, can be used to enhance understanding of interpersonal interactions and improve the outcomes of clinical training.

  20. Costs and returns analysis of improved and alternative cassava ...

    African Journals Online (AJOL)

    The specific objectives of the study was an analysis of the costs and returns of improved and alternative technologies available in the study area by farmers and their level of adoption of the new technologies. Data were collected from a random sample of 250 farmers and 30 extension Staff in the three (3) agricultural zones ...

  1. Sensitivity analysis and its application for dynamic improvement

    Indian Academy of Sciences (India)

    Keywords. Sensitivity analysis; dynamic improvement structural modoficaton; laser beam printer; motorbike; disc drive; mechatronics; automobile engine. Abstract. In order to determine appropriate points where natural frequency or mode shape under consideration can be effectively modified by structural modification, the ...

  2. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2013-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly......, the new result holds for population sizes up to mu = n1/4-epsilon which is an improvement up to a power of 2 larger. Secondly, we present a technique to bound the diversity of the population that does not require a bound on its bandwidth. Apart from allowing a stronger result, we believe this is a major...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits...

  3. Thermal hydraulic analysis of the JMTR improved LEU-core

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Toshio; Nagao, Yoshiharu; Komukai, Bunsaku; Naka, Michihiro; Fujiki, Kazuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Takeda, Takashi [Radioactive Waste Management and Nuclear Facility Decommissioning Technology Center, Tokai, Ibaraki (Japan)

    2003-01-01

    After the investigation of the new core arrangement for the JMTR reactor in order to enhance the fuel burn-up and consequently extend the operation period, the ''improved LEU core'' that utilized 2 additional fuel elements instead of formerly installed reflector elements, was adopted. This report describes the results of the thermal-hydraulic analysis of the improved LEU core as a part of safety analysis for the licensing. The analysis covers steady state, abnormal operational transients and accidents, which were described in the annexes of the licensing documents as design bases events. Calculation conditions for the computer codes were conservatively determined based on the neutronic analysis results and others. The results of the analysis, that revealed the safety criteria were satisfied on the fuel temperature, DNBR and primary coolant temperature, were used in the licensing. The operation license of the JMTR with the improved LEU core was granted in March 2001, and the reactor operation with new core started in November 2001 as 142nd operation cycle. (author)

  4. Improving preconception health and care: a situation analysis.

    Science.gov (United States)

    Goodfellow, Ashley; Frank, John; McAteer, John; Rankin, Jean

    2017-08-23

    The purpose of this situation analysis was to explore the views of health and non-health professionals working with women of childbearing age on current and future delivery of preconception care in one National Health Service (NHS) Board area in Scotland. The situation analysis was undertaken using a mixed methods approach. Six focus groups were conducted organised by profession - general practitioners (GPs), practice nurses, health visitors, family nurses, guidance teachers and youth workers. Existing evidence of effective preconception care interventions informed focus group guides. A survey was undertaken with community pharmacists which provided qualitative data for analysis. Focus group transcripts were analysed by two researchers using a thematic analysis approach. There was lack of awareness of preconception health and its importance amongst the target group. Levels of unplanned pregnancy hampered efforts to deliver interventions. Professional knowledge, capacity and consistency of practice were viewed as challenges, as was individual compliance with preconception care advice. Improvement requires multifaceted action, including ensuring the school curriculum adequately prepares adolescents for future parenthood, increasing awareness through communication and marketing, supporting professional knowledge and practice and capitalising on existing opportunities for preconception care, and ensuring services are equitable and targeted to need. Delivery of preconception care needs to be improved both before and between pregnancies to improve outcomes for women and infants. Action is required at individual, organisational and community levels to ensure this important issue is at the forefront of preventative care and preventative spending.

  5. Domain analysis and modeling to improve comparability of health statistics.

    Science.gov (United States)

    Okada, M; Hashimoto, H; Ohida, T

    2001-01-01

    Health statistics is an essential element to improve the ability of managers of health institutions, healthcare researchers, policy makers, and health professionals to formulate appropriate course of reactions and to make decisions based on evidence. To ensure adequate health statistics, standards are of critical importance. A study on healthcare statistics domain analysis is underway in an effort to improve usability and comparability of health statistics. The ongoing study focuses on structuring the domain knowledge and making the knowledge explicit with a data element dictionary being the core. Supplemental to the dictionary are a domain term list, a terminology dictionary, and a data model to help organize the concepts constituting the health statistics domain.

  6. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  7. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  8. Training needs analysis for MSMEs: how to improve training effectiveness

    Science.gov (United States)

    Rohayati, Y.; Wulandari, S.

    2017-12-01

    The study aims to analyze training needs for MSMEs in the area of Kabupaten Bandung by selecting the case of MSMEs joined in Association for Agricultural Product Process, focusing on marketing as the main topic of the training. The needs analysis was required to improve training participation and effectiveness. Both aspects are important to notice since making MSMEs participate in training is not an easy task. Similarly, the needs analysis was carried out to anticipate participants’ thoughts that the training does not give any benefits for them or is ineffective because it does not meet their needs although it was actually to help MSMEs improve their marketing knowledge expected to lead to their success. This research involved 100 MSMEs with business ages starting from less than five years to more than 15 years. Those involved MSMEs were dominated by MSMEs targeting local marketing areas. The data were collected by survey and judgmental sampling technique. By conducting a descriptive analysis, it can be concluded that the needs of SMEs on marketing training materials should focus on improving marketing skills such as product development, sales, and use of marketing media as well as discussing legal aspects such as the need for certification and product brand. The results of the study also concluded that there is a need for training that is supplemented by making visits to more successful SMEs as well as practices with on the job training methods.

  9. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  10. Requirements Analysis for Future Satellite Gravity Mission Improved-GRACE

    Science.gov (United States)

    Zheng, Wei; Hsu, Houtse; Zhong, Min; Yun, Meijuan

    2015-01-01

    The Earth's gravitational field from the Next-Generation Gravimetry Mission (NGGM) and the Improved-Gravity Recovery and Climate Experiment (Improved-GRACE) complete up to degree and order 120 is recovered by a closed-loop numerical simulation using different orbital altitudes of 325 and 300 km, different orbital inclinations of 96.78° and 89° and different inter-satellite ranges of 10 and 50 km. The preferred orbit parameters of the future twin Improved-GRACE satellites are proposed based on the results of the simulations in this study. The research results show: (1) In order to achieve the scientific objectives, which require that the accuracy of the next-generation Earth gravity field models is at least one order of magnitude better than that of the current gravity models, the orbit design at an altitude of 300 ± 50 km is recommended for the future Improved-GRACE mission. This altitude is determined by a trade-off analysis between the recovery accuracy of the gravity field and the operational lifetime of the satellite system. (2) Because the accuracy of the Earth's gravitational field from NGGM with an orbital inclination of 96.78° will be decreased due to a lack of the observation data in the polar areas, we propose that a near-polar orbit (inclination of 89° ± 2°) is a preferable selection for the future twin Improved-GRACE satellites. (3) The future Improved-GRACE mission has to adopt an inter-satellite range of 50 ± 10 km, because the common signals of the Earth's gravitational field between the twin NGGM satellites will be substantially eliminated with a shorter inter-satellite range of 10 km. With these orbit design parameters, the Earth's gravitational field from the Improved-GRACE mission is precisely recovered complete up to degree and order 120 with a cumulative geoid height error of about 0.7 mm.

  11. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  12. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was used...... to model correlation between 19 register-based indicators for the quality of non-steroidal anti-inflammatory drug (NSAID) prescribing. RESULTS: The correlation between indicators ranged widely from 0 to 0.93. Factor analysis revealed three dimensions of quality: (1) "Coxib preference", comprising...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  13. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  14. Security analysis and improvements to the PsychoPass method.

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  15. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    International Nuclear Information System (INIS)

    Sun Zengzhi; Xu Qinwu

    2008-01-01

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca 2+ and Al 3+ cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH) 2 , acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent

  16. Modified paraffin wax for improvement of histological analysis efficiency.

    Science.gov (United States)

    Lim, Jin Ik; Lim, Kook-Jin; Choi, Jin-Young; Lee, Yong-Keun

    2010-08-01

    Paraffin wax is usually used as an embedding medium for histological analysis of natural tissue. However, it is not easy to obtain enough numbers of satisfactory sectioned slices because of the difference in mechanical properties between the paraffin and embedded tissue. We describe a modified paraffin wax that can improve the histological analysis efficiency of natural tissue, composed of paraffin and ethylene vinyl acetate (EVA) resin (0, 3, 5, and 10 wt %). Softening temperature of the paraffin/EVA media was similar to that of paraffin (50-60 degrees C). The paraffin/EVA media dissolved completely in xylene after 30 min at 50 degrees C. Physical properties such as the amount of load under the same compressive displacement, elastic recovery, and crystal intensity increased with increased EVA content. EVA medium (5 wt %) was regarded as an optimal composition, based on the sectioning efficiency measured by the numbers of unimpaired sectioned slices, amount of load under the same compressive displacement, and elastic recovery test. Based on the staining test of sectioned slices embedded in a 5 wt % EVA medium by hematoxylin and eosin (H&E), Masson trichrome (MT), and other staining tests, it was concluded that the modified paraffin wax can improve the histological analysis efficiency with various natural tissues. (c) 2010 Wiley-Liss, Inc.

  17. Toward improved analysis of concentration data: Embracing nondetects.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien

    2018-03-01

    Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.

  18. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  19. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  20. An improved convergence analysis of smoothed aggregation algebraic multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Brezina, Marian [Univ. of Colorado, Boulder, CO (United States). Dept. of Applied Mathematics; Vaněk, Petr [University of West Bohemia (Czech Republic). Dept. of Mathematics; Vassilevski, Panayot S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing

    2011-03-02

    We present an improved analysis of the smoothed aggregation (SA) alge- braic multigrid method (AMG) extending the original proof in [SA] and its modification in [Va08]. The new result imposes fewer restrictions on the aggregates that makes it eas- ier to verify in practice. Also, we extend a result in [Van] that allows us to use aggressive coarsening at all levels due to the special properties of the polynomial smoother, that we use and analyze, and thus provide a multilevel convergence estimate with bounds independent of the coarsening ratio.

  1. Improved method and apparatus for chromatographic quantitative analysis

    Science.gov (United States)

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  2. MITG post-test analysis and design improvements

    International Nuclear Information System (INIS)

    Schock, A.

    1983-01-01

    The design, performance analysis, and key attributes of the Modular Isotopic Thermoelectric Generator (MITG) were described in a 1981 IECEC paper; and the design, fabrication, and testing of prototypical MITG test assemblies were described in preceding papers in these proceedings. Each test assembly simulated a typical modular slice of the flight generator. The present paper describes a detailed thermal-stress analysis, which identified the causes of stress-related problems observed during the tests. It then describes how additional analyses were used to evaluate design changes to alleviate those problems. Additional design improvements are discussed in the next paper in these proceedings, which also describes revised fabrication procedures and updated performance estimates for the generator

  3. Missile placement analysis based on improved SURF feature matching algorithm

    Science.gov (United States)

    Yang, Kaida; Zhao, Wenjie; Li, Dejun; Gong, Xiran; Sheng, Qian

    2015-03-01

    The precious battle damage assessment by use of video images to analysis missile placement is a new study area. The article proposed an improved speeded up robust features algorithm named restricted speeded up robust features, which combined the combat application of TV-command-guided missiles and the characteristics of video image. Its restrictions mainly reflected in two aspects, one is to restrict extraction area of feature point; the second is to restrict the number of feature points. The process of missile placement analysis based on video image was designed and a video splicing process and random sample consensus purification were achieved. The RSURF algorithm is proved that has good realtime performance on the basis of guarantee the accuracy.

  4. Improvements and experience in the analysis of reprocessing samples

    International Nuclear Information System (INIS)

    Koch, L.; Cricchio, A.; Meester, R. de; Romkowski, M.; Wilhelmi, M.; Arenz, H.J.; Stijl, E. van der; Baeckmann, A. von

    1976-01-01

    Improvements in the analysis of input samples for reprocessing were obtained. To cope with the decomposition of reprocessing input solutions owling to the high radioactivity, an aluminium capsule technique was developed. A known amount of the dissolver solution was weighed into an aluminium can, dried, and the capsule was sealed. In this form, the sample could be stored over a long period and could be redissolved later for the analysis. The isotope correlation technique offers an attractive alternative for measuring the plutonium isotopic content in the dissolver solution. Moreover, this technique allows for consistency checks of analytical results. For this purpose, a data bank of correlated isotopic data is in use. To improve the efficiency of analytical work, four automatic instruments have been developed. The conditioning of samples for the U-Pu isotopic measurement was achieved by an automatic ion exchanger. A mass spectrometer, to which a high vacuum lock is connected, allows the automatic measurement of U-Pu samples. A process-computer controls the heating, focusing and scanning processes during the measurement and evaluates the data. To ease the data handling, alpha-spectrometry as well as a balance have been automated. (author)

  5. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ

    2010-09-01

    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  6. Group learning improves case analysis in veterinary medicine.

    Science.gov (United States)

    Pickrell, John A; Boyer, John; Oehme, Frederick W; Clegg, Victoria L; Sells, Nikki

    2002-01-01

    Group learning has become important to professional students in the healing sciences. Groups share factual and procedural resources to enhance their performances. We investigated the extent to which students analyzing case-based evaluations as teams acquired an immediate performance advantage relative to those analyzing them as individuals and the extent to which group work on one problem led to better performance by individual students on related problems. We blinded written evaluations by randomly assigning numbers to groups of students and using removable tracers. Differences between groups and individuals were evaluated using Student's t statistic. Similar comparisons were evaluated by meta-analysis to determine overall trends. Students who analyzed evaluations as a group had an 8.5% performance advantage over those who analyzed them as individuals. When evaluations were divided into those asking questions related to treatment, differential diagnosis, and prognosis, specific performance advantages for groups relative to individuals were 8.9%, 5.9%, and 6.1% respectively. Students who had previously been trained by group evaluations had a 1.5% advantage relative to those who received their training as individuals. Answers by students analyzing evaluations as groups suggested a deeper understanding, in large part because of their improved ability to explain treatment and to conduct differential diagnosis. These improvements suggested limited abilities to use previous experience to improve present performance.

  7. Skill analysis part 3: improving a practice skill.

    Science.gov (United States)

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series.

  8. ECONOMIC AND ENERGETICAL ANALYSIS OF IMPROVED WASTE UTILIZATION PLASMA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Serghei VAMBOL

    2015-07-01

    Full Text Available Purpose. Energy and economic evaluation of the improved plasma waste utilization technological process, as well as an expediency substantiation of the use of improved plasma technology by comparing its energy consumption with other thermal methods of utilization. Methodology. Analysis of existing modern and advanced methods of waste management and its impact on environmental safety. Considering of energy and monetary costs to implement two different waste management technologies. Results. Studies have shown regular gasification ensure greater heating value due to differences, a significant amount of nitrogen than for plasma gasification. From the point of view of minimizing energy and monetary costs and environmental safety more promising is to offer advanced technology for plasma waste. To carry out the energy assessment of the appropriateness of the considered technologies-comparative calculation was carried out at the standard conditions. This is because in the processing of waste produced useful products, such as liquefied methane, synthetic gas (94% methane and a fuel gas for heating, suitable for sale that provides cost-effectiveness of this technology. Originality. Shown and evaluated ecological and economic efficiency of proposed improved plasma waste utilization technology compared with other thermal techniques. Practical value. Considered and grounded of energy and monetary costs to implement two different waste management technologies, namely ordinary gasification and using plasma generators. Proposed plasma waste utilization technology allows to obtain useful products, such as liquefied methane, synthetic gas and a fuel gas for heating, which are suitable for sale. Plant for improved plasma waste utilization technological process allows to compensate the daily and seasonal electricity and heat consumption fluctuations by allowing the storage of obtained fuel products.

  9. Does plyometric training improve strength performance? A meta-analysis.

    Science.gov (United States)

    Sáez-Sáez de Villarreal, Eduardo; Requena, Bernardo; Newton, Robert U

    2010-09-01

    Majority of the research suggests plyometric training (PT) improves maximal strength performance as measured by 1RM, isometric MVC or slow velocity isokinetic testing. However, the effectiveness of PT depends upon various factors. A meta-analysis of 15 studies with a total of 31 effect sizes (ES) was carried out to analyse the role of various factors on the effects of PT on strength performance. The inclusion criteria for the analysis were: (a) studies using PT programs for lower limb muscles; (b) studies employing true experimental design and valid and reliable measurements; (c) studies including sufficient data to calculate ES. When subjects can adequately follow plyometric exercises, the training gains are independent of fitness level. Subjects in either good or poor physical condition, benefit equally from plyometric work, also men obtain similar strength results to women following PT. In relation to the variables of program design, training volume of less than 10 weeks and with more than 15 sessions, as well as the implementation of high-intensity programs, with more than 40 jumps per session, were the strategies that seem to maximize the probability to obtain significantly greater improvements in performance (p<0.05). In order to optimise strength enhancement, the combination of different types of plyometrics with weight-training would be recommended, rather than utilizing only one form (p<0.05). The responses identified in this analysis are essential and should be considered by the strength and conditioning professional with regard to the most appropriate dose-response trends for PT to optimise strength gains.

  10. Improving patient safety in radiotherapy through error reporting and analysis

    International Nuclear Information System (INIS)

    Findlay, Ú.; Best, H.; Ottrey, M.

    2016-01-01

    Aim: To improve patient safety in radiotherapy (RT) through the analysis and publication of radiotherapy errors and near misses (RTE). Materials and methods: RTE are submitted on a voluntary basis by NHS RT departments throughout the UK to the National Reporting and Learning System (NRLS) or directly to Public Health England (PHE). RTE are analysed by PHE staff using frequency trend analysis based on the classification and pathway coding from Towards Safer Radiotherapy (TSRT). PHE in conjunction with the Patient Safety in Radiotherapy Steering Group publish learning from these events, on a triannual and summarised on a biennial basis, so their occurrence might be mitigated. Results: Since the introduction of this initiative in 2010, over 30,000 (RTE) reports have been submitted. The number of RTE reported in each biennial cycle has grown, ranging from 680 (2010) to 12,691 (2016) RTE. The vast majority of the RTE reported are lower level events, thus not affecting the outcome of patient care. Of the level 1 and 2 incidents reported, it is known the majority of them affected only one fraction of a course of treatment. This means that corrective action could be taken over the remaining treatment fractions so the incident did not have a significant impact on the patient or the outcome of their treatment. Analysis of the RTE reports demonstrates that generation of error is not confined to one professional group or to any particular point in the pathway. It also indicates that the pattern of errors is replicated across service providers in the UK. Conclusion: Use of the terminology, classification and coding of TSRT, together with implementation of the national voluntary reporting system described within this report, allows clinical departments to compare their local analysis to the national picture. Further opportunities to improve learning from this dataset must be exploited through development of the analysis and development of proactive risk management strategies

  11. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)

    2012-03-15

    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  12. TENDENCY OF IMPROVEMENT ANALYSIS OF VENTURE ACTIVITY FOR MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс

    2015-03-01

    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  13. Full-motion video analysis for improved gender classification

    Science.gov (United States)

    Flora, Jeffrey B.; Lochtefeld, Darrell F.; Iftekharuddin, Khan M.

    2014-06-01

    The ability of computer systems to perform gender classification using the dynamic motion of the human subject has important applications in medicine, human factors, and human-computer interface systems. Previous works in motion analysis have used data from sensors (including gyroscopes, accelerometers, and force plates), radar signatures, and video. However, full-motion video, motion capture, range data provides a higher resolution time and spatial dataset for the analysis of dynamic motion. Works using motion capture data have been limited by small datasets in a controlled environment. In this paper, we explore machine learning techniques to a new dataset that has a larger number of subjects. Additionally, these subjects move unrestricted through a capture volume, representing a more realistic, less controlled environment. We conclude that existing linear classification methods are insufficient for the gender classification for larger dataset captured in relatively uncontrolled environment. A method based on a nonlinear support vector machine classifier is proposed to obtain gender classification for the larger dataset. In experimental testing with a dataset consisting of 98 trials (49 subjects, 2 trials per subject), classification rates using leave-one-out cross-validation are improved from 73% using linear discriminant analysis to 88% using the nonlinear support vector machine classifier.

  14. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  15. Improving knowledge management systems with latent semantic analysis

    International Nuclear Information System (INIS)

    Sebok, A.; Plott, C.; LaVoie, N.

    2006-01-01

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  16. A improved method for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    Equillor, Hugo E.

    2004-01-01

    In this work we describe a methodology, developed in the last years, for the analysis of alpha emitters spectra, obtained with implanted ion detectors, that tend to solve some of the problems that shows this type of spectra. This is an improved methodology respect to that described in a previous publication. The method is based on the application of a mathematical function that allows to model the tail of an alpha peak, to evaluate the part of the peak that is not seen in the cases of partial superposition with another peak. Also, a calculation program that works in a semiautomatic way, with the possibility of interactive intervention of the analyst, has been developed simultaneously and is described in detail. (author)

  17. Improving assessment of personality disorder traits through social network analysis.

    Science.gov (United States)

    Clifton, Allan; Turkheimer, Eric; Oltmanns, Thomas F

    2007-10-01

    When assessing personality disorder traits, not all judges make equally valid judgments of all targets. The present study uses social network analysis to investigate factors associated with reliability and validity in peer assessment. Participants were groups of military recruits (N=809) who acted as both targets and judges in a round-robin design. Participants completed self- and informant versions of the Multisource Assessment of Personality Pathology. Social network matrices were constructed based on reported acquaintance, and cohesive subgroups were identified. Judges who shared a mutual subgroup were more reliable and had higher self-peer agreement than those who did not. Partitioning networks into two subgroups achieved more consistent improvements than multiple subgroups. We discuss implications for multiple informant assessments.

  18. Plant improvements through the use of benchmarking analysis

    International Nuclear Information System (INIS)

    Messmer, J.R.

    1993-01-01

    As utilities approach the turn of the century, customer and shareholder satisfaction is threatened by rising costs. Environmental compliance expenditures, coupled with low load growth and aging plant assets are forcing utilities to operate existing resources in a more efficient and productive manner. PSI Energy set out in the spring of 1992 on a benchmarking mission to compare four major coal fired plants against others of similar size and makeup, with the goal of finding the best operations in the country. Following extensive analysis of the 'Best in Class' operation, detailed goals and objectives were established for each plant in seven critical areas. Three critical processes requiring rework were identified and required an integrated effort from all plants. The Plant Improvement process has already resulted in higher operation productivity, increased emphasis on planning, and lower costs due to effective material management. While every company seeks improvement, goals are often set in an ambiguous manner. Benchmarking aids in setting realistic goals based on others' actual accomplishments. This paper describes how the utility's short term goals will move them toward being a lower cost producer

  19. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  20. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  1. Sensitivity analysis for improving nanomechanical photonic transducers biosensors

    International Nuclear Information System (INIS)

    Fariña, D; Álvarez, M; Márquez, S; Lechuga, L M; Dominguez, C

    2015-01-01

    The achievement of high sensitivity and highly integrated transducers is one of the main challenges in the development of high-throughput biosensors. The aim of this study is to improve the final sensitivity of an opto-mechanical device to be used as a reliable biosensor. We report the analysis of the mechanical and optical properties of optical waveguide microcantilever transducers, and their dependency on device design and dimensions. The selected layout (geometry) based on two butt-coupled misaligned waveguides displays better sensitivities than an aligned one. With this configuration, we find that an optimal microcantilever thickness range between 150 nm and 400 nm would increase both microcantilever bending during the biorecognition process and increase optical sensitivity to 4.8   ×   10 −2  nm −1 , an order of magnitude higher than other similar opto-mechanical devices. Moreover, the analysis shows that a single mode behaviour of the propagating radiation is required to avoid modal interference that could misinterpret the readout signal. (paper)

  2. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    Palamalai, A.; Thankachan, T.S.; Balasubramanian, G.R.

    1979-01-01

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  3. Improved inhomogeneous finite elements for fabric reinforced composite mechanics analysis

    Science.gov (United States)

    Foye, R. L.

    1992-01-01

    There is a need to do routine stress/failure analysis of fabric reinforced composite microstructures to provide additional confidence in critical applications and guide materials development. Conventional methods of 3-D stress analysis are time consuming to set up, run and interpret. A need exists for simpler methods of modeling these structures and analyzing the models. The principal difficulty is the discrete element mesh generation problem. Inhomogeneous finite elements are worth investigating for application to these problems because they eliminate the mesh generation problem. However, there are penalties associated with these elements. Their convergence rates can be slow compared to homogeneous elements. Also, there is no accepted method for obtaining detailed stresses in the constituent materials of each element. This paper shows that the convergence rate can be significantly improved by a simple device which substitutes homogeneous elements for the inhomogeneous ones. The device is shown to work well in simple one and two dimensional problems. However, demonstration of the application to more complex two and three dimensional problems remains to be done. Work is also progressing toward more realistic fabric microstructural geometries.

  4. Receiver operating characteristic analysis improves diagnosis by radionuclide ventriculography

    International Nuclear Information System (INIS)

    Dickinson, C.Z.; Forman, M.B.; Vaugh, W.K.; Sandler, M.P.; Kronenberg, M.W.

    1985-01-01

    Receiver operating characteristic analysis (ROC) evaluates continuous variables to define diagnostic criteria for the optimal sensitivity (SENS) and specificity (SPEC) of a test. The authors studied exercise-induced chest pain (CP), ST-changes on electrocardiography (ECG) and rest-exercise gated radionuclide ventriculography (RVG) using ROC to clarify the optimal criteria for detecting myocardial ischemia due to coronary artherosclerosis (CAD). The data of 95 consecutive patients studied with coronary angiography, rest-exercise RVG and ECG were reviewed. 77 patients had ''significant'' CAD (≥50% lesions). Exercise-induced CP, ECG abnormalities (ST-T shifts) and RVG abnormalities (change in ejection fraction, 2-view regional wall motion change and relative end-systolic volume) were evaluated to define optimal SENS/SPEC of each and for the combined data. ROC curves were constructed by multiple logistic regression (MLR). By MLR, RVG alone was superior to ECG and CP. The combination of all three produced the best ROC curve for the entire group and for clinical subsets based on the number of diseased vessels and the presence or absence of prior myocardial infarction. When CP, ECG and RVG were combined, the optimal SENS/SPEC for detection of single vessel disease was 88/86. The SENS/SPEC for 3 vessel disease was 93/95. Thus, the application of RVG for the diagnosis of myocardial ischemia is improved with the inclusion of ECG and CP data by the use of a multiple logistic regression model. ROC analysis allows clinical application of multiple data for diagnosing CAD at desired SENS/SPEC rather than by arbitrary single-standard criteria

  5. X-linked Alport syndrome associated with a synonymous p.Gly292Gly mutation alters the splicing donor site of the type IV collagen alpha chain 5 gene.

    Science.gov (United States)

    Fu, Xue Jun; Nozu, Kandai; Eguchi, Aya; Nozu, Yoshimi; Morisada, Naoya; Shono, Akemi; Taniguchi-Ikeda, Mariko; Shima, Yuko; Nakanishi, Koichi; Vorechovsky, Igor; Iijima, Kazumoto

    2016-10-01

    X-linked Alport syndrome (XLAS) is a progressive hereditary nephropathy caused by mutations in the type IV collagen alpha chain 5 gene (COL4A5). Although many COL4A5 mutations have previously been identified, pathogenic synonymous mutations have not yet been described. A family with XLAS underwent mutational analyses of COL4A5 by PCR and direct sequencing, as well as transcript analysis of potential splice site mutations. In silico analysis was also conducted to predict the disruption of splicing factor binding sites. Immunohistochemistry (IHC) of kidney biopsies was used to detect α2 and α5 chain expression. We identified a hemizygous point mutation, c.876A>T, in exon 15 of COL4A5 in the proband and his brother, which is predicted to result in a synonymous amino acid change, p.(Gly292Gly). Transcript analysis showed that this mutation potentially altered splicing because it disrupted the splicing factor binding site. The kidney biopsy of the proband showed lamellation of the glomerular basement membrane (GBM), while IHC revealed negative α5(IV) staining in the GBM and Bowman's capsule, which is typical of XLAS. This is the first report of a synonymous COL4A5 substitution being responsible for XLAS. Our findings suggest that transcript analysis should be conducted for the future correct assessment of silent mutations.

  6. An integrated sampling and analysis approach for improved biodiversity monitoring.

    Science.gov (United States)

    DeWan, Amielle A; Zipkin, Elise F

    2010-05-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  7. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Directory of Open Access Journals (Sweden)

    R. E. Wilson

    2014-06-01

    Full Text Available Essential ideas, successes, and difficulties of Areal Density Analysis (ADA for color-magnitude diagrams (CMD’s of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA. Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF quantization, IMF steepness, low Areal Densities (A, and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  8. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Science.gov (United States)

    Wilson, R. E.

    2014-06-01

    Essential ideas, successes, and difficulties of Areal Density Analysis (ADA) for color-magnitude diagrams (CMD¡¯s) of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC) so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA). Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF) quantization, IMF steepness, low Areal Densities (A ), and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  9. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  10. Improvements of Defect Analysis by Space-Charge Wave Spectroscopy

    Science.gov (United States)

    Voit, Kay-Michael; Hilling, Burkhard; Schmidt, Heinz-Jürgen; Imlau, Mirco

    2011-03-01

    We report on our recent advancements in space-charge wave spectroscopy, which can be used to investigate defect structures in classical high-resistive semiconductors and insulators. It permits to estimate the effective trap concentrations as well as the effective donor density Neff and the product μτ of electron mobility and life-time in the conduction band. We present a novel method of space-charge wave excitation, using a superposition of a running and a static sinusoidal illumination pattern. Thus we acquire -- in contrast to the former oscillating pattern -- a distinct direction of movement. The proposed new technique can be regarded as an effective amelioration, as the theoretical analysis is no longer limited by numerous presumptions like low modulation depth or small oscillation amplitudes. It not only overcomes these limits of the experimental configuration improving accuracy of SCW spectroscopy, but it also provides additional information, such as the sign of the charge carriers. Financial support by the DFG within the graduate college 695 ``Nonlinearities of Optical Materials'' and the project IM 37/5-1 is gratefully acknowledged.

  11. Improved sampling and analysis of images in corneal confocal microscopy.

    Science.gov (United States)

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the

  12. Comparative analysis of metagenomes of Italian top soil improvers

    International Nuclear Information System (INIS)

    Gigliucci, Federica; Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano

    2017-01-01

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15–50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms’ profiles as indicators of their origin. - Highlights: • Sludge- and green- based biosolids analysed by metagenomics. • Biosolids may introduce microbial hazards in the food chain. • Metagenomics enables tracking biosolids’ sources.

  13. Ethical analysis to improve decision-making on health technologies.

    Science.gov (United States)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian; Lühmann, Dagmar; Mäkelä, Marjukka; Velasco-Garrido, Marcial; Autti-Rämö, Ilona

    2008-08-01

    Health technology assessment (HTA) is the multidisciplinary study of the implications of the development, diffusion and use of health technologies. It supports health-policy decisions by providing a joint knowledge base for decision-makers. To increase its policy relevance, HTA tries to extend beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs, and their implementation may also have significant impact on people other than the patient. These are essential considerations for health policy. The ethics model is structured around key ethical questions rather than philosophical theories, to be applicable to different cultures and usable by non-philosophers. Integrating ethical considerations into HTA can improve the relevance of technology assessments for health care and health policy in both developed and developing countries.

  14. Comparative analysis of metagenomes of Italian top soil improvers.

    Science.gov (United States)

    Gigliucci, Federica; Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano

    2017-05-01

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15-50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms' profiles as indicators of their origin. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  16. A multivariate analysis of factors affecting adoption of improved ...

    African Journals Online (AJOL)

    This paper analyzes the synergies/tradeoffs involved in the adoption of improved varieties of multiple crops in the mixed crop-livestock production systems of the highlands of Ethiopia A multivariate probit (MVP) model involving a system of four equations for the adoption decision of improved varieties of barley, potatoes, ...

  17. A Multivariate Analysis of Factors Affecting Adoption of Improved ...

    African Journals Online (AJOL)

    and significant effects on the likelihood of adopting improved varieties of barley and wheat and vice versa - indicating synergistic effects among the adoption decisions of the two groups of crops. On the other hand, the area share of improved varieties of wheat negatively and significantly effects the chances of using ...

  18. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  19. Analysis and improvement of security of energy smart grids

    International Nuclear Information System (INIS)

    Halimi, Halim

    2014-01-01

    The Smart grid is the next generation power grid, which is a new self-healing, self-activating form of electricity network, and integrates power-flow control, increased quality of electricity, and energy reliability, energy efficiency and energy security using information and communication technologies. Communication networks play a critical role in smart grid, as the intelligence of smart grid is built based on information exchange across the power grid. Its two-way communication and electricity flow enable to monitor, predict and manage the energy usage. To upgrade an existing power grid into a smart grid, it requires an intelligent and secure communication infrastructure. Because of that, the main goal of this dissertation is to propose new architecture and implementation of algorithms for analysis and improvement of the security and reliability in smart grid. In power transmission segments of smart grid, wired communications are usually adopted to ensure robustness of the backbone power network. In contrast, for a power distribution grid, wireless communications provide many benefits such as low cost high speed links, easy setup of connections among different devices/appliances, and so on. Wireless communications are usually more vulnerable to security attacks than wired ones. Developing appropriate wireless communication architecture and its security measures is extremely important for a smart grid system. This research addresses physical layer security in a Wireless Smart Grid. Hence a defense Quorum- based algorithm is proposed to ensure physical security in wireless communication. The new security architecture for smart grid that supports privacy-preserving, data aggregation and access control is defined. This architecture consists of two parts. In the first part we propose to use an efficient and privacy-preserving aggregation scheme (EPPA), which aggregates real-time data of consumers by Local Gateway. During aggregation the privacy of consumers is

  20. Improved Extreme Learning Machine based on the Sensitivity Analysis

    Science.gov (United States)

    Cui, Licheng; Zhai, Huawei; Wang, Benchao; Qu, Zengtang

    2018-03-01

    Extreme learning machine and its improved ones is weak in some points, such as computing complex, learning error and so on. After deeply analyzing, referencing the importance of hidden nodes in SVM, an novel analyzing method of the sensitivity is proposed which meets people’s cognitive habits. Based on these, an improved ELM is proposed, it could remove hidden nodes before meeting the learning error, and it can efficiently manage the number of hidden nodes, so as to improve the its performance. After comparing tests, it is better in learning time, accuracy and so on.

  1. Does ownership of improved dairy cow breeds improve child nutrition? A pathway analysis for Uganda.

    Science.gov (United States)

    Kabunga, Nassul S; Ghosh, Shibani; Webb, Patrick

    2017-01-01

    The promotion of livestock production is widely believed to support enhanced diet quality and child nutrition, but the empirical evidence for this causal linkage remains narrow and ambiguous. This study examines whether adoption of improved dairy cow breeds is linked to farm-level outcomes that translate into household-level benefits including improved child nutrition outcomes in Uganda. Using nationwide data from Uganda's National Panel Survey, propensity score matching is used to create an unbiased counterfactual, based on observed characteristics, to assess the net impacts of improved dairy cow adoption. All estimates were tested for robustness and sensitivity to variations in observable and unobservable confounders. Results based on the matched samples showed that households adopting improved dairy cows significantly increased milk yield-by over 200% on average. This resulted in higher milk sales and milk intakes, demonstrating the potential of this agricultural technology to both integrate households into modern value chains and increase households' access to animal source foods. Use of improved dairy cows increased household food expenditures by about 16%. Although undernutrition was widely prevalent in the study sample and in matched households, the adoption of improved dairy cows was associated with lower child stunting in adopter household. In scale terms, results also showed that holding larger farms tends to support adoption, but that this also stimulates the household's ability to achieve gains from adoption, which can translate into enhanced nutrition.

  2. Does ownership of improved dairy cow breeds improve child nutrition? A pathway analysis for Uganda.

    Directory of Open Access Journals (Sweden)

    Nassul S Kabunga

    Full Text Available The promotion of livestock production is widely believed to support enhanced diet quality and child nutrition, but the empirical evidence for this causal linkage remains narrow and ambiguous. This study examines whether adoption of improved dairy cow breeds is linked to farm-level outcomes that translate into household-level benefits including improved child nutrition outcomes in Uganda. Using nationwide data from Uganda's National Panel Survey, propensity score matching is used to create an unbiased counterfactual, based on observed characteristics, to assess the net impacts of improved dairy cow adoption. All estimates were tested for robustness and sensitivity to variations in observable and unobservable confounders. Results based on the matched samples showed that households adopting improved dairy cows significantly increased milk yield-by over 200% on average. This resulted in higher milk sales and milk intakes, demonstrating the potential of this agricultural technology to both integrate households into modern value chains and increase households' access to animal source foods. Use of improved dairy cows increased household food expenditures by about 16%. Although undernutrition was widely prevalent in the study sample and in matched households, the adoption of improved dairy cows was associated with lower child stunting in adopter household. In scale terms, results also showed that holding larger farms tends to support adoption, but that this also stimulates the household's ability to achieve gains from adoption, which can translate into enhanced nutrition.

  3. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  4. Using external data sources to improve audit trail analysis.

    OpenAIRE

    Herting, R. L.; Asaro, P. V.; Roth, A. C.; Barnes, M. R.

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system wh...

  5. Improved Methods for Pitch Synchronous Linear Prediction Analysis of Speech

    OpenAIRE

    劉, 麗清

    2015-01-01

    Linear prediction (LP) analysis has been applied to speech system over the last few decades. LP technique is well-suited for speech analysis due to its ability to model speech production process approximately. Hence LP analysis has been widely used for speech enhancement, low-bit-rate speech coding in cellular telephony, speech recognition, characteristic parameter extraction (vocal tract resonances frequencies, fundamental frequency called pitch) and so on. However, the performance of the co...

  6. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  7. Why economic analysis of health system improvement interventions matters

    Directory of Open Access Journals (Sweden)

    Edward Ivor Broughton

    2016-10-01

    Full Text Available There is little evidence to direct health systems toward providing efficient interventions to address medical errors, defined as an unintended act of omission or commission or one not executed as intended that may or may not cause harm to the patient but does not achieve its intended outcome. We believe that lack of guidance on what is the most efficient way to reduce adverse events and improve the quality of health care limits the scale-up of health system improvement interventions. Challenges to economic evaluation of these interventions include defining and implementing improvement interventions in different settings with high fidelity, capturing all of the positive and negative effects of the intervention, using process measures of effectiveness rather than health outcomes, and determining the full cost of the intervention and all economic consequences its effects. However, health system improvement interventions should be treated similarly to individual medical interventions and undergo rigorous economic evaluation to provide actionable evidence to guide policy-makers in decisions of resources allocation for improvement activities among other competing demands for health care resources.

  8. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pimprovement of +6.1% (p=0.017) and sustained monthly improvements in care delivery-improving at a rate of 0.7% per month (p=0.028). The SAMU experience demonstrates the utility of a responsive, data-driven quality improvement programme to yield significant immediate and sustained improvements in pre-hospital care for trauma in Rwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the

  9. Method for improving accuracy in full evaporation headspace analysis.

    Science.gov (United States)

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A Note on Improved Homotopy Analysis Method for Solving the Jeffery-Hamel Flow

    OpenAIRE

    Motsa, Sandile Sydney; Sibanda, Precious; Marewo, Gerald T.; Shateyi, Stanford

    2010-01-01

    This paper presents the solution of the nonlinear equation that governs the flow of a viscous, incompressible fluid between two converging-diverging rigid walls using an improved homotopy analysis method. The results obtained by this new technique show that the improved homotopy analysis method converges much faster than both the homotopy analysis method and the optimal homotopy asymptotic method. This improved technique is observed to be much more accurate than these traditional ...

  11. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    Telugu language is one of the most widely spoken south Indian languages. In the proposed Telugu speech recognition system, errors obtained from decoder are analysed to improve the performance of the speech recognition system. Static pronunciation dictionary plays a key role in the speech recognition accuracy.

  12. An Analysis of Appropriateness of Improved Rice Technology for ...

    African Journals Online (AJOL)

    The study was carried out in selected states of South Western Nigeria in order to investigate the appropriateness of improved rice technology for women farmers. A purposive sampling technique was used to select 320 women farmers from the study area. Data collection was carried out using structured interview schedule.

  13. Improved omit set displacement recoveries in dynamic analysis

    Science.gov (United States)

    Allen, Tom; Cook, Greg; Walls, Bill

    1993-09-01

    Two related methods for improving the dependent (OMIT set) displacements after performing a Guyan reduction are presented. The theoretical bases for the methods are derived. The NASTRAN DMAP ALTERs used to implement the methods in a NASTRAN execution are described. Data are presented that verify the methods and the NASTRAN DMAP ALTERs.

  14. Does Competition Improve Public School Efficiency? A Spatial Analysis

    Science.gov (United States)

    Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.

    2012-01-01

    Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…

  15. Improving Capabilities for Irregular Warfare. Volume 2. Capabilities Analysis

    Science.gov (United States)

    2007-08-01

    and Iraq were limited to six orbits at any one time. Only six operator stations were available at Nellis AFB in Nevada , where Predators were...improve this system will need to hit the ground running with a well thought-out plan. The cost of “learn as you go” will be high: hunger and patience

  16. A Multivariate Analysis of Factors Affecting Adoption of Improved ...

    African Journals Online (AJOL)

    Internal factors that affect the adoption and use of agricultural technologies include farmers' attitude ... Single probit and logit models are often employed to model discrete choices such as adoption of improved ..... suggests that there are unobservable factors affecting both choices and reveals an association after controlling ...

  17. Rapid economic analysis of northern hardwood stand improvement options

    Science.gov (United States)

    William B. Leak

    1980-01-01

    Data and methodology are provided for projecting basal area, diameter, volumes, and values by product for northern hardwood stands, and for determining the rate of return on stand improvement investments. The method is rapid, requires a minimum amount of information, and should prove useful for on-the-ground economic analyses.

  18. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    measures, error-rate and Word Error Rate (WER) by application of the proposed method. Keywords. Speech recognition; pronunciation dictionary modification method; error analysis; F-measure. 1. Introduction. Speech is one of the easiest modes of ...

  19. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  20. Optimizing Bus Passenger Complaint Service through Big Data Analysis: Systematized Analysis for Improved Public Sector Management

    Directory of Open Access Journals (Sweden)

    Weng-Kun Liu

    2016-12-01

    Full Text Available With the advances in industry and commerce, passengers have become more accepting of environmental sustainability issues; thus, more people now choose to travel by bus. Government administration constitutes an important part of bus transportation services as the government gives the right-of-way to transportation companies allowing them to provide services. When these services are of poor quality, passengers may lodge complaints. The increase in consumer awareness and developments in wireless communication technologies have made it possible for passengers to easily and immediately submit complaints about transportation companies to government institutions, which has brought drastic changes to the supply–demand chain comprised of the public sector, transportation companies, and passengers. This study proposed the use of big data analysis technology including systematized case assignment and data visualization to improve management processes in the public sector and optimize customer complaint services. Taichung City, Taiwan, was selected as the research area. There, the customer complaint management process in public sector was improved, effectively solving such issues as station-skipping, allowing the public sector to fully grasp the service level of transportation companies, improving the sustainability of bus operations, and supporting the sustainable development of the public sector–transportation company–passenger supply chain.

  1. Analysis and improvement of face detection based on surf cascade

    Science.gov (United States)

    Hu, Siquan; Zhang, Caihong; Liu, Lei

    2017-08-01

    This paper aims to study limitations of the commonly employed boosting cascade framework. We focus on the factors like data, feature, weak classifier and stages. A set of novel experiments were done to show the relationship. The model contains three key points: SURF feature, weak classifier based on logistic regression and AUC-based cascade learning algorithm. This paper adds cross validation in logistic regression creatively which improves accuracy and speeds up convergence greatly. Eventually only five stages and about 100 weak classifiers are needed. The frontal face detector improves reject rate to 99% for the first three stages, decreases number of false positive greatly and achieves comparable performance among non-CNN techniques on FDDB dataset.

  2. Mass Spectrometry Vapor Analysis for Improving Explosives Detection Canine Proficiency

    Science.gov (United States)

    2017-02-10

    ionization (SESI), 8,19-21 dielectric barrier discharge ionization (DBDI), 21,22 selected-ion-flow-tube ( SIFT ), 23,24 and proton transfer reaction...sensors in addition to as service teams, will improve training efficiency and opera- tional performance. In support of canine training, the Department of...for both materials, but transfer efficiency into the source may be poor. Instrument response also saturat- ed when analyzing cyclohexanone. These

  3. Improving the usefulness of accounting data in financial analysis

    Directory of Open Access Journals (Sweden)

    A Saville

    2004-04-01

    Full Text Available Accounting practices are flawed.  As a consequence, the accounting data generated by firms are generally open to interpretation, often misleading and sometimes patently false.  Yet, financial analysts place tremendous confidence in accounting data when appraising investments and investment strategies.  The implications of financial analysis based on questionable information are numerous, and range from inexact analysis to acute investment error.  To rectify this situation, this paper identifies a set of simple, yet highly effective corrective measures, which have the capacity to move accounting practice into a realm wherein accounting starts to ‘count what counts’.  The net result would be delivery of accounting data that more accurately reflect firms’ economic realities and, as such, are more useful in the task of financial analysis.

  4. Using external data sources to improve audit trail analysis.

    Science.gov (United States)

    Herting, R L; Asaro, P V; Roth, A C; Barnes, M R

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system where information available in the medical record is augmented with external information sources such as: database sources, Light-weight Directory Access Protocol (LDAP) server sources, and World Wide Web (WWW) database sources. We discuss several issues that arise when combining the information from each of these disparate information sources. Furthermore, we explain how the enhanced person specific information obtained can be used to determine user-patient relationships that might signify a motive for inappropriately accessing a patient's medical record.

  5. The Subjectivity Problem: Improving Triangulation Approaches in Metaphor Analysis Studies

    Directory of Open Access Journals (Sweden)

    Sonya L. Armstrong

    2011-06-01

    Full Text Available Metaphor analysis procedures for uncovering participant conceptualizations have been well-established in qualitative research settings since the early 1980s; however, one common criticism of metaphor analysis is the trustworthiness of the findings. Namely, accurate determination of the conceptual metaphors held by participants based on the investigation of linguistic metaphors has been identified as a methodological issue because of the subjectivity involved in the interpretation; that is, because they are necessarily situated in specific social and cultural milieus, meanings of particular metaphors are not universally constructed nor understood. In light of these critiques, this article provides examples of two different triangulation methods that can be employed to supplement the trustworthiness of the findings when metaphor analysis methodologies are used.

  6. A quality improvement study using fishbone analysis and an electronic medical records intervention to improve care for children with asthma.

    Science.gov (United States)

    Gold, Jonathan; Reyes-Gastelum, David; Turner, Jane; Davies, H Dele

    2014-01-01

    Despite expert guidelines, gaps persist in quality of care for children with asthma. This study sought to identify barriers and potential interventions to improve compliance to national asthma prevention guidelines at a single academic pediatric primary care clinic. Using the plan-do-check-act (PDCA) quality improvement framework and fishbone analysis, several barriers to consistent asthma processes and possible interventions were identified by a group of key stakeholders. Two interventions were implemented using the electronic medical record (EMR). Physician documentation of asthma quality measures were analyzed before intervention and during 2 subsequent time points over 16 months. Documentation of asthma action plans (core group P PDCA and fishbone analysis in conjunction with embedded EMR tools can improve asthma care in a pediatric primary care setting.

  7. Rasch Analysis for Psychometric Improvement of Science Attitude Rating Scales

    Science.gov (United States)

    Oon, Pey-Tee; Fan, Xitao

    2017-01-01

    Students' attitude towards science (SAS) is often a subject of investigation in science education research. Survey of rating scale is commonly used in the study of SAS. The present study illustrates how Rasch analysis can be used to provide psychometric information of SAS rating scales. The analyses were conducted on a 20-item SAS scale used in an…

  8. Improving the Computational Morphological Analysis of a Swahili ...

    African Journals Online (AJOL)

    approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation. Keywords: LEXICOGRAPHY, MORPHOLOGY, CORPUS ANNOTATION, LEMMATIZATION, MACHINE LEARNING, SWAHILI ...

  9. Temporal Land Cover Analysis for Net Ecosystem Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Ke, Yinghai; Coleman, Andre M.; Diefenderfer, Heida L.

    2013-04-09

    We delineated 8 watersheds contributing to previously defined river reaches within the 1,468-km2 historical floodplain of the tidally influenced lower Columbia River and estuary. We assessed land-cover change at the watershed, reach, and restoration site scales by reclassifying remote-sensing data from the National Oceanic and Atmospheric Administration Coastal Change Analysis Program’s land cover/land change product into forest, wetland, and urban categories. The analysis showed a 198.3 km2 loss of forest cover during the first 6 years of the Columbia Estuary Ecosystem Restoration Program, 2001–2006. Total measured urbanization in the contributing watersheds of the estuary during the full 1996-2006 change analysis period was 48.4 km2. Trends in forest gain/loss and urbanization differed between watersheds. Wetland gains and losses were within the margin of error of the satellite imagery analysis. No significant land cover change was measured at restoration sites, although it was visible in aerial imagery, therefore, the 30-m land-cover product may not be appropriate for assessment of early-stage wetland restoration. These findings suggest that floodplain restoration sites in reaches downstream of watersheds with decreasing forest cover will be subject to increased sediment loads, and those downstream of urbanization will experience effects of increased impervious surfaces on hydrologic processes.

  10. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  11. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  12. Stiffness Analysis and Improvement of Bolt-Plate Contact Assemblies

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard; Pedersen, Pauli

    2008-01-01

    and the plate is extended by the possibility of designing a gap, that is, a nonuniform distance between the bolt and plate before prestressing. Designing the gap function generates the possibility for a better stress field by which the stiffness of the bolt is lowered, and at the same time the stiffness...... of the members is increased. Both of these changes have a positive influence on the lifetime of the connections. From designing a varying gap size distribution, it is found that the stiffness become a function of the loading. It is shown that similar improvements in the stiffness ratio between the bolt...

  13. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  14. An improvement analysis on video compression using file segmentation

    Science.gov (United States)

    Sharma, Shubhankar; Singh, K. John; Priya, M.

    2017-11-01

    From the past two decades the extreme evolution of the Internet has lead a massive rise in video technology and significantly video consumption over the Internet which inhabits the bulk of data traffic in general. Clearly, video consumes that so much data size on the World Wide Web, to reduce the burden on the Internet and deduction of bandwidth consume by video so that the user can easily access the video data.For this, many video codecs are developed such as HEVC/H.265 and V9. Although after seeing codec like this one gets a dilemma of which would be improved technology in the manner of rate distortion and the coding standard.This paper gives a solution about the difficulty for getting low delay in video compression and video application e.g. ad-hoc video conferencing/streaming or observation by surveillance. Also this paper describes the benchmark of HEVC and V9 technique of video compression on subjective oral estimations of High Definition video content, playback on web browsers. Moreover, this gives the experimental ideology of dividing the video file into several segments for compression and putting back together to improve the efficiency of video compression on the web as well as on the offline mode.

  15. SIFT Based Vein Recognition Models: Analysis and Improvement

    Directory of Open Access Journals (Sweden)

    Guoqing Wang

    2017-01-01

    Full Text Available Scale-Invariant Feature Transform (SIFT is being investigated more and more to realize a less-constrained hand vein recognition system. Contrast enhancement (CE, compensating for deficient dynamic range aspects, is a must for SIFT based framework to improve the performance. However, evidence of negative influence on SIFT matching brought by CE is analysed by our experiments. We bring evidence that the number of extracted keypoints resulting by gradient based detectors increases greatly with different CE methods, while on the other hand the matching result of extracted invariant descriptors is negatively influenced in terms of Precision-Recall (PR and Equal Error Rate (EER. Rigorous experiments with state-of-the-art and other CE adopted in published SIFT based hand vein recognition system demonstrate the influence. What is more, an improved SIFT model by importing the kernel of RootSIFT and Mirror Match Strategy into a unified framework is proposed to make use of the positive keypoints change and make up for the negative influence brought by CE.

  16. Analysis of radial electric field in LHD towards improved confinement

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.

    2001-05-01

    The radial electric field (E{sub r}) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E{sub r} transition and bifurcation. The ambipolar E{sub r} is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E{sub r} properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E{sub r} and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E{sub r} in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E{sub r} can exist. This is the region where E{sub r} transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E{sub r} properties, which indicates an optimum path towards improved confinement. (author)

  17. Analysis of radial electric field in LHD towards improved confinement

    International Nuclear Information System (INIS)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.

    2001-05-01

    The radial electric field (E r ) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E r transition and bifurcation. The ambipolar E r is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E r properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E r and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E r in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E r can exist. This is the region where E r transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E r properties, which indicates an optimum path towards improved confinement. (author)

  18. Improving E-Business Design through Business Model Analysis

    OpenAIRE

    Ilayperuma, Tharaka

    2010-01-01

    To a rapidly increasing degree, traditional organizational structures evolve in large parts of the world towards online business using modern Information and Communication Technology (ICT) capabilities. For efficient applications of inter-organizational information systems, the alignment between business and ICT is a key factor. In this context, business analysis using business modelling can be regarded as a first step in designing economically sustainable e-business solutions. This thesis ex...

  19. In House HSV PCR, Process Improvement and Cost Effectiveness Analysis

    Science.gov (United States)

    2017-09-15

    TYPE 09/15/2017 Poster 4. TITLE AND SUBTITLE Cost-Analysis: In-hous(l HSV P(’R capabilities 6. AUTHOR(S) Ma.i Nich() las R CaJT 7. PERFORMING...ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMIT A TIC ".’ OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES 3

  20. Improved methods for dependent failure analysis in PSA

    International Nuclear Information System (INIS)

    Ballard, G.M.; Games, A.M.

    1988-01-01

    The basic design principle used in ensuring the safe operation of nuclear power plant is defence in depth. This normally takes the form of redundant equipment and systems which provide protection even if a number of equipment failures occur. Such redundancy is particularly effective in ensuring that multiple, independent equipment failures with the potential for jeopardising reactor safety will be rare events. However the achievement of high reliability has served to highlight the potentially dominant role of multiple, dependent failures of equipment and systems. Analysis of reactor operating experience has shown that dependent failure events are the major contributors to safety system failures and reactor incidents and accidents. In parallel PSA studies have shown that the results of a safety analysis are sensitive to assumptions made about the dependent failure (CCF) probability for safety systems. Thus a Westinghouse Analysis showed that increasing system dependent failure probabilities by a factor of 5 led to a factor 4 increase in core. This paper particularly refers to the engineering concepts underlying dependent failure assessment touching briefly on aspects of data. It is specifically not the intent of our work to develop a new mathematical model of CCF but to aid the use of existing models

  1. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  2. Optimization to improve precision in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani

    2010-01-01

    The level of precision or accuracy required in analysis should be satisfied the general requirements and customer needs. In presenting the results of the analysis, the level of precision is expressed as uncertainty. Requirement general is Horwitz prediction. Factors affecting the uncertainty in the Neutron Activation Analysis (NAA) include the mass of sample, mass standards, concentration in standard, count of sample, count of standard and counting geometry. Therefore, to achieve the expected level of precision, these parameters need to be optimized. A standard concentration of similar materials is applied as a basis of calculation. In the calculation NIST SRM 2704 is applied for sediment samples. Mass of sample, irradiation time and cooling time can be modified to obtain the expected uncertainty. The prediction results show the level of precision for Al, V, Mg, Mn, K, Na, As, Cr, Co, Fe, and Zn eligible the Horwitz. The predictive the count and standard deviation for Mg-27 and Zn-65 were higher than the actual value occurred due to overlapping of Mg-27 and Mn-54 peaks and Zn-65 and Fe-59 peaks. Precision level of Ca is greater than the Horwitz, since the value of microscopic cross section, the probability of radiation emission of Ca-49 and gamma spectrometer efficiency at 3084 keV is relatively small. Increased precision can only be done by extending the counting time and multiply the number of samples, because of the fixed value. The prediction results are in accordance with experimental results. (author)

  3. [Predictors of symptomatic improvement in eating disorders. Preliminary analysis].

    Science.gov (United States)

    Cygankiewicz, Patrycja; Solecka, Dorota; Pilecki, Maciej Wojciech; Józefik, Barbara

    2012-01-01

    The article discusses the preliminary results of a follow-up study carried out in 2009-2010 on former patients with a diagnosis of anorexia nervosa and bulimia nervosa, first seen in 2001-2004 at the Department of Child and Adolescent Psychiatry, the Jagiellonian University Medical College in Krakow. At that time they had been taking part in a research project, whose aim was to define the relationships among the psychopathological picture of eating disorders, self-image and family relations and also the influence of socio-cultural factors. The aim of the current study is to attempt to define factors influencing the course and prognosis of eating disorders in the studied group. Results from the Eating Disorder Inventory Questionnaire (EDI) and the Polish version of Family Assessment Measure (KOR) in the first study were juxtaposed with the clinical state and parameters of psychosocial functioning of the studied women assessed on the basis of the follow-up study. In the studied group, 13 girls suffered from anorexia nervosa--restricting type, 6 from anorexia nervosa binge-eating/purging type, and 6 from bulimia. In the studied group, there was complete symptomatic improvement in 12 persons (48%), subclinical symptoms continued to be observed in 9 persons (36%), and 4 persons (16%) met full diagnostic criteria for eating disorders. The most favourable course was observed in the group with a diagnosis of anorexia nervosa restricting type. The least favourable was observed in the group with a diagnosis of bulimia. Results. In the studied group, 13 girls suffered from anorexia nervosa - restricting type, 6 from anorexia nervosa binge-eating/purging type, and 6 from bulimia. In the studied group, there was complete symptomatic improvement in 12 persons (48%), subclinical symptoms continued to be observed in 9 persons (36%), and 4 persons (16%) met full diagnostic criteria for eating disorders. The most favourable course was observed in the group with a diagnosis of

  4. Improvement of testing and maintenance based on fault tree analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2000-01-01

    Testing and maintenance of safety equipment is an important issue, which significantly contributes to safe and efficient operation of a nuclear power plant. In this paper a method, which extends the classical fault tree with time, is presented. Its mathematical model is represented by a set of equations, which include time requirements defined in the house event matrix. House events matrix is a representation of house events switched on and off through the discrete points of time. It includes house events, which timely switch on and off parts of the fault tree in accordance with the status of the plant configuration. Time dependent top event probability is calculated by the fault tree evaluations. Arrangement of components outages is determined on base of minimization of mean system unavailability. The results show that application of the method may improve the time placement of testing and maintenance activities of safety equipment. (author)

  5. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  6. Numerical analysis of modeling based on improved Elman neural network.

    Science.gov (United States)

    Jie, Shao; Li, Wang; WeiSong, Zhao; YaQin, Zhong; Malekian, Reza

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance.

  7. Improving Between-Shot Fusion Data Analysis with Parallel Structures

    Energy Technology Data Exchange (ETDEWEB)

    CHET NIETER

    2005-07-27

    In the Phase I project we concentrated on three technical objectives to demonstrate the feasibility of the Phase II project: (1) the development of a parallel MDSplus data handler, (2) the parallelization of existing fusion data analysis packages, and (3) the development of techniques to automatically generate parallelized code using pre-compiler directives. We summarize the results of the Phase I research for each of these objectives below. We also describe below additional accomplishments related to the development of the TaskDL and mpiDL parallelization packages.

  8. Recent improvements in plutonium gamma-ray analysis using MGA

    International Nuclear Information System (INIS)

    Ruhter, W.D.; Gunnink, R.

    1992-06-01

    MGA is a gamma-ray spectrum analysis program for determining relative plutonium isotopic abundances. It can determine plutonium isotopic abundances better than 1% using a high-resolution, low-energy, planar germanium detector and measurement times ten minutes or less. We have modified MGA to allow determination of absolute plutonium isotopic abundances in solutions. With calibration of a detector using a known solution concentration in a well-defined sample geometry, plutonium solution concentrations can be determined. MGA can include analysis of a second spectrum of the high-energy spectrum to include determination of fission product abundances relative to total plutonium. For the high-energy gamma-ray measurements we have devised a new hardware configuration, so that both the low- and high-energy gamma-ray detectors are mounted in a single cryostat thereby reducing weight and volume of the detector systems. We describe the detector configuration, and the performance of the MGA program for determining plutonium concentrations in solutions and fission product abundances

  9. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada

    2016-08-01

    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  10. Systematic wavelength selection for improved multivariate spectral analysis

    Science.gov (United States)

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  11. Performance analysis of PV plants: Optimization for improving profitability

    International Nuclear Information System (INIS)

    Díez-Mediavilla, M.; Alonso-Tristán, C.; Rodríguez-Amigo, M.C.; García-Calderón, T.; Dieste-Velasco, M.I.

    2012-01-01

    Highlights: ► Real PV production from two 100 kW p grid-connected installations is conducted. ► Data sets on production were collected over an entire year. ► Economic results highlight the importance of properly selecting the system components. ► Performance of PV plants is directly related to improvements of all components. - Abstract: A study is conducted of real PV production from two 100 kW p grid-connected installations located in the same area, both of which experience the same fluctuations in temperature and radiation. Data sets on production were collected over an entire year and both installations were compared under various levels of radiation. The installations were assembled with mono-Si panels, mounted on the same support system, and the power supply was equal for the inverter and the measurement system; the same parameters were also employed for the wiring, and electrical losses were calculated in both cases. The results, in economic terms, highlight the importance of properly selecting the system components and the design parameters for maximum profitability.

  12. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...

  13. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...

  14. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  15. Pooled calibrations and retainment of outliers improve chemical analysis

    DEFF Research Database (Denmark)

    Andersen, Jens; Alfaloje, Haedar S.H.

    2012-01-01

    Analytical chemistry has a large responsibility in society, and credibility and reliability are important concepts associated with chemical analysis. Metrology and Quality Assurance (QA) are key areas of interest in contemporary research. Quality in measurements is illustrated by a series...... of experiments with several analytical technologies comprising of ICP-MS, GC-MS and AAS. The scientific methodology relies on the concept of reproducibility that depends on type of analyte and type of apparatus. By applying the principle of pooled calibrations it is shown that the performance of the apparatus...... indicate that the procedures outlined in the Eurachem/CITAC Guide are of tremendous value to analytical sciences because they direct researcher's attention towards the concept of consensus values rather than towards true values. Introduction of certified reference materials (CRM’s) in metrology has...

  16. An improved method for thin layer chromatographic analysis of saponins.

    Science.gov (United States)

    Sharma, Om P; Kumar, Neeraj; Singh, Bikram; Bhat, Tej K

    2012-05-01

    Analysis of saponins by thin layer chromatography (TLC) is reported. The solvent system was n-butanol:water:acetic acid (84:14:7). Detection of saponins on the TLC plates after development and air-drying was done by immersion in a suspension of sheep erythrocytes, followed by washing off the excess blood on the plate surface. Saponins appeared as white spots against a pink background. The protocol provided specific detection of saponins in the saponins enriched extracts from Aesculusindica (Wall. ex Camb.) Hook.f., Lonicera japonica Thunb., Silene inflata Sm., Sapindusmukorossi Gaertn., Chlorophytum borivilianum Santapau & Fernandes, Asparagusadscendens Roxb., Asparagus racemosus Willd., Agave americana L., Camellia sinensis [L.] O. Kuntze. The protocol is convenient, inexpensive, does not require any corrosive chemicals and provides specific detection of saponins. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    , and their implementation may also have significant impact on people other than the patient. These are essential considerations for health policy. The ethics model is structured around key ethical questions rather than philosophical theories, to be applicable to different cultures and usable by non-philosophers...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...

  18. Database improvements for motor vehicle/bicycle crash analysis

    Science.gov (United States)

    Lusk, Anne C; Asgarzadeh, Morteza; Farvid, Maryam S

    2015-01-01

    Background Bicycling is healthy but needs to be safer for more to bike. Police crash templates are designed for reporting crashes between motor vehicles, but not between vehicles/bicycles. If written/drawn bicycle-crash-scene details exist, these are not entered into spreadsheets. Objective To assess which bicycle-crash-scene data might be added to spreadsheets for analysis. Methods Police crash templates from 50 states were analysed. Reports for 3350 motor vehicle/bicycle crashes (2011) were obtained for the New York City area and 300 cases selected (with drawings and on roads with sharrows, bike lanes, cycle tracks and no bike provisions). Crashes were redrawn and new bicycle-crash-scene details were coded and entered into the existing spreadsheet. The association between severity of injuries and bicycle-crash-scene codes was evaluated using multiple logistic regression. Results Police templates only consistently include pedal-cyclist and helmet. Bicycle-crash-scene coded variables for templates could include: 4 bicycle environments, 18 vehicle impact-points (opened-doors and mirrors), 4 bicycle impact-points, motor vehicle/bicycle crash patterns, in/out of the bicycle environment and bike/relevant motor vehicle categories. A test of including these variables suggested that, with bicyclists who had minor injuries as the control group, bicyclists on roads with bike lanes riding outside the lane had lower likelihood of severe injuries (OR, 0.40, 95% CI 0.16 to 0.98) compared with bicyclists riding on roads without bicycle facilities. Conclusions Police templates should include additional bicycle-crash-scene codes for entry into spreadsheets. Crash analysis, including with big data, could then be conducted on bicycle environments, motor vehicle potential impact points/doors/mirrors, bicycle potential impact points, motor vehicle characteristics, location and injury. PMID:25835304

  19. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal

    Science.gov (United States)

    2014-05-01

    1 LEXICAL LINK ANALYSIS APPLICATION: IMPROVING WEB SERVICE TO ACQUISITION VISIBILITY PORTAL May 14-15, 2014 Dr. Ying Zhao, Dr. Douglas J. MacKinnon...3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service to Acquisition...Extract relations among PE, MDAP, and ACATII • Extract costs 2 Methods: System Self-awareness (SSA) and Lexical Link Analysis (LLA) • System Self

  20. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Directory of Open Access Journals (Sweden)

    Gafurov Andrey

    2018-01-01

    Full Text Available The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the “Project analysis scenario” flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  1. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Science.gov (United States)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  2. Delamination Modeling of Composites for Improved Crash Analysis

    Science.gov (United States)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  3. Improving Power System Stability Using Transfer Function: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    G. Shahgholian

    2017-10-01

    Full Text Available In this paper, a small-signal dynamic model of a single-machine infinite-bus (SMIB power system that includes IEEE type-ST1 excitation system and PSS based on transfer fu¬n¬c¬¬tion structure is presented. The changes in the operating co¬n¬dition of a power system on dynamic performance have been exa¬m¬ined. The dynamic performance of the closed-loop system is ana¬lyzed base on its eigenvalues. The effectiveness of the par¬a¬m¬e¬t¬ers changes on dynamic stability is verified by simulation res¬u¬l¬ts. Three types of PSS have been considered for analysis: (a the derivative PSS, (b the lead-lag PSS or conventional PSS, and (c the proportional-integral-derivative PSS. The objective fu¬nc¬t¬i¬o¬n is formulated to increase the dam¬¬ping ratio of the electromechanical mode eigenvalues. Simu¬la¬tion results show that the PID-PSS performs better for less ov¬e¬r¬shoot and less settling time comp¬ared with the CPSS and DPSS un¬der different load ope¬ration and the significant system pa¬r¬am¬eter variation conditions.

  4. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    Science.gov (United States)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  5. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-01-30

    monitoring of content that is accessible. The study examines risks associated with information security, technological change and continued popularity of Scouting. Mitigation is based on system functions that are defined. The approach to developing an improved system for facilitating Boy Scout leader functions was iterative with insights into capabilities coming in the course of working through the used cases and sequence diagrams.

  6. Using robust statistics to improve neutron activation analysis results

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G.

    2011-01-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  7. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2012-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  8. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2013-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  9. Dr. Dahl publishes article on improving intelligence analysis through the use of case studies

    OpenAIRE

    2017-01-01

    NSA’s Dr. Erik Dahl has recently published an article in Intelligence and National Security entitled “Getting beyond Analysis by Anecdote: Improving Intelligence Analysis through the Use of Case Studies.” Dahl argues that although American intelligence officials have been trying since the 9/11 attacks to improve the quality of their analysis, they have so far failed to make much use of one of the most common methods used in the social science: case study analysis. Through better u...

  10. Logistics analysis to Improve Deployability (LOG-AID): Field Experiment/Results

    National Research Council Canada - National Science Library

    Evers, Kenneth

    2000-01-01

    .... Under sponsorship of the Air Force Research Laboratory Logistics Readiness Branch (AFRL/HESR), the Synergy team analyzed the current wing-level deployment process as part of the Logistics Analysis to Improve Deployability (LOG-AID) program...

  11. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    International Nuclear Information System (INIS)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H.

    2005-01-01

    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported

  12. On-treatment analysis of the Improved Reduction of Outcomes: Vytorin Efficacy International Trial (IMPROVE-IT).

    Science.gov (United States)

    Blazing, Michael A; Giugliano, Robert P; de Lemos, James A; Cannon, Christopher P; Tonkin, Andrew; Ballantyne, Christie M; Lewis, Basil S; Musliner, Thomas A; Tershakovec, Andrew M; Lokhnygina, Yuliya; White, Jennifer A; Reist, Craig; McCagg, Amy; Braunwald, Eugene

    2016-12-01

    We aimed to determine the efficacy and safety of adding ezetimibe (Ez) to simvastatin (S) in a post-acute coronary syndrome (ACS) population in a prespecified on-treatment analysis. We evaluated 17,706 post-ACS patients from the IMPROVE-IT trial who had low-density lipoprotein cholesterol values between 50 and 125 mg/dL and who received Ez 10 mg/d with S 40 mg/d (Ez/S) or placebo with simvastatin 40 mg/d (P/S). The primary composite end point was cardiovascular death, myocardial infarction, unstable angina, coronary revascularization ≥30 days postrandomization, or stroke. The on-treatment analysis included patients who received study drug for the duration of the trial or experienced a primary end point or noncardiovascular death within 30 days of drug discontinuation. Mean low-density lipoprotein cholesterol values at 1 year were 71 mg/dL for P/S and 54 mg/dL for Ez/S (absolute difference -17 mg/dL = -24%; P < .001). The 7-year Kaplan-Meier estimate of the primary end point occurred in 32.4% in the P/S arm and 29.8% in the Ez/S arm (absolute difference 2.6%; HR adj 0.92 [95% CI 0.87-0.98]; P = .01). The absolute treatment effect favoring Ez/S was 30% greater than in the intention-to-treat analysis of IMPROVE-IT. This analysis provides additional support for the efficacy and safety of adding Ez to S in this high-risk, post-ACS population. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Cause-Effect Analysis: Improvement of a First Year Engineering Students' Calculus Teaching Model

    Science.gov (United States)

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of…

  14. IMPROVED EFFECT AND FEM ANALYSIS OF VACUUM CONSOLIDATION METHOD USES PRELOAD EMBANKMENT

    Science.gov (United States)

    Hirata, Masafumi; Fukuda, Jun; Nobuta, Junichi; Nishikawa, Kouji; Yamada, Kouichi; Kawaida, Minoru

    In the vacuum consolidation method, the vacuum pressure acts on the direction of the inside of the improvement region. When the embankment is used together, the lateral deformation decrease and the rapid construction of the embankment are possible. However, the FEM analysis is necessary to forecast such an improved effect. In this paper, the improved effect and the deformation characteristic were verified about the vacuum consolidation method executed by the Wakasa construction. Moreover, soil-water coupled FEM analysis was executed, and the use method to the execution management etc. were examined. In the Wakasa construction, it was confirmed that the vacuum consolidation method demonstrated a high effect in the deformation decrease of the surrounding soil, the accelerating consolidation, and the term of works shortening. The executed FEM analysis can reproduce the improved effect of the vacuum consolidation by high accuracy. In the Wakasa construction, a stable construction of embankment has been achieved by using this analytical result for the execution management.

  15. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment (DAVE)

    Science.gov (United States)

    2015-05-01

    1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis (LLA) Application...Making 3 2 1 3 L L A Methods • Lexical Link Analysis (LLA) Core – LLA Reports and Visualizations • Collaborative Learning Agents (CLA) for

  16. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  17. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know......Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...

  18. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...... recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know...

  19. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Malik, S; Hoehle, F; Lassila-Perini, K; Hinzmann, A; Wolf, R; Shipsey, I

    2012-01-01

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  20. An improved multiple linear regression and data analysis computer program package

    Science.gov (United States)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  1. Improvement in Student Data Analysis Skills after Out-of-Class Assignments

    Directory of Open Access Journals (Sweden)

    Kristen Lee Williams Walton

    2016-12-01

    Full Text Available The ability to understand and interpret data is a critical aspect of scientific thinking.  However, although data analysis is often a focus in biology majors classes, many textbooks for allied health majors classes are primarily content-driven and do not include substantial amounts of experimental data in the form of graphs and figures.  In a lower-division allied health majors microbiology class, students were exposed to data from primary journal articles as take-home assignments and their data analysis skills were assessed in a pre-/posttest format.  Students were given 3 assignments that included data analysis questions.  Assignments ranged from case studies that included a figure from a journal article to reading a short journal article and answering questions about multiple figures or tables.  Data were represented as line or bar graphs, gel photographs, and flow charts.  The pre- and posttest was designed incorporating the same types of figures to assess whether the assignments resulted in any improvement in data analysis skills.  The mean class score showed a small but significant improvement from the pretest to the posttest across three semesters of testing.  Scores on individual questions testing accurate conclusions and predictions improved the most.  This supports the conclusion that a relatively small number of out-of-class assignments through the semester resulted in a significant improvement in data analysis abilities in this population of students.

  2. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    Itooka, Satoshi; Saito, Yoshinori; Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  3. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  4. An improved and explicit surrogate variable analysis procedure by coefficient adjustment.

    Science.gov (United States)

    Lee, Seunggeun; Sun, Wei; Wright, Fred A; Zou, Fei

    2017-06-01

    Unobserved environmental, demographic, and technical factors can negatively affect the estimation and testing of the effects of primary variables. Surrogate variable analysis, proposed to tackle this problem, has been widely used in genomic studies. To estimate hidden factors that are correlated with the primary variables, surrogate variable analysis performs principal component analysis either on a subset of features or on all features, but weighting each differently. However, existing approaches may fail to identify hidden factors that are strongly correlated with the primary variables, and the extra step of feature selection and weight calculation makes the theoretical investigation of surrogate variable analysis challenging. In this paper, we propose an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties. Simulation studies and real data analysis show that the method is competitive to state-of-the-art methods.

  5. U.S. Forest Service Region 1 Lake Chemistry, NADP, and IMPROVE air quality data analysis

    Science.gov (United States)

    Jill Grenon; Mark Story

    2009-01-01

    This report was developed to address the need for comprehensive analysis of U.S. Forest Service (USFS) Region 1 air quality monitoring data. The monitoring data includes Phase 3 (long-term data) lakes, National Atmospheric Deposition Program (NADP), and Interagency Monitoring of Protected Visual Environments (IMPROVE). Annual and seasonal data for the periods of record...

  6. ULg Spectra: An Interactive Software Tool to Improve Undergraduate Students' Structural Analysis Skills

    Science.gov (United States)

    Agnello, Armelinda; Carre, Cyril; Billen, Roland; Leyh, Bernard; De Pauw, Edwin; Damblon, Christian

    2018-01-01

    The analysis of spectroscopic data to solve chemical structures requires practical skills and drills. In this context, we have developed ULg Spectra, a computer-based tool designed to improve the ability of learners to perform complex reasoning. The identification of organic chemical compounds involves gathering and interpreting complementary…

  7. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  8. The improved Apriori algorithm based on matrix pruning and weight analysis

    Science.gov (United States)

    Lang, Zhenhong

    2018-04-01

    This paper uses the matrix compression algorithm and weight analysis algorithm for reference and proposes an improved matrix pruning and weight analysis Apriori algorithm. After the transactional database is scanned for only once, the algorithm will construct the boolean transaction matrix. Through the calculation of one figure in the rows and columns of the matrix, the infrequent item set is pruned, and a new candidate item set is formed. Then, the item's weight and the transaction's weight as well as the weight support for items are calculated, thus the frequent item sets are gained. The experimental result shows that the improved Apriori algorithm not only reduces the number of repeated scans of the database, but also improves the efficiency of data correlation mining.

  9. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Powerplant productivity improvement study: policy analysis and incentive assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    Policy options that the Illinois Commerce Commission might adopt in order to promote improved power plant productivity for existing units in Illinois are identified and analyzed. These policy options would generally involve either removing existing disincentives and/or adding direct incentives through the regulatory process. The following activities are reported: in-depth review of existing theoretical and empirical literature in the areas of power plant reliability, regulatory utility efficiency and performance incentives, and impacts of various regulatory mechanisms such as the Fuel Adjustment Clauses on productivity; contacts with other state public utility commissions known to be investigating or implementing productivity improvement incentive mechanisms; documentation and analysis of incentive mechanisms adopted or under consideration in other states; analysis of current regulatory practice in Illinois as it relates to power plant productivity incentives and disincentives; identification of candidate incentive mechanisms for consideration by the Illinois Commerce Commission; and analysis and evaluation of these candidates. 72 references, 8 figures.

  11. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    Science.gov (United States)

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to Acquisition...2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service...processes. Lexical Link Analysis (LLA) can help, by applying automation to reveal and depict???to decisionmakers??? the correlations, associations, and

  12. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase II

    Science.gov (United States)

    2014-04-30

    bäÉîÉåíÜ=^ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to...DATE 30 APR 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving...vocabulary or lexicon, to describe the attributes and surrounding environment of the system. Lexical Link Analysis (LLA) is a form of text mining in which

  13. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  14. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  15. Improving educational environment in medical colleges through transactional analysis practice of teachers.

    Science.gov (United States)

    Rajan, Marina; Chacko, Thomas

    2012-01-01

     A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of 'awareness' about intrapersonal and interpersonal processes. To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same.  An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training.  The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students.  These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes.

  16. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    International Nuclear Information System (INIS)

    Binder, Claudia R.; Hofer, Christoph; Wiek, Arnim; Scholz, Roland W.

    2004-01-01

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management

  17. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    Science.gov (United States)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  18. Improving the PSA quality in the human reliability analysis of pre-accident human errors

    International Nuclear Information System (INIS)

    Kang, D.-I.; Jung, W.-D.; Yang, J.-E.

    2004-01-01

    This paper describes the activities for improving the Probabilistic Safety Assessment (PSA) quality in the human reliability analysis (HRA) of the pre-accident human errors for the Korea Standard Nuclear Power Plant (KSNP). We evaluate the HRA results of the PSA for the KSNP and identify the items to be improved using the ASME PRA Standard. Evaluation results show that the ratio of items to be improved for pre-accident human errors is relatively high when compared with the ratio of those for post-accident human errors. They also show that more than 50% of the items to be improved for pre-accident human errors are related to the identification and screening analysis for them. In this paper, we develop the modeling guidelines for pre-accident human errors and apply them to the auxiliary feedwater system of the KSNP. Application results show that more than 50% of the items to be improved for the pre-accident human errors of the auxiliary feedwater system are resolved. (author)

  19. Root Cause Analysis and Productivity Improvement Of An Apparel Industry In Bangladesh Through Kaizen Implementation

    Directory of Open Access Journals (Sweden)

    Taposh Kumar Kapuria

    2017-12-01

    Full Text Available Garments industry is playing the pioneering role in improving Bangladesh economic condition. It was started in late 1970’s and now the leading foreign currency earner for Bangladesh. It’s no dubiousness to say that, the Bangladesh garment industry is ameliorating garment’s service quality and innovative design features to exist in the global competitive market. Global competition in the garment’s market is changing day to day. Leading garment manufacturer from all over the world are adopting new innovative features and techniques to sustain global fierce competitive market. But the point is, Bangladeshi garment manufacturers are not lingered. They are also emphasizing on better service quality by adding latest design features and using the latest technologies to the garments. The sole purpose of this paper is to identify the root causes of sewing defects of an apparel industry in Bangladesh and continuous improvement in reducing the defects through Kaizen (Continuous Improvement system. In short, productivity improvement of the apparel industry. Our studied garment manufacturing company is “ABONTI Color Tex. Ltd.”. Pareto Analysis is used to identify the top defect items. Cause-Effect Analysis helped to identify the root causes of sewing defects. Then, Kaizen is used for continuous improvement of the minimization of sewing defects.

  20. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  1. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Jonny; Nasution, Januar

    2013-01-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  2. The effects of aromatherapy on sleep improvement: a systematic literature review and meta-analysis.

    Science.gov (United States)

    Hwang, Eunhee; Shin, Sujin

    2015-02-01

    To evaluate the existing data on aromatherapy interventions for improvement of sleep quality. Systematic literature review and meta-analysis on the effects of aromatherapy. Study Sources: Electronic databases, including the Korea Education and Research Information Service (KERIS), Korean studies Information Service System (KISS), National Assembly Library, and eight academies within the Korean Society of Nursing Science, were searched to identify studies published between 2000 and August 2013. Randomized controlled and quasi-experimental trials that included aromatherapy for the improvement of sleep quality. Of the 245 publications identified, 13 studies met the inclusion and exclusion criteria, and 12 studies were used in the meta-analysis. Meta-analysis of the 12 studies using a random-effects model revealed that the use of aromatherapy was effective in improving sleep quality (95% confidence interval [CI], 0.540-1.745; Z=3.716). Subgroup analysis revealed that inhalation aromatherapy (95% CI, 0.792-1.541; Z=6.107) was more effective than massage therapy (95% CI, 0.128-2.166; Z=2.205) in unhealthy (95% CI, 0.248-1.100; Z=3.100) and healthy (95% CI, 0.393-5.104; Z=2.287) participants, respectively. Readily available aromatherapy treatments appear to be effective and promote sleep. Thus, it is essential to develop specific guidelines for the efficient use of aromatherapy.

  3. Improvement of burnup analysis for pebble bed reactors with an accumulative fuel loading scheme

    International Nuclear Information System (INIS)

    Simanullang, Irwan Liapto; Obara, Toru

    2015-01-01

    Given the limitations of natural uranium resources, innovative nuclear power plant concepts that increase the efficiency of nuclear fuel utilization are needed. The Pebble Bed Reactor (PBR) shows some potential to achieve high efficiency in natural uranium utilization. To simplify the PBR concept, PBR with an accumulation fuel loading scheme was introduced and the Fuel Handling System (FHS) removed. In this concept, the pebble balls are added little by little into the reactor core until the pebble balls reach the top of the reactor core, and all pebble balls are discharged from the core at the end of the operation period. A code based on the MVP/MVP-BURN method has been developed to perform an analysis of a PBR with the accumulative fuel loading scheme. The optimum fuel composition was found using the code for high burnup performance. Previous efforts provided several motivations to improve the burnup performance: First, some errors in the input code were corrected. This correction, and an overall simplification of the input code, was implemented for easier analysis of a PBR with the accumulative fuel loading scheme. Second, the optimum fuel design had been obtained in the infinite geometry. To improve the optimum fuel composition, a parametric survey was obtained by varying the amount of Heavy Metal (HM) uranium per pebble and the degree of uranium enrichment. Moreover, an entire analysis of the parametric survey was obtained in the finite geometry. The results show that improvements in the fuel composition can lead to more accurate analysis with the code. (author)

  4. Prediction of improvement in skin fibrosis in diffuse cutaneous systemic sclerosis: a EUSTAR analysis.

    Science.gov (United States)

    Dobrota, Rucsandra; Maurer, Britta; Graf, Nicole; Jordan, Suzana; Mihai, Carina; Kowal-Bielecka, Otylia; Allanore, Yannick; Distler, Oliver

    2016-10-01

    Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Analysis of transient heat conduction in a PWR fuel rod by an improved lumped parameter approach

    Energy Technology Data Exchange (ETDEWEB)

    Dourado, Eneida Regina G. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Cotta, Renato M. [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Mecanica; Jian, Su, E-mail: eneidadourado@gmail.com, E-mail: sujian@nuclear.ufrj.br, E-mail: cotta@mecanica.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2017-07-01

    This paper aims to analyze transient heat conduction in a nuclear fuel rod by an improved lumped parameter approach. One-dimensional transient heat conduction is considered, with the circumferential symmetry assumed and the axial conduction neglected. The thermal conductivity and specific heat in the fuel pellet are considered temperature dependent, while the thermophysical properties of the cladding are considered constant. Hermite approximation for integration is used to obtain the average temperature and heat flux in the radial direction. Significant improvement over the classical lumped parameter formulation has been achieved. The proposed model can be also used in dynamic analysis of PWR and nuclear power plant simulators. (author)

  6. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole

    2002-01-01

    A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...... productivity for every 10% reduction in the proportion of occupants entering a space who are dissatisfied with the air quality. With this assumption, the annual benefit due to improved air quality was always at least 10 times higher than the increase in annual energy and maintenance costs. The payback time...

  7. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; hide

    2015-01-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  8. Application of numerical grid generation for improved CFD analysis of multiphase screw machines

    Science.gov (United States)

    Rane, S.; Kovačević, A.

    2017-08-01

    Algebraic grid generation is widely used for discretization of the working domain of twin screw machines. Algebraic grid generation is fast and has good control over the placement of grid nodes. However, the desired qualities of grid which should be able to handle multiphase flows such as oil injection, may be difficult to achieve at times. In order to obtain fast solution of multiphase screw machines, it is important to further improve the quality and robustness of the computational grid. In this paper, a deforming grid of a twin screw machine is generated using algebraic transfinite interpolation to produce initial mesh upon which an elliptic partial differential equations (PDE) of the Poisson’s form is solved numerically to produce smooth final computational mesh. The quality of numerical cells and their distribution obtained by the differential method is greatly improved. In addition, a similar procedure was introduced to fully smoothen the transition of the partitioning rack curve between the rotors thus improving continuous movement of grid nodes and in turn improve robustness and speed of the Computational Fluid Dynamic (CFD) solver. Analysis of an oil injected twin screw compressor is presented to compare the improvements in grid quality factors in the regions of importance such as interlobe space, radial tip and the core of the rotor. The proposed method that combines algebraic and differential grid generation offer significant improvement in grid quality and robustness of numerical solution.

  9. Construction Delay Analysis Techniques—A Review of Application Issues and Improvement Needs

    Directory of Open Access Journals (Sweden)

    Nuhu Braimah

    2013-07-01

    Full Text Available The time for performance of a project is usually of the essence to the employer and the contractor. This has made it quite imperative for contracting parties to analyse project delays for purposes of making right decisions on potential time and/or cost compensation claims. Over the years, existing delay analysis techniques (DATs for aiding this decision-making have been helpful but have not succeeded in curbing the high incidence of disputes associated with delay claims resolutions. A major source of the disputes lies with the limitations and capabilities of the techniques in their practical use. Developing a good knowledge of these aspects of the techniques is of paramount importance in understanding the real problematic issues involved and their improvement needs. This paper seeks to develop such knowledge and understanding (as part of a wider research work via: an evaluation of the most common DATs based on a case study, a review of the key relevant issues often not addressed by the techniques, and the necessary improvements needs. The evaluation confirmed that the various techniques yield different analysis results for the same delay claims scenario, mainly due to their unique application procedures. The issues that are often ignored in the analysis but would also affect delay analysis results are: functionality of the programming software employed for the analysis, resource loading and levelling requirements, resolving concurrent delays, and delay-pacing strategy. Improvement needs by way of incorporating these issues in the analysis and focusing on them in future research work are the key recommendations of the study.

  10. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    Directory of Open Access Journals (Sweden)

    Bahram A. Ghanbari-Hashemabadi

    2012-03-01

    Full Text Available Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen based on screening method. In addition, they were randomly divided into three equal groups: control group (15 participants, cognitive experimental group (15 participants and transactional analysis group (15 participants. Conflict-solving questionnaire was used to collect data and the intervention methods were cognitive and transactional analysis group therapy that was administrated during 8 weekly two-hour sessions. Mean and standard deviation were used for data analysis in the descriptive level and One-Way ANOVA method was used at the inference level.Results: The results of the study suggest that the conflict-solving skills in the two experimental groups were significantly increased. Conclusion: The finding of this research is indicative of the fact that both cognitive and transactional analysis group therapy could be an effective intervention for improving conflict-solving skills

  11. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    International Nuclear Information System (INIS)

    Park, Dae Woong

    2015-01-01

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  12. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  13. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis

    Directory of Open Access Journals (Sweden)

    Masanobu Motooka

    2018-02-01

    Full Text Available Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  14. Analysis of means of improving the uncontrolled lateral motions of personal airplanes

    Science.gov (United States)

    Mckinney, Marion O , Jr

    1951-01-01

    A theoretical analysis has been made of means of improving the uncontrolled motions of personal airplanes. The purpose of this investigation was to determine whether such airplanes could be made to fly uncontrolled for an indefinite period of time without getting into dangerous attitudes and for a reasonable period of time (1 to 3 min) without deviating excessively from their original course. The results of this analysis indicated that the uncontrolled motions of a personal airplane could be made safe as regards spiral tendencies and could be greatly improved as regards maintenance of course without resort to an autopilot. The only way to make the uncontrolled motions completely satisfactory as regards continuous maintenance of course, however, is to use a conventional type of autopilot.

  15. Improvements of Physical Models in TRITGO code for Tritium Behavior Analysis in VHTR

    International Nuclear Information System (INIS)

    Yoo, Jun Soo; Tak, Nam Il; Lim, Hong Sik

    2010-01-01

    Since tritium is radioactive material with 12.32 year of half-life and is generated by a ternary fission reaction in fuel as well as by neutron absorption reactions of impurities in Very High Temperature gas-cooled Reactor (VHTR) core, accurate prediction of tritium behavior and its concentration in product hydrogen is definitely important in terms of public safety for its construction. In this respect, TRITGO code was developed for estimating the tritium production and distribution in high temperature gas-cooled reactors by General Atomics (GA). However, some models in it are hard-wired to specific reactor type or too simplified, which makes the analysis results less applicable. Thus, major improvements need to be considered for better predictions. In this study, some of model improvements have been suggested and its effect is evaluated based on the analysis work against PMR600 design concept

  16. Design improvement and dynamic finite element analysis of novel ITI dental implant under dynamic chewing loads.

    Science.gov (United States)

    Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei; Lee, Shyh-Yuan

    2015-01-01

    The main aim of this article was to introduce the application of a uniform design for experimental methods to drop the micromotion of a novel ITI dental implant model under the dynamic loads. Combining the characteristics of the traditional ITI and Nano-Tite implants, a new implant with concave holes has been constructed. Compared to the traditional ITI dental implant model, the micromotion of the new dental implant model was significantly reduced by explicit dynamic finite element analysis. From uniform design of experiments, the dynamic finite element analysis method was applied to caluculated the maximum micromotion of the full model. Finally, the chief design in all the experiment simulations which cause the minimum micromotion is picked as the advanced model of the design. Related to the original design, which was associated with a micromotion of 45.11 μm, the micromotion of the improved version was 31.37 μm, for an improvement rate of 30.5%.

  17. Using digital notifications to improve attendance in clinic: systematic review and meta-analysis

    Science.gov (United States)

    Robotham, Dan; Satkunanathan, Safarina; Reynolds, John; Stahl, Daniel; Wykes, Til

    2016-01-01

    Objectives Assess the impact of text-based electronic notifications on improving clinic attendance, in relation to study quality (according to risk of bias), and to assess simple ways in which notifications can be optimised (ie, impact of multiple notifications). Design Systematic review, study quality appraisal assessing risk of bias, data synthesised in meta-analyses. Data sources MEDLINE, EMBASE, PsycINFO, Web of Science and Cochrane Database of Systematic Reviews (01.01.05 until 25.4.15). A systematic search to discover all studies containing quantitative data for synthesis into meta-analyses. Eligibility criteria Studies examining the effect of text-based electronic notifications on prescheduled appointment attendance in healthcare settings. Primary analysis included experimental studies where randomisation was used to define allocation to intervention and where a control group consisting of ‘no reminders’ was used. Secondary meta-analysis included studies comparing text reminders with voice reminders. Studies lacking sufficient information for inclusion (after attempting to contact study authors) were excluded. Outcome measures Primary outcomes were rate of attendance/non-attendance at healthcare appointments. Secondary outcome was rate of rescheduled and cancelled appointments. Results 26 articles were included. 21 included in the primary meta-analysis (8345 patients receiving electronic text notifications, 7731 patients receiving no notifications). Studies were included from Europe (9), Asia (7), Africa (2), Australia (2) and America (1). Patients who received notifications were 23% more likely to attend clinic than those who received no notification (risk ratio=1.23, 67% vs 54%). Those receiving notifications were 25% less likely to ‘no show’ for appointments (risk ratio=.75, 15% vs 21%). Results were similar when accounting for risk of bias, region and publication year. Multiple notifications were significantly more effective at improving

  18. Nonlinear analysis of an improved continuum model considering headway change with memory

    Science.gov (United States)

    Cheng, Rongjun; Wang, Jufeng; Ge, Hongxia; Li, Zhipeng

    2018-01-01

    Considering the effect of headway changes with memory, an improved continuum model of traffic flow is proposed in this paper. By means of linear stability theory, the new model’s linear stability with the effect of headway changes with memory is obtained. Through nonlinear analysis, the KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line. Numerical simulation is carried out to study the improved traffic flow model, which explores how the headway changes with memory affected each car’s velocity, density and energy consumption. Numerical results show that when considering the effects of headway changes with memory, the traffic jams can be suppressed efficiently. Furthermore, research results demonstrate that the effect of headway changes with memory can avoid the disadvantage of historical information, which will improve the stability of traffic flow and minimize car energy consumption.

  19. An improved spectral homotopy analysis method for solving boundary layer problems

    Directory of Open Access Journals (Sweden)

    Sibanda Precious

    2011-01-01

    Full Text Available Abstract This article presents an improved spectral-homotopy analysis method (ISHAM for solving nonlinear differential equations. The implementation of this new technique is shown by solving the Falkner-Skan and magnetohydrodynamic boundary layer problems. The results obtained are compared to numerical solutions in the literature and MATLAB's bvp4c solver. The results show that the ISHAM converges faster and gives accurate results.

  20. Analysis After Sales Service in Effort to Improve Costomer Saticfactoin on PT. Nusantara Motor in Balikpapan

    OpenAIRE

    -, Mursidah -

    2014-01-01

    Research purpose is to determine the effect after sales service to the costomer saticfaction on PT.Nusantara Motor.while the usefulness of this study infut material for the company in planning and improve marketing strategy in particular after sales service accordance with the needs and costumer desires. Analysis tolls used is simple linier regression equation Y= a+bx. Scale coefficient regression (b) =0,9874 , this means there is an increase costumer saticfaction in influenced by the...

  1. Central aortic reservoir-wave analysis improves prediction of cardiovascular events in elderly hypertensives.

    Science.gov (United States)

    Narayan, Om; Davies, Justin E; Hughes, Alun D; Dart, Anthony M; Parker, Kim H; Reid, Christopher; Cameron, James D

    2015-03-01

    Several morphological parameters based on the central aortic pressure waveform are proposed as cardiovascular risk markers, yet no study has definitively demonstrated the incremental value of any waveform parameter in addition to currently accepted biomarkers in elderly, hypertensive patients. The reservoir-wave concept combines elements of wave transmission and Windkessel models of arterial pressure generation, defining an excess pressure superimposed on a background reservoir pressure. The utility of pressure rate constants derived from reservoir-wave analysis in prediction of cardiovascular events is unknown. Carotid blood pressure waveforms were measured prerandomization in a subset of 838 patients in the Second Australian National Blood Pressure Study. Reservoir-wave analysis was performed and indices of arterial function, including the systolic and diastolic rate constants, were derived. Survival analysis was performed to determine the association between reservoir-wave parameters and cardiovascular events. The incremental utility of reservoir-wave parameters in addition to the Framingham Risk Score was assessed. Baseline values of the systolic rate constant were independently predictive of clinical outcome (hazard ratio, 0.33; 95% confidence interval, 0.13-0.82; P=0.016 for fatal and nonfatal stroke and myocardial infarction and hazard ratio, 0.38; 95% confidence interval, 0.20-0.74; P=0.004 for the composite end point, including all cardiovascular events). Addition of this parameter to the Framingham Risk Score was associated with an improvement in predictive accuracy for cardiovascular events as assessed by the integrated discrimination improvement and net reclassification improvement indices. This analysis demonstrates that baseline values of the systolic rate constant predict clinical outcomes in elderly patients with hypertension and incrementally improve prognostication of cardiovascular events. © 2014 American Heart Association, Inc.

  2. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    Science.gov (United States)

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  3. Information Operations Versus Civilian Marketing and Advertising: A Comparative Analysis to Improve IO Planning and Strategy

    Science.gov (United States)

    2008-03-01

    concepts, relevant to IO, which are known successful marketing practices. Successful marketing strategy includes the basic “ 4Ps of marketing ...OPERATIONS VERSUS CIVILIAN MARKETING AND ADVERTISING: A COMPARATIVE ANALYSIS TO IMPROVE IO PLANNING AND STRATEGY by Dan Chilton March 2008...REPORT DATE March 2008 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Information Operations Versus Civilian Marketing and

  4. Use of peers to improve adherence to antiretroviral therapy: a global network meta-analysis.

    Science.gov (United States)

    Kanters, Steve; Park, Jay Jh; Chan, Keith; Ford, Nathan; Forrest, Jamie; Thorlund, Kristian; Nachega, Jean B; Mills, Edward J

    2016-01-01

    It is unclear whether using peers can improve adherence to antiretroviral therapy (ART). To construct the World Health Organization's global guidance on adherence interventions, we conducted a systematic review and network meta-analysis to determine the effectiveness of using peers for achieving adequate adherence and viral suppression. We searched for randomized clinical trials of peer-based interventions to promote adherence to ART in HIV populations. We searched six electronic databases from inception to July 2015 and major conference abstracts within the last three years. We examined the outcomes of adherence and viral suppression among trials done worldwide and those specific to low- and middle-income countries (LMIC) using pairwise and network meta-analyses. Twenty-two trials met the inclusion criteria. We found similar results between pairwise and network meta-analyses, and between the global and LMIC settings. Peer supporter+Telephone was superior in improving adherence than standard-of-care in both the global network (odds-ratio [OR]=4.79, 95% credible intervals [CrI]: 1.02, 23.57) and the LMIC settings (OR=4.83, 95% CrI: 1.88, 13.55). Peer support alone, however, did not lead to improvement in ART adherence in both settings. For viral suppression, we found no difference of effects among interventions due to limited trials. Our analysis showed that peer support leads to modest improvement in adherence. These modest effects may be due to the fact that in many settings, particularly in LMICs, programmes already include peer supporters, adherence clubs and family disclosures for treatment support. Rather than introducing new interventions, a focus on improving the quality in the delivery of existing services may be a more practical and effective way to improve adherence to ART.

  5. Transcriptome Analysis of Maize Immature Embryos Reveals the Roles of Cysteine in Improving Agrobacterium Infection Efficiency

    Science.gov (United States)

    Liu, Yan; Zhang, Zhiqiang; Fu, Junjie; Wang, Guoying; Wang, Jianhua; Liu, Yunjun

    2017-01-01

    Maize Agrobacterium-mediated transformation efficiency has been greatly improved in recent years. Antioxidants, such as, cysteine, can significantly improve maize transformation frequency through improving the Agrobacterium infection efficiency. However, the mechanism underlying the transformation improvement after cysteine exposure has not been elucidated. In this study, we showed that the addition of cysteine to the co-cultivation medium significantly increased the Agrobacterium infection efficiency of hybrid HiII and inbred line Z31 maize embryos. Reactive oxygen species contents were higher in embryos treated with cysteine than that without cysteine. We further investigated the mechanism behind cysteine-related infection efficiency increase using transcriptome analysis. The results showed that the cysteine treatment up-regulated 939 genes and down-regulated 549 genes in both Z31 and HiII. Additionally, more differentially expressed genes were found in HiII embryos than those in Z31 embryos, suggesting that HiII was more sensitive to the cysteine treatment than Z31. GO analysis showed that the up-regulated genes were mainly involved in the oxidation reduction process. The up-regulation of these genes could help maize embryos to cope with the oxidative stress stimulated by Agrobacterium infection. The down-regulated genes were mainly involved in the cell wall and membrane metabolism, such as, aquaporin and expansin genes. Decreased expression of these cell wall integrity genes could loosen the cell wall, thereby improving the entry of Agrobacterium into plant cells. This study offers insight into the role of cysteine in improving Agrobacterium-mediated transformation of maize immature embryos. PMID:29089955

  6. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs

  7. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs.

  8. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson

    2008-06-01

    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  9. Improved Proteomic Analysis Following Trichloroacetic Acid Extraction of Bacillus anthracis Spore Proteins

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.; Warner, Marvin G.; Wahl, Karen L.; Hutchison, Janine R.

    2015-08-07

    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extraction improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.

  10. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-11-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  11. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-02-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  13. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement.

    Science.gov (United States)

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna; Nyström, Monica Elisabeth

    2016-07-29

    Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. Published by the BMJ Publishing Group Limited. For

  14. Application of exergy analysis for improving energy efficiency of natural gas liquids recovery processes

    International Nuclear Information System (INIS)

    Shin, Jihoon; Yoon, Sekwang; Kim, Jin-Kuk

    2015-01-01

    Thermodynamic analysis and optimization method is applied to provide design guidelines for improving energy efficiency and cost-effectiveness of natural gas liquids recovery processes. Exergy analysis is adopted in this study as a thermodynamic tool to evaluate the loss of exergy associated with irreversibility in natural gas liquids recovery processes, with which conceptual understanding on inefficient design feature or equipment can be obtained. Natural gas liquids processes are modeled and simulated within UniSim ® simulator, with which detailed thermodynamic information are obtained for calculating exergy loss. The optimization framework is developed by minimizing overall exergy loss, as an objective function, subject to product specifications and engineering constraints. The optimization is carried out within MATLAB ® with the aid of a stochastic solver based on genetic algorithms. The process simulator is linked and interacted with the optimization solver, in which optimal operating conditions can be determined. A case study is presented to illustrate the benefit of using exergy analysis for the design and optimization of natural gas liquids processes and to demonstrate the applicability of design method proposed in this paper. - Highlights: • Application of exergy analysis for natural gas liquids (NGL) recovery processes. • Minimization of exergy loss for improving energy efficiency. • A systematic optimization framework for the design of NGL recovery processes

  15. Does Flywheel Paradigm Training Improve Muscle Volume and Force? A Meta-Analysis.

    Science.gov (United States)

    Nuñez Sanchez, Francisco J; Sáez de Villarreal, Eduardo

    2017-11-01

    Núñez Sanchez, FJ and Sáez de Villarreal, E. Does flywheel paradigm training improve muscle volume and force? A meta-analysis. J Strength Cond Res 31(11): 3177-3186, 2017-Several studies have confirmed the efficacy of flywheel paradigm training for improving or benefiting muscle volume and force. A meta-analysis of 13 studies with a total of 18 effect sizes was performed to analyse the role of various factors on the effectiveness of flywheel paradigm training. The following inclusion criteria were employed for the analysis: (a) randomized studies; (b) high validity and reliability instruments; (c) published in a high quality peer-reviewed journal; (d) healthy participants; (e) studies where the eccentric programme were described; and (f) studies where increases in muscle volume and force were measured before and after training. Increases in muscle volume and force were noted through the use of flywheel systems during short periods of training. The increase in muscle mass appears was not influenced by the existence of eccentric overload during the exercise. The increase in force was significantly higher with the existence of eccentric overload during the exercise. The responses identified in this analysis are essential and should be considered by strength and conditioning professionals regarding the most appropriate dose response trends for flywheel paradigm systems to optimize the increase in muscle volume and force.

  16. Assessment and improvement of the Plasmodium yoelii yoelii genome annotation through comparative analysis.

    Science.gov (United States)

    Vaughan, Ashley; Chiu, Sum-Ying; Ramasamy, Gowthaman; Li, Ling; Gardner, Malcolm J; Tarun, Alice S; Kappe, Stefan H I; Peng, Xinxia

    2008-07-01

    The sequencing of the Plasmodium yoelii genome, a model rodent malaria parasite, has greatly facilitated research for the development of new drug and vaccine candidates against malaria. Unfortunately, only preliminary gene models were annotated on the partially sequenced genome, mostly by in silico gene prediction, and there has been no major improvement of the annotation since 2002. Here we report on a systematic assessment of the accuracy of the genome annotation based on a detailed analysis of a comprehensive set of cDNA sequences and proteomics data. We found that the coverage of the current annotation tends to be biased toward genes expressed in the blood stages of the parasite life cycle. Based on our proteomic analysis, we estimate that about 15% of the liver stage proteome data we have generated is absent from the current annotation. Through comparative analysis we identified and manually curated a further 510 P. yoelii genes which have clear orthologs in the P. falciparum genome, but were not present or incorrectly annotated in the current annotation. This study suggests that improvements of the current P. yoelii genome annotation should focus on genes expressed in stages other than blood stages. Comparative analysis will be critically helpful for this re-annotation. The addition of newly annotated genes will facilitate the use of P. yoelii as a model system for studying human malaria. Supplementary data are available at Bioinformatics online.

  17. Improved method for minimizing sulfur loss in analysis of particulate organic sulfur.

    Science.gov (United States)

    Park, Ki-Tae; Lee, Kitack; Shin, Kyoungsoon; Jeong, Hae Jin; Kim, Kwang Young

    2014-02-04

    The global sulfur cycle depends primarily on the metabolism of marine microorganisms, which release sulfur gas into the atmosphere and thus affect the redistribution of sulfur globally as well as the earth's climate system. To better quantify sulfur release from the ocean, analysis of the production and distribution of organic sulfur in the ocean is necessary. This report describes a wet-based method for accurate analysis of particulate organic sulfur (POS) in the marine environment. The proposed method overcomes the considerable loss of sulfur (up to 80%) that occurs during analysis using conventional methods involving drying. Use of the wet-based POS extraction procedure in conjunction with a sensitive sulfur analyzer enabled accurate measurements of cellular POS. Data obtained using this method will enable accurate assessment of how rapidly sulfur can transfer among pools. Such information will improve understanding of the role of POS in the oceanic sulfur cycle.

  18. Applying transactional analysis and personality assessment to improve patient counseling and communication skills.

    Science.gov (United States)

    Lawrence, Lesa

    2007-08-15

    To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients.

  19. Improving the effectiveness of FMEA analysis in automotive – a case study

    Directory of Open Access Journals (Sweden)

    Ványi Gábor

    2016-06-01

    Full Text Available Many industries, for example automotive, have well defined product development process definitions and risk evaluation methods. The FMEA (Failure Mode and Effects Analysis is a first line risk analysis method in design, which has been implemented in development and production since decades. Although the first applications were focusing on mechanical and electrical design and functionalities, today, software components are implemented in many modern vehicle systems. However, standards or industry specific associations do not specify any “best practice” how to design the interactions of multiple entities in one model. This case study focuses on modelling interconnections and on the improvement of the FMEA modelling process in the automotive. Selecting and grouping software components for the analysis is discussed, but software architect design patterns are excluded from the study.

  20. Efficient Fault Localization and Failure Analysis Techniques for Improving IC Yield

    Directory of Open Access Journals (Sweden)

    Ankush Oberai

    2018-02-01

    Full Text Available With the increase in the complexity of the semiconductor device processes and increase in the challenge to satisfy high market demands, enhancement in yield has become a crucial factor. Discovering and reacting to yield problems emerging at the end of the production line may cause unbearable yield loss leading to larger times to market. Thus, time and cost involved in fault isolation may be significantly shortened by effectively utilizing the fault diagnosis technology and supporting yield improvements. Hence for yield analysis, a highly integrated data network with software analysis tools have been established to reduce the fault analysis time. Synopsys Avalon, a product used for fault localization is described in this paper which aids in achieving better integrated circuit yields. This paper also illustrates various fault localization techniques for faster problem identification and discusses a few analytical tools like photon emission microscope and transmission emission microscope for faster determination of device failures.

  1. Combined analysis of cortical (EEG) and nerve stump signals improves robotic hand control.

    Science.gov (United States)

    Tombini, Mario; Rigosa, Jacopo; Zappasodi, Filippo; Porcaro, Camillo; Citi, Luca; Carpaneto, Jacopo; Rossini, Paolo Maria; Micera, Silvestro

    2012-01-01

    Interfacing an amputee's upper-extremity stump nerves to control a robotic hand requires training of the individual and algorithms to process interactions between cortical and peripheral signals. To evaluate for the first time whether EEG-driven analysis of peripheral neural signals as an amputee practices could improve the classification of motor commands. Four thin-film longitudinal intrafascicular electrodes (tf-LIFEs-4) were implanted in the median and ulnar nerves of the stump in the distal upper arm for 4 weeks. Artificial intelligence classifiers were implemented to analyze LIFE signals recorded while the participant tried to perform 3 different hand and finger movements as pictures representing these tasks were randomly presented on a screen. In the final week, the participant was trained to perform the same movements with a robotic hand prosthesis through modulation of tf-LIFE-4 signals. To improve the classification performance, an event-related desynchronization/synchronization (ERD/ERS) procedure was applied to EEG data to identify the exact timing of each motor command. Real-time control of neural (motor) output was achieved by the participant. By focusing electroneurographic (ENG) signal analysis in an EEG-driven time window, movement classification performance improved. After training, the participant regained normal modulation of background rhythms for movement preparation (α/β band desynchronization) in the sensorimotor area contralateral to the missing limb. Moreover, coherence analysis found a restored α band synchronization of Rolandic area with frontal and parietal ipsilateral regions, similar to that observed in the opposite hemisphere for movement of the intact hand. Of note, phantom limb pain (PLP) resolved for several months. Combining information from both cortical (EEG) and stump nerve (ENG) signals improved the classification performance compared with tf-LIFE signals processing alone; training led to cortical reorganization and

  2. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats

    Science.gov (United States)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.

    2017-02-01

    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  3. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Improving Production of Treated and Untreated Verbs in Aphasia: A Meta-Analysis.

    Science.gov (United States)

    de Aguiar, Vânia; Bastiaanse, Roelien; Miceli, Gabriele

    2016-01-01

    Background: Demographic and clinical predictors of aphasia recovery have been identified in the literature. However, little attention has been devoted to identifying and distinguishing predictors of improvement for different outcomes, e.g., production of treated vs. untreated materials. These outcomes may rely on different mechanisms, and therefore be predicted by different variables. Furthermore, treatment features are not typically accounted for when studying predictors of aphasia recovery. This is partly due to the small numbers of cases reported in studies, but also to limitations of data analysis techniques usually employed. Method: We reviewed the literature on predictors of aphasia recovery, and conducted a meta-analysis of single-case studies designed to assess the efficacy of treatments for verb production. The contribution of demographic, clinical, and treatment-related variables was assessed by means of Random Forests (a machine-learning technique used in classification and regression). Two outcomes were investigated: production of treated (for 142 patients) and untreated verbs (for 166 patients). Results: Improved production of treated verbs was predicted by a three-way interaction of pre-treatment scores on tests for verb comprehension and word repetition, and the frequency of treatment sessions. Improvement in production of untreated verbs was predicted by an interaction including the use of morphological cues, presence of grammatical impairment, pre-treatment scores on a test for noun comprehension, and frequency of treatment sessions. Conclusion: Improvement in the production of treated verbs occurs frequently. It may depend on restoring access to and/or knowledge of lexeme representations, and requires relative sparing of semantic knowledge (as measured by verb comprehension) and phonological output abilities (including working memory, as measured by word repetition). Improvement in the production of untreated verbs has not been reported very often

  5. Improving production of treated and untreated verbs in aphasia: A meta-analysis

    Directory of Open Access Journals (Sweden)

    Vânia de Aguiar

    2016-09-01

    Full Text Available BACKGROUND. Demographic and clinical predictors of aphasia recovery have been identified in the literature. However, little attention has been devoted to identifying and distinguishing predictors of improvement for different outcomes, e.g., production of treated vs. untreated materials. These outcomes may rely on different mechanisms, and therefore be predicted by different variables. Furthermore, treatment features are not typically accounted for when studying predictors of aphasia recovery. This is partly due to the small numbers of cases reported in studies, but also to limitations of data analysis techniques usually employed. METHOD. We reviewed the literature on predictors of aphasia recovery, and conducted a meta-analysis of single-case studies designed to assess the efficacy of treatments for verb production. The contribution of demographic, clinical, and treatment-related variables was assessed by means of Random Forests (a machine-learning technique used in classification and regression. Two outcomes were investigated: production of treated (for 142 patients and untreated verbs (for 166 patients. RESULTS. Improved production of treated verbs was predicted by a three-way interaction of pre-treatment scores on tests for verb comprehension and word repetition, and the frequency of treatment sessions. Improvement in production of untreated verbs was predicted by an interaction including the use of morphological cues, presence of grammatical impairment, pre-treatment scores on a test for noun comprehension and frequency of treatment sessions. CONCLUSION. Improvement in the production of treated verbs occurs frequently. It may depend on restoring access to and/or knowledge of lexeme representations, and requires relative sparing of semantic knowledge (as measured by verb comprehension and phonological output abilities (including working memory, as measured by word repetition. Improvement in the production of untreated verbs has not been

  6. Aerobic Exercise Improves Cognitive Functioning in People With Schizophrenia: A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Stubbs, Brendon; Rosenbaum, Simon; Vancampfort, Davy; Malchow, Berend; Schuch, Felipe; Elliott, Rebecca; Nuechterlein, Keith H.; Yung, Alison R.

    2017-01-01

    Abstract Cognitive deficits are pervasive among people with schizophrenia and treatment options are limited. There has been an increased interest in the neurocognitive benefits of exercise, but a comprehensive evaluation of studies to date is lacking. We therefore conducted a meta-analysis of all controlled trials investigating the cognitive outcomes of exercise interventions in schizophrenia. Studies were identified from a systematic search across major electronic databases from inception to April 2016. Meta-analyses were used to calculate pooled effect sizes (Hedges g) and 95% CIs. We identified 10 eligible trials with cognitive outcome data for 385 patients with schizophrenia. Exercise significantly improved global cognition (g = 0.33, 95% CI = 0.13–0.53, P = .001) with no statistical heterogeneity (I2 = 0%). The effect size in the 7 studies which were randomized controlled trials was g = 0.43 (P exercise are associated with larger improvements in global cognition (β = .005, P = .065). Interventions which were supervised by physical activity professionals were also more effective (g = 0.47, P Exercise significantly improved the cognitive domains of working memory (g = 0.39, P = .024, N = 7, n = 282), social cognition (g = 0.71, P = .002, N = 3, n = 81), and attention/vigilance (g = 0.66, P = .005, N = 3, n = 104). Effects on processing speed, verbal memory, visual memory and reasoning and problem solving were not significant. This meta-analysis provides evidence that exercise can improve cognitive functioning among people with schizophrenia, particularly from interventions using higher dosages of exercise. Given the challenges in improving cognition, and the wider health benefits of exercise, a greater focus on providing supervised exercise to people with schizophrenia is needed. PMID:27521348

  7. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Pointurier, F.; Hubert, A.; Faure, A.L.; Pottin, A.C.; Mourier, W.; Marie, O.

    2010-01-01

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  8. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  9. Improving defect visibility in square pulse thermography of metallic components using correlation analysis

    Science.gov (United States)

    Xu, Changhang; Xie, Jing; Huang, Weiping; Chen, Guoming; Gong, Xumei

    2018-03-01

    Infrared (IR) thermography has gained wide applications as an important non-destructive testing (NDT) technique. Improving defect visibility is critical to achieving an accurate detection result through IR thermography. In this study, we propose a novel approach to improving defect visibility in square pulse thermography (SPT) of metallic components. In the proposed approach, the correlation function of contrast (CFC) is defined for the first time. Based on the theories of heat conduction and of correlation analysis, the differences of CFC between defects and sound regions are determined. We found that the peak lag time of the CFC is an effective feature for discriminating defects and sound regions in SPT. A new image is then constructed using the peak lag time of the CFC to improve defect visibility. To verify the efficiency of the proposed approach, an experiment was conducted on a steel specimen, and the principle component analysis (PCA) and the presented approach were compared. The results show that through the proposed approach, defects in metallic components can be indicated more clearly and detected more accurately.

  10. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  11. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2005-11-01

    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  12. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    International Nuclear Information System (INIS)

    Chauhan, Narvendra Singh; Mohapatra, Pratap K.J.; Pandey, Keshaw Prasad

    2006-01-01

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone

  13. [Failure mode and effects analysis to improve quality in clinical trials].

    Science.gov (United States)

    Mañes-Sevilla, M; Marzal-Alfaro, M B; Romero Jiménez, R; Herranz-Alonso, A; Sanchez Fresneda, M N; Benedi Gonzalez, J; Sanjurjo-Sáez, M

    2018-02-15

    The failure mode and effects analysis (FMEA) has been used as a tool in risk management and quality improvement. The objective of this study is to identify the weaknesses in processes in the clinical trials area, of a Pharmacy Department (PD) with great research activity, in order to improve the safety of the usual procedures. A multidisciplinary team was created to analyse each of the critical points, identified as possible failure modes, in the development of clinical trial in the PD. For each failure mode, the possible cause and effect were identified, criticality was calculated using the risk priority number and the possible corrective actions were discussed. Six sub-processes were defined in the development of the clinical trials in PD. The FMEA identified 67 failure modes, being the dispensing and prescription/validation sub-processes the most likely to generate errors. All the improvement actions established in the AMFE were implemented in the Clinical Trials area. The FMEA is a useful tool in proactive risk management because it allows us to identify where we are making mistakes and analyze the causes that originate them, to prioritize and to adopt solutions to risk reduction. The FMEA improves process safety and quality in PD. Copyright © 2018 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Analysis and Improvement of Attitude Output Accuracy in Rotation Inertial Navigation System

    Directory of Open Access Journals (Sweden)

    Kui Li

    2015-01-01

    Full Text Available Inertial navigation system (INS measures vehicle’s angular rate and acceleration by orthogonally mounted tri-axis gyroscopes and accelerometers and then calculates the vehicle’s real-time attitude, velocity, and position. Gyroscope drifts and accelerometer biases are the key factors that affect the navigation accuracy. Theoretical analysis and experimental results show that the influence of gyroscope drifts and accelerometer biases can be restrained greatly in rotation INS (RINS by driving the inertial measurement unit (IMU rotating regularly, thus improving navigation accuracy significantly. High accuracy in position and velocity should be matched with that in attitude theoretically since INS is based on dead reckoning. However, the marine and vehicle experiments show that short-term attitude output accuracy of RINS is even worse compared with that of nonrotation INS. The loss of attitude accuracy has serious impacts on many task systems where high attitude accuracy is required. This paper researched the principle of attitude output accuracy loss in RINS and then proposed a new attitude output accuracy improvement algorithm for RINS. Experiment results show that the proposed attitude compensation method can improve short-term pitch and roll output accuracy from 20~30 arc seconds to less than 5 arc seconds and azimuth output accuracy improved from 2~3 arc minutes to less than 0.5 arc minutes in RINS.

  15. Economic analysis of the health impacts of housing improvement studies: a systematic review

    Science.gov (United States)

    Fenwick, Elisabeth; Macdonald, Catriona; Thomson, Hilary

    2013-01-01

    Background Economic evaluation of public policies has been advocated but rarely performed. Studies from a systematic review of the health impacts of housing improvement included data on costs and some economic analysis. Examination of these data provides an opportunity to explore the difficulties and the potential for economic evaluation of housing. Methods Data were extracted from all studies included in the systematic review of housing improvement which had reported costs and economic analysis (n=29/45). The reported data were assessed for their suitability to economic evaluation. Where an economic analysis was reported the analysis was described according to pre-set definitions of various types of economic analysis used in the field of health economics. Results 25 studies reported cost data on the intervention and/or benefits to the recipients. Of these, 11 studies reported data which was considered amenable to economic evaluation. A further four studies reported conducting an economic evaluation. Three of these studies presented a hybrid ‘balance sheet’ approach and indicated a net economic benefit associated with the intervention. One cost-effectiveness evaluation was identified but the data were unclearly reported; the cost-effectiveness plane suggested that the intervention was more costly and less effective than the status quo. Conclusions Future studies planning an economic evaluation need to (i) make best use of available data and (ii) ensure that all relevant data are collected. To facilitate this, economic evaluations should be planned alongside the intervention with input from health economists from the outset of the study. When undertaken appropriately, economic evaluation provides the potential to make significant contributions to housing policy. PMID:23929616

  16. Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm

    Science.gov (United States)

    Neri, P.

    2017-05-01

    Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.

  17. Aggregate analysis of regulatory authority assessors' comments to improve the quality of periodic safety update reports.

    Science.gov (United States)

    Jullian, Sandra; Jaskiewicz, Lukasz; Pfannkuche, Hans-Jürgen; Parker, Jeremy; Lalande-Luesink, Isabelle; Lewis, David J; Close, Philippe

    2015-09-01

    Marketing authorization holders (MAHs) are expected to provide high-quality periodic safety update reports (PSURs) on their pharmaceutical products to health authorities (HAs). We present a novel instrument aiming at improving quality of PSURs based on standardized analysis of PSUR assessment reports (ARs) received from the European Union HAs across products and therapeutic areas. All HA comments were classified into one of three categories: "Request for regulatory actions," "Request for medical and scientific information," or "Data deficiencies." The comments were graded according to their impact on patients' safety, the drug's benefit-risk profile, and the MAH's pharmacovigilance system. A total of 476 comments were identified through the analysis of 63 PSUR HA ARs received in 2013 and 2014; 47 (10%) were classified as "Requests for regulatory actions," 309 (65%) as "Requests for medical and scientific information," and 118 (25%) comments were related to "Data deficiencies." The most frequent comments were requests for labeling changes (35 HA comments in 19 ARs). The aggregate analysis revealed commonly raised issues and prompted changes of the MAH's procedures related to the preparation of PSURs. The authors believe that this novel instrument based on the evaluation of PSUR HA ARs serves as a valuable mechanism to enhance the quality of PSURs and decisions about optimization of the use of the products and, therefore, contributes to improve further the MAH's pharmacovigilance system and patient safety. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Improved liquid chromatography combined with pulsed electrochemical detection for the analysis of etimicin sulfate.

    Science.gov (United States)

    Wu, Yuning; Zhao, Wei; Zhu, Xiaoyue; Wang, Fang; Zhang, Mei; Fan, Xialei; Yuan, Yaozuo; Hu, Changqin; Deng, Xiaolan; Adams, Erwin

    2016-04-01

    This paper describes an improved liquid chromatography method combined with pulsed electrochemical detection for the analysis of etimicin sulfate. In total, 22 impurities could be separated. A TSK-GEL C18 column (250 mm × 4.6 mm i.d., 5 μm) is used, and the mobile phase is composed of 40 mL of acetonitrile and 960 mL of an aqueous solution containing trifluoroacetic acid (15 mL/L), pentafluoropropionic acid (500 μL/L), 50% sodium hydroxide (8 mL/L) and sodium sulfate (1.5 g/L). The pH of the aqueous solution is adjusted to 3.5 with 0.8 M sodium hydroxide. The influence of the different chromatographic parameters on the separation was investigated. A quadruple potential-time waveform was applied to the electrodes of the detection cell. 0.8 M sodium hydroxide was added post column to raise the pH to at least 12 before detection. A central composite experimental design was used to describe the relationship between factors and response values and to establish factorial analysis. Compared to previously published investigations, this improved method shows higher sensitivity, better separation ability and robustness and has been incorporated by the Chinese Pharmacopoeia 2015 for analysis of etimicin sulfate. A number of commercial samples of etimicin sulfate were also analyzed using this method. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity.

    Science.gov (United States)

    Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey

    2016-02-07

    Real-time Raman spectroscopy can be used to assist in assessing skin lesions suspicious for cancer. Most of the diagnostic algorithms are based on full band of the Raman spectra, either in the fingerprint region or the high wavenumber region. In this paper we explored wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Wavenumber selection was implemented using windows of wavenumber and leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected windows of wavenumber using multivariate statistical analyses, including principal component and general discriminate analysis (PC-GDA) and partial least squares (PLS). In total a combined cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included, which were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. It was found that the area under the receiver operating characteristic curve (ROC) was improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for fixed sensitivity 0.99-0.90 was improved from 0.17-0.65 to 0.20-0.75 with wavenumber selection based analysis.

  20. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  1. PWR (pressurized water reactor) water treatment improvements: Cost-benefit analysis: Final report

    International Nuclear Information System (INIS)

    Siegwarth, D.P.; Bickerstaff, J.A.; Chakravorti, R.

    1988-05-01

    Pressurized water reactor steam generators and turbines have experienced a variety of corrosion problems as a result of ionic, corrosion product and oxidizing species transport into the steam generators. This project considered the design, cost and benefit of equipment modifications and additions which would decrese secondary cycle impurity transport. Improving condenser integrity, adding full-flow condensate polishers, providing low dissolved oxygen in makeup water and installation of all-ferrous heat exchangers are four changes that can significantly improve secondary water quality. Conceptual designs and costs of these four concepts at a 1160 MWe pressurized water reactor are summarized. The expected chemistry and operational benefits are discussed, and a cost-benefit analysis is given

  2. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    International Nuclear Information System (INIS)

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  3. Development of an improved commercial sector energy model for national policy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B.

    1992-12-01

    Pacific Northwest Laboratory provided support to the Office of Conservation and Renewable Energy (CE), under the Office of Planning and Assessment, to develop improved energy and environmental analysis tools. Commercial building sector energy models from the past decade were analyzed in order to provoke comment and stimulate discussion between potential model users and developers as to the appropriate structure and capability of a commercial sector energy model supported by CE. Three specific areas were examined during this review. These areas provide (1) a look at recent suggestions and guidance as to what constitutes a minimal set of requirements and capabilities for a commercial buildings energy model for CE, (2) a review of several existing models in terms of their general structure and how they match up with the requirements listed previously, and (3) an overview of a proposed improved commercial sector energy model.

  4. Surgical videos for accident analysis, performance improvement, and complication prevention: time for a surgical black box?

    Science.gov (United States)

    Gambadauro, Pietro; Magos, Adam

    2012-03-01

    Conventional audit of surgical records through review of surgical results provides useful knowledge but hardly helps identify the technical reasons lying behind specific outcomes or complications. Surgical teams not only need to know that a complication might happen but also how and when it is most likely to happen. Functional awareness is therefore needed to prevent complications, know how to deal with them, and improve overall surgical performance. The authors wish to argue that the systematic recording and reviewing of surgical videos, a "surgical black box," might improve surgical care, help prevent complications, and allow accident analysis. A possible strategy to test this hypothesis is presented and discussed. Recording and reviewing surgical interventions, apart from helping us achieve functional awareness and increasing the safety profile of our performance, allows us also to effectively share our experience with colleagues. The authors believe that those potential implications make this hypothesis worth testing.

  5. An improved principal component analysis based region matching method for fringe direction estimation

    Science.gov (United States)

    He, A.; Quan, C.

    2018-04-01

    The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.

  6. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    Science.gov (United States)

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  7. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  8. Regional fringe analysis for improving depth measurement in phase-shifting fringe projection profilometry

    Science.gov (United States)

    Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen

    2018-01-01

    This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.

  9. Silica Fume and Fly Ash Admixed Can Help to Improve the PRC Durability Combine Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Xiao Li-guang

    2016-01-01

    Full Text Available Silica fume/Fly ash RPC can greatly improve durability. When Silica fume to replace the same amount of 8% of the proportion of cement, re-mixed 15min of mechanically activated Fly ash content of 10%, by chloride ion flux detector measuring, complex doped than the reference RPC impermeability improved significantly; In addition, by using static nitrogen adsorption method showed, RPC internal pore structure determination, the hole integral volume was lower than the reference admixed RPC integral pore volume significantly; And combined SEM microscopic experimental methods, mixed of RPC internal structure and the formation mechanism analysis showed that, SF/FA complex fully embodies the synergy doped composites “Synergistic” principle.

  10. Analysis of the nutritional management practices in intensive care: Identification of needs for improvement.

    Science.gov (United States)

    Lázaro-Martín, N I; Catalán-González, M; García-Fuentes, C; Terceros-Almanza, L; Montejo-González, J C

    2015-12-01

    To analyze the nutritional management practices in Intensive Care (ICU) to detect the need for improvement actions. Re-evaluate the process after implementation of improvement actions. Prospective observational study in 3 phases: 1) observation; 2) analysis, proposal development and dissemination; 3) analysis of the implementation. ICU of a hospital of high complexity. Adult ICU forecast more than 48h of artificial nutrition. Parenteral nutrition (PN), enteral nutrition (EN) (type, average effective volume, complications) and average nutritional ratio. A total of 229 patients (phase 1: 110, phase 3: 119). After analyzing the initial results, were proposed: increased use and precocity of EN, increased protein intake, nutritional monitoring effectiveness and increased supplementary indication NP. The measures were broadcast at specific meetings. During phase 3 more patients received EN (55.5 vs. 78.2%, P=.001), with no significant difference in the start time (1.66 vs. 2.33 days), duration (6.82 vs. 10,12 days) or complications (37,7 vs. 47,3%).Use of hyperproteic diets was higher in phase 3 (0 vs. 13.01%, P<.05). The use of NP was similar (48.2 vs. 48,7%) with a tendency to a later onset in phase 3 (1.25±1.25 vs. 2.45±3.22 days). There were no significant differences in the average nutritional ratio (0.56±0.28 vs. 0.61±0.27, P=.56). The use of EN and the protein intake increased, without appreciating effects on other improvement measures. Other methods appear to be necessary for the proper implementation of improvement measures. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.

  11. Primary health care contribution to improve health outcomes in Bogota-Colombia: a longitudinal ecological analysis

    Directory of Open Access Journals (Sweden)

    Mosquera Paola A

    2012-08-01

    Full Text Available Abstract Background Colombia has a highly segmented and fragmented national health system that contributes to inequitable health outcomes. In 2004 the district government of Bogota initiated a Primary Health Care (PHC strategy to improve health care access and population health status. This study aims to analyse the contribution of the PHC strategy to the improvement of health outcomes controlling for socioeconomic variables. Methods A longitudinal ecological analysis using data from secondary sources was carried out. The analysis used data from 2003 and 2007 (one year before and 3 years after the PHC implementation. A Primary Health Care Index (PHCI of coverage intensity was constructed. According to the PHCI, localities were classified into two groups: high and low coverage. A multivariate analysis using a Poisson regression model for each year separately and a Panel Poisson regression model to assess changes between the groups over the years was developed. Dependent variables were infant mortality rate, under-5 mortality rate, infant mortality rate due to acute diarrheal disease and pneumonia, prevalence of acute malnutrition, vaccination coverage for diphtheria, pertussis, tetanus (DPT and prevalence of exclusive breastfeeding. The independent variable was the PHCI. Control variables were sewerage coverage, health system insurance coverage and quality of life index. Results The high PHCI localities as compared with the low PHCI localities showed significant risk reductions of under-5 mortality (13.8% and infant mortality due to pneumonia (37.5% between 2003 and 2007. The probability of being vaccinated for DPT also showed a significant increase of 4.9%. The risk of infant mortality and of acute malnutrition in children under-5 years was lesser in the high coverage group than in the low one; however relative changes were not statistically significant. Conclusions Despite the adverse contextual conditions and the limitations imposed by the

  12. Drug supply indicators: Pitfalls and possibilities for improvements to assist comparative analysis.

    Science.gov (United States)

    Singleton, Nicola; Cunningham, Andrew; Groshkova, Teodora; Royuela, Luis; Sedefov, Roumen

    2018-03-03

    Interventions to tackle the supply of drugs are seen as standard components of illicit drug policies. Therefore drug market-related administrative data, such as seizures, price, purity and drug-related offending, are used in most countries for policy monitoring and assessment of the drug situation. International agencies, such as the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) and the UN Office of Drugs and Crime, also monitor and report on the drug situation cross-nationally and therefore seek to collect and make available key data in a uniform manner from the countries they cover. However, these data are not primarily collected for this purpose, which makes interpretation and comparative analysis difficult. Examples of limitations of these data sources include: the extent to which they reflect operational priorities rather than market changes; question marks over the robustness of and consistency in data collection methods, and issues around the timeliness of data availability. Such problems are compounded by cultural, social and contextual differences between countries. Making sense of such data is therefore challenging and extreme care needs to be taken using it. Nevertheless, these data provide an important window on a hidden area, so improving the quality of the data collected and expanding its scope should be a priority for those seeking to understand or monitor drug markets and supply reduction. In addition to highlighting some of the potential pitfalls in using supply indicators for comparative analysis, this paper presents a selection of options for improvements based on the current EMCDDA programme of work to improve their supply-related monitoring and analysis. The conceptual framework developed to steer this work may have wider application. Adopting this approach has the potential to provide a richer picture of drug markets, at both national and international levels, and make it easier to compare data between countries. Copyright

  13. Rational improvement of the engineered isobutanol-producing Bacillus subtilis by elementary mode analysis

    Directory of Open Access Journals (Sweden)

    Li Shanshan

    2012-08-01

    Full Text Available Abstract Background Isobutanol is considered as a leading candidate for the replacement of current fossil fuels, and expected to be produced biotechnologically. Owing to the valuable features, Bacillus subtilis has been engineered as an isobutanol producer, whereas it needs to be further optimized for more efficient production. Since elementary mode analysis (EMA is a powerful tool for systematical analysis of metabolic network structures and cell metabolism, it might be of great importance in the rational strain improvement. Results Metabolic network of the isobutanol-producing B. subtilis BSUL03 was first constructed for EMA. Considering the actual cellular physiological state, 239 elementary modes (EMs were screened from total 11,342 EMs for potential target prediction. On this basis, lactate dehydrogenase (LDH and pyruvate dehydrogenase complex (PDHC were predicted as the most promising inactivation candidates according to flux flexibility analysis and intracellular flux distribution simulation. Then, the in silico designed mutants were experimentally constructed. The maximal isobutanol yield of the LDH- and PDHC-deficient strain BSUL05 reached 61% of the theoretical value to 0.36 ± 0.02 C-mol isobutanol/C-mol glucose, which was 2.3-fold of BSUL03. Moreover, this mutant produced approximately 70 % more isobutanol to the maximal titer of 5.5 ± 0.3 g/L in fed-batch fermentations. Conclusions EMA was employed as a guiding tool to direct rational improvement of the engineered isobutanol-producing B. subtilis. The consistency between model prediction and experimental results demonstrates the rationality and accuracy of this EMA-based approach for target identification. This network-based rational strain improvement strategy could serve as a promising concept to engineer efficient B. subtilis hosts for isobutanol, as well as other valuable products.

  14. Using digital notifications to improve attendance in clinic: systematic review and meta-analysis.

    Science.gov (United States)

    Robotham, Dan; Satkunanathan, Safarina; Reynolds, John; Stahl, Daniel; Wykes, Til

    2016-10-24

    Assess the impact of text-based electronic notifications on improving clinic attendance, in relation to study quality (according to risk of bias), and to assess simple ways in which notifications can be optimised (ie, impact of multiple notifications). Systematic review, study quality appraisal assessing risk of bias, data synthesised in meta-analyses. MEDLINE, EMBASE, PsycINFO, Web of Science and Cochrane Database of Systematic Reviews (01.01.05 until 25.4.15). A systematic search to discover all studies containing quantitative data for synthesis into meta-analyses. Studies examining the effect of text-based electronic notifications on prescheduled appointment attendance in healthcare settings. Primary analysis included experimental studies where randomisation was used to define allocation to intervention and where a control group consisting of 'no reminders' was used. Secondary meta-analysis included studies comparing text reminders with voice reminders. Studies lacking sufficient information for inclusion (after attempting to contact study authors) were excluded. Primary outcomes were rate of attendance/non-attendance at healthcare appointments. Secondary outcome was rate of rescheduled and cancelled appointments. 26 articles were included. 21 included in the primary meta-analysis (8345 patients receiving electronic text notifications, 7731 patients receiving no notifications). Studies were included from Europe (9), Asia (7), Africa (2), Australia (2) and America (1). Patients who received notifications were 23% more likely to attend clinic than those who received no notification (risk ratio=1.23, 67% vs 54%). Those receiving notifications were 25% less likely to 'no show' for appointments (risk ratio=.75, 15% vs 21%). Results were similar when accounting for risk of bias, region and publication year. Multiple notifications were significantly more effective at improving attendance than single notifications. Voice notifications appeared more effective than text

  15. Ranking agricultural practices on soil water improvements: a meta-analysis

    Science.gov (United States)

    Basche, A.; DeLonge, M. S.; Gonzalez, J.

    2016-12-01

    Increased rainfall variability is well documented in the historic record and predicted to intensify with future climate change. Managing excess water in periods of heavy rain and a lack of water in periods of inadequate precipitation will continue to be a challenge. Improving soil resiliency through increased water storage is a promising strategy to combat effects of both rainfall extremes. The goal of this research is to quantify to what extent various conservation and ecological practices can improve soil hydrology. We are conducting a global meta-analysis focused on studies where conservation and ecological practices are compared to more conventional management. To date we have analyzed 100 studies with more than 450 paired comparisons to understand the effect of management on water infiltration rates, a critical process that ensures water enters the soil profile for crop use, water storage and runoff prevention. The database will be expanded to include studies measuring soil porosity and the water retained at field capacity. Statistical analysis has been done both with both a bootstrap method and a mixed model that weights studies based on precision while accounting for between-study variation. We find that conservation and ecological practices, ranging from no-till, cover crops, crop rotation, perennial crops and agroforestry, on average significantly increased water infiltration rates relative to more conventional practice controls (mean of 75%, standard error 25%). There were significant differences between practices, where perennial and agroforestry systems show the greatest potential for improving water infiltration rates (> 100% increase). Cover crops also lead to a significant increase in water infiltration rates (> 60%) while crop rotations and no-till systems did not consistently demonstrate increases. We also found that studies needed to include alternative management for more than two years to detect a significant increase. Overall this global meta-analysis

  16. Improving torque per kilogram magnet of permanent magnet couplings using finite element analysis

    DEFF Research Database (Denmark)

    Högberg, Stig; Jensen, Bogi Bech; Bendixen, Flemming Buus

    2013-01-01

    This paper presents the methodology and subsequent findings of a performance-improvement routine that employs automated finite element (FE) analysis to increase the torque-per-kilogram-magnet (TPKM) of a permanent magnet coupling (PMC). The routine is applied to a commercially available cylindrical...... PMC with rectangular permanent magnets (PM), and a new design is discovered which increases TPKM by 15.6%. Furthermore, the study is repeated using concave/convex-shaped PMs, which results in an increase of TPKM of 57.6%. The FE models are validated against experimental measurements of the static...

  17. Improving IT project governance: A reflective analysis based on critical systems heuristics

    Directory of Open Access Journals (Sweden)

    David Johnstone

    2017-05-01

    Full Text Available IT project governance involves establishing authority structures, policies and mechanisms for IT projects. However, the way governance arrangements are implemented can sometimes exclude or marginalise important stakeholders. In this paper, we use critical systems thinking, and the notions of boundary critique and entrenched structural conflict, to inform a critical re-analysis of a case study where the governance proved relatively ineffective. We use the ‘twelve questions’ from the critical systems heuristics (CSH approach to diagnose problems with governance arrangements and suggest solutions. Based on this, we suggest the CSH approach has theoretical and practical efficacy for improving IT project governance in general.

  18. DAMBE7: New and improved tools for data analysis in molecular biology and evolution.

    Science.gov (United States)

    Xia, Xuhua

    2018-04-14

    DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca.

  19. An improved method for reactor coolant pump abnormality monitoring using power line signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jae Cheon [Korea Power Engineering Company, Korea Advanced Institute of Science and Technology, 150 deokjin-dong, Yuseong-ku, Daejeon (Korea, Republic of)]. E-mail jcjung@kopec.co.kr; Seong, Poong Hyun [Korea Power Engineering Company, Korea Advanced Institute of Science and Technology, 150 deokjin-dong, Yuseong-ku, Daejeon (Korea, Republic of)

    2006-01-15

    An improved method to detect the reactor coolant pump (RCP) abnormality is suggested in this work. The monitoring parameters that are acquired from power line signal analysis are motor torque, motor speed and characteristic harmonic frequencies. The combination of Wigner-Ville Distribution (WVD) and feature area matrix comparison method is used for abnormality diagnosis. For validation of the proposed method, the test was performed during cool-down phase and heat-up phase in nuclear power plant (NPP) by cross-comparison with RCP vibration monitoring system (VMS). Using pump internal inspection results, the diagnosis prediction is verified.

  20. Thrombocytopenia and craniotomy for tumor: A National Surgical Quality Improvement Program analysis.

    Science.gov (United States)

    Dasenbrock, Hormuzdiyar H; Devine, Christopher A; Liu, Kevin X; Gormley, William B; Claus, Elizabeth B; Smith, Timothy R; Dunn, Ian F

    2016-06-01

    To the authors' knowledge, the current study is the first national analysis of the association between preoperative platelet count and outcomes after craniotomy. Patients who underwent craniotomy for tumor were extracted from the prospective National Surgical Quality Improvement Program registry (2007-2014) and stratified by preoperative thrombocytopenia, defined as mild (125,000-149,000/μL), moderate (100,000-124,000/μL), severe (75,000-99,000/μL), or very severe (craniotomy for tumor. Cancer 2016;122:1708-17. © 2016 American Cancer Society. © 2016 American Cancer Society.

  1. Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle

    Science.gov (United States)

    Stambolian, Damon B.; Dippolito, Gregory M.; Nyugen, Bao; Dischinger, Charles; Tran, Donald; Henderson, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human Factors analysis in improving the ground processing procedures for the Ares-1 launch vehicle. The light vehicle engineering designers for Ares-l launch vehicle had to design the flight vehicle for effective, efficient and safe ground operations in the cramped dimensions in a rocket design. The use of a mockup of the area where the technician would be required to work proved to be a very effective method to promote the collaboration between the Ares-1 designers and the ground operations personnel.

  2. Improving power output of inertial energy harvesters by employing principal component analysis of input acceleration

    Science.gov (United States)

    Smilek, Jan; Hadas, Zdenek

    2017-02-01

    In this paper we propose the use of principal component analysis to process the measured acceleration data in order to determine the direction of acceleration with the highest variance on given frequency of interest. This method can be used for improving the power generated by inertial energy harvesters. Their power output is highly dependent on the excitation acceleration magnitude and frequency, but the axes of acceleration measurements might not always be perfectly aligned with the directions of movement, and therefore the generated power output might be severely underestimated in simulations, possibly leading to false conclusions about the feasibility of using the inertial energy harvester for the examined application.

  3. Analysis and Alternate Selection of Nanopowder Modifiers to Improve a Special Protective Coating System

    Directory of Open Access Journals (Sweden)

    S. P. Bardakhanov

    2017-01-01

    Full Text Available This paper presents a practical approach for rational choice of silica nanopowders as modifiers to control and improve the performance of protective coating systems operating in harsh environmental conditions. The approach is based on the multiparameter analysis of nanoparticle reactivity of similar silica synthesized by using chemical and physical methods. The analysis indicates distinct adsorption centers due to the differences in the particles formation; the features of the formation and adsorption mechanisms lead to higher diffusion capacity of the nanoparticles, synthesized by physical methods, into a paint material and finally result in stronger chemical bonds between the system elements. The approach allows reducing the consumption of paint materials by 30% or more, at least 2-3 times increasing of the coating adhesion and hence the system life. Validity of the approach is illustrated through the data obtained from comparative modeling, factory testing, and practical use of modified systems.

  4. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  5. Improved gap filling method based on singular spectrum analysis and its application in space environment

    Science.gov (United States)

    Li, Xiangzhen; Liu, Shuai; Li, Zhi; Gong, Jiancun

    2017-11-01

    Data missing is a common phenomenon in the space environment measurements, which impacts or even blocks the following model-building procedures, predictions and posterior analysis. To fill these data gaps, an improved filling method based on iterative singular spectrum analysis is proposed. It first extracts a distribution array of the gaps and then fills the gaps with all known data. The distribution array is utilized to generate the test sets for cross validation. The embedding window length and principal components are determined by the discrete particle swarm optimization algorithm in a noncontinuous fashion. The effectiveness and adaptability of the filling method are proved by some tests done on solar wind data and geomagnetic indices from different solar activity years.

  6. Sentiment Analysis in Spanish for Improvement of Products and Services: A Deep Learning Approach

    Directory of Open Access Journals (Sweden)

    Mario Andrés Paredes-Valverde

    2017-01-01

    Full Text Available Sentiment analysis is an important area that allows knowing public opinion of the users about several aspects. This information helps organizations to know customer satisfaction. Social networks such as Twitter are important information channels because information in real time can be obtained and processed from them. In this sense, we propose a deep-learning-based approach that allows companies and organizations to detect opportunities for improving the quality of their products or services through sentiment analysis. This approach is based on convolutional neural network (CNN and word2vec. To determine the effectiveness of this approach for classifying tweets, we conducted experiments with different sizes of a Twitter corpus composed of 100000 tweets. We obtained encouraging results with a precision of 88.7%, a recall of 88.7%, and an F-measure of 88.7% considering the complete dataset.

  7. Improving the Design of Capacitive Micromachined Ultrasonic Transducers Aided with Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    A Martowicz

    2016-09-01

    Full Text Available The paper presents the results of analysis performed to search for feasible design improvements for capacitive micromachined ultrasonic transducer. Carried out search has been aided with the sensitivity analysis and the application of Response Surface Method. The multiphysics approach has been taken into account in elaborated finite element model of one cell of described transducer in order to include significant physical phenomena present in modelled microdevice. The set of twelve input uncertain and design parameters consists of geometric, material and control properties. The amplitude of dynamic membrane deformation of the transducer has been chosen as studied parameter. The objective of performed study has been defined as the task of finding robust design configurations of the transducer, i.e. characterizing maximal value of deformation amplitude with its minimal variation.

  8. Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft

    Science.gov (United States)

    Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.

  9. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  10. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  11. Structure of CPV17 polyhedrin determined by the improved analysis of serial femtosecond crystallographic data.

    Science.gov (United States)

    Ginn, Helen M; Messerschmidt, Marc; Ji, Xiaoyun; Zhang, Hanwen; Axford, Danny; Gildea, Richard J; Winter, Graeme; Brewster, Aaron S; Hattne, Johan; Wagner, Armin; Grimes, Jonathan M; Evans, Gwyndaf; Sauter, Nicholas K; Sutton, Geoff; Stuart, David I

    2015-03-09

    The X-ray free-electron laser (XFEL) allows the analysis of small weakly diffracting protein crystals, but has required very many crystals to obtain good data. Here we use an XFEL to determine the room temperature atomic structure for the smallest cytoplasmic polyhedrosis virus polyhedra yet characterized, which we failed to solve at a synchrotron. These protein microcrystals, roughly a micron across, accrue within infected cells. We use a new physical model for XFEL diffraction, which better estimates the experimental signal, delivering a high-resolution XFEL structure (1.75 Å), using fewer crystals than previously required for this resolution. The crystal lattice and protein core are conserved compared with a polyhedrin with less than 10% sequence identity. We explain how the conserved biological phenotype, the crystal lattice, is maintained in the face of extreme environmental challenge and massive evolutionary divergence. Our improved methods should open up more challenging biological samples to XFEL analysis.

  12. Predictors of psychological improvement on non-professional suicide message boards: content analysis.

    Science.gov (United States)

    Niederkrotenthaler, T; Gould, M; Sonneck, G; Stack, S; Till, B

    2016-12-01

    Suicide message boards have been at the core of debates about negative influences of the Internet on suicidality. Nothing is currently known about communication styles that may help users to psychologically improve in these settings. In all, 1182 archival threads with 20 499 individual postings from seven non-professional suicide message boards supporting an 'against-suicide', 'neutral' or 'pro-suicide' attitude were randomly selected and subject to content analysis. Initial needs of primary posters (i.e. individual who open a thread), their psychological improvement by the end of the thread, their responses received and indicators of suicidality were coded. Differences between 'pro-suicide', 'neutral' and 'against suicide' boards, and correlations between primary posters and respondents in terms of suicidality were assessed. Logistic regression was used to test associations with psychological improvement. 'Pro-suicide' boards (n = 4) differed from 'neutral' (n = 1) and 'against-suicide' (n = 2) boards in terms of communicated contents. Indicators of suicidality correlated moderately to strongly between primary posters and respondents on 'pro-suicide' message boards, but less on other boards. Several communicative strategies were associated with psychological improvement in primary posters, including the provision of constructive advice [adjusted odds ratio (aOR) 4.10, 95% confidence interval (CI) 2.40-7.03], active listening (aOR 1.60, 95% CI 1.12-2.27), sympathy towards the poster (aOR 2.22, 95% CI 1.68-2.95) and provision of alternatives to suicide (aOR 2.30, 95% CI 1.67-3.18). Respondents resemble primary posters with regard to suicidality in 'pro-suicide' boards, which may hinder psychological improvement. Still, opportunities to intervene in these settings using simple communication techniques exist and need to be taken and evaluated.

  13. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  14. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  15. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    Science.gov (United States)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  16. Improving estimation of kinetic parameters in dynamic force spectroscopy using cluster analysis

    Science.gov (United States)

    Yen, Chi-Fu; Sivasankar, Sanjeevi

    2018-03-01

    Dynamic Force Spectroscopy (DFS) is a widely used technique to characterize the dissociation kinetics and interaction energy landscape of receptor-ligand complexes with single-molecule resolution. In an Atomic Force Microscope (AFM)-based DFS experiment, receptor-ligand complexes, sandwiched between an AFM tip and substrate, are ruptured at different stress rates by varying the speed at which the AFM-tip and substrate are pulled away from each other. The rupture events are grouped according to their pulling speeds, and the mean force and loading rate of each group are calculated. These data are subsequently fit to established models, and energy landscape parameters such as the intrinsic off-rate (koff) and the width of the potential energy barrier (xβ) are extracted. However, due to large uncertainties in determining mean forces and loading rates of the groups, errors in the estimated koff and xβ can be substantial. Here, we demonstrate that the accuracy of fitted parameters in a DFS experiment can be dramatically improved by sorting rupture events into groups using cluster analysis instead of sorting them according to their pulling speeds. We test different clustering algorithms including Gaussian mixture, logistic regression, and K-means clustering, under conditions that closely mimic DFS experiments. Using Monte Carlo simulations, we benchmark the performance of these clustering algorithms over a wide range of koff and xβ, under different levels of thermal noise, and as a function of both the number of unbinding events and the number of pulling speeds. Our results demonstrate that cluster analysis, particularly K-means clustering, is very effective in improving the accuracy of parameter estimation, particularly when the number of unbinding events are limited and not well separated into distinct groups. Cluster analysis is easy to implement, and our performance benchmarks serve as a guide in choosing an appropriate method for DFS data analysis.

  17. A social work study on the effect of transactional analysis on the improvement of intimacy attitude

    Directory of Open Access Journals (Sweden)

    Parvin Gol

    2013-04-01

    Full Text Available The purpose of this paper is to investigate the impact of group counseling using transactional analysis on the improvement of intimacy attitude in some depressed patients in city of Esfahan, Iran. In this paper, semi-experimental design with pretest posttest control groups was conducted among 30 patients. The sample was selected through available sampling method among the depressed patients referred to psychiatric centers. They were randomly assigned into experimental and control groups. The measurement instrument is intimacy attitude scale (IAS questionnaire by Amidon et al. (1983 [Amidon, E., Kumar, V. K., & Treadwell, T. (1983. Measurement of intimacy attitudes: The intimacy attitude scale-revisited. Journal of personality assessment, 47(6, 635-639.] and the Beck depression inventory (BDI. The pretest and posttest scores of the intimacy attitude scale questionnaire were analyzed in both experimental and control groups. For statistical analysis of data, repeated measures analysis of variance was carried out. The research findings indicated that group counseling using transactional analysis increases the level of intimacy attitude in depressed individuals. It also increases the emotional intimacy, but it does not increase the mental intimacy.

  18. Computerized lung sound analysis following clinical improvement of pulmonary edema due to congestive heart failure exacerbations.

    Science.gov (United States)

    Wang, Zhen; Xiong, Ying-xia

    2010-05-05

    Although acute congestive heart failure (CHF) patients typically present with abnormal auscultatory findings on lung examination, lung sounds are not normally subjected to rigorous analysis. The goals of this study were to use a computerized analytic acoustic tool to evaluate lung sound patterns in CHF patients during acute exacerbation and after clinical improvement and to compare CHF profiles with those of normal individuals. Lung sounds throughout the respiratory cycle was captured using a computerized acoustic-based imaging technique. Thirty-two consecutive CHF patients were imaged at the time of presentation to the emergency department and after clinical improvement. Digital images were created, geographical area of the images and lung sound patterns were quantitatively analyzed. The geographical areas of the vibration energy image of acute CHF patients without and with radiographically evident pulmonary edema were (67.9 +/- 4.7) and (60.3 +/- 3.5) kilo-pixels, respectively (P sound increased to (74.5 +/- 4.4) and (73.9 +/- 3.9) kilo-pixels (P sound analysis may be useful to track in acute CHF exacerbations.

  19. Development of thermodynamic optimum searching (TOS) to improve the prediction accuracy of flux balance analysis.

    Science.gov (United States)

    Zhu, Yan; Song, Jiangning; Xu, Zixiang; Sun, Jibin; Zhang, Yanping; Li, Yin; Ma, Yanhe

    2013-03-01

    Flux balance analysis (FBA) has been widely used in calculating steady-state flux distributions that provide important information for metabolic engineering. Several thermodynamics-based methods, for example, quantitative assignment of reaction directionality and energy balance analysis have been developed to improve the prediction accuracy of FBA. However, these methods can only generate a thermodynamically feasible range, rather than the most thermodynamically favorable solution. We therefore developed a novel optimization method termed as thermodynamic optimum searching (TOS) to calculate the thermodynamically optimal solution, based on the second law of thermodynamics, the minimum magnitude of the Gibbs free energy change and the maximum entropy production principle (MEPP). Then, TOS was applied to five physiological conditions of Escherichia coli to evaluate its effectiveness. The resulting prediction accuracy was found significantly improved (10.7-48.5%) by comparing with the (13)C-fluxome data, indicating that TOS can be considered an advanced calculation and prediction tool in metabolic engineering. Copyright © 2012 Wiley Periodicals, Inc.

  20. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP

    Science.gov (United States)

    Zhang, Junhong; Wang, Jian; Lin, Jiewei; Bi, Fengrong; Guo, Qian; Chen, Kongwu; Ma, Liang

    2015-09-01

    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction.

  1. Improved RNA analysis for immediate autopsy of temporal bone soft tissues.

    Science.gov (United States)

    Lin, J; Kawano, H; Paparella, M M; Ho, S B

    1999-01-01

    RNA analysis is essential for understanding biological activities of a cell or tissue. Unfortunately, retrieval of RNA from existing archives of human temporal bones has proven extremely difficult due to degradation of RNA molecules. The major factors that contribute to degradation of RNA in specimens from autopsied temporal bones are tissue autolysis due to time elapsed before autopsy, and technical problems in processing the bones after harvest. We therefore focused on improving the survival of RNA in human temporal bones by shortening the time to autopsy and through modification of the processing technique by removing targeted tissues directly from the temporal bones and by avoiding time-consuming decalcification and celloidin-embedding. Eight temporal bones collected at immediate autopsies were used in this study. Representative mRNAs, ranging from high (MUC5B, physically unstable) to low (beta-actin, physically stable) molecular weights, and from abundant (MUC5B) to non-abundant (MUC1) RNA, were studied by in situ hybridization, Northern blot technique, or both. Using this modified protocol in autopsies performed up to 6 h after death, the existence of mRNAs was demonstrated in all bones studied. This improved method demonstrates the feasibility of the use of autopsied temporal bone tissues for RNA analysis.

  2. Analysis of the dynamic response improvement of a turbocharged diesel engine driven alternating current generating set

    International Nuclear Information System (INIS)

    Katrasnik, Tomaz; Medica, Vladimir; Trenc, Ferdinand

    2005-01-01

    Reliability of electric supply systems is among the most required necessities of modern society. Turbocharged diesel engine driven alternating current generating sets are often used to prevent electric black outs and/or as prime electric energy suppliers. It is well known that turbocharged diesel engines suffer from an inadequate response to a sudden load increase, this being a consequence of the nature of the energy exchange between the engine and the turbocharger. The dynamic response of turbocharged diesel engines could be improved by electric assisting systems, either by direct energy supply with an integrated starter-generator-booster (ISG) mounted on the engine flywheel, or by an indirect energy supply with an electrically assisted turbocharger. An experimentally verified zero dimensional computer simulation method was used for the analysis of both types of electrical assistance. The paper offers an analysis of the interaction between a turbocharged diesel engine and different electric assisting systems, as well as the requirements for the supporting electric motors that could improve the dynamic response of a diesel engine while driving an AC generating set. When performance class compliance is a concern, it is evident that an integrated starter-generator-booster outperforms an electrically assisted turbocharger for the investigated generating set. However, the electric energy consumption and frequency recovery times are smaller when an electrically assisted turbocharger is applied

  3. Image preprocessing improves Fourier-based texture analysis of nuclear chromatin.

    Science.gov (United States)

    Adam, Randall L; Leite, Neucimar J; Metze, Konradin

    2008-06-01

    To investigate whether preprocessing of digitized images can improve the image analysis of chromatin of cytologic preparations using Fast Fourier Transformation (FFT). In a preprocessing step the nuclear borders of the segmented nuclei were smoothed, thus avoiding the Airy ring artifact. We tested this method, comparing the inertia values of digitalized cardiomyocyte nuclei of rats of different ages. Furthermore, we created in silicio nuclear images with chromatin alterations at or nearby the nuclear edge in order to investigate the robustness of our method. After preprocessing, the FFT-derived variable inertia discriminated significantly better the chromatin structure of the nuclei at different ages in every frequency range. The investigation on simulated nuclei revealed that within the frequency ranges from 1.8 microm to 0.72 microm smoothing of the borders does not interfere with the detection of chromatin changes at the nuclear border. Smoothing of borders in segmented images can improve the analysis of Fourier-derived variables of the chromatin texture.

  4. Security analysis and improvements of authentication and access control in the Internet of Things.

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-08-13

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  5. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  6. Improvement of reflood model in RELAP5 code based on sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dong; Liu, Xiaojing; Yang, Yanhua, E-mail: yanhuay@sjtu.edu.cn

    2016-07-15

    Highlights: • Sensitivity analysis is performed on the reflood model of RELAP5. • The selected influential models are discussed and modified. • The modifications are assessed by FEBA experiment and better predictions are obtained. - Abstract: Reflooding is an important and complex process to the safety of nuclear reactor during loss of coolant accident (LOCA). Accurate prediction of the reflooding behavior is one of the challenge tasks for the current system code development. RELAP5 as a widely used system code has the capability to simulate this process but with limited accuracy, especially for low inlet flow rate reflooding conditions. Through the preliminary assessment with six FEBA (Flooding Experiments with Blocked Arrays) tests, it is observed that the peak cladding temperature (PCT) is generally underestimated and bundle quench is predicted too early compared to the experiment data. In this paper, the improvement of constitutive models related to reflooding is carried out based on single parametric sensitivity analysis. Film boiling heat transfer model and interfacial friction model of dispersed flow are selected as the most influential models to the results of interests. Then studies and discussions are specifically focused on these sensitive models and proper modifications are recommended. These proposed improvements are implemented in RELAP5 code and assessed against FEBA experiment. Better agreement between calculations and measured data for both cladding temperature and quench time is obtained.

  7. Repeated Quantitative Urine Toxicology Analysis May Improve Chronic Pain Patient Compliance with Opioid Therapy.

    Science.gov (United States)

    Knezevic, Nebojsa Nick; Khan, Omar M; Beiranvand, Afsaneh; Candido, Kenneth D

    2017-02-01

    Even though serious efforts have been undertaken by different medical societies to reduce opioid use for treating chronic benign pain, many Americans continue to seek pain relief through opioid consumption. Assuring compliance of these patients may be a difficult aspect of proper management even with regular behavioral monitoring. The purpose of this study was to accurately assess the compliance of chronic opioid-consuming patients in an outpatient setting and evaluate if utilizing repeated urine drug testing (UDT) could improve compliance. Retrospective analysis of prospectively collected data. Outpatient pain management clinic. After Institutional Review Board (IRB) approval, a retrospective analysis of data for 500 patients was conducted. We included patients who were aged 18 years and older who were treated with opioid analgesic medication for chronic pain. Patients were asked to provide supervised urine toxicology specimens during their regular clinic visits, and were asked to do so without prior notification. The specimens were sent to an external laboratory for quantitative testing using liquid chromatography-tandem mass spectrometry. Three hundred and eighty-six (77.2%) patients were compliant with prescribed medications and did not use any illicit drugs or undeclared medications. Forty-one (8.2%) patients tested positive for opioid medication(s) that were not prescribed in our clinic; 8 (1.6%) of the patients were positive for medication that was not prescribed by any physician and was not present in the Illinois Prescription Monitoring Program; 5 (1%) patients tested negative for prescribed opioids; and 60 (12%) patients were positive for illicit drugs (8.6% marijuana, 3.2% cocaine, 0.2% heroin). Repeated UDTs following education and disclosure, showed 49 of the 77 patients (63.6%) had improved compliance. This was a single-site study and we normalized concentrations of opioids in urine with creatinine levels while specific gravity normalization was not

  8. Improving student satisfaction of Andalas University Dormitory through Service Quality and Importance Performance Analysis

    Science.gov (United States)

    Putri, Nilda Tri; Anggraini, Larisa

    2018-03-01

    Residential satisfaction of university dormitories serve as one of the significant aspects in the framework of sustainability in higher education. This research investigated the quality of dormitory services in Andalas University Dormitory based on student’s satisfaction. According to management residential, the enrollment of residential student has increased gradually in Andalas University. In 2016, capacity of residential student is 1686, but only 1081 students can stay at dormitory because some rooms in bad condition. There are a lot of problems and complaints regarding dormitory’s service quality i.e water problems, leaky rooms and bathrooms, cleanliness and inadequate facilities in residential college. In addition, there are 20% of last year student’s residential check out before the time of contract runs out. The aim of this research are understanding the level of GAP exists between expectation and perception students’ residential in the content of service quality and evaluating the improvement priority services using Importance Performance Analysis. This study is measuring service quality by using Responsiveness, Assurance, Empathy, Reliability and Tangible dimension. A negative GAP indicates that the actual services are than what was expected and the GAP is highlighted area for improvement. Based on IPA, management should improve this following dimension services : responsiveness, tangible and assurance dimension.

  9. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  10. Improving food safety within the dairy chain: an application of conjoint analysis.

    Science.gov (United States)

    Valeeva, N I; Meuwissen, M P M; Lansink, A G J M Oude; Huirne, R B M

    2005-04-01

    This study determined the relative importance of attributes of food safety improvement in the production chain of fluid pasteurized milk. The chain was divided into 4 blocks: "feed" (compound feed production and its transport), "farm" (dairy farm), "dairy processing" (transport and processing of raw milk, delivery of pasteurized milk), and "consumer" (retailer/catering establishment and pasteurized milk consumption). The concept of food safety improvement focused on 2 main groups of hazards: chemical (antibiotics and dioxin) and microbiological (Salmonella, Escherichia coli, Mycobacterium paratuberculosis, and Staphylococcus aureus). Adaptive conjoint analysis was used to investigate food safety experts' perceptions of the attributes' importance. Preference data from individual experts (n = 24) on 101 attributes along the chain were collected in a computer-interactive mode. Experts perceived the attributes from the "feed" and "farm" blocks as being more vital for controlling the chemical hazards; whereas the attributes from the "farm" and "dairy processing" were considered more vital for controlling the microbiological hazards. For the chemical hazards, "identification of treated cows" and "quality assurance system of compound feed manufacturers" were considered the most important attributes. For the microbiological hazards, these were "manure supply source" and "action in salmonellosis and M. paratuberculosis cases". The rather high importance of attributes relating to quality assurance and traceability systems of the chain participants indicates that participants look for food safety assurance from the preceding participants. This information has substantial decision-making implications for private businesses along the chain and for the government regarding the food safety improvement of fluid pasteurized milk.

  11. Analysis of Technological Innovation and Environmental Performance Improvement in Aviation Sector

    Science.gov (United States)

    Lee, Joosung; Mo, Jeonghoon

    2011-01-01

    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector—aircraft manufacturers and airlines—has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation’s lifecycle environmental impact if they can achieve sufficient economies of scale. PMID:22016716

  12. Analysis of technological innovation and environmental performance improvement in aviation sector.

    Science.gov (United States)

    Lee, Joosung; Mo, Jeonghoon

    2011-09-01

    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector-aircraft manufacturers and airlines-has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation's lifecycle environmental impact if they can achieve sufficient economies of scale.

  13. Genome shuffling of Propionibacterium shermanii for improving vitamin B12 production and comparative proteome analysis.

    Science.gov (United States)

    Zhang, Ying; Liu, Jian-Zhong; Huang, Jun-Sheng; Mao, Zong-Wan

    2010-07-20

    Genome shuffling is an efficient approach for the rapid improvement of microbial phenotype. Here we improved vitamin B12 production of Propionibacterium shermanii by genome shuffling based on inactivated protoplast fusion. A genome shuffling strain with titer of vitamin B12 of 2.85 mgl(-1), named Propionibacterium shermanii-F2-3, was obtained. The genome shuffled strain produced about 61% improvement of vitamin B12 over the parent strain after 96 h. Comparative analysis of proteome profile was conducted between Propionibacterium shermanii 17 and F2-3. The expression levels of 38 proteins varied significantly in the genome shuffled strain compared with those in the parent strain. Of these proteins, 22 proteins were up-regulated, 16 proteins were down-regulated. Of the up-regulated proteins, 6 proteins (glutaminyl-tRNA synthetase (GlnS), Delta-aminolevulinic acid dehydratase (HemB), methionine synthase (Meth), riboflavin synthase (RibE), phosphofructo kinase (PfkA) and isocitrate dehydrogenase (Icd) is involved in the vitamin B12 biosynthesis pathway. They may be the key enzymes of vitamin B12 biosynthesis. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  14. Energy spectrum analysis of blast waves based on an improved Hilbert-Huang transform

    Science.gov (United States)

    Li, L.; Wang, F.; Shang, F.; Jia, Y.; Zhao, C.; Kong, D.

    2017-05-01

    Using the improved Hilbert-Huang transform (HHT), this paper investigates the problems of analysis and interpretation of the energy spectrum of a blast wave. It has been previously established that the energy spectrum is an effective feature by which to characterize a blast wave. In fact, the higher the energy spectra in a frequency band of a blast wave, the greater the damage to a target in the same frequency band. However, most current research focuses on analyzing wave signals in the time domain or frequency domain rather than considering the energy spectrum. We propose here an improved HHT method combined with a wavelet packet to extract the energy spectrum feature of a blast wave. When applying the HHT, the signal is first roughly decomposed into a series of intrinsic mode functions (IMFs) by empirical mode decomposition. The wavelet packet method is then performed on each IMF to eliminate noise on the energy spectrum. Second, a coefficient is introduced to remove unrelated IMFs. The energy of each instantaneous frequency can be derived through the Hilbert transform. The energy spectrum can then be obtained by adding up all the components after the wavelet packet filters and screens them through a coefficient to obtain the effective IMFs. The effectiveness of the proposed method is demonstrated by 12 groups of experimental data, and an energy attenuation model is established based on the experimental data. The improved HHT is a precise method for blast wave signal analysis. For other shock wave signals from blasting experiments, an energy frequency time distribution and energy spectrum can also be obtained through this method, allowing for more practical applications.

  15. Life cycle assessment of Italian citrus-based products. Sensitivity analysis and improvement scenarios.

    Science.gov (United States)

    Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina

    2010-07-01

    Though many studies concern the agro-food sector in the EU and Italy, and its environmental impacts, literature is quite lacking in works regarding LCA application on citrus products. This paper represents one of the first studies on the environmental impacts of citrus products in order to suggest feasible strategies and actions to improve their environmental performance. In particular, it is part of a research aimed to estimate environmental burdens associated with the production of the following citrus-based products: essential oil, natural juice and concentrated juice from oranges and lemons. The life cycle assessment of these products, published in a previous paper, had highlighted significant environmental issues in terms of energy consumption, associated CO(2) emissions, and water consumption. Starting from such results the authors carry out an improvement analysis of the assessed production system, whereby sustainable scenarios for saving water and energy are proposed to reduce environmental burdens of the examined production system. In addition, a sensitivity analysis to estimate the effects of the chosen methods will be performed, giving data on the outcome of the study. Uncertainty related to allocation methods, secondary data sources, and initial assumptions on cultivation, transport modes, and waste management is analysed. The results of the performed analyses allow stating that every assessed eco-profile is differently influenced by the uncertainty study. Different assumptions on initial data and methods showed very sensible variations in the energy and environmental performances of the final products. Besides, the results show energy and environmental benefits that clearly state the improvement of the products eco-profile, by reusing purified water use for irrigation, using the railway mode for the delivery of final products, when possible, and adopting efficient technologies, as the mechanical vapour recompression, in the pasteurisation and

  16. Developing person-centred analysis of harm in a paediatric hospital: a quality improvement report.

    Science.gov (United States)

    Lachman, Peter; Linkson, Lynette; Evans, Trish; Clausen, Henning; Hothi, Daljit

    2015-05-01

    The provision of safe care is complex and difficult to achieve. Awareness of what happens in real time is one of the ways to develop a safe system within a culture of safety. At Great Ormond Street Hospital, we developed and tested a tool specifically designed for patients and families to report harm, with the aim of raising awareness and opportunities for staff to continually improve and provide safe care. Over a 10-month period, we developed processes to report harm. We used the Model for Improvement and multiple Plan, Do, Study, Act cycles for testing. We measured changes using culture surveys as well as analysis of the reports. The tool was tested in different formats and moved from a provider centric to a person-centred tool analysed in real time. An independent person working with the families was best placed to support reporting. Immediate feedback to families was managed by senior staff, and provided the opportunity for clarification, transparency and apologies. Feedback to staff provided learning opportunities. Improvements in culture climate and staff reporting were noted in the short term. The integration of patient involvement in safety monitoring systems is essential to achieve safety. The high number of newly identified 'near-misses' and 'critical incidents' by families demonstrated an underestimation of potentially harmful events. This testing and introduction of a self-reporting, real-time bedside tool has led to active engagement with families and patients and raised situation awareness. We believe that this will lead to improved and safer care in the longer term. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Crossing the Barriers: An Analysis of Land Access Barriers to Geothermal Development and Potential Improvement Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Katherine R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-04

    Developers have identified many non-technical barriers to geothermal power development, including access to land. Activities required for accessing land, such as environmental review and private and public leasing can take a considerable amount of time and can delay or prevent project development. This paper discusses the impacts to available geothermal resources and deployment caused by land access challenges, including tribal and cultural resources, environmentally sensitive areas, biological resources, land ownership, federal and state lease queues, and proximity to military installations. In this analysis, we identified challenges that have the potential to prevent development of identified and undiscovered hydrothermal geothermal resources. We found that an estimated 400 MW of identified geothermal resource potential and 4,000 MW of undiscovered geothermal resource potential were either unallowed for development or contained one or more significant barriers that could prevent development at the site. Potential improvement scenarios that could be employed to overcome these barriers include (1) providing continuous funding to the U.S. Forest Service (USFS) for processing geothermal leases and permit applications and (2) the creation of advanced environmental mitigation measures. The model results forecast that continuous funding to the USFS could result in deployment of an additional 80 MW of geothermal capacity by 2030 and 124 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The creation of advanced environmental mitigation measures coupled with continuous funding to the USFS could result in deployment of an additional 97 MW of geothermal capacity by 2030 and 152 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The small impact on potential deployment in these improvement scenarios suggests that these 4,400 MW have other barriers to development in addition to land access. In other words, simply

  18. Music-assisted relaxation to improve sleep quality: meta-analysis.

    Science.gov (United States)

    de Niet, Gerrit; Tiemens, Bea; Lendemeijer, Bert; Hutschemaekers, Giel

    2009-07-01

    This paper is a report of a meta-analysis conducted to evaluate the efficacy of music-assisted relaxation for sleep quality in adults and elders with sleep complaints with or without a co-morbid medical condition. Clinical studies have shown that music can influence treatment outcome in a positive and beneficial way. Music holds the promise of counteracting psychological presleep arousal and thus improving the preconditions for sleep. We conducted a search in the Embase (1997 - July 2008), Medline (1950 - July 2008), Cochrane (2000 - July 2008), Psychinfo (1987 - July 2008) and Cinahl (1982 - July 2008) databases for randomized controlled trials reported in English, German, French and Dutch. The outcome measure of interest was sleep quality. Data were extracted from the included studies using predefined data fields. The researchers independently assessed the quality of the trials using the Delphi list. Only studies with a score of 5 points or higher were included. A pooled analysis was performed based on a fixed effect model. Five randomized controlled trials with six treatment conditions and a total of 170 participants in intervention groups and 138 controls met our inclusion criteria. Music-assisted relaxation had a moderate effect on the sleep quality of patients with sleep complaints (standardized mean difference, -0.74; 95% CI: -0.96, -0.46). Subgroup analysis revealed no statistically significant contribution of accompanying measures. Music-assisted relaxation can be used without intensive investment in training and materials and is therefore cheap, easily available and can be used by nurses to promote music-assisted relaxation to improve sleep quality.

  19. An improved method for Multipath Hemispherical Map (MHM) based on Trend Surface Analysis

    Science.gov (United States)

    Wang, Zhiren; Chen, Wen; Dong, Danan; Yu, Chao

    2017-04-01

    Among various approaches developed for detecting the multipath effect in high-accuracy GNSS positioning, Only MHM (Multipath Hemispherical Map) and SF (Sidereal Filtering) can be implemented to real-time GNSS data processing. SF is based on the time repeatability of satellites which just suitable for static environment, while the spatiotemporal repeatability-based MHM is applicable not only for static environment but also for dynamic carriers with static multipath environment such as ships and airplanes, and utilizes much smaller number of parameters than ASF. However, the MHM method also has certain defects. Since the MHM take the mean of residuals from the grid as the filter value, it is more suitable when the multipath regime is medium to low frequency. Now existing research data indicate that the newly advanced Sidereal Filtering (ASF) method perform better with high frequency multipath reduction than MHM by contrast. To solve the above problem and improve MHM's performance on high frequency multipath, we combined binary trend surface analysis method with original MHM model to effectively analyze particular spatial distribution and variation trends of multipath effect. We computed trend surfaces of the residuals within a grid by least-square procedures, and chose the best results through the moderate successive test. The enhanced MHM grid was constructed from a set of coefficients of the fitted equation instead of mean value. According to the analysis of the actual observation, the improved MHM model shows positive effect on high frequency multipath reduction, and significantly reduced the root mean square (RMS) value of the carrier residuals. Keywords: Trend Surface Analysis; Multipath Hemispherical Map; high frequency multipath effect

  20. Texture analysis improves level set segmentation of the anterior abdominal wall

    International Nuclear Information System (INIS)

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-01-01

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  1. Improving primary health care facility performance in Ghana: efficiency analysis and fiscal space implications.

    Science.gov (United States)

    Novignon, Jacob; Nonvignon, Justice

    2017-06-12

    Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private

  2. An improved model for whole genome phylogenetic analysis by Fourier transform.

    Science.gov (United States)

    Yin, Changchuan; Yau, Stephen S-T

    2015-10-07

    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Effects of improved modeling on best estimate BWR severe accident analysis

    International Nuclear Information System (INIS)

    Hyman, C.R.; Ott, L.J.

    1984-01-01

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H 2 O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B 4 C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table

  4. Sensitivity analysis of thermodynamic properties of liquid water: a general approach to improve empirical potentials.

    Science.gov (United States)

    Iordanov, Tzvetelin D; Schenter, Gregory K; Garrett, Bruce C

    2006-01-19

    A sensitivity analysis of bulk water thermodynamics is presented in an effort to understand the relation between qualitative features of molecular potentials and properties that they predict. The analysis is incorporated in molecular dynamics simulations and investigates the sensitivity of the Helmholtz free energy, internal energy, entropy, heat capacity, pressure, thermal pressure coefficient, and static dielectric constant to components of the potential rather than the parameters of a given functional form. The sensitivities of the properties are calculated with respect to the van der Waals repulsive and the attractive parts, plus short- and long-range Coulomb parts of three four site empirical water potentials: TIP4P, Dang-Chang and TTM2R. The polarization sensitivity is calculated for the polarizable Dang-Chang and TTM2R potentials. This new type of analysis allows direct comparisons of the sensitivities for different potentials that use different functional forms. The analysis indicates that all investigated properties are most sensitive to the van der Waals repulsive, the short-range Coulomb and the polarization components of the potentials. When polarization is included in the potentials, the magnitude of the sensitivity of the Helmholtz free energy, internal energy, and entropy with respect to this part of the potential is comparable in magnitude to the other electrostatic components. In addition similarities in trends of observed sensitivities for nonpolarizable and polarizable potentials lead to the conclusion that the complexity of the model is not of critical importance for the calculation of these thermodynamic properties for bulk water. The van der Waals attractive and the long-range Coulomb sensitivities are relatively small for the entropy, heat capacity, thermal pressure coefficient and the static dielectric constant, while small changes in any of the potential contributions will significantly affect the pressure. The analysis suggests a procedure

  5. Flipped classroom improves student learning in health professions education: a meta-analysis.

    Science.gov (United States)

    Hew, Khe Foon; Lo, Chung Kwan

    2018-03-15

    The use of flipped classroom approach has become increasingly popular in health professions education. However, no meta-analysis has been published that specifically examines the effect of flipped classroom versus traditional classroom on student learning. This study examined the findings of comparative articles through a meta-analysis in order to summarize the overall effects of teaching with the flipped classroom approach. We focused specifically on a set of flipped classroom studies in which pre-recorded videos were provided before face-to-face class meetings. These comparative articles focused on health care professionals including medical students, residents, doctors, nurses, or learners in other health care professions and disciplines (e.g., dental, pharmacy, environmental or occupational health). Using predefined study eligibility criteria, seven electronic databases were searched in mid-April 2017 for relevant articles. Methodological quality was graded using the Medical Education Research Study Quality Instrument (MERSQI). Effect sizes, heterogeneity estimates, analysis of possible moderators, and publication bias were computed using the COMPREHENSIVE META-ANALYSIS software. A meta-analysis of 28 eligible comparative studies (between-subject design) showed an overall significant effect in favor of flipped classrooms over traditional classrooms for health professions education (standardized mean difference, SMD = 0.33, 95% confidence interval, CI = 0.21-0.46, p flipped classroom approach was more effective when instructors used quizzes at the start of each in-class session. More respondents reported they preferred flipped to traditional classrooms. Current evidence suggests that the flipped classroom approach in health professions education yields a significant improvement in student learning compared with traditional teaching methods.

  6. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  7. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  8. Application of computational fluid dynamics methods to improve thermal hydraulic code analysis

    Science.gov (United States)

    Sentell, Dennis Shannon, Jr.

    A computational fluid dynamics code is used to model the primary natural circulation loop of a proposed small modular reactor for comparison to experimental data and best-estimate thermal-hydraulic code results. Recent advances in computational fluid dynamics code modeling capabilities make them attractive alternatives to the current conservative approach of coupled best-estimate thermal hydraulic codes and uncertainty evaluations. The results from a computational fluid dynamics analysis are benchmarked against the experimental test results of a 1:3 length, 1:254 volume, full pressure and full temperature scale small modular reactor during steady-state power operations and during a depressurization transient. A comparative evaluation of the experimental data, the thermal hydraulic code results and the computational fluid dynamics code results provides an opportunity to validate the best-estimate thermal hydraulic code's treatment of a natural circulation loop and provide insights into expanded use of the computational fluid dynamics code in future designs and operations. Additionally, a sensitivity analysis is conducted to determine those physical phenomena most impactful on operations of the proposed reactor's natural circulation loop. The combination of the comparative evaluation and sensitivity analysis provides the resources for increased confidence in model developments for natural circulation loops and provides for reliability improvements of the thermal hydraulic code.

  9. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H{sub 2}/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author).

  10. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Yoshida, Koichi; Yabuta, Tetsuro

    1992-01-01

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  11. Improving the Measurement of Shared Cultural Schemas with Correlational Class Analysis: Theory and Method

    Directory of Open Access Journals (Sweden)

    Andrei Boutyline

    2017-05-01

    Full Text Available Measurement of shared cultural schemas is a central methodological challenge for the sociology of culture. Relational Class Analysis (RCA is a recently developed technique for identifying such schemas in survey data. However, existing work lacks a clear definition of such schemas, which leaves RCA’s accuracy largely unknown. Here, I build on the theoretical intuitions behind RCA to arrive at this definition. I demonstrate that shared schemas should result in linear dependencies between survey rows—the relationship usually measured with Pearson’s correlation. I thus modify RCA into a “Correlational Class Analysis” (CCA. When I compare the methods using a broad set of simulations, results show that CCA is reliably more accurate at detecting shared schemas than RCA, even in scenarios that substantially violate CCA’s assumptions. I find no evidence of theoretical settings where RCA is more accurate. I then revisit a previous RCA analysis of the 1993 General Social Survey musical tastes module. Whereas RCA partitioned these data into three schematic classes, CCA partitions them into four. I compare these results with a multiple-groups analysis in structural equation modeling and find that CCA’s partition yields greatly improved model fit over RCA. I conclude with a parsimonious framework for future work.

  12. Reliability of multiresolution deconvolution for improving depth resolution in SIMS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boulakroune, M’Hamed, E-mail: Boulakroune.mhamed@univ-ouargla.dz

    2016-11-15

    Highlights: • Recovery of SIMS profiles by enhancement of depth resolution using multiresolution deconvolution. • The multiresolution deconvolution is based on Tikhonov Miller regularization and wavelet analysis. • Local application of the regularization parameter at each resolution level provided to smoothed signals without artifacts related to noise. • The aim is to show the ability of multiresolution deconvolution to restore two extremely different structures large and thin. • On thin structure the multiresolution deconvolution by zone was successfully applied. - Abstract: This paper deals the effectiveness and reliability of multiresolution deconvolution algorithm for recovery Secondary Ions Mass Spectrometry, SIMS, profiles altered by the measurement. This new algorithm is characterized as a regularized wavelet transform. It combines ideas from Tikhonov Miller regularization, wavelet analysis and deconvolution algorithms in order to benefit from the advantages of each. The SIMS profiles were obtained by analysis of two structures of boron in a silicon matrix using a Cameca-Ims6f instrument at oblique incidence. The first structure is large consisting of two distant wide boxes and the second one is thin structure containing ten delta-layers in which the deconvolution by zone was applied. It is shown that this new multiresolution algorithm gives best results. In particular, local application of the regularization parameter of blurred and estimated solutions at each resolution level provided to smoothed signals without creating artifacts related to noise content in the profile. This led to a significant improvement in the depth resolution and peaks’ maximums.

  13. Preparation of Improved Turkish DataSet for Sentiment Analysis in Social Media

    Directory of Open Access Journals (Sweden)

    Makinist Semiha

    2017-01-01

    Full Text Available A public dataset, with a variety of properties suitable for sentiment analysis [1], event prediction, trend detection and other text mining applications, is needed in order to be able to successfully perform analysis studies. The vast majority of data on social media is text-based and it is not possible to directly apply machine learning processes into these raw data, since several different processes are required to prepare the data before the implementation of the algorithms. For example, different misspellings of same word enlarge the word vector space unnecessarily, thereby it leads to reduce the success of the algorithm and increase the computational power requirement. This paper presents an improved Turkish dataset with an effective spelling correction algorithm based on Hadoop [2]. The collected data is recorded on the Hadoop Distributed File System and the text based data is processed by MapReduce programming model. This method is suitable for the storage and processing of large sized text based social media data. In this study, movie reviews have been automatically recorded with Apache ManifoldCF (MCF [3] and data clusters have been created. Various methods compared such as Levenshtein and Fuzzy String Matching have been proposed to create a public dataset from collected data. Experimental results show that the proposed algorithm, which can be used as an open source dataset in sentiment analysis studies, have been performed successfully to the detection and correction of spelling errors.

  14. In vivo dynamics of skeletal muscle Dystrophin in zebrafish embryos revealed by improved FRAP analysis.

    Science.gov (United States)

    Bajanca, Fernanda; Gonzalez-Perez, Vinicio; Gillespie, Sean J; Beley, Cyriaque; Garcia, Luis; Theveneau, Eric; Sear, Richard P; Hughes, Simon M

    2015-10-13

    Dystrophin forms an essential link between sarcolemma and cytoskeleton, perturbation of which causes muscular dystrophy. We analysed Dystrophin binding dynamics in vivo for the first time. Within maturing fibres of host zebrafish embryos, our analysis reveals a pool of diffusible Dystrophin and complexes bound at the fibre membrane. Combining modelling, an improved FRAP methodology and direct semi-quantitative analysis of bleaching suggests the existence of two membrane-bound Dystrophin populations with widely differing bound lifetimes: a stable, tightly bound pool, and a dynamic bound pool with high turnover rate that exchanges with the cytoplasmic pool. The three populations were found consistently in human and zebrafish Dystrophins overexpressed in wild-type or dmd(ta222a/ta222a) zebrafish embryos, which lack Dystrophin, and in Gt(dmd-Citrine)(ct90a) that express endogenously-driven tagged zebrafish Dystrophin. These results lead to a new model for Dystrophin membrane association in developing muscle, and highlight our methodology as a valuable strategy for in vivo analysis of complex protein dynamics.

  15. SWOT analysis of a pediatric rehabilitation programme: a participatory evaluation fostering quality improvement.

    Science.gov (United States)

    Camden, Chantal; Swaine, Bonnie; Tétreault, Sylvie; Bergeron, Sophie

    2009-01-01

    To present the results of a strengths, weaknesses, opportunities and threats (SWOT) analysis used as part of a process aimed at reorganising services provided within a pediatric rehabilitation programme (PRP) in Quebec, Canada and to report the perceptions of the planning committee members regarding the usefulness of the SWOT in this process. Thirty-six service providers working in the PRP completed a SWOT questionnaire and reported what they felt worked and what did not work in the existing model of care. Their responses were used by a planning committee over a 12-month period to assist in the development of a new service delivery model. Committee members shared their thoughts about the usefulness of the SWOT. Current programme strengths included favourable organisational climate and interdisciplinary work whereas weaknesses included lack of psychosocial support to families and long waiting times for children. Opportunities included working with community partners, whereas fear of losing professional autonomy with the new service model was a threat. The SWOT results helped the planning committee redefine the programme goals and make decisions to improve service coordination. SWOT analysis was deemed as a very useful tool to help guide service reorganisation. SWOT analysis appears to be an interesting evaluation tool to promote awareness among service providers regarding the current functioning of a rehabilitation programme. It fosters their active participation in the reorganisation of a new service delivery model for pediatric rehabilitation.

  16. Partnership capacity for community health improvement plan implementation: findings from a social network analysis.

    Science.gov (United States)

    McCullough, J Mac; Eisen-Cohen, Eileen; Salas, S Bianca

    2016-07-13

    Many health departments collaborate with community organizations on community health improvement processes. While a number of resources exist to plan and implement a community health improvement plan (CHIP), little empirical evidence exists on how to leverage and expand partnerships when implementing a CHIP. The purpose of this study was to identify characteristics of the network involved in implementing the CHIP in one large community. The aims of this analysis are to: 1) identify essential network partners (and thereby highlight potential network gaps), 2) gauge current levels of partner involvement, 3) understand and effectively leverage network resources, and 4) enable a data-driven approach for future collaborative network improvements. We collected primary data via survey from n = 41 organizations involved in the Health Improvement Partnership of Maricopa County (HIPMC), in Arizona. Using the previously validated Program to Analyze, Record, and Track Networks to Enhance Relationships (PARTNER) tool, organizations provided information on existing ties with other coalition members, including frequency and depth of partnership and eight categories of perceived value/trust of each current partner organization. The coalition's overall network had a density score of 30 %, degree centralization score of 73 %, and trust score of 81 %. Network maps are presented to identify existing relationships between HIPMC members according to partnership frequency and intensity, duration of involvement in the coalition, and self-reported contributions to the coalition. Overall, number of ties and other partnership measures were positively correlated with an organization's perceived value and trustworthiness as rated by other coalition members. Our study presents a novel use of social network analysis methods to evaluate the coalition of organizations involved in implementing a CHIP in an urban community. The large coalition had relatively low network density but high

  17. Partnership capacity for community health improvement plan implementation: findings from a social network analysis

    Directory of Open Access Journals (Sweden)

    J. Mac McCullough

    2016-07-01

    Full Text Available Abstract Background Many health departments collaborate with community organizations on community health improvement processes. While a number of resources exist to plan and implement a community health improvement plan (CHIP, little empirical evidence exists on how to leverage and expand partnerships when implementing a CHIP. The purpose of this study was to identify characteristics of the network involved in implementing the CHIP in one large community. The aims of this analysis are to: 1 identify essential network partners (and thereby highlight potential network gaps, 2 gauge current levels of partner involvement, 3 understand and effectively leverage network resources, and 4 enable a data-driven approach for future collaborative network improvements. Methods We collected primary data via survey from n = 41 organizations involved in the Health Improvement Partnership of Maricopa County (HIPMC, in Arizona. Using the previously validated Program to Analyze, Record, and Track Networks to Enhance Relationships (PARTNER tool, organizations provided information on existing ties with other coalition members, including frequency and depth of partnership and eight categories of perceived value/trust of each current partner organization. Results The coalition’s overall network had a density score of 30 %, degree centralization score of 73 %, and trust score of 81 %. Network maps are presented to identify existing relationships between HIPMC members according to partnership frequency and intensity, duration of involvement in the coalition, and self-reported contributions to the coalition. Overall, number of ties and other partnership measures were positively correlated with an organization’s perceived value and trustworthiness as rated by other coalition members. Conclusions Our study presents a novel use of social network analysis methods to evaluate the coalition of organizations involved in implementing a CHIP in an urban community. The

  18. An Improved, Automated Whole-Air Sampler and VOC Analysis System: Results from SONGNEX 2015

    Science.gov (United States)

    Lerner, B. M.; Gilman, J.; Tokarek, T. W.; Peischl, J.; Koss, A.; Yuan, B.; Warneke, C.; Isaacman-VanWertz, G. A.; Sueper, D.; De Gouw, J. A.; Aikin, K. C.

    2015-12-01

    Accurate measurement of volatile organic compounds (VOCs) in the troposphere is critical for the understanding of emissions and physical and chemical processes that can impact both air quality and climate. Airborne VOC measurements have proven challenging due to the requirements of short sample collection times (=10 s) to maximize spatial resolution and sampling frequency and high sensitivity (pptv) to chemically diverse hydrocarbons, halocarbons, oxygen- and nitrogen-containing VOCs. NOAA ESRL CSD has built an improved whole air sampler (iWAS) which collects compressed ambient air samples in electropolished stainless steel canisters, based on the NCAR HAIS Advanced Whole Air Sampler [Atlas and Blake]. Post-flight chemical analysis is performed with a custom-built gas chromatograph-mass spectrometer system that pre-concentrates analyte cryostatically via a Stirling cooler, an electromechanical chiller which precludes the need for liquid nitrogen to reach trapping temperatures. For the 2015 Shale Oil and Natural Gas Nexus Study (SONGNEX), CSD conducted iWAS measurements on 19 flights aboard the NOAA WP-3D aircraft between March 19th and April 27th. Nine oil and natural gas production regions were surveyed during SONGNEX and more than 1500 air samples were collected and analyzed. For the first time, we employed real-time mapping of sample collection combined with live data from fast time-response measurements (e.g. ethane) for more uniform surveying and improved target plume sampling. Automated sample handling allowed for more than 90% of iWAS canisters to be analyzed within 96 hours of collection - for the second half of the campaign improved efficiencies reduced the median sample age at analysis to 36 hours. A new chromatography peak-fitting software package was developed to minimize data reduction time by an order of magnitude without a loss of precision or accuracy. Here we report mixing ratios for aliphatic and aromatic hydrocarbons (C2-C8) along with select

  19. Improvement of microtome cutting process of carbon nanotube composite sample preparation for TEM analysis

    Science.gov (United States)

    Trayner, Sarah

    As research progresses towards nanoscale materials, there has become a need for a more efficient and effective way to obtain ultra-thin samples for imaging under transmission electron microscope (TEM) for atomic resolution analysis. There are various methods used to obtain thin samples (research is a continuous effort to study and improve the ultra-microtome cutting technique to provide an effective and reliable approach of obtaining an ultra-thin (25-50 nm) cross section of a CNT/polymer composite for high resolution TEM analysis. Improvements were achieved by studying the relationships between the chosen cutting parameters, sample characteristics and TEM image quality. From this information, a cutting protocol was established so that ultra-thin sample slices can be achieved by different microtome operators for high resolution TEM analysis. In addition, a custom tool was created to aid in the sample collection process. In this research, three composite samples were studied for both microtome cutting and TEM analysis: 1) Unidirectional (UD) IM7/BMI composite; 2) Single-layer CNT buckypaper (BP)/epoxy nanocomposite; 3) 3-layer CNT BP/BMI nanocomposite. The resultant TEM images revealed a clear microstructure consisting of amorphous resin and graphite crystalline packing. UD IM7/BMI composite TEM results did not reveal an interfacial region resulting in a need for even thinner sliced cross sections. TEM results for the single-layer CNT BP/epoxy nanocomposite revealed the alignment direction of the nanotubes and numerous stacks of CNT bundles. In addition, there was visible flattening of CNT packing into dumbbell shapes similar to results obtain by Alan Windle. TEM results for the 3-layer CNT BP/BMI nanocomposite revealed uniformly cut resin. However, when the diamond knife reached graphite crystalline regions, the nanotube either became deformed into a cone-like structure, was cut at a thicker thickness than the resin, or folded over onto itself. This is most likely

  20. Term AnalysisImproving the Quality of Learning and Application Documents in Engineering Design

    Directory of Open Access Journals (Sweden)

    S. Weiss

    2006-01-01

    Full Text Available Conceptual homogeneity is one determinant of the quality of text documents. A concept remains the same if the words used (termini change [1, 2]. In other words, termini can vary while the concept retains the same meaning. Human beings are able to handle concepts and termini because of their semantic network, which is able to connect termini to the actual context and thus identify the adequate meaning of the termini. Problems could arise when humans have to learn new content and correspondingly new concepts. Since the content is basically imparted by text via particular termini, it is a challenge to establish the right concept from the text with the termini. A term might be known, but have a different meaning [3, 4]. Therefore, it is very important to build up the correct understanding of concepts within a text. This is only possible when concepts are explained by the right termini, within an adequate context, and above all, homogeneously. So, when setting up or using text documents for teaching or application, it is essential to provide concept homogeneity.Understandably, the quality of documents is, ceteris paribus, reciprocally proportional to variations of termini. Therefore, an analysis of variations of termini could form a basis for specific improvement of conceptual homogeneity.Consequently, an exposition of variations of termini as control and improvement parameters is carried out in this investigation. This paper describes the functionality and the profit of a tool called TermAnalysis.It also outlines the margins, typeface and other vital specifications necessary for authors preparing camera-ready papers for submission to the 5th International Conference on Advanced Engineering Design. The aim of this paper is to ensure that all readers are clear as to the uniformity required by the organizing committee and to ensure that readers’ papers will be accepted as camera-ready for the conference.TermAnalysis is a software tool developed

  1. Thermodynamic analysis questions claims of improved cardiac efficiency by dietary fish oil

    Science.gov (United States)

    Goo, Eden; Chapman, Brian; Hickey, Anthony J.R.

    2016-01-01

    Studies in the literature describe the ability of dietary supplementation by omega-3 fish oil to increase the pumping efficiency of the left ventricle. Here we attempt to reconcile such studies with our own null results. We undertake a quantitative analysis of the improvement that could be expected theoretically, subject to physiological constraints, by posing the following question: By how much could efficiency be expected to increase if inefficiencies could be eliminated? Our approach utilizes thermodynamic analyses to investigate the contributions, both singly and collectively, of the major components of cardiac energetics to total cardiac efficiency. We conclude that it is unlikely that fish oils could achieve the required diminution of inefficiencies without greatly compromising cardiac performance. PMID:27574288

  2. Use of video-feedback, reflection, and interactive analysis to improve nurse leadership practices.

    Science.gov (United States)

    Crenshaw, Jeannette T

    2012-01-01

    The chronic shortage of registered nurses (RNs) affects patient safety and health care quality. Many factors affect the RN shortage in the workforce, including negative work environments, exacerbated by ineffective leadership approaches. Improvements in the use of relationship-based leadership approaches lead to healthier work environments that foster RN satisfaction and reduce RN turnover and vacancy rates in acute care settings. In this article, an innovative approach to reduce nurse turnover and decrease vacancy rates in acute care settings is described. Video feedback with reflection and interactive analysis is an untapped resource for nurse leaders and aspiring nurse leaders in their development of effective leadership skills. This unique method may be an effective leadership strategy for addressing recruitment and retention issues in a diverse workforce.

  3. Cross-sectional atom probe tomography sample preparation for improved analysis of fins on SOI.

    Science.gov (United States)

    Martin, Andrew J; Weng, Weihao; Zhu, Zhengmao; Loesing, Rainer; Shaffer, James; Katnani, Ahmad

    2016-02-01

    Sample preparation for atom probe tomography of 3D semiconductor devices has proven to significantly affect field evaporation and the reliability of reconstructed data. A cross-sectional preparation method is applied to state-of-the-art Si finFET technology on SOI. This preparation approach advantageously provides a conductive path for voltage and heat, offers analysis of many fins within a single tip, and improves resolution across interfaces of particular interest. Measured B and Ge profiles exhibit good correlation with SIMS and EDX and show no signs of B clustering or pile-up near the Si/SiGe interface of the fin. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot

    2009-03-01

    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  5. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. © The Author(s) 2016.

  6. Analysis of the characteristics of taxi services as a prerequisite for their improvement

    Directory of Open Access Journals (Sweden)

    Vujić Nenad

    2014-01-01

    Full Text Available The expansion of services sector is the characteristics of modern and developed societies that influence national economy. Therefore, the analysis of services, as a concept and part of marketing is very significant. In this sense, the paper researches a particular service - the taxi services in the capital of Serbia. Through this research, the authors try to define the groups of customer of taxi services and their preference and attitudes. The research was performed in period May to July 2014, by direct contact with customer of taxi services. The results of research have confirmed the initial hypothesis and provide possibilities for further insight into the way of using taxi services and general circumstances that characterize them in mentioned region. On this basis, it is provided proposals for improvement of taxi services and easier outreach of target groups.

  7. Analysis from reviews in Social Media to improve hotel´s online reputation

    Directory of Open Access Journals (Sweden)

    Daissy Hatblathy Moya Sánchez

    2017-07-01

    Full Text Available Today, hoteliers have problems with handling online reputation due to bad reviews they’ve received on social networks. The aim of this research is to identify the key factors to consider in the operation of each hotel to avoid negative comments and to increase their online reputation. The ratings received by virtual means in 57 Latin American hotels belonging to the GHL Hotel Chain from March 31st, 2015 until March 31st, 2016. By using the software Revinate, there were analyzed the reviews by department. Then, they were classified to developed a manual of good practices. From the analysis of those comments, recommendations were made on six areas of the hotels: Rooms, Food and Beverage, Front Desk, Business Center, Security, and Management to optimize the quality in hotels and thus improve their online reputation.

  8. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    Science.gov (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  9. Improved Label-Free LC-MS Analysis by Wavelet-Based Noise Rejection

    Directory of Open Access Journals (Sweden)

    Salvatore Cappadona

    2010-01-01

    Full Text Available Label-free LC-MS analysis allows determining the differential expression level of proteins in multiple samples, without the use of stable isotopes. This technique is based on the direct comparison of multiple runs, obtained by continuous detection in MS mode. Only differentially expressed peptides are selected for further fragmentation, thus avoiding the bias toward abundant peptides typical of data-dependent tandem MS. The computational framework includes detection, alignment, normalization and matching of peaks across multiple sets, and several software packages are available to address these processing steps. Yet, more care should be taken to improve the quality of the LC-MS maps entering the pipeline, as this parameter severely affects the results of all downstream analyses. In this paper we show how the inclusion of a preprocessing step of background subtraction in a common laboratory pipeline can lead to an enhanced inclusion list of peptides selected for fragmentation and consequently to better protein identification.

  10. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Zamorano, Montserrat

    2010-01-01

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  11. A critical analysis of energy efficiency improvement potentials in Taiwan's cement industry

    International Nuclear Information System (INIS)

    Huang, Yun-Hsun; Chang, Yi-Lin; Fleiter, Tobias

    2016-01-01

    The cement industry is the second most energy-intensive sector in Taiwan, which underlines the need to understand its potential for energy efficiency improvement. A bottom-up model-based assessment is utilized to conduct a scenario analysis of energy saving opportunities up to the year 2035. The analysis is supported by detailed expert interviews in all cement plants of Taiwan. The simulation results reveal that by 2035, eighteen energy efficient technologies could result in 25% savings for electricity and 9% savings for fuels under the technical diffusion scenario. This potential totally amounts to about 5000 TJ/year, of which 91% can be implemented cost-effectively assuming a discount rate of 10%. Policy makers should support a fast diffusion of these technologies. Additionally, policy makers can tap further saving potentials. First, by decreasing the clinker share, which is currently regulated to a minimum of 95%. Second, by extending the prohibition to build new cement plants by allowing for replacement of existing capacity with new innovative plants in the coming years. Third, by supporting the use of alternative fuels, which is currently still a niche in Taiwan. - Highlights: •We analyze energy efficiency improvement potentials in Taiwan's cement industry. •Eighteen process-specific technologies are analyzed using a bottom-up model. •Our model systematically reflects the diffusion of technologies over time. •We find energy-saving potentials of 25% for electricity and 9% for fuels in 2035. •91% of the energy-saving potentials can be realized cost-effectively.

  12. Improving time-frequency domain sleep EEG classification via singular spectrum analysis.

    Science.gov (United States)

    Mahvash Mohammadi, Sara; Kouchaki, Samaneh; Ghavami, Mohammad; Sanei, Saeid

    2016-11-01

    Manual sleep scoring is deemed to be tedious and time consuming. Even among automatic methods such as time-frequency (T-F) representations, there is still room for more improvement. To optimise the efficiency of T-F domain analysis of sleep electroencephalography (EEG) a novel approach for automatically identifying the brain waves, sleep spindles, and K-complexes from the sleep EEG signals is proposed. The proposed method is based on singular spectrum analysis (SSA). The single-channel EEG signal (C3-A2) is initially decomposed and then the desired components are automatically separated. In addition, the noise is removed to enhance the discrimination ability of features. The obtained T-F features after preprocessing stage are classified using a multi-class support vector machines (SVMs) and used for the identification of four sleep stages over three sleep types. Furthermore, to emphasise on the usefulness of the proposed method the automatically-determined spindles are parameterised to discriminate three sleep types. The four sleep stages are classified through SVM twice: with and without preprocessing stage. The mean accuracy, sensitivity, and specificity for before the preprocessing stage are: 71.5±0.11%, 56.1±0.09% and 86.8±0.04% respectively. However, these values increase significantly to 83.6±0.07%, 70.6±0.14% and 90.8±0.03% after applying SSA. The new T-F representation has been compared with the existing benchmarks. Our results prove that, the proposed method well outperforms the previous methods in terms of identification and representation of sleep stages. Experimental results confirm the performance improvement in terms of classification rate and also representative T-F domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Observations of Tunable Resistive Pulse Sensing for Exosome Analysis: Improving System Sensitivity and Stability.

    Science.gov (United States)

    Anderson, Will; Lane, Rebecca; Korbie, Darren; Trau, Matt

    2015-06-16

    Size distribution and concentration measurements of exosomes are essential when investigating their cellular function and uptake. Recently, a particle size distribution and concentration measurement platform known as tunable resistive pulse sensing (TRPS) has seen increased use for the characterization of exosome samples. TRPS measures the brief increase in electrical resistance (a resistive pulse) produced by individual submicrometer/nanoscale particles as they translocate through a size-tunable submicrometer/micrometer-sized pore, embedded in an elastic membrane. Unfortunately, TRPS measurements are susceptible to issues surrounding system stability, where the pore can become blocked by particles, and sensitivity issues, where particles are too small to be detected against the background noise of the system. Herein, we provide a comprehensive analysis of the parameters involved in TRPS exosome measurements and demonstrate the ability to improve system sensitivity and stability by the optimization of system parameters. We also provide the first analysis of system noise, sensitivity cutoff limits, and accuracy with respect to exosome measurements and offer an explicit definition of system sensitivity that indicates the smallest particle diameter that can be detected within the noise of the trans-membrane current. A comparison of exosome size measurements from both TRPS and cryo-electron microscopy is also provided, finding that a significant number of smaller exosomes fell below the detection limit of the TRPS platform and offering one potential insight as to why there is such large variability in the exosome size distribution reported in the literature. We believe the observations reported here may assist others in improving TRPS measurements for exosome samples and other submicrometer biological and nonbiological particles.

  14. Analysis of an Online Match Discussion Board: Improving the Otolaryngology—Head and Neck Surgery Match

    Science.gov (United States)

    Kozin, Elliott D.; Sethi, Rosh; Lehmann, Ashton; Remenschneider, Aaron K.; Golub, Justin S.; Reyes, Samuel A.; Emerick, Kevin; Lee, Daniel J.; Gray, Stacey T.

    2015-01-01

    Introduction “The Match” has become the accepted selection process for graduate medical education. Otomatch.com has provided an online forum for Otolaryngology-Head and Neck Surgery (OHNS) Match-related questions for over a decade. Herein, we aim to 1) delineate the types of posts on Otomatch to better understand the perspective of medical students applying for residency and 2) provide recommendations to potentially improve the Match process. Methods Discussion forum posts on Otomatch between December 2001 and April 2014 were reviewed. The title of each thread and total number of views were recorded for quantitative analysis. Each thread was organized into one of six major categories and one of eighteen subcategories, based on chronology within the application cycle and topic. National Resident Matching Program (NRMP) data were utilized for comparison. Results We identified 1,921 threads corresponding to over 2 million page views. Over 40% of threads related to questions about specific programs, and 27% were discussions about interviews. Views, a surrogate measure for popularity, reflected different trends. The majority of individuals viewed posts on interviews (42%), program specific questions (20%) and how to rank programs (11%). Increase in viewership tracked with a rise in applicant numbers based on NRMP data. Conclusions Our study provides an in depth analysis of a popular discussion forum for medical students interested in the OHNS Match. The most viewed posts are about interview dates and questions regarding specific programs. We provide suggestions to address unmet needs for medical students and potentially improve the Match process. PMID:25550223

  15. Recommendations to improve imaging and analysis of brain lesion load and atrophy in longitudinal studies of multiple sclerosis

    NARCIS (Netherlands)

    Vrenken, H.; Jenkinson, M.; Horsfield, M.A.; Battaglini, M.; van Schijndel, R.A.; Rostrup, E.; Geurts, J.J.G.; Fisher, E.; Zijdenbos, A.; Ashburner, J.; Miller, D. H.; Filippi, M.; Fazekas, F.; Rovaris, M.; Rovira, A.; Barkhof, F.; De Stefano, N.

    2013-01-01

    Focal lesions and brain atrophy are the most extensively studied aspects of multiple sclerosis (MS), but the image acquisition and analysis techniques used can be further improved, especially those for studying within-patient changes of lesion load and atrophy longitudinally. Improved accuracy and

  16. How to Use Value-Added Analysis to Improve Student Learning: A Field Guide for School and District Leaders

    Science.gov (United States)

    Kennedy, Kate; Peters, Mary; Thomas, Mike

    2012-01-01

    Value-added analysis is the most robust, statistically significant method available for helping educators quantify student progress over time. This powerful tool also reveals tangible strategies for improving instruction. Built around the work of Battelle for Kids, this book provides a field-tested continuous improvement model for using…

  17. Failure mode and effect analysis: improving intensive care unit risk management processes.

    Science.gov (United States)

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  18. Instrumented gait analysis: a measure of gait improvement by a wheeled walker in hospitalized geriatric patients.

    Science.gov (United States)

    Schülein, Samuel; Barth, Jens; Rampp, Alexander; Rupprecht, Roland; Eskofier, Björn M; Winkler, Jürgen; Gaßmann, Karl-Günter; Klucken, Jochen

    2017-02-27

    In an increasing aging society, reduced mobility is one of the most important factors limiting activities of daily living and overall quality of life. The ability to walk independently contributes to the mobility, but is increasingly restricted by numerous diseases that impair gait and balance. The aim of this cross-sectional observation study was to examine whether spatio-temporal gait parameters derived from mobile instrumented gait analysis can be used to measure the gait stabilizing effects of a wheeled walker (WW) and whether these gait parameters may serve as surrogate marker in hospitalized patients with multifactorial gait and balance impairment. One hundred six patients (ages 68-95) wearing inertial sensor equipped shoes passed an instrumented walkway with and without gait support from a WW. The walkway assessed the risk of falling associated gait parameters velocity, swing time, stride length, stride time- and double support time variability. Inertial sensor-equipped shoes measured heel strike and toe off angles, and foot clearance. The use of a WW improved the risk of spatio-temporal parameters velocity, swing time, stride length and the sagittal plane associated parameters heel strike and toe off angles in all patients. First-time users (FTUs) showed similar gait parameter improvement patterns as frequent WW users (FUs). However, FUs with higher levels of gait impairment improved more in velocity, stride length and toe off angle compared to the FTUs. The impact of a WW can be quantified objectively by instrumented gait assessment. Thus, objective gait parameters may serve as surrogate markers for the use of walking aids in patients with gait and balance impairments.

  19. [Security of hospital infusion practices: From an a priori risk analysis to an improvement action plan].

    Science.gov (United States)

    Pignard, J; Cosserant, S; Traore, O; Souweine, B; Sautou, V

    2016-03-01

    Infusion in care units, and all the more in intensive care units, is a complex process which can be the source of many risks for the patient. Under cover of an institutional approach for the improvement of the quality and safety of patient healthcare, a risk mapping infusion practices was performed. The analysis was focused on intravenous infusion situations in adults, the a priori risk assessment methodology was applied and a multidisciplinary work group established. Forty-three risks were identified for the infusion process (prescription, preparation and administration). The risks' assessment and the existing means of control showed that 48% of them would have a highly critical patient security impact. Recommendations were developed for 20 risks considered to be most critical, to limit their occurrence and severity, and improve their control level. An institutional action plan was developed and validated in the Drug and Sterile Medical Devices Commission. This mapping allowed the realization of an exhaustive inventory of potential risks associated with the infusion. At the end of this work, multidisciplinary groups were set up to work on different themes and regular quarterly meetings were established to follow the progress of various projects. Risk mapping will be performed in pediatric and oncology unit where the risks associated with the handling of toxic products is omnipresent. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  20. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  1. Use of Selection Indices Based on Multivariate Analysis for Improving Grain Yield in Rice

    Directory of Open Access Journals (Sweden)

    Hossein SABOURI

    2008-12-01

    Full Text Available In order to study selection indices for improving rice grain yield, a cross was made between an Iranian traditional rice (Oryza sativa L. variety, Tarommahalli and an improved indica rice variety, Khazar in 2006. The traits of the parents (30 plants, F1 (30 plants and F2 generations (492 individuals were evaluated at the Rice Research Institute of Iran (RRII during 2007. Heritabilities of the number of panicles per plant, plant height, days to heading and panicle exsertion were greater than that of grain yield. The selection indices were developed using the results of multivariate analysis. To evaluate selection strategies to maximize grain yield, 14 selection indices were calculated based on two methods (optimum and base and combinations of 12 traits with various economic weights. Results of selection indices showed that selection for grain weight, number of panicles per plant and panicle length by using their phenotypic and/or genotypic direct effects (path coefficient as economic weights should serve as an effective selection criterion for using either the optimum or base index.

  2. Systematic review and meta-analysis of behavioral interventions to improve child pedestrian safety.

    Science.gov (United States)

    Schwebel, David C; Barton, Benjamin K; Shen, Jiabin; Wells, Hayley L; Bogar, Ashley; Heath, Gretchen; McCullough, David

    2014-09-01

    Pedestrian injuries represent a pediatric public health challenge. This systematic review/meta-analysis evaluated behavioral interventions to teach children pedestrian safety. Multiple strategies derived eligible manuscripts (published before April 1, 2013, randomized design, evaluated behavioral child pedestrian safety interventions). Screening 1,951 abstracts yielded 125 full-text retrievals. 25 were retained for data extraction, and 6 were later omitted due to insufficient data. In all, 19 articles reporting 25 studies were included. Risk of bias and quality of evidence were assessed. Behavioral interventions generally improve children's pedestrian safety, both immediately after training and at follow-up several months later. Quality of the evidence was low to moderate. Available evidence suggested interventions targeting dash-out prevention, crossing at parked cars, and selecting safe routes across intersections were effective. Individualized/small-group training for children was the most effective training strategy based on available evidence. Behaviorally based interventions improve children's pedestrian safety. Efforts should continue to develop creative, cost-efficient, and effective interventions. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Improving Students’ Argumentation Style Ability in Writing Essay through Discourse Analysis Model Critical Thinking Map Oriented

    Directory of Open Access Journals (Sweden)

    R. Panca Pertiwi Hidayati

    2017-03-01

    Full Text Available Students’ ability in writing anessay, as one of language skills which can improve the creativity in language, is a serious problem that should be investigated by a deep research. Besides for showing that language as a thinking tool, writing competence can be seen as a product if we take a look from critical thinking ability measurement aspect and also as a process if we take a look at the individual development itself aspect. One of indicator for measuring the students’ critical is by the ability in delivering their argumentation style which is showed in their essay. The goal of this research is obtaining the objective discourse analysis model critical thinking map oriented in improving students’ argumentation style skills in writing anessay, hence can be used as a modelling for the growing of students’ creativity in a variety of writing skills. Based on paired-samples t test, the pre-test/post-test paired-sample is significant, because the significance (2-tailed is less than 0.05 in the standard of 95%. This is mean, there is a real difference between the ability before and after getting the treatment. The sample’s ability after getting the treatment is better than before. As well as if be compared with control class, experiment class shows the significant superiority for all aspects of essay writing assessment.

  4. Effectiveness of goal management training® in improving executive functions: A meta-analysis.

    Science.gov (United States)

    Stamenova, Vessela; Levine, Brian

    2018-03-14

    Our objective was to review the literature and quantitatively summarise the effectiveness of Goal Management Training® (GMT) (alone or in combination with other training approaches) in improving executive functions in adult populations. Ovid, Scopus, Web of Science, and ProQuest Dissertations & Theses Global were searched for articles citing "goal management training". Any group trials (n > 3) in adults that used multiple-session GMT programmes were included in the analyses. Outcome variables were extracted and classified into one of nine cognitive measures domains: executive functioning tasks, everyday executive functioning tasks, subjective executive tasks rated by the patient, subjective executive tasks rated by proxy, working memory, speed of processing, long-term memory, instrumental activities of daily living and general mental health status questionnaires. A total of 21 publications, containing 19 separate treatment group samples were included in the final analyses. Significantly positive small to moderate effect sizes were observed in all cognitive measure domains (except speed of processing) with effects maintained at follow-up assessments for all followed-up outcome measures, except for subjective ratings by patients and proxy. The analysis suggests that GMT is an effective intervention, leading to moderate improvements in executive functions that are usually maintained at follow-up.

  5. Improvement of the fringe analysis algorithm for wavelength scanning interferometry based on filter parameter optimization.

    Science.gov (United States)

    Zhang, Tao; Gao, Feng; Muhamedsalih, Hussam; Lou, Shan; Martin, Haydn; Jiang, Xiangqian

    2018-03-20

    The phase slope method which estimates height through fringe pattern frequency and the algorithm which estimates height through the fringe phase are the fringe analysis algorithms widely used in interferometry. Generally they both extract the phase information by filtering the signal in frequency domain after Fourier transform. Among the numerous papers in the literature about these algorithms, it is found that the design of the filter, which plays an important role, has never been discussed in detail. This paper focuses on the filter design in these algorithms for wavelength scanning interferometry (WSI), trying to optimize the parameters to acquire the optimal results. The spectral characteristics of the interference signal are analyzed first. The effective signal is found to be narrow-band (near single frequency), and the central frequency is calculated theoretically. Therefore, the position of the filter pass-band is determined. The width of the filter window is optimized with the simulation to balance the elimination of the noise and the ringing of the filter. Experimental validation of the approach is provided, and the results agree very well with the simulation. The experiment shows that accuracy can be improved by optimizing the filter design, especially when the signal quality, i.e., the signal noise ratio (SNR), is low. The proposed method also shows the potential of improving the immunity to the environmental noise by adapting the signal to acquire the optimal results through designing an adaptive filter once the signal SNR can be estimated accurately.

  6. Analysis of drought characteristics for improved understanding of a water resource system

    Directory of Open Access Journals (Sweden)

    A. T. Lennard

    2014-09-01

    Full Text Available Droughts are a reoccurring feature of the UK climate; recent drought events (2004–2006 and 2010–2012 have highlighted the UK’s continued vulnerability to this hazard. There is a need for further understanding of extreme events, particularly from a water resource perspective. A number of drought indices are available, which can help to improve our understanding of drought characteristics such as frequency, severity and duration. However, at present little of this is applied to water resource management in the water supply sector. Improved understanding of drought characteristics using indices can inform water resource management plans and enhance future drought resilience. This study applies the standardised precipitation index (SPI to a series of rainfall records (1962–2012 across the water supply region of a single utility provider. Key droughts within this period are analysed to develop an understanding of the meteorological characteristics that lead to, exist during and terminate drought events. The results of this analysis highlight how drought severity and duration can vary across a small-scale water supply region, indicating that the spatial coherence of drought events cannot be assumed.

  7. A fuzzy MICMAC analysis for improving supply chain performance of basic vaccines in developing countries.

    Science.gov (United States)

    Chandra, Dheeraj; Kumar, Dinesh

    2018-03-01

    In recent years, demand to improve child immunization coverage globally, and the development of the latest vaccines and technology has made the vaccine market very complex. The rise in such complexities often gives birth to numerous issues in the vaccine supply chain, which are the primary cause of its poor performance. Figuring out the cause of the performance problem can help you decide how to address it. The goal of the present study is to identify and analyze important issues in the supply chain of basic vaccines required for child immunization in the developing countries. Twenty-five key issues as various factors of the vaccine supply chain have been presented in this paper. Fuzzy MICMAC analysis has been carried out to classify the factors based on their driving and dependence power and to develop a hierarchy based model. Further, the findings have been discussed with the field experts to identify the critical factors. Three factors: better demand forecast, communication between the supply chain members, and proper planning and scheduling have been identified as the critical factors of vaccine supply chain. These factors should be given special care to improve vaccine supply chain performance.

  8. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    Science.gov (United States)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  9. Analysis of the Convention on Nuclear Safety and Suggestions for Improvement

    International Nuclear Information System (INIS)

    Choi, K. S.; Viet, Phuong Nguyen

    2013-01-01

    The innovative approach of the Convention, which is based on incentive after than legal binding, had been considered successful in strengthening the nuclear safety worldwide. However, the nuclear accident at the Fukushima Dai-ichi Nuclear Power Plant (Japan) in March 2011 has exposed a number of weaknesses of the Convention. Given that context, this paper will analyse the characteristics of the CNS in order to understand the advantages and disadvantages of the Convention, and finally to suggest some possible improvements. The analysis in this paper shows that the incentive approach of the CNS has succeeded in facilitating the active roles of its Contracting Parties in making the National Reports and participating in the peer review of these reports. However, the incoherent quality of the National Reports, the different level of participation in the peer review process by different Contracting Parties, and the lack of transparency of the peer review have undermined the effectiveness of the Convention in strengthening the international safety regime as well as preventing serious regulatory errors that had happened in Japan before the Fukushima accident. Therefore, the peer review process should be reformed into a more transparent and independent direction, while an advisory group of regulators within the CNS might also be useful in improving the effectiveness of the Convention as already proven by the good practice in the European Union. Only with such effective change, the CNS can maintain its pivotal role in the international safety regime

  10. Risk analysis of urban gas pipeline network based on improved bow-tie model

    Science.gov (United States)

    Hao, M. J.; You, Q. J.; Yue, Z.

    2017-11-01

    Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.

  11. Improvement and error analysis of quantitative information extraction in diffraction-enhanced imaging

    International Nuclear Information System (INIS)

    Yang Hao; Xuan Rui-Jiao; Hu Chun-Hong; Duan Jing-Hao

    2014-01-01

    Diffraction-enhanced imaging (DEI) is a powerful phase-sensitive technique that provides higher spatial resolution and supercontrast of weakly absorbing objects than conventional radiography. It derives contrast from the X-ray absorption, refraction, and ultra-small-angle X-ray scattering (USAXS) properties of an object. The separation of different-contrast contributions from images is an important issue for the potential application of DEI. In this paper, an improved DEI (IDEI) method is proposed based on the Gaussian curve fitting of the rocking curve (RC). Utilizing only three input images, the IDEI method can accurately separate the absorption, refraction, and USAXS contrasts produced by the object. The IDEI method can therefore be viewed as an improvement to the extended DEI (EDEI) method. In contrast, the IDEI method can circumvent the limitations of the EDEI method well since it does not impose a Taylor approximation on the RC. Additionally, analysis of the IDEI model errors is performed to further investigate the factors that lead to the image artifacts, and finally validation studies are conducted using computer simulation and synchrotron experimental data. (interdisciplinary physics and related areas of science and technology)

  12. SEISMIC FRAGILITY ANALYSIS OF IMPROVED RC FRAMES USING DIFFERENT TYPES OF BRACING

    Directory of Open Access Journals (Sweden)

    HAMED HAMIDI JAMNANI

    2017-04-01

    Full Text Available Application of bracings to increase the lateral stiffness of building structures is a technique of seismic improvement that engineers frequently have recourse to. Accordingly, investigating the role of bracings in concrete structures along with the development of seismic fragility curves are of overriding concern to civil engineers. In this research, an ordinary RC building, designed according to the 1st edition of Iranian seismic code, was selected for examination. According to FEMA 356 code, this building is considered to be vulnerable. To improve the seismic performance of this building, 3 different types of bracings, which are Concentrically Braced Frames, Eccentrically Braced Frames and Buckling Restrained Frames were employed, and each bracing element was distributed in 3 different locations in the building. The researchers developed fragility curves and utilized 30 earthquake records on the Peak Ground Acceleration seismic intensity scale to carry out a time history analysis. Tow damage scale, including Inter-Story Drifts and Plastic Axial Deformation were also used. The numerical results obtained from this investigation confirm that Plastic Axial Deformation is more reliable than conventional approaches in developing fragility curves for retrofitted frames. In lieu of what is proposed, the researchers selected the suitable damage scale and developed and compared log-normal distribution of fragility curves first for the original and then for the retrofitted building.

  13. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  14. Using containment analysis to improve component cooling water heat exchanger limits

    International Nuclear Information System (INIS)

    Da Silva, H.C.; Tajbakhsh, A.

    1995-01-01

    The Comanche Peak Steam Electric Station design requires that exit temperatures from the Component Cooling Water Heat Exchanger remain below 330.37 K during the Emergency Core Cooling System recirculation stage, following a hypothetical Loss of Coolant Accident (LOCA). Due to measurements indicating a higher than expected combination of: (a) high fouling factor in the Component Cooling Water Heat Exchanger with (b) high ultimate heat sink temperatures, that might lead to temperatures in excess of the 330.37 K limit, if a LOCA were to occur, TUElectric adjusted key flow rates in the Component Cooling Water network. This solution could only be implemented with improvements to the containment analysis methodology of record. The new method builds upon the CONTEMPT-LT/028 code by: (a) coupling the long term post-LOCA thermohydraulics with a more detailed analytical model for the complex Component Cooling Water Heat Exchanger network and (b) changing the way mass and energy releases are calculated after core reflood and steam generator energy is dumped to the containment. In addition, a simple code to calculate normal cooldowns was developed to confirm RHR design bases were met with the improved limits

  15. IMPROVEMENT OF THE LOCA PSA MODEL USING A BEST-ESTIMATE THERMAL-HYDRAULIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    DONG HYUN LEE

    2014-08-01

    Full Text Available Probabilistic Safety Assessment (PSA has been widely used to estimate the overall safety of nuclear power plants (NPP and it provides base information for risk informed application (RIA and risk informed regulation (RIR. For the effective and correct use of PSA in RIA/RIR related decision making, the risk estimated by a PSA model should be as realistic as possible. In this work, a best-estimate thermal-hydraulic analysis of loss-of-coolant accidents (LOCAs for the Hanul Nuclear Units 3&4 is first carried out in a systematic way. That is, the behaviors of peak cladding temperature (PCT were analyzed with various combinations of break sizes, the operating conditions of safety systems, and the operator's action time for aggressive secondary cooling. Thereafter, the results of the thermal-hydraulic analysis have been reflected in the improvement of the PSA model by changing both accident sequences and success criteria of the event trees for the LOCA scenarios.

  16. Using Critical Discourse Analysis Based Instruction to Improve EFL Learners’ Writing Complexity, Accuracy and Fluency

    Directory of Open Access Journals (Sweden)

    Hamid Marashi

    2016-11-01

    Full Text Available The literature of ELT is perhaps overwhelmed by attempts to enhance learners’ writing through the application of different methodologies. One such methodology is critical discourse analysis which is founded upon stressing not only the decoding of the propositional meaning of a text but also its ideological assumptions. Accordingly, this study was an attempt to investigate the impact of critical discourse analysis-based (CDA instruction on EFL learners’ writing complexity, accuracy, and fluency (CAF. To fulfill the purpose of this study, 60 female intermediate EFL learners were selected from among a total number of 100 through their performance on a piloted sample PET. Based on the results, the students were randomly assigned to a control and an experimental group with 30 participants in each. Both groups underwent the same amount of teaching time during 17 sessions which included a treatment of CDA instruction for the experimental group. A writing posttest was administered at the end of the instruction to both groups and their mean scores on the test were compared through a MANOVA. The results led to the rejection of the three null hypotheses, thereby demonstrating that the learners in the experimental group benefited significantly more than those in the control group in terms of improving their writing CAF. To this end, it is recommended that CDA instruction be incorporated more frequently in writing classes following of course adequate syllabus design and materials development.

  17. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    International Nuclear Information System (INIS)

    Sandusky, Peter; Appiah-Amponsah, Emmanuel; Raftery, Daniel

    2011-01-01

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  18. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  19. Drought Characteristic Analysis Based on an Improved PDSI in the Wei River Basin of China

    Directory of Open Access Journals (Sweden)

    Lei Zou

    2017-03-01

    Full Text Available In this study, to improve the efficiency of the original Palmer Drought Severity Index (PDSI_original, we coupled the Soil and Water Assessment tool (SWAT and PDSI_original to construct a drought index called PDSI_SWAT. The constructed PDSI_SWAT is applied in the Wei River Basin (WRB of China during 1960–2012. The comparison of the PDSI_SWAT with four other commonly used drought indices reveals the effectiveness of the PDSI_SWAT in describing the drought propagation processes in WRB. The whole WRB exhibits a dry trend, with more significant trends in the northern, southeastern and western WRB than the remaining regions. Furthermore, the drought frequencies show that drought seems to occur more likely in the northern part than the southern part of WRB. The principle component analysis method based on the PDSI_SWAT reveals that the whole basin can be further divided into three distinct sub-regions with different drought variability, i.e., the northern, southeastern and western part. Additionally, these three sub-regions are also consistent with the spatial pattern of drought shown by the drought frequency. The wavelet transform analysis method indicates that the El Niño-Southern Oscillation (ENSO events have strong impacts on inducing droughts in the WRB. The results of this study could be beneficial for a scientific water resources management and drought assessment in the current study area and also provide a valuable reference for other areas with similar climatic characteristics.

  20. Cross-sectional atom probe tomography sample preparation for improved analysis of fins on SOI

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Andrew J., E-mail: andy.martin@globalfoundries.com; Weng, Weihao; Zhu, Zhengmao; Loesing, Rainer; Shaffer, James; Katnani, Ahmad

    2016-02-15

    Sample preparation for atom probe tomography of 3D semiconductor devices has proven to significantly affect field evaporation and the reliability of reconstructed data. A cross-sectional preparation method is applied to state-of-the-art Si finFET technology on SOI. This preparation approach advantageously provides a conductive path for voltage and heat, offers analysis of many fins within a single tip, and improves resolution across interfaces of particular interest. Measured B and Ge profiles exhibit good correlation with SIMS and EDX and show no signs of B clustering or pile-up near the Si/SiGe interface of the fin. - Highlights: • Cross-section atom probe tomography sample preparation of fins on SOI. • >5 fins captured in single atom probe tip via cross-section method. • Oxides affect collection efficiency, reconstruction accuracy, and data reliability. • Sample orientation affects field evaporation of dissimilar materials. • Data is well-matched to SIMS and EDX analysis.

  1. An improved quadratic inference function for parameter estimation in the analysis of correlated data.

    Science.gov (United States)

    Westgate, Philip M; Braun, Thomas M

    2013-08-30

    Generalized estimating equations (GEE) are commonly employed for the analysis of correlated data. However, the quadratic inference function (QIF) method is increasing in popularity because of its multiple theoretical advantages over GEE. We base our focus on the fact that the QIF method is more efficient than GEE when the working covariance structure for the data is misspecified. It has been shown that because of the use of an empirical weighting covariance matrix inside its estimating equations, the QIF method's realized estimation performance can potentially be inferior to GEE's when the number of independent clusters is not large. We therefore propose an alternative weighting matrix for the QIF, which asymptotically is an optimally weighted combination of the empirical covariance matrix and its model-based version, which is derived by minimizing its expected quadratic loss. Use of the proposed weighting matrix maintains the large-sample advantages the QIF approach has over GEE and, as shown via simulation, improves small-sample parameter estimation. We also illustrated the proposed method in the analysis of a longitudinal study. Copyright © 2012 John Wiley & Sons, Ltd.

  2. PROJECT MANAGEMENT OF CULTURAL EVENTS IN A PUBLIC UNIVERSITY: ANALYSIS OF PRACTICES AND IMPROVEMENT PROPOSITIONS

    Directory of Open Access Journals (Sweden)

    Cláudia Fabiana Gohr,

    2013-09-01

    Full Text Available This article aims to describe the management practices used for conducting cultural events adopted by the Department of Culture of a public university, identifying critical points and providing the basis for developing a specific methodology for the organization. The research adopted a qualitative approach of analysis, both to identify practices with those responsible for project management and for conducting a case study of a project undertaken by the institution. For data collection we used semi-structured interviews, participant observation and document analysis. It was found that the project management of cultural events at the university has some quirks, which requires the development of specific processes for their management. Moreover, when compared with the best practices suggested by the literature, the project management of events conducted by the university is still at an incipient stage of maturity. Based on the results, we presented some proposals for improvement for the project management of cultural events in the Department of Culture, especially regarding the adoption of management tools for projects and activities that must be performed at each stage of the life cycle of a cultural event.

  3. Improved process analytical technology for protein a chromatography using predictive principal component analysis tools.

    Science.gov (United States)

    Hou, Ying; Jiang, Canping; Shukla, Abhinav A; Cramer, Steven M

    2011-01-01

    Protein A chromatography is widely employed for the capture and purification of antibodies and Fc-fusion proteins. Due to the high cost of protein A resins, there is a significant economic driving force for using these chromatographic materials for a large number of cycles. The maintenance of column performance over the resin lifetime is also a significant concern in large-scale manufacturing. In this work, several statistical methods are employed to develop a novel principal component analysis (PCA)-based tool for predicting protein A chromatographic column performance over time. A method is developed to carry out detection of column integrity failures before their occurrence without the need for a separate integrity test. In addition, analysis of various transitions in the chromatograms was also employed to develop PCA-based models to predict both subtle and general trends in real-time protein A column yield decay. The developed approach has significant potential for facilitating timely and improved decisions in large-scale chromatographic operations in line with the process analytical technology (PAT) guidance from the Food and Drug Administration (FDA). © 2010 Wiley Periodicals, Inc.

  4. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  5. Is Recreational Soccer Effective for Improving VO2max A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Milanović, Zoran; Pantelić, Saša; Čović, Nedim; Sporiš, Goran; Krustrup, Peter

    2015-09-01

    Soccer is the most popular sport worldwide, with a long history and currently more than 500 million active participants, of whom 300 million are registered football club members. On the basis of scientific findings showing positive fitness and health effects of recreational soccer, FIFA (Fédération Internationale de Football Association) introduced the slogan "Playing football for 45 min twice a week-best prevention of non-communicable diseases" in 2010. The objective of this paper was to perform a systematic review and meta-analysis of the literature to determine the effects of recreational soccer on maximal oxygen uptake (VO2max). Six electronic databases (MEDLINE, PubMed, SPORTDiscus, Web of Science, CINAHL and Google Scholar) were searched for original research articles. A manual search was performed to cover the areas of recreational soccer, recreational physical activity, recreational small-sided games and VO2max using the following key terms, either singly or in combination: recreational small-sided games, recreational football, recreational soccer, street football, street soccer, effect, maximal oxygen uptake, peak oxygen uptake, cardiorespiratory fitness, VO2max. The inclusion criteria were divided into four sections: type of study, type of participants, type of interventions and type of outcome measures. Probabilistic magnitude-based inferences for meta-analysed effects were based on standardised thresholds for small, moderate and large changes (0.2, 0.6 and 1.2, respectively) derived from between-subject standard deviations for baseline fitness. Seventeen studies met the inclusion criteria and were included in the systematic review and meta-analysis. Mean differences showed that VO2max increased by 3.51 mL/kg/min (95 % CI 3.07-4.15) over a recreational soccer training programme in comparison with other training models. The meta-analysed effects of recreational soccer on VO2max compared with the controls of no exercise, continuous running and strength

  6. Crossing the Barriers: An Analysis of Permitting Barriers to Geothermal Development and Potential Improvement Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Katherine R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-04

    Developers have identified many non-technical barriers to geothermal power development, including permitting. Activities required for permitting, such as the associated environmental reviews, can take a considerable amount of time and delay project development. This paper discusses the impacts to geothermal development timelines due to the permitting challenges, including the regulatory framework, environmental review process, and ancillary permits. We identified barriers that have the potential to prevent geothermal development or delay timelines and defined improvement scenarios that could assist in expediting geothermal development and permitting timelines and lead to the deployment of additional geothermal resources by 2030 and 2050: (1) the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices as well as (2) an expansion of existing categorical exclusions applicable to geothermal development on Bureau of Land Management public lands to include the oil and gas categorical exclusions passed as part of the Energy Policy Act of 2005. We utilized the Regional Energy Deployment System (ReEDS) and the Geothermal Electricity Technology Evaluation Model (GETEM) to forecast baseline geothermal deployment based on previous analysis of geothermal project development and permitting timelines. The model results forecast that reductions in geothermal project timelines can have a significant impact on geothermal deployment. For example, using the ReEDS model, we estimated that reducing timelines by two years, perhaps due to the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices, could result in deployment of an additional 204 MW by 2030 and 768 MW by 2050 - a 13% improvement when compared to the business as usual scenario. The model results forecast that a timeline improvement of four years - for example with an expansion of existing categorical

  7. Systematic Review and Meta-Analysis of Interventions to Improve Access and Coverage of Adolescent Immunizations.

    Science.gov (United States)

    Das, Jai K; Salam, Rehana A; Arshad, Ahmed; Lassi, Zohra S; Bhutta, Zulfiqar A

    2016-10-01

    Vaccination strategies are among the most successful and cost-effective public health strategies for preventing disease and death. Until recently, most of the existing immunization programs targeted infants and children younger than 5 years which have successfully resulted in reducing global infant and child mortality. Adolescent immunization has been relatively neglected, leaving a quarter of world's population underimmunized and hence vulnerable to a number of preventable diseases. In recent years, a large number of programs have been launched to increase the uptake of different vaccines in adolescents; however, the recommended vaccination coverage among the adolescent population overall remains very low, especially in low- and middle-income countries. Adolescent vaccination has received significantly more attention since the advent of the human papillomavirus (HPV) vaccine in 2006. However, only half of the adolescent girls in the United States received a single dose of HPV vaccine while merely 43% and 33% received two and three doses, respectively. We systematically reviewed literature published up to December 2014 and included 23 studies on the effectiveness of interventions to improve immunization coverage among adolescents. Moderate-quality evidence suggested an overall increase in vaccination coverage by 78% (relative risk: 1.78; 95% confidence interval: 1.41-2.23). Review findings suggest that interventions including implementing vaccination requirement in school, sending reminders, and national permissive recommendation for adolescent vaccination have the potential to improve immunization uptake. Strategies to improve coverage for HPV vaccines resulted in a significant decrease in the prevalence of HPV by 44% and genital warts by 33%; however, the quality of evidence was low. Analysis from single studies with low- or very low-quality evidence suggested significant decrease in varicella deaths, measles incidence, rubella susceptibility, and incidence of

  8. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  9. Interventions to improve gross motor performance in children with neurodevelopmental disorders: a meta-analysis.

    Science.gov (United States)

    Lucas, Barbara R; Elliott, Elizabeth J; Coggan, Sarah; Pinto, Rafael Z; Jirikowic, Tracy; McCoy, Sarah Westcott; Latimer, Jane

    2016-11-29

    Gross motor skills are fundamental to childhood development. The effectiveness of current physical therapy options for children with mild to moderate gross motor disorders is unknown. The aim of this study was to systematically review the literature to investigate the effectiveness of conservative interventions to improve gross motor performance in children with a range of neurodevelopmental disorders. A systematic review with meta-analysis was conducted. MEDLINE, EMBASE, AMED, CINAHL, PsycINFO, PEDro, Cochrane Collaboration, Google Scholar databases and clinical trial registries were searched. Published randomised controlled trials including children 3 to ≤18 years with (i) Developmental Coordination Disorder (DCD) or Cerebral Palsy (CP) (Gross Motor Function Classification System Level 1) or Developmental Delay or Minimal Acquired Brain Injury or Prematurity (Disorders; and (ii) receiving non-pharmacological or non-surgical interventions from a health professional and (iii) gross motor outcomes obtained using a standardised assessment tool. Meta-analysis was performed to determine the pooled effect of intervention on gross motor function. Methodological quality and strength of meta-analysis recommendations were evaluated using PEDro and the GRADE approach respectively. Of 2513 papers, 9 met inclusion criteria including children with CP (n = 2) or DCD (n = 7) receiving 11 different interventions. Only two of 9 trials showed an effect for treatment. Using the least conservative trial outcomes a large beneficial effect of intervention was shown (SMD:-0.8; 95% CI:-1.1 to -0.5) with "very low quality" GRADE ratings. Using the most conservative trial outcomes there is no treatment effect (SMD:-0.1; 95% CI:-0.3 to 0.2) with "low quality" GRADE ratings. Study limitations included the small number and poor quality of the available trials. Although we found that some interventions with a task-orientated framework can improve gross motor outcomes in children with

  10. IMPROVEMENT OF EXPERT ANALYSIS FOR ROAD TRAFFIC ACCIDENTS USING COMPUTER SIMULATION PROGRAMS

    Directory of Open Access Journals (Sweden)

    S. A. Azemsha

    2015-01-01

    Full Text Available The existing methods for auto-technical expertise presuppose selection of some parameters on the basis of the expert’s intuition and experience. Type of a vehicle and its loading rate, road conditions are not taken into account also in the case when deceleration is to be determined. While carrying out the analysis it has been established that an application of special software makes it possible to improve significantly efficiency of the executed works directed on solution of the assigned tasks, to speed up calculation processes, to decrease qualitatively probability of arithmetic errors and provides the possibility to visualize results of the conducted investigations. Possibility of using various models for dynamic motion simulation and collision of vehicles (in the form of 3D-models has been established in the paper. In such a case specific features of vehicle technical conditions, its loading rate and condition of roadway surface have been taken account in the paper. The given paper also permits to obtain a dynamic display of reconstructed accident mechanism in axonometric projection, to film video-clips when a camera is positioned at any spatial point: road, roadside, raised position, moving vehicle, driver's seat in the vehicle.The paper contains an analysis of possibilities of road traffic accident simulation programs, a statistical analysis that shows significance in differences between simulation results when various programs have been used. The paper presents initial data and results of vehicle speed calculation on the basis of braking track length which have been obtained with the help of road traffic accident express analysis (a classical approach and PC-Crash when additional influencing factors are taken into account. A number of shortcomings have been revealed while analyzing the simulation results of the applied software. The shortcomings must be removed in the analyzed software products.On the basis of the executed analysis in

  11. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  12. Improvement of the analysis of the biochemical oxygen demand (BOD) of Mediterranean seawater by seeding control.

    Science.gov (United States)

    Simon, F Xavier; Penru, Ywann; Guastalli, Andrea R; Llorens, Joan; Baig, Sylvie

    2011-07-15

    Biochemical oxygen demand (BOD) is a useful parameter for assessing the biodegradability of dissolved organic matter in water. At the same time, this parameter is used to evaluate the efficiency with which certain processes remove biodegradable natural organic matter (NOM). However, the values of BOD in seawater are very low (around 2 mgO(2)L(-1)) and the methods used for its analysis are poorly developed. The increasing attention given to seawater desalination in the Mediterranean environment, and related phenomena such as reverse osmosis membrane biofouling, have stimulated interest in seawater BOD close to the Spanish coast. In this study the BOD analysis protocol was refined by introduction of a new step in which a critical quantity of autochthonous microorganisms, measured as adenosine triphosphate, is added. For the samples analyzed, this improvement allowed us to obtain reliable and replicable BOD measurements, standardized with solutions of glucose-glutamic acid and acetate. After 7 days of analysis duration, more than 80% of ultimate BOD is achieved, which in the case of easily biodegradable compounds represents nearly a 60% of the theoretical oxygen demand. BOD(7) obtained from the Mediterranean Sea found to be 2.0±0.3 mgO(2)L(-1) but this value decreased with seawater storage time due to the rapid consumption of labile compounds. No significant differences were found between two samples points located on the Spanish coast, since their organic matter content was similar. Finally, the determination of seawater BOD without the use of inoculum may lead to an underestimation of BOD. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Improvements in quantification of low z element analysis for Sr- and conventional TXRF

    International Nuclear Information System (INIS)

    Baur, K.; Brennan, S.; Pianetta, P.; Kerner, J.; Zhu, Q.; Burrow, B.

    2000-01-01

    As the dimensions of integrated circuits continue to shrink also the amount of tolerable contamination on Si wafer surfaces decreases. Contaminants of primary concern are transition metals and light elements like Al. Total reflection x-ray fluorescence (TXRF) spectroscopy using synchrotron radiation from the Stanford synchrotron radiation laboratory (SSRL) is one of the most powerful techniques for trace impurity analysis on Si wafer surfaces. In addition, it is among the more sensitive techniques and the only one, which is non-destructive. Upon having established a better detection sensitivity for transition elements than required by semiconductor industry, the current effort focuses on the improvement of the sensitivity for the detection and data analysis of light elements. Due to the presence of the neighboring Si signal from the substrate this can only be achieved by tuning the excitation energy below the Si-K absorption edge. For conventional TXRF systems this can be done by using a W-M fluorescence line (1.78 keV) for excitation or by employing the tunability of synchrotron radiation. However, this results in a substantial increase in background due to resonant X-ray Raman scattering. This scattering dominates the background behavior of the Al K fluorescence line, and consequently limits the achievable sensitivity for the detection of Al surface contaminants. In particular, we find that for a precise determination of the achievable sensitivity, the specific shape of the continuous Raman background must be used in the deconvolution. This data analysis opens a new perspective for conventional TXRF systems to overcome background problems in quantification and first results will be presented. (author)

  14. Using the failure mode and effects analysis model to improve parathyroid hormone and adrenocorticotropic hormone testing

    Directory of Open Access Journals (Sweden)

    Magnezi R

    2016-12-01

    Full Text Available Racheli Magnezi,1 Asaf Hemi,1 Rina Hemi2 1Department of Management, Public Health and Health Systems Management Program, Bar Ilan University, Ramat Gan, 2Endocrine Service Unit, Sheba Medical Center, Tel Aviv, Israel Background: Risk management in health care systems applies to all hospital employees and directors as they deal with human life and emergency routines. There is a constant need to decrease risk and increase patient safety in the hospital environment. The purpose of this article is to review the laboratory testing procedures for parathyroid hormone and adrenocorticotropic hormone (which are characterized by short half-lives and to track failure modes and risks, and offer solutions to prevent them. During a routine quality improvement review at the Endocrine Laboratory in Tel Hashomer Hospital, we discovered these tests are frequently repeated unnecessarily due to multiple failures. The repetition of the tests inconveniences patients and leads to extra work for the laboratory and logistics personnel as well as the nurses and doctors who have to perform many tasks with limited resources.Methods: A team of eight staff members accompanied by the Head of the Endocrine Laboratory formed the team for analysis. The failure mode and effects analysis model (FMEA was used to analyze the laboratory testing procedure and was designed to simplify the process steps and indicate and rank possible failures.Results: A total of 23 failure modes were found within the process, 19 of which were ranked by level of severity. The FMEA model prioritizes failures by their risk priority number (RPN. For example, the most serious failure was the delay after the samples were collected from the department (RPN =226.1.Conclusion: This model helped us to visualize the process in a simple way. After analyzing the information, solutions were proposed to prevent failures, and a method to completely avoid the top four problems was also developed. Keywords: failure mode

  15. An improved Agrobacterium-mediated transformation system for the functional genetic analysis of Penicillium marneffei.

    Science.gov (United States)

    Kummasook, Aksarakorn; Cooper, Chester R; Vanittanakom, Nongnuch

    2010-12-01

    We have developed an improved Agrobacterium-mediated transformation (AMT) system for the functional genetic analysis of Penicillium marneffei, a thermally dimorphic, human pathogenic fungus. Our AMT protocol included the use of conidia or pre-germinated conidia of P. marneffei as the host recipient for T-DNA from Agrobacterium tumefaciens and co-cultivation at 28°C for 36 hours. Bleomycin-resistant transformants were selected as yeast-like colonies following incubation at 37°C. The efficiency of transformation was approximately 123 ± 3.27 and 239 ± 13.12 transformants per plate when using 5 × 10(4) conidia and pre-germinated conidia as starting materials, respectively. Southern blot analysis demonstrated that 95% of transformants contained single copies of T-DNA. Inverse PCR was employed for identifying flanking sequences at the T-DNA insertion sites. Analysis of these sequences indicated that integration occurred as random recombination events. Among the mutants isolated were previously described stuA and gasC defective strains. These AMT-derived mutants possessed single T-DNA integrations within their particular coding sequences. In addition, other morphological and pigmentation mutants possessing a variety of gene-specific defects were isolated, including two mutants having T-DNA integrations within putative promoter regions. One of the latter integration events was accompanied by the deletion of the entire corresponding gene. Collectively, these results indicated that AMT could be used for large-scale, functional genetic analyses in P. marneffei. Such analyses can potentially facilitate the identification of those genetic elements related to morphogenesis, as well as pathogenesis in this medically important fungus.

  16. Improving configuration management of thermalhydraulic analysis by automating the linkage between pipe geometry and plant idealization

    International Nuclear Information System (INIS)

    Gibb, R.; Girard, R.; Thompson, W.

    1997-01-01

    All safety analysis codes require some representation of actual plant data as a part of their input. Such representations, referred to at Point Lepreau Generating Station (PLGS) as plant idealizations, may include piping layout, orifice, pump or valve opening characteristics, boundary conditions of various sorts, reactor physics parameters, etc. As computing power increases, the numerical capabilities of thermalhydraulic analysis tools become more sophisticated, requiring more detailed assessments, and consequently more complex and complicated idealizations of the system models. Thus, a need has emerged to create a precise plant model layout in electronic form which ensures a realistic representation of the plant systems, and form which analytical approximations of any chosen degree of accuracy may be created. The benefits of this process are twofold. Firstly, the job of developing a plant idealization is made simpler, and therefore is cheaper for the utility. More important however, are the improvements in documentation and reproducibility that this process imparts to the resultant idealization. Just as the software that performs the numerical operations on the input data must be subject to verification/validation, equally robust measures must be taken to ensure that these software operations are being applied to valid idealizations, that are formally documented. Since the CATHENA Code is one of the most important thermalhydraulic code used for safety analysis at PLGS the main effort was directed towards the systems plant models for this code. This paper reports the results of the work carried on at PLGS and ANSL to link the existing piping data base to the actual CATHENA plant idealization. An introduction to the concept is given first, followed by a description of the databases, and the supervisory tool which manages the data, and associated software. An intermediate code, which applied some thermalhydraulic rules to the data, and translated the resultant data

  17. An Improved Rigid Multibody Model for the Dynamic Analysis of the Planetary Gearbox in a Wind Turbine

    Directory of Open Access Journals (Sweden)

    Wenguang Yang

    2016-01-01

    Full Text Available This paper proposes an improved rigid multibody model for the dynamic analysis of the planetary gearbox in a wind turbine. The improvements mainly include choosing the inertia frame as the reference frame of the carrier, the ring, and the sun and adding a new degree of freedom for each planet. An element assembly method is introduced to build the model, and a time-varying mesh stiffness model is presented. A planetary gear study case is employed to verify the validity of the improved model. Comparisons between the improvement model and the traditional model show that the natural characteristics are very close; the improved model can obtain the right equivalent moment of inertia of the planetary gear in the transient simulation, and all the rotation speeds satisfy the transmission relationships well; harmonic resonance and resonance modulation phenomena can be found in their vibration signals. The improved model is applied in a multistage gearbox dynamics analysis to reveal the prospects of the model. Modal analysis and transient analysis with and without time-varying mesh stiffness considered are conducted. The rotation speeds from the transient analysis are consistent with the theory, and resonance modulation can be found in the vibration signals.

  18. A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, A. S. T. [Univ. of Cambridge (United Kingdom); Chapman, J. D. [Univ. of Cambridge (United Kingdom); Thomson, M. A. [Univ. of Cambridge (United Kingdom)

    2013-04-01

    This paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of νμ charged-current interactions with an oscillatory dependence on Lν/Eν, where Lν is the neutrino propagation distance and E mrow is="true">ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the Lν/Eν resolution varies significantly from event to event. The precision of the oscillation measurement can be improved by incorporating information on Lν/Eν resolution into the oscillation analysis. In the analysis presented

  19. Gynecologic Oncology Group quality assurance audits: analysis and initiatives for improvement.

    Science.gov (United States)

    Blessing, John A; Bialy, Sally A; Whitney, Charles W; Stonebraker, Bette L; Stehman, Frederick B

    2010-08-01

    The Gynecologic Oncology Group (GOG) is a multi-institution, multi-discipline Cooperative Group funded by the National Cancer Institute (NCI) to conduct clinical trials which investigate the treatment, prevention, control, quality of survivorship, and translational science of gynecologic malignancies. In 1982, the NCI initiated a program of on-site quality assurance audits of participating institutions. Each is required to be audited at least once every 3 years. In GOG, the audit mandate is the responsibility of the GOG Quality Assurance Audit Committee and it is centralized in the Statistical and Data Center (SDC). Each component (Regulatory, Investigational Drug Pharmacy, Patient Case Review) is classified as Acceptable, Acceptable, follow-up required, or Unacceptable. To determine frequently occurring deviations and develop focused innovative solutions to address them. A database was created to examine the deviations noted at the most recent audit conducted at 57 GOG parent institutions during 2004-2007. Cumulatively, this involved 687 patients and 306 protocols. The results documented commendable performance: Regulatory (39 Acceptable, 17 Acceptable, follow-up, 1 Unacceptable); Pharmacy (41 Acceptable, 3 Acceptable, follow-up, 1 Unacceptable, 12 N/A): Patient Case Review (31 Acceptable, 22 Acceptable, follow-up, 4 Unacceptable). The nature of major and lesser deviations was analyzed to create and enhance initiatives for improvement of the quality of clinical research. As a result, Group-wide proactive initiatives were undertaken, audit training sessions have emphasized recurring issues, and GOG Data Management Subcommittee agendas have provided targeted instruction and training. The analysis was based upon parent institutions only; affiliate institutions and Community Clinical Oncology Program participants were not included, although it is assumed their areas of difficulty are similar. The coordination of the GOG Quality Assurance Audit program in the SDC has

  20. Trabeculectomy Improves Vessel Response Measured by Dynamic Vessel Analysis (DVA) in Glaucoma Patients.

    Science.gov (United States)

    J, Michael Selbach; Schallenberg, Maurice; Kramer, Sebastian; Anastassiou, Gerasimos; Steuhl, Klaus-Peter; Vilser, Walthard; Kremmer, Stephan

    2014-01-01

    To determine the effects of surgical IOP reduction (trabeculectomy) on retinal blood flow parameters in glaucoma patients using Dynamic Vessel Analysis (DVA). 26 eyes of 26 patients with progressive primary open-angle glaucoma (POAG) despite maximal topical therapy were examined before and after trabeculectomy. The responses of the retinal vessels to flickering light provocation were measured with DVA the day before surgery and 4 to 6 weeks after trabeculectomy. Between 3 and 4 weeks before surgery all local therapies were stopped and a systemic therapy with acetazolamide and conservative free topic steroidal eye drops was started. In 19 patients (73%), an inadequate response to the flicker stimulation was measured preoperatively. In these patients, the maximum dilation of arteries and veins was reduced significantly as compared to healthy eyes. In this group, the maximum dilation of the arteries following the flicker provocation improved from 1.4% before to 3.8% following trabeculectomy (p<0.01). In retinal veins, this parameter increased from 3.1% to 4.6% (p<0.05). In the 7 patients whose arterial and venous reactions to flickering light provocation preoperatively did not differ from healthy eyes, there was no significant change after surgery. The initial baseline values of arteries and veins (MU) did not deviate significantly in both groups. POAG patients with progressive disease and impaired vascular regulation profit from IOP lowering trabeculectomy concerning vascular reactivity and dilative reserve, indicating a possible improvement of retinal perfusion following effective IOP control. Future studies with long-term follow-up must determine the clinical importance of these findings for the treatment of glaucoma patients.

  1. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    International Nuclear Information System (INIS)

    Miwa, Kenta; Inubushi, Masayuki; Wagatsuma, Kei; Nagao, Michinobu; Murata, Taisuke; Koyama, Masamichi; Koizumi, Mitsuru; Sasaki, Masayuki

    2014-01-01

    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUV max ) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUV max and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUV max , the difference did not reach statistical significance. Tumor size correlated significantly with SUV max (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUV max or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUV max and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy

  2. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency.

    Science.gov (United States)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-12-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. [Analysis of interventions designed to improve clinical supervision of student nurses in Benin].

    Science.gov (United States)

    Otti, André; Pirson, Magali; Piette, Danielle; Coppieters T Wallant, Yves

    2017-12-05

    The absence of an explicit and coherent conception of the articulation between theory and practice in the reform of nursing training in Benin has resulted in poor quality clinical supervision of student nurses. The objective of this article is to analyze two interventions designed to improve the quality of supervision. A student welcome booklet developed by means of a consultative and provocative participatory approach was tested with twelve student nurses versus a control group. Content analysis of the data collected by individual semi-directed interviews and during two focus groups demonstrated the value of this tool. Student nurses were also taught to use to training diaries inspired by the ?experiential learning? Training diaries were analysed using a grid based on the descriptive elements of the five types of Scheepers training diaries (2008). According to the student nurses, the welcome booklet provided them with structured information to be used as a reference during their training and a better understanding of their teachers, and allowed them to situate the resources of the training course with a lower level of stress. Fifty-eight per cent of the training diaries were are mosaics, reflecting the reflective practice and self-regulated learning of student nurses. This activity also promoted metacognitive dialogue with their supervisors. The student welcome booklet appeared to facilitate integration of student nurses into the clinical setting and promoted professional and organizational socialization. The training diary improved the quality of clinical learning by repeated reflective observation of student nurses and helped to maintain permanent communication with the supervisors.

  4. Improving the position resolution of highly segmented HPGe detectors using pulse shape analysis methods

    International Nuclear Information System (INIS)

    Descovich, Martina

    2002-01-01

    This work presents an approach for determining the interaction position of γ rays in highly segmented HPGe detectors. A precise knowledge of the interaction position enables the effective granularity of the detector to be substantially improved and a calibration of the detector response as a function of position to be performed. An improved position resolution is fundamental for the development of arrays of γ ray tracking detectors. The performance of a highly segmented germanium detector (TIGRE) has been characterised. TIGRE consists of a large volume coaxial high-purity n-type germanium crystal with a 24-fold segmented outer contact. Due to its high granularity and its fast electronics, TIGRE represents a unique example of a tracking detector, having low noise output signals, fast rise time and good energy resolution. In order to calibrate the response of the detector as a function of the interaction position, a dedicated scanning apparatus has been developed and the front surface of the detector has been scanned. The method developed for position determination is based on the digital analysis of the preamplifier signal, whose features are position dependent. A two-dimensional position resolution is accomplished by combining the radial position information, contained in the rise time of the pulse shape leading edge, with the azimuthal position information, carried by the magnitude of the transient charge signals induced in the spectator segments. Utilising this method, a position resolution of 0.6 mm, both radially and along the azimuthal direction, can be achieved in the most sensitive part of the detector. (author)

  5. Spatial analysis of ecosystem service relationships to improve targeting of payments for hydrological services.

    Science.gov (United States)

    Mokondoko, Pierre; Manson, Robert H; Ricketts, Taylor H; Geissert, Daniel

    2018-01-01

    Payment for hydrological services (PHS) are popular tools for conserving ecosystems and their water-related services. However, improving the spatial targeting and impacts of PHS, as well as their ability to foster synergies with other ecosystem services (ES), remain challenging. We aimed at using spatial analyses to evaluate the targeting performance of México's National PHS program in central Veracruz. We quantified the effectiveness of areas targeted for PHS in actually covering areas of high HS provision and social priority during 2003-2013. First, we quantified provisioning and spatial distributions of two target (water yield and soil retention), and one non-target ES (carbon storage) using InVEST. Subsequently, pairwise relationships among ES were quantified by using spatial correlation and overlap analyses. Finally, we evaluated targeting by: (i) prioritizing areas of individual and overlapping ES; (ii) quantifying spatial co-occurrences of these priority areas with those targeted by PHS; (iii) evaluating the extent to which PHS directly contribute to HS delivery; and (iv), testing if PHS targeted areas disproportionately covered areas with high ecological and social priority. We found that modelled priority areas exhibited non-random distributions and distinct spatial patterns. Our results show significant pairwise correlations between all ES suggesting synergistic relationships. However, our analysis showed a significantly lower overlap than expected and thus significant mismatches between PHS targeted areas and all types of priority areas. These findings suggest that the targeting of areas with high HS provisioning and social priority by Mexico's PHS program could be improved significantly. This study underscores: (1) the importance of using maps of HS provisioning as main targeting criteria in PHS design to channel payments towards areas that require future conservation, and (2) the need for future research that helps balance ecological and socioeconomic

  6. Identification of quality improvement areas in pediatric MRI from analysis of patient safety reports

    International Nuclear Information System (INIS)

    Jaimes, Camilo; Murcia, Diana J.; Miguel, Karen; DeFuria, Cathryn; Sagar, Pallavi; Gee, Michael S.

    2018-01-01

    Analysis of safety reports has been utilized to guide practice improvement efforts in adult magnetic resonance imaging (MRI). Data specific to pediatric MRI could help target areas of improvement in this population. To estimate the incidence of safety reports in pediatric MRI and to determine associated risk factors. In a retrospective HIPAA-compliant, institutional review board-approved study, a single-institution Radiology Information System was queried to identify MRI studies performed in pediatric patients (0-18 years old) from 1/1/2010 to 12/31/2015. The safety report database was queried for events matching the same demographic and dates. Data on patient age, gender, location (inpatient, outpatient, emergency room [ER]), and the use of sedation/general anesthesia were recorded. Safety reports were grouped into categories based on the cause and their severity. Descriptive statistics were used to summarize continuous variables. Chi-square analyses were performed for univariate determination of statistical significance of variables associated with safety report rates. A multivariate logistic regression was used to control for possible confounding effects. A total of 16,749 pediatric MRI studies and 88 safety reports were analyzed, yielding a rate of 0.52%. There were significant differences in the rate of safety reports between patients younger than 6 years (0.89%) and those older (0.41%) (P<0.01), sedated (0.8%) and awake children (0.45%) (P<0.01), and inpatients (1.1%) and outpatients (0.4%) (P<0.01). The use of sedation/general anesthesia is an independent risk factor for a safety report (P=0.02). The most common causes for safety reports were service coordination (34%), drug reactions (19%), and diagnostic test and ordering errors (11%). The overall rate of safety reports in pediatric MRI is 0.52%. Interventions should focus on vulnerable populations, such as younger patients, those requiring sedation, and those in need of acute medical attention. (orig.)

  7. Improving the Efficiency and Ease of Healthcare Analysis Through Use of Data Visualization Dashboards.

    Science.gov (United States)

    Stadler, Jennifer G; Donlon, Kipp; Siewert, Jordan D; Franken, Tessa; Lewis, Nathaniel E

    2016-06-01

    The digitization of a patient's health record has profoundly impacted medicine and healthcare. The compilation and accessibility of medical history has provided clinicians an unprecedented, holistic account of a patient's conditions, procedures, medications, family history, and social situation. In addition to the bedside benefits, this level of information has opened the door for population-level monitoring and research, the results of which can be used to guide initiatives that are aimed at improving quality of care. Cerner Corporation partners with health systems to help guide population management and quality improvement projects. With such an enormous and diverse client base-varying in geography, size, organizational structure, and analytic needs-discerning meaning in the data and how they fit with that particular hospital's goals is a slow, difficult task that requires clinical, statistical, and technical literacy. This article describes the development of dashboards for efficient data visualization at the healthcare facility level. Focusing on two areas with broad clinical importance, sepsis patient outcomes and 30-day hospital readmissions, dashboards were developed with the goal of aggregating data and providing meaningful summary statistics, highlighting critical performance metrics, and providing easily digestible visuals that can be understood by a wide range of personnel with varying levels of skill and areas of expertise. These internal-use dashboards have allowed associates in multiple roles to perform a quick and thorough assessment on a hospital of interest by providing the data to answer necessary questions and to identify important trends or opportunities. This automation of a previously manual process has greatly increased efficiency, saving hours of work time per hospital analyzed. Additionally, the dashboards have standardized the analysis process, ensuring use of the same metrics and processes so that overall themes can be compared across

  8. Analysis of Stakeholder's Behaviours for an Improved Management of an Agricultural Coastal Region in Oman

    Science.gov (United States)

    Khatri, Ayisha Al; Jens, Grundmann; der Weth Rüdiger, van; Niels, Schütze

    2015-04-01

    differences exist between groups on how to achieve this improvement, since farmers prefer management interventions operating more on the water resources side while decision makers support measures for a better management on the water demand side. Furthermore, the opinions within single groups are sometimes contradicting for several management interventions. The use of more advanced statistical methods like discriminant analysis or Bayesian network allow for identifying factors and drivers to explain these differences. Both approaches, will help to understand stakeholder's behaviours and to evaluate the implementation potential of several management interventions. Keywords IWRM, Stakeholder participation, field survey, statistical analysis, Oman

  9. Paediatric ED BiPAP continuous quality improvement programme with patient analysis: 2005-2013.

    Science.gov (United States)

    Abramo, Thomas; Williams, Abby; Mushtaq, Samaiya; Meredith, Mark; Sepaule, Rawle; Crossman, Kristen; Burney Jones, Cheryl; Godbold, Suzanne; Hu, Zhuopei; Nick, Todd

    2017-01-16

    In paediatric moderate-to-severe asthmatics, there is significant bronchospasm, airway obstruction, air trapping causing severe hyperinflation with more positive intraplural pressure preventing passive air movement. These effects cause an increased respiratory rate (RR), less airflow and shortened inspiratory breath time. In certain asthmatics, aerosols are ineffective due to their inadequate ventilation. Bilevel positive airway pressure (BiPAP) in acute paediatric asthmatics can be an effective treatment. BiPAP works by unloading fatigued inspiratory muscles, a direct bronchodilation effect, offsetting intrinsic PEEP and recruiting collapsed alveoli that reduces the patient's work of breathing and achieves their total lung capacity quicker. Unfortunately, paediatric emergency department (PED) BiPAP is underused and quality analysis is non-existent. A PED BiPAP Continuous Quality Improvement Program (CQIP) from 2005 to 2013 was evaluated using descriptive analytics for the primary outcomes of usage, safety, BiPAP settings, therapeutics and patient disposition. PED BiPAP CQIP descriptive analytics. Academic PED. 1157 patients. A PED BiPAP CQIP from 2005 to 2013 for the usage, safety, BiPAP settings, therapeutic response parameters and patient disposition was evaluated using descriptive analytics. Safety, usage, compliance, therapeutic response parameters, BiPAP settings and patient disposition. 1157 patients had excellent compliance without complications. Only 6 (0.5%) BiPAP patients were intubated. BiPAP median settings: IPAP 18 (16,20) cm H 2 O range 12-28; EPAP 8 cmH 2 O (8,8) range 6-10; inspiratory-to-expiratory time (I:E) ratio 1.75 (1.5,1.75). Pediatric Asthma Severity score and RR decreased (pimproved therapeutics times, very low intubations and decreased PICU admissions. CQIP analysis demonstrated that using a higher IPAP, low EPAP with longer I:E optimises the patient's BiPAP settings and showed a significant improvement in PAS, RR and tidal volume. Bi

  10. Do quality improvement initiatives for diabetes care address social inequities? Secondary analysis of a systematic review.

    Science.gov (United States)

    Lu, Jacquie Boyang; Danko, Kristin J; Elfassy, Michael D; Welch, Vivian; Grimshaw, Jeremy M; Ivers, Noah M

    2018-02-14

    Socially disadvantaged populations carry a disproportionate burden of diabetes-related morbidity and mortality. There is an emerging interest in quality improvement (QI) strategies in the care of patients with diabetes, however, the effect of these interventions on disadvantaged groups remains unclear. This is a secondary analysis of a systematic review that seeks to examine the extent of equity considerations in diabetes QI studies, specifically quantifying the proportion of studies that target interventions toward disadvantaged populations and conduct analyses on the impact of interventions on disadvantaged groups. Studies were identified using Medline, HealthStar and the Cochrane Effective Practice and Organisation of Care database. Randomised controlled trials assessing 12 QI strategies targeting health systems, healthcare professionals and/or patients for the management of adult outpatients with diabetes were eligible. The place of residence, race/ethnicity/culture/language, occupational status, gender/sexual identity, religious affiliations, education level, socioeconomic status, social capital, plus age, disability, sexual preferences and relationships (PROGRESS-Plus) framework was used to identify trials that focused on disadvantaged patient populations, to examine the types of equity-relevant factors that are being considered and to explore temporal trends in equity-relevant diabetes QI trials. Of the 278 trials that met the inclusion criteria, 95 trials had equity-relevant considerations. These include 64 targeted trials that focused on a disadvantaged population with the aim to improve the health status of that population and 31 general trials that undertook subgroup analyses to assess the extent to which their interventions may have had differential impacts on disadvantaged subgroups. Trials predominantly focused on race/ethnicity, socioeconomic status and place of residence as potential factors for disadvantage in patients receiving diabetes care. Less

  11. Improved methodologies for continuous-flow analysis of stable water isotopes in ice cores

    Science.gov (United States)

    Jones, Tyler R.; White, James W. C.; Steig, Eric J.; Vaughn, Bruce H.; Morris, Valerie; Gkinis, Vasileios; Markle, Bradley R.; Schoenemann, Spruce W.

    2017-02-01

    Water isotopes in ice cores are used as a climate proxy for local temperature and regional atmospheric circulation as well as evaporative conditions in moisture source regions. Traditional measurements of water isotopes have been achieved using magnetic sector isotope ratio mass spectrometry (IRMS). However, a number of recent studies have shown that laser absorption spectrometry (LAS) performs as well or better than IRMS. The new LAS technology has been combined with continuous-flow analysis (CFA) to improve data density and sample throughput in numerous prior ice coring projects. Here, we present a comparable semi-automated LAS-CFA system for measuring high-resolution water isotopes of ice cores. We outline new methods for partitioning both system precision and mixing length into liquid and vapor components - useful measures for defining and improving the overall performance of the system. Critically, these methods take into account the uncertainty of depth registration that is not present in IRMS nor fully accounted for in other CFA studies. These analyses are achieved using samples from a South Pole firn core, a Greenland ice core, and the West Antarctic Ice Sheet (WAIS) Divide ice core. The measurement system utilizes a 16-position carousel contained in a freezer to consecutively deliver ˜ 1 m × 1.3 cm2 ice sticks to a temperature-controlled melt head, where the ice is converted to a continuous liquid stream and eventually vaporized using a concentric nebulizer for isotopic analysis. An integrated delivery system for water isotope standards is used for calibration to the Vienna Standard Mean Ocean Water (VSMOW) scale, and depth registration is achieved using a precise overhead laser distance device with an uncertainty of ±0.2 mm. As an added check on the system, we perform inter-lab LAS comparisons using WAIS Divide ice samples, a corroboratory step not taken in prior CFA studies. The overall results are important for substantiating data obtained from LAS

  12. Improved Bean Varieties Make a difference in Western Kenya: An Impact Analysis

    International Nuclear Information System (INIS)

    Odendo, M.

    2002-01-01

    A survey was conducted in Kakamega and Vihiga District of Western Kenya during 2001 cropping seasons to analyze impacts of root rot resistance bean varieties introduced in Kakamega and Vihiga Districts by KARI and CIAT between 1989 and 1998. The objective of the study were to: assess adoption of the introduced root rot resistance bush and climbing bean varieties, determine technological and socio-economic factors that enhances or lessen adoption and attainment of the impacts, and examine types and magnitude of the impacts of adoption of the improved bean varieties at farm-level. To form a composite picture on spread and impact of the varieties, a systematic approach was developed to collect and put together data from several sources mainly literature reviews and household surveys. Data were collected through interviews of 233 randomly selected households, using a structured questionnaire. Descriptive statistics and Tobit model were used to analyse the data. The study showed that although several improved bean varieties were disseminated, most farmers did not adopt all the varieties; 35-80% of the farmers had mainly adopted three bush beans KK22, KK15 and KK8. A relatively small portion of farmers (8-19%) adopted climbers. Disease resistance (root rot) was only one of the criteria, but not the most important one, farmers used in selecting the varieties they adopted. Result of Tobit analysis showed that farmers perception of the trait of the bean varieties as well as farm-farmer characteristics were important determinants of decisions to adopt and intensity the adoption. The main impacts of the new beans were their contribution to equity by improving food self-sufficiency for about 97-99% of te farmers. Most farmers used their bean harvest mainly for subsistence, only a few farmers sold beans in any given season. The 63% of the households that sold beans to earn cash income to purchase other types of food as well as investing in capital assets, especially household

  13. Experimental study and mechanism analysis of modified limestone by red mud for improving desulfurization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongtao; Han, Kuihua; Niu, Shengli; Lu, Chunmei; Liu, Mengqi; Li, Hui [Shandong Univ., Jinan (China). School of Energy and Power Engineering

    2013-07-01

    Red mud is a type of solid waste generated during alumina production from bauxite, and how to dispose and utilize red mud in a large scale is yet a question with no satisfied answer. This paper attempts to use red mud as a kind of additive to modify the limestone. The enhancement of the sulfation reaction of limestone by red mud (two kinds of Bayer process red mud and one kind of sintering process red mud) are studied by a tube furnace reactor. The calcination and sulfation process and kinetics are investigated in a thermogravimetric (TG) analyzer. The results show that red mud can effectively improve the desulfurization performance of limestone in the whole temperature range (1,073-1,373K). Sulfur capacity of limestone (means quality of SO{sub 2} which can be retained by 100mg of limestone) can be increased by 25.73, 7.17 and 15.31% while the utilization of calcium can be increased from 39.68 to 64.13%, 60.61 and 61.16% after modified by three kinds of red mud under calcium/metallic element (metallic element described here means all metallic elements which can play a catalytic effect on the sulfation process, including the Na, K, Fe, Ti) ratio being 15, at the temperature of 1,173K. The structure of limestone modified by red mud is interlaced and tridimensional which is conducive to the sulfation reaction. The phase composition analysis measured by XRD of modified limestone sulfated at high temperature shows that there are correspondingly more sulphates for silicate and aluminate complexes of calcium existing in the products. Temperature, calcium/metallic element ratio and particle diameter are important factors as for the sulfation reaction. The optimum results can be obtained as calcium/metallic element ratio being 15. Calcination characteristic of limestone modified by red mud shows a migration to lower temperature direction. The enhancement of sulfation by doping red mud is more pronounced once the product layer has been formed and consequently the promoting

  14. Linkage analysis and physical mapping near the gene for x-linked agammaglobulinemia at Xq22

    Energy Technology Data Exchange (ETDEWEB)

    Parolini, O.; Lassiter, G.L.; Henry, M.J.; Conley, M.E. (Univ. of Tennessee College of Medicine, Memphis (United States) St. Jude Children' s Research Hospital, Memphis, TN (United States)); Hejtmancik, J.F. (National Inst. of Health, Bethesda, MD (United States)); Allen, R.C.; Belmont, J.W. (Baylor College of Medicine, Houston, TX (United States)); Barker, D.F. (Univ. of Utah, Salt Lake City (United States))

    1993-02-01

    The gene for x-linked agammaglobulinemia (XLA) has been mapped to Xq22. No recombinations have been reported between the gene and the prob p212 at DXS178; however, this probe is informative in only 30-40% of women and the reported flanking markers, DXS3 and DXS94, and 10-15 cM apart. To identify additional probes that might be useful in genetic counseling, we examined 11 polymorphisms that have been mapped to the Xq21.3-q22 region in 13 families with XLA. In addition, pulsed-field gel electrophoresis and yeast artificial chromosomes (YACs) were used to further characterize the segman of DNA within which the gene for SLA must lie. The results demonstrated that DXS366 and DXS442, which share a 430-kb pulsed-field fragment, could replace DXS3 as proximal flanking markers. Probes at DXS178 and DXS265 identified the same 145-kb pulsed-field fragment, and both loci were contained within a 200-kb YAC identified with the probe p212. A highly polymorphic CA repeat (DCS178CA) was isolated from one end of this YAC and used in linkage analysis. Probes at DXS101 and DXS328 shared several pulsed-field fragments, the smallest of which was 250 kb. No recombinations were seen between XLA and the DXS178-DXS265-DXS178CA complex, DXS101, DXS328, DXS87, or the gene for proteolipid protein (PLP). Key crossovers, when combined with the linkage data from families with Alport syndrome, suggested the following order of loci: cen-DXS3-DXS366-DXS442-(PLP, DXS101, DXS328, DXS178-DXS265-DXS178CA complex, XL)-(DXS87, DXS94)-DXS327-(DXS350, DXS362)-tel. Our studies also limit the segment of DNA within which the XLA gene must lie to the 3- to 4-cM distance between DCS442 and DXS94 and they identify and orient polymorphisms that can be used in genetic counseling not only for XLA but also for Pelizaeus-Merzbacher disease (PLP deficiency), Alport syndrome (COL4A5 deficiency), and Fabry disease ([alpha]-galactosidase A difficiency). 31 refs., 5 figs., 2 tabs.

  15. Genre Analysis and Writing Skill: Improving Iranian EFL Learners Writing Performance through the Tenets of Genre Analysis

    Directory of Open Access Journals (Sweden)

    Nazanin Naderi Kalali

    2015-12-01

    Full Text Available The main thrust of this study was to determine whether a genre-based instruction improve the writing proficiency of Iranian EFL learners. To this end, 30 homogenous Iranian BA learners studying English at Islamic Azad University, Bandar Abbas Branch were selected as the participants of the study through a version of TOEFL test as the proficiency test. The selected participants were 15 females and 15 males who were randomly divided into two groups of experimental and control. The both experimental and control groups were asked to write on a topic determined by the researcher which were considered as the pre-test. The writing of the students were scored using holistic scoring procedure. The subjects received sixteen hours instruction—the experimental group using a genre-based pedagogy and the control group through the traditional methodology which was followed by a post-test—the subjects were, this time, asked to write on the same topic which they were asked to write before instruction. Their post-writings were also scored through the holistic scoring procedures. In analyzing the data, t-test statistic was utilized for comparing the performances of the two groups. It was found that there is statistically significant difference between the writing ability of the participants who go under a genre-based instruction and who don’t. The study, however, didn’t find any significant role for gender. Keywords: genre analysis, writing skill, holistic scoring procedure, pre-test, post-test, t-test

  16. Improving Crop Yield and Nutrient Use Efficiency via Biofertilization—A Global Meta-analysis

    Science.gov (United States)

    Schütz, Lukas; Gattinger, Andreas; Meier, Matthias; Müller, Adrian; Boller, Thomas; Mäder, Paul; Mathimaran, Natarajan

    2018-01-01

    The application of microbial inoculants (biofertilizers) is a promising technology for future sustainable farming systems in view of rapidly decreasing phosphorus stocks and the need to more efficiently use available nitrogen (N). Various microbial taxa are currently used as biofertilizers, based on their capacity to access nutrients from fertilizers and soil stocks, to fix atmospheric nitrogen, to improve water uptake or to act as biocontrol agents. Despite the existence of a considerable knowledge on effects of specific taxa of biofertilizers, a comprehensive quantitative assessment of the performance of biofertilizers with different traits such as phosphorus solubilization and N fixation applied to various crops at a global scale is missing. We conducted a meta-analysis to quantify benefits of biofertilizers in terms of yield increase, nitrogen and phosphorus use efficiency, based on 171 peer reviewed publications that met eligibility criteria. Major findings are: (i) the superiority of biofertilizer performance in dry climates over other climatic regions (yield response: dry climate +20.0 ± 1.7%, tropical climate +14.9 ± 1.2%, oceanic climate +10.0 ± 3.7%, continental climate +8.5 ± 2.4%); (ii) meta-regression analyses revealed that yield response due to biofertilizer application was generally small at low soil P levels; efficacy increased along higher soil P levels in the order arbuscular mycorrhizal fungi (AMF), P solubilizers, and N fixers; (iii) meta-regressions showed that the success of inoculation with AMF was greater at low organic matter content and at neutral pH. Our comprehensive analysis provides a basis and guidance for proper choice and application of biofertilizers. PMID:29375594

  17. Improving Crop Yield and Nutrient Use Efficiency via Biofertilization-A Global Meta-analysis.

    Science.gov (United States)

    Schütz, Lukas; Gattinger, Andreas; Meier, Matthias; Müller, Adrian; Boller, Thomas; Mäder, Paul; Mathimaran, Natarajan

    2017-01-01

    The application of microbial inoculants (biofertilizers) is a promising technology for future sustainable farming systems in view of rapidly decreasing phosphorus stocks and the need to more efficiently use available nitrogen (N). Various microbial taxa are currently used as biofertilizers, based on their capacity to access nutrients from fertilizers and soil stocks, to fix atmospheric nitrogen, to improve water uptake or to act as biocontrol agents. Despite the existence of a considerable knowledge on effects of specific taxa of biofertilizers, a comprehensive quantitative assessment of the performance of biofertilizers with different traits such as phosphorus solubilization and N fixation applied to various crops at a global scale is missing. We conducted a meta-analysis to quantify benefits of biofertilizers in terms of yield increase, nitrogen and phosphorus use efficiency, based on 171 peer reviewed publications that met eligibility criteria. Major findings are: (i) the superiority of biofertilizer performance in dry climates over other climatic regions (yield response: dry climate +20.0 ± 1.7%, tropical climate +14.9 ± 1.2%, oceanic climate +10.0 ± 3.7%, continental climate +8.5 ± 2.4%); (ii) meta-regression analyses revealed that yield response due to biofertilizer application was generally small at low soil P levels; efficacy increased along higher soil P levels in the order arbuscular mycorrhizal fungi (AMF), P solubilizers, and N fixers; (iii) meta-regressions showed that the success of inoculation with AMF was greater at low organic matter content and at neutral pH. Our comprehensive analysis provides a basis and guidance for proper choice and application of biofertilizers.

  18. Improved Ancestry Estimation for both Genotyping and Sequencing Data using Projection Procrustes Analysis and Genotype Imputation

    Science.gov (United States)

    Wang, Chaolong; Zhan, Xiaowei; Liang, Liming; Abecasis, Gonçalo R.; Lin, Xihong

    2015-01-01

    Accurate estimation of individual ancestry is important in genetic association studies, especially when a large number of samples are collected from multiple sources. However, existing approaches developed for genome-wide SNP data do not work well with modest amounts of genetic data, such as in targeted sequencing or exome chip genotyping experiments. We propose a statistical framework to estimate individual ancestry in a principal component ancestry map generated by a reference set of individuals. This framework extends and improves upon our previous method for estimating ancestry using low-coverage sequence reads (LASER 1.0) to analyze either genotyping or sequencing data. In particular, we introduce a projection Procrustes analysis approach that uses high-dimensional principal components to estimate ancestry in a low-dimensional reference space. Using extensive simulations and empirical data examples, we show that our new method (LASER 2.0), combined with genotype imputation on the reference individuals, can substantially outperform LASER 1.0 in estimating fine-scale genetic ancestry. Specifically, LASER 2.0 can accurately estimate fine-scale ancestry within Europe using either exome chip genotypes or targeted sequencing data with off-target coverage as low as 0.05×. Under the framework of LASER 2.0, we can estimate individual ancestry in a shared reference space for samples assayed at different loci or by different techniques. Therefore, our ancestry estimation method will accelerate discovery in disease association studies not only by helping model ancestry within individual studies but also by facilitating combined analysis of genetic data from multiple sources. PMID:26027497

  19. Rehabilitation Interventions for Improving Social Participation After Stroke: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Obembe, Adebimpe O; Eng, Janice J

    2016-05-01

    Despite the fact that social participation is considered a pivotal outcome of a successful recovery after stroke, there has been little attention on the impact of activities and services on this important domain. To present a systematic review and meta-analysis from randomized controlled trials (RCTs) on the effects of rehabilitation interventions on social participation after stroke. A total of 8 electronic databases were searched for relevant RCTs that evaluated the effects of an intervention on the outcome of social participation after stroke. Reference lists of selected articles were hand searched to identify further relevant studies. The methodological quality of the studies was assessed using the Physiotherapy Evidence Database Scale. Standardized mean differences (SMDs) and confidence intervals (CIs) were estimated using fixed- and random-effect models. In all, 24 RCTs involving 2042 stroke survivors were identified and reviewed, and 21 were included in the meta-analysis. There was a small beneficial effect of interventions that utilized exercise on social participation (10 studies; SMD = 0.43; 95% CI = 0.09, 0.78;P= .01) immediately after the program ended. Exercise in combination with other interventions (13 studies; SMD = 0.34; 95% CI = 0.10, 0.58;P= .006) also resulted in beneficial effects. No significant effect was observed for interventions that involved support services over 9 studies (SMD = 0.09 [95% CI = -0.04, 0.21];I(2)= 0%;P= .16). The included studies provide evidence that rehabilitation interventions may be effective in improving social participation after stroke, especially if exercise is one of the components. © The Author(s) 2015.

  20. Gaining improved chemical composition by exploitation of Compton-to-Rayleigh intensity ratio in XRF analysis.

    Science.gov (United States)

    Hodoroaba, Vasile-Dan; Rackwitz, Vanessa

    2014-07-15

    The high specificity of the coherent (Rayleigh), as well as incoherent (Compton) X-ray scattering to the mean atomic number of a specimen to be analyzed by X-ray fluorescence (XRF), is exploited to gain more information on the chemical composition. Concretely, the evaluation of the Compton-to-Rayleigh intensity ratio from XRF spectra and its relation to the average atomic number of reference materials via a calibration curve can reveal valuable information on the elemental composition complementary to that obtained from the reference-free XRF analysis. Particularly for matrices of lower mean atomic numbers, the sensitivity of the approach is so high that it can be easily distinguished between specimens of mean atomic numbers differing from each other by 0.1. Hence, the content of light elements which are "invisible" for XRF, particularly hydrogen, or of heavier impurities/additives in light materials can be calculated "by difference" from the scattering calibration curve. The excellent agreement between such an experimental, empirical calibration curve and a synthetically generated one, on the basis of a reliable physical model for the X-ray scattering, is also demonstrated. Thus, the feasibility of the approach for given experimental conditions and particular analytical questions can be tested prior to experiments with reference materials. For the present work a microfocus X-ray source attached on an SEM/EDX (scanning electron microscopy/energy dispersive X-ray spectroscopy) system was used so that the Compton-to-Rayleigh intensity ratio could be acquired with EDX spectral data for improved analysis of the elemental composition.

  1. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    Science.gov (United States)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by

  2. Opportunity for verbalization does not improve visual change detection performance: A state-trace analysis.

    Science.gov (United States)

    Sense, Florian; Morey, Candice C; Prince, Melissa; Heathcote, Andrew; Morey, Richard D

    2017-06-01

    Evidence suggests that there is a tendency to verbally recode visually-presented information, and that in some cases verbal recoding can boost memory performance. According to multi-component models of working memory, memory performance is increased because task-relevant information is simultaneously maintained in two codes. The possibility of dual encoding is problematic if the goal is to measure capacity for visual information exclusively. To counteract this possibility, articulatory suppression is frequently used with visual change detection tasks specifically to prevent verbalization of visual stimuli. But is this precaution always necessary? There is little reason to believe that concurrent articulation affects performance in typical visual change detection tasks, suggesting that verbal recoding might not be likely to occur in this paradigm, and if not, precautionary articulatory suppression would not always be necessary. We present evidence confirming that articulatory suppression has no discernible effect on performance in a typical visual change-detection task in which abstract patterns are briefly presented. A comprehensive analysis using both descriptive statistics and Bayesian state-trace analysis revealed no evidence for any complex relationship between articulatory suppression and performance that would be consistent with a verbal recoding explanation. Instead, the evidence favors the simpler explanation that verbal strategies were either not deployed in the task or, if they were, were not effective in improving performance, and thus have no influence on visual working memory as measured during visual change detection. We conclude that in visual change detection experiments in which abstract visual stimuli are briefly presented, pre-cautionary articulatory suppression is unnecessary.

  3. Stable Isotope Labeling for Improved Comparative Analysis of RNA Digests by Mass Spectrometry

    Science.gov (United States)

    Paulines, Mellie June; Limbach, Patrick A.

    2017-03-01

    Even with the advent of high throughput methods to detect modified ribonucleic acids (RNAs), mass spectrometry remains a reliable method to detect, characterize, and place post-transcriptional modifications within an RNA sequence. Here we have developed a stable isotope labeling comparative analysis of RNA digests (SIL-CARD) approach, which improves upon the original 18O/16O labeling CARD method. Like the original, SIL-CARD allows sequence or modification information from a previously uncharacterized in vivo RNA sample to be obtained by direct comparison with a reference RNA, the sequence of which is known. This reference is in vitro transcribed using a 13C/15N isotopically enriched nucleoside triphosphate (NTP). The two RNAs are digested with an endonuclease, the specificity of which matches the labeled NTP used for transcription. As proof of concept, several transfer RNAs (tRNAs) were characterized by SIL-CARD, where labeled guanosine triphosphate was used for the reference in vitro transcription. RNase T1 digestion products from the in vitro transcript will be 15 Da higher in mass than the same digestion products from the in vivo tRNA that are unmodified, leading to a doublet in the mass spectrum. Singlets, rather than doublets, arise if a sequence variation or a post-transcriptional modification is present that results in a relative mass shift different from 15 Da. Moreover, the use of the in vitro synthesized tRNA transcript allows for quantitative measurement of RNA abundance. Overall, SIL-CARD simplifies data analysis and enhances quantitative RNA modification mapping by mass spectrometry.

  4. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST

    Directory of Open Access Journals (Sweden)

    Dyrda J.

    2017-01-01

    Full Text Available Following the release and initial testing period of the NEA’s Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a

  5. Improved metabolites of pharmaceutical ingredient grade Ginkgo biloba and the correlated proteomics analysis.

    Science.gov (United States)

    Zheng, Wen; Li, Ximin; Zhang, Lin; Zhang, Yanzhen; Lu, Xiaoping; Tian, Jingkui

    2015-06-01

    Ginkgo biloba is an attractive and traditional medicinal plant, and has been widely used as a phytomedicine in the prevention and treatment of cardiovascular and cerebrovascular diseases. Flavonoids and terpene lactones are the major bioactive components of Ginkgo, whereas the ginkgolic acids (GAs) with strong allergenic properties are strictly controlled. In this study, we tested the content of flavonoids and GAs under ultraviolet-B (UV-B) treatment and performed comparative proteomic analyses to determine the differential proteins that occur upon UV-B radiation. That might play a crucial role in producing flavonoids and GAs. Our phytochemical analyses demonstrated that UV-B irradiation significantly increased the content of active flavonoids, and decreased the content of toxic GAs. We conducted comparative proteomic analysis of both whole leaf and chloroplasts proteins. In total, 27 differential proteins in the whole leaf and 43 differential proteins in the chloroplast were positively identified and functionally annotated. The proteomic data suggested that enhanced UV-B radiation exposure activated antioxidants and stress-responsive proteins as well as reduced the rate of photosynthesis. We demonstrate that UV-B irradiation pharmaceutically improved the metabolic ingredients of Ginkgo, particularly in terms of reducing GAs. With high UV absorption properties, and antioxidant activities, the flavonoids were likely highly induced as protective molecules following UV-B irradiation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Enhancing e-waste estimates: improving data quality by multivariate Input-Output Analysis.

    Science.gov (United States)

    Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter

    2013-11-01

    Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input-Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. An Improved Variable Structure Adaptive Filter Design and Analysis for Acoustic Echo Cancellation

    Directory of Open Access Journals (Sweden)

    A. Kar

    2015-04-01

    Full Text Available In this research an advance variable structure adaptive Multiple Sub-Filters (MSF based algorithm for single channel Acoustic Echo Cancellation (AEC is proposed and analyzed. This work suggests a new and improved direction to find the optimum tap-length of adaptive filter employed for AEC. The structure adaptation, supported by a tap-length based weight update approach helps the designed echo canceller to maintain a trade-off between the Mean Square Error (MSE and time taken to attain the steady state MSE. The work done in this paper focuses on replacing the fixed length sub-filters in existing MSF based AEC algorithms which brings refinements in terms of convergence, steady state error and tracking over the single long filter, different error and common error algorithms. A dynamic structure selective coefficient update approach to reduce the structural and computational cost of adaptive design is discussed in context with the proposed algorithm. Simulated results reveal a comparative performance analysis over proposed variable structure multiple sub-filters designs and existing fixed tap-length sub-filters based acoustic echo cancellers.

  8. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Siew Li Teoh

    2016-01-01

    Full Text Available Chicken essence (CE is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane’s Risk of Bias (ROB tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD of −0.55 (−1.04, −0.06 and another with no significant difference (pooled SMD of 0.70 (−0.001, 1.40. For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (−0.16, 1.42. Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE.

  9. Improved high order free vibration analysis of thick double curved sandwich panels with transversely flexible cores

    Directory of Open Access Journals (Sweden)

    K. Malekzadeh Fard

    Full Text Available This paper dealt with free vibration analysis of thick double curved composite sandwich panels with simply supported or fully clamped boundary conditions based on a new improved higher order sandwich panel theory. The formulation used the first order shear deformation theory for composite face sheets and polynomial description for the displacement field in the core layer which was based on the displacement field of Frostig's second model. The fully dynamic effects of the core layer and face sheets were also considered in this study. Using the Hamilton's principle, the governing equations were derived. Moreover, effects of some important parameters like that of boundary conditions, thickness ratio of the core to panel, radii curvatures and composite lay-up sequences were investigated on free vibration response of the panel. The results were validated by those published in the literature and with the FE results obtained by ABAQUS software. It was shown that thicker panels with a thicker core provided greater resistance to resonant vibrations. Also, effect of increasing the core thickness in general was significant decreased fundamental natural frequency values.

  10. The performance of one belt and one road exchange rate: Based on improved singular spectrum analysis

    Science.gov (United States)

    Lai, Lin; Guo, Kun

    2017-10-01

    ;One Belt and One Road; strategy in China is on push of foreign trade openness at northwest, southwest and northeast, absorption of the excess capacity and new support for economic increase. However, the fluctuation in RMB exchange rate with the countries along the road is unstable so related Chinese enterprises will face high risk of exchange rate. Precise explanation or prediction for exchange rate has been the challengeable hop point in the international finance. This paper decomposed the One Belt One Road Exchange Rate Index (OBORR) and the RMB Effective Exchange Rate Index (CNYX) into trend term, market fluctuation term and noise term using improved singular spectrum analysis (SSA). It turns out that the increasing velocity of OBORR is greater than that of CNYX in the long term, and there is dynamic lead-lag structure in the medium term. In the short term, the fluctuation range and frequency of OBORR are greater than those of CNYX, which means there will be more exchange rate risks in One Belt and One Road countries.

  11. Improving energy productivity of sunflower production using data envelopment analysis (DEA) approach.

    Science.gov (United States)

    Avval, Seyed Hashem Mousavi; Rafiee, Shahin; Jafari, Ali; Mohammadi, Ali

    2011-08-15

    Efficient use of energy in agriculture is one of the conditions for sustainable agricultural production. This study applies the data envelopment analysis (DEA) approach to the data of 95 randomly selected farms to investigate the technical and scale efficiencies of farmers with respect to energy use for sunflower production in Golestan province, Iran. The study also helps to identify the wasteful usage and the optimum level of energy from different inputs. According to the results of DEA models, about 36% of farmers were found to be technically efficient and the mean efficiency of sunflower producers was found to be 0.87 and 0.96 under the constant and variable returns to scale assumptions respectively. The optimum energy requirement was calculated as 8448.3 MJ ha⁻¹; accordingly, a potential reduction of 10.8% (1020.3 MJ ha⁻¹) in total energy input could be achieved by raising the performance of farmers to the highest level. Applying a better machinery management technique and conservation tillage methods, application of fertilisers by performance monitoring and utilisation of alternative sources of energy such as compost and chopped residues may be the pathways for improving energy productivity and reducing the environmental footprint. Copyright © 2011 Society of Chemical Industry.

  12. A New Method for Improving the Discrimination Power and Weights Dispersion in the Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    S. Kordrostami

    2013-06-01

    Full Text Available The appropriate choice of input-output weights is necessary to have a successful DEA model. Generally, if the number of DMUs i.e., n, is less than number of inputs and outputs i.e., m+s, then many of DMUs are introduced as efficient then the discrimination between DMUs is not possible. Besides, DEA models are free to choose the best weights. For resolving the problems that are resulted from freedom of weights, some constraints are set on the input-output weights. Symmetric weight constraints are a kind of weight constrains. In this paper, we represent a new model based on a multi-criterion data envelopment analysis (MCDEA are developed to moderate the homogeneity of weights distribution by using symmetric weight constrains.Consequently, we show that the improvement of the dispersal of unrealistic input-output weights and the increasing discrimination power for our suggested models. Finally, as an application of the new model, we use this model to evaluate and ranking guilan selected hospitals.

  13. Combination of principal component analysis and optical-flow motion compensation for improved cardiac MR thermometry.

    Science.gov (United States)

    Toupin, S; de Senneville, B Denis; Ozenne, V; Bour, P; Lepetit-Coiffe, M; Boissenin, M; Jais, P; Quesson, B

    2017-02-21

    The use of magnetic resonance (MR) thermometry for the monitoring of thermal ablation is rapidly expanding. However, this technique remains challenging for the monitoring of the treatment of cardiac arrhythmia by radiofrequency ablation due to the heart displacement with respiration and contraction. Recent studies have addressed this problem by compensating in-plane motion in real-time with optical-flow based tracking technique. However, these algorithms are sensitive to local variation of signal intensity on magnitude images associated with tissue heating. In this study, an optical-flow algorithm was combined with a principal component analysis method to reduce the impact of such effects. The proposed method was integrated to a fully automatic cardiac MR thermometry pipeline, compatible with a future clinical workflow. It was evaluated on nine healthy volunteers under free breathing conditions, on a phantom and in vivo on the left ventricle of a sheep. The results showed that local intensity changes in magnitude images had lower impact on motion estimation with the proposed method. Using this strategy, the temperature mapping accuracy was significantly improved.

  14. Improved machine learning method for analysis of gas phase chemistry of peptides

    Directory of Open Access Journals (Sweden)

    Ahn Natalie

    2008-12-01

    Full Text Available Abstract Background Accurate peptide identification is important to high-throughput proteomics analyses that use mass spectrometry. Search programs compare fragmentation spectra (MS/MS of peptides from complex digests with theoretically derived spectra from a database of protein sequences. Improved discrimination is achieved with theoretical spectra that are based on simulating gas phase chemistry of the peptides, but the limited understanding of those processes affects the accuracy of predictions from theoretical spectra. Results We employed a robust data mining strategy using new feature annotation functions of MAE software, which revealed under-prediction of the frequency of occurrence in fragmentation of the second peptide bond. We applied methods of exploratory data analysis to pre-process the information in the MS/MS spectra, including data normalization and attribute selection, to reduce the attributes to a smaller, less correlated set for machine learning studies. We then compared our rule building machine learning program, DataSqueezer, with commonly used association rules and decision tree algorithms. All used machine learning algorithms produced similar results that were consistent with expected properties for a second gas phase mechanism at the second peptide bond. Conclusion The results provide compelling evidence that we have identified underlying chemical properties in the data that suggest the existence of an additional gas phase mechanism for the second peptide bond. Thus, the methods described in this study provide a valuable approach for analyses of this kind in the future.

  15. Pulmonary complications after major abdominal surgery: National Surgical Quality Improvement Program analysis.

    Science.gov (United States)

    Yang, Chun Kevin; Teng, Annabelle; Lee, David Y; Rose, Keith

    2015-10-01

    Postoperative pulmonary complications (PPCs) after major abdominal surgery are common and associated with significant morbidity and high cost of care. The objective of this study was to identify the risk factors for PPCs after major abdominal surgery. The American College of Surgeons' National Surgical Quality Improvement Program database from 2005-2012 was queried for patients who underwent major abdominal surgery (esophagectomy, gastrectomy, pacnreatectomy, enterectomy, hepatectomy, colectomy, and proctectomy). Predictors of PPCs were identified using multivariate logistic regression. Of 165,196 patients who underwent major abdominal surgery 9595 (5.8%) suffered PPCs (pneumonia 3.2%, prolonged ventilator support ≥48 h 3.0%, and unplanned intubation 2.8%). On multivariate analysis, significant predictors of overall and individual PPCs include esophagectomy, advanced American Society of Anesthesiology Classification System, dependent functional status, prolonged operative time, age ≥80 y, severe chronic obstructive pulmonary disease, preoperative shock, ascites, and smoking. Obesity was not a risk factor. Female gender was overall protective for PPCs. PPCs after abdominal procedures are associated with a number of clinical variables. Esophageal operations and American Society of Anesthesiology Classification System were the strongest predictors. These results provide a framework for identifying patients at risk for developing pulmonary complications after major abdominal surgery. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Analysis of microbiota on abalone (Haliotis discus hannai) in South Korea for improved product management.

    Science.gov (United States)

    Lee, Min-Jung; Lee, Jin-Jae; Chung, Han Young; Choi, Sang Ho; Kim, Bong-Soo

    2016-10-03

    Abalone is a popular seafood in South Korea; however, because it contains various microorganisms, its ingestion can cause food poisoning. Therefore, analysis of the microbiota on abalone can improve understanding of outbreaks and causes of food poisoning and help to better manage seafood products. In this study, we collected a total of 40 abalones from four different regions in March and July, which are known as the maximum abalone production areas in Korea. The microbiota were analyzed using high-throughput sequencing, and bacterial loads on abalone were quantified by real-time PCR. Over 2700 species were detected in the samples, and Alpha- and Gammaproteobacteria were the predominant classes. The differences in microbiota among regions and at each sampling time were also investigated. Although Psychrobacter was the dominant genus detected on abalone in both March and July, the species compositions were different between the two sampling times. Five potential pathogens (Lactococcus garvieae, Yersinia kristensenii, Staphylococcus saprophyticus, Staphylococcus warneri, and Staphylococcus epidermidis) were detected among the abalone microbiota. In addition, we analyzed the influence of Vibrio parahaemolyticus infection on shifts in abalone microbiota during storage at different temperatures. Although the proportion of Vibrio increased over time in infected and non-infected abalone, the shifts of microbiota were more dynamic in infected abalone. These results can be used to better understand the potential of food poisoning caused by abalone consumption and manage abalone products according to the microbiota composition. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Vicus: Exploiting local structures to improve network-based analysis of biological data.

    Directory of Open Access Journals (Sweden)

    Bo Wang

    2017-10-01

    Full Text Available Biological networks entail important topological features and patterns critical to understanding interactions within complicated biological systems. Despite a great progress in understanding their structure, much more can be done to improve our inference and network analysis. Spectral methods play a key role in many network-based applications. Fundamental to spectral methods is the Laplacian, a matrix that captures the global structure of the network. Unfortunately, the Laplacian does not take into account intricacies of the network's local structure and is sensitive to noise in the network. These two properties are fundamental to biological networks and cannot be ignored. We propose an alternative matrix Vicus. The Vicus matrix captures the local neighborhood structure of the network and thus is more effective at modeling biological interactions. We demonstrate the advantages of Vicus in the context of spectral methods by extensive empirical benchmarking on tasks such as single cell dimensionality reduction, protein module discovery and ranking genes for cancer subtyping. Our experiments show that using Vicus, spectral methods result in more accurate and robust performance in all of these tasks.

  18. Vicus: Exploiting local structures to improve network-based analysis of biological data.

    Science.gov (United States)

    Wang, Bo; Huang, Lin; Zhu, Yuke; Kundaje, Anshul; Batzoglou, Serafim; Goldenberg, Anna

    2017-10-01

    Biological networks entail important topological features and patterns critical to understanding interactions within complicated biological systems. Despite a great progress in understanding their structure, much more can be done to improve our inference and network analysis. Spectral methods play a key role in many network-based applications. Fundamental to spectral methods is the Laplacian, a matrix that captures the global structure of the network. Unfortunately, the Laplacian does not take into account intricacies of the network's local structure and is sensitive to noise in the network. These two properties are fundamental to biological networks and cannot be ignored. We propose an alternative matrix Vicus. The Vicus matrix captures the local neighborhood structure of the network and thus is more effective at modeling biological interactions. We demonstrate the advantages of Vicus in the context of spectral methods by extensive empirical benchmarking on tasks such as single cell dimensionality reduction, protein module discovery and ranking genes for cancer subtyping. Our experiments show that using Vicus, spectral methods result in more accurate and robust performance in all of these tasks.

  19. Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method.

    Science.gov (United States)

    Lu, Zhaolin; Hu, Xiaojuan; Lu, Yao

    2017-01-01

    Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250  μ m were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250  μ m. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis.

  20. Bootstrap analysis of designed experiments for reliability improvement with a non-constant scale parameter

    International Nuclear Information System (INIS)

    Wang, Guodong; He, Zhen; Xue, Li; Cui, Qingan; Lv, Shanshan; Zhou, Panpan

    2017-01-01

    Factors which significantly affect product reliability are of great interest to reliability practitioners. This paper proposes a bootstrap-based methodology for identifying significant factors when both location and scale parameters of the smallest extreme value distribution vary over experimental factors. An industrial thermostat experiment is presented, analyzed, and discussed as an illustrative example. The analysis results show that 1) the misspecification of a constant scale parameter may lead to misidentify spurious effects; 2) the important factors identified by different bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping) are different; 3) the number of factors affecting 10th percentile lifetime significantly is less than the number of important factors identified at 63.21th percentile. - Highlights: • Product reliability is improved by design of experiments under both scale and location parameters of smallest extreme value distribution vary with experimental factors. • A bootstrap-based methodology is proposed to identify important factors which affect 100pth lifetime percentile significantly. • Bootstrapping confidence intervals associating experimental factors are obtained by using three bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping). • The important factors identified by different bootstrap methods are different. • The number of factors affecting 10th percentile significantly is less than the number of important factors identified at 63.21th percentile.

  1. Improving breast cancer classification with mammography, supported on an appropriate variable selection analysis

    Science.gov (United States)

    Pérez, Noel; Guevara, Miguel A.; Silva, Augusto

    2013-02-01

    This work addresses the issue of variable selection within the context of breast cancer classification with mammography. A comprehensive repository of feature vectors was used including a hybrid subset gathering image-based and clinical features. It aimed to gather experimental evidence of variable selection in terms of cardinality, type and find a classification scheme that provides the best performance over the Area Under Receiver Operating Characteristics Curve (AUC) scores using the ranked features subset. We evaluated and classified a total of 300 subsets of features formed by the application of Chi-Square Discretization, Information-Gain, One-Rule and RELIEF methods in association with Feed-Forward Backpropagation Neural Network (FFBP), Support Vector Machine (SVM) and Decision Tree J48 (DTJ48) Machine Learning Algorithms (MLA) for a comparative performance evaluation based on AUC scores. A variable selection analysis was performed for Single-View Ranking and Multi-View Ranking groups of features. Features subsets representing Microcalcifications (MCs), Masses and both MCs and Masses lesions achieved AUC scores of 0.91, 0.954 and 0.934 respectively. Experimental evidence demonstrated that classification performance was improved by combining image-based and clinical features. The most important clinical and image-based features were StromaDistortion and Circularity respectively. Other less important but worth to use due to its consistency were Contrast, Perimeter, Microcalcification, Correlation and Elongation.

  2. Improvement of Smoking Abstinence Rates With Increased Varenicline Dosage: A Propensity Score-Matched Analysis.

    Science.gov (United States)

    Karam-Hage, Maher; Kypriotakis, George; Robinson, Jason D; Green, Charles E; Mann, Gurtej; Rabius, Vance; Wippold, Rosario; Blalock, Janice A; Mouhayar, Elie; Tayar, Jean; Chaftari, Patrick; Cinciripini, Paul M

    2018-02-01

    It is unclear whether increasing the dose of varenicline beyond the standard dose of 2 mg/d would improve smoking abstinence. We examined the effect of 3 mg/d of varenicline on smoking abstinence among smokers who had reduced their smoking by 50% or more in response to 2 mg/d for at least 6 weeks but had not quit smoking. Of 2833 patients treated with varenicline, dosage of a subset of 73 smokers was increased to 3 mg/d after 6 weeks. We used a propensity score analysis involving multiple baseline covariates to create a comparative sample of 356 smokers who remained on 2 mg/d. All smokers received concurrent and similar smoking-cessation counseling. At 3 months, we found higher 7-day point prevalence smoking-abstinence rate in the 3-mg group (26%) than in the 2-mg group (11.5%, χ = 10.60, P < 0.001; risk ratio [RR], 2.3; 95% confidence interval [CI], 1.4-3.6). The difference in abstinence rates remained significant at the 6-month (P < 0.001; RR, 2.6; 95% CI, 1.6-3.9) and 9-month follow-up (P < 0.001; RR, 2.2; 95% CI, 1.4-3.3). A relatively small increase in the daily dose of varenicline seems to offer a benefit for those who are not able to achieve total abstinence after approximately 6 weeks of 2 mg/d.

  3. Improving Australia's renewable energy project policy and planning: A multiple stakeholder analysis

    International Nuclear Information System (INIS)

    Martin, Nigel; Rice, John

    2015-01-01

    Renewable Energy (RE) is part of Australia's and the world's energy supply matrix with over A$100 billion spent annually on RE projects since 2007. Businesses seeking to invest in RE projects, particularly in the wind and solar energy sectors, may face an onerous collection of planning approvals and permitting processes that impede investment and implementation. In this study, we draw on international and domestic stakeholder inputs to a governmental inquiry in Australia to show how RE projects might be approved in shortened timeframes with reduced associated costs. The process mapping and stakeholder analysis demonstrates that RE supply projects can benefit from standardized approval processes and documentation, a 360° deep engagement with stakeholders, and expanded electricity grid access in resource areas, augmented through supportive public policy and planning frameworks. In addition, stakeholder objections to project approval and implementation streamlining were used to contrast the efficacy of the proposed changes in policy. -- Highlights: •Highlights the over A$200 billion spent annually on global RE projects. •Describes a typical two stage, multi-layered governance RE project approval process. •Exposes long 3 year and multi-million dollar cost approvals for RE projects. •Identifies multi-million dollar remote grid connections as an RE project impediment. •Outlines RE project policy and guidelines shortcomings and proposed improvements

  4. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  5. Rise time analysis of pulsed klystron-modulator for efficiency improvement of linear colliders

    International Nuclear Information System (INIS)

    Oh, J.S.; Cho, M.H.; Namkung, W.; Chung, K.H.; Shintake, T.; Matsumoto, H.

    2000-01-01

    In linear accelerators, the periods during the rise and fall of a klystron-modulator pulse cannot be used to generate RF power. Thus, these periods need to be minimized to get high efficiency, especially in large-scale machines. In this paper, we present a simplified and generalized voltage rise time function of a pulsed modulator with a high-power klystron load using the equivalent circuit analysis method. The optimum pulse waveform is generated when this pulsed power system is tuned with a damping factor of ∼0.85. The normalized rise time chart presented in this paper allows one to predict the rise time and pulse shape of the pulsed power system in general. The results can be summarized as follows: The large distributed capacitance in the pulse tank and operating parameters, V s xT p , where V s is load voltage and T p is the pulse width, are the main factors determining the pulse rise time in the high-power RF system. With an RF pulse compression scheme, up to ±3% ripple of the modulator voltage is allowed without serious loss of compressor efficiency, which allows the modulator efficiency to be improved as well. The wiring inductance should be minimized to get the fastest rise time

  6. Perioperative blood transfusion in gynecologic oncology surgery: analysis of the National Surgical Quality Improvement Program Database.

    Science.gov (United States)

    Prescott, Lauren S; Aloia, Thomas A; Brown, Alaina J; Taylor, Jolyn S; Munsell, Mark F; Sun, Charlotte C; Schmeler, Kathleen M; Levenback, Charles F; Bodurka, Diane C

    2015-01-01

    To use a large-scale multi-institutional dataset to quantify the prevalence of packed red blood cell transfusions and examine the associations between transfusion and perioperative outcomes in gynecologic cancer surgery. The American College of Surgeons National Surgical Quality Improvement Program (NSQIP) participant use file was queried for all gynecologic cancer cases between 2010 and 2012. Demographic, preoperative and intraoperative variables were compared between transfusion and non-transfusion groups using chi-squared, Fisher's exact and Wilcoxon rank-sum tests. The primary endpoint was 30-day composite morbidity. Secondary endpoints included composite surgical site infections, mortality and length of stay. A total of 8519 patients were analyzed, and 13.8% received a packed red blood cell transfusion. In the multivariate analysis, after adjusting for key clinical and perioperative factors, including preoperative anemia and case magnitude, transfusion was associated with higher composite morbidity (OR = 1.85, 95% CI 1.5-2.24), surgical site infections (OR 1.80, 95% CI 1.39-2.35), mortality (OR 3.38, 95% CI 1.80-6.36) and length of hospital stay (3.02 days v. 7.17 days, P gynecologic cancer should be scrutinized. Examination of institutional practices and creation of transfusion guidelines for gynecologic malignancies could potentially result in better utilization of blood bank resources and clinical outcomes among patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Analysis of Factors Affecting Its Performance

    Science.gov (United States)

    Perry, Bruce A.; Anderson, Molly S.

    2015-01-01

    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station Water Processor Assembly to form a complete water recovery system for future missions. A preliminary chemical process simulation was previously developed using Aspen Custom Modeler® (ACM), but it could not simulate thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. This paper describes modifications to the ACM simulation of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version can be used to model thermal startup and predicts the total energy consumption of the CDS. The simulation has been validated for both NaC1 solution and pretreated urine feeds and no longer requires retuning when operating parameters change. The simulation was also used to predict how internal processes and operating conditions of the CDS affect its performance. In particular, it is shown that the coefficient of performance of the thermoelectric heat pump used to provide heating and cooling for the CDS is the largest factor in determining CDS efficiency. Intrastage heat transfer affects CDS performance indirectly through effects on the coefficient of performance.

  8. Vibration Characteristics Analysis of Cylindrical Shell-Plate Coupled Structure Using an Improved Fourier Series Method

    Directory of Open Access Journals (Sweden)

    Yipeng Cao

    2018-01-01

    Full Text Available A simple yet accurate solution procedure based on the improved Fourier series method (IFSM is applied to the vibration characteristics analysis of a cylindrical shell-circular plate (S-P coupled structure subjected to various boundary conditions. By applying four types of coupling springs with arbitrary stiffness at the junction of the coupled structure, the mechanical coupling effects are completely considered. Each of the plate and shell displacement functions is expressed as the superposition of a two-dimensional Fourier series and several supplementary functions. The unknown series-expansion coefficients are treated as the generalized coordinates and determined using the familiar Rayleigh-Ritz procedure. Using the IFSM, a unified solution for the S-P coupled structure with symmetrical and asymmetrical boundary conditions can be derived directly without the need to change either the equations of motion or the expressions of the displacements. This solution can be verified by comparing the current results with those calculated by the finite-element method (FEM. The effects of several significant factors, including the restraint stiffness, the coupling stiffness, and the situation of coupling, are presented. The forced vibration behaviors of the S-P coupled structure are also illustrated.

  9. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Teoh, Siew Li; Sudfangsai, Suthinee; Lumbiganon, Pisake; Laopaiboon, Malinee; Lai, Nai Ming; Chaiyakunapruk, Nathorn

    2016-01-20

    Chicken essence (CE) is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane's Risk of Bias (ROB) tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD) of -0.55 (-1.04, -0.06)) and another with no significant difference (pooled SMD of 0.70 (-0.001, 1.40)). For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (-0.16, 1.42)). Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE.

  10. Are standards effective in improving automobile fuel economy? An international panel analysis

    International Nuclear Information System (INIS)

    Clerides, Sofronis; Zachariadis, Theodoros

    2007-01-01

    Although the adoption of fuel economy standards has induced fuel savings in new motor vehicles, there are arguments against standards and in favour of fuel tax increases because the latter may have lower welfare costs. We therefore attempted to analyze the impact of standards and fuel prices in the fuel consumption of new cars with the aid of cross-section time series analysis of data from 18 countries. To our knowledge, this study is the first one that attempts to explore econometrically this issue at an international level. We built an unbalanced panel comprising 384 observations from the US, Canada, Australia, Japan, Switzerland and 13 EU countries spanning a period between 1975 and 2003. We specified a dynamic panel model of fuel economy and estimated the model for the whole sample and also for North America and Europe separately. Based on these estimates, we derived three important policy conclusions. Firstly, it seems that if there were no FE standards or voluntary targets in force, transportation energy use would increase more rapidly. Secondly, if CO 2 targets are not to be tightened in Europe, retail fuel prices might have to double in order to attain the currently discussed target of 120 g CO 2 /km in the future. Thirdly, without higher fuel prices and/or tighter FE standards, one should not expect any marked improvements in fuel economy under 'business as usual' conditions. European policy makers might need to consider this issue carefully because some recent European studies tend to be optimistic in this respect

  11. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  12. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  13. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-09-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  14. Analysis and Improvement on a Unimodal Haptic PIN-Entry Method

    Directory of Open Access Journals (Sweden)

    Mun-Kyu Lee

    2017-01-01

    Full Text Available User authentication is a process in which a user of a system proves his/her identity to acquire access permission. An effective user authentication method should be both secure and usable. In an attempt to achieve these two objectives, Bianchi et al. recently proposed novel unimodal PIN-entry methods that use either audio or vibration cues. This paper analyzes the security of their method, in particular, the vibration version of one of their proposals, Timelock. A probabilistic analysis and real attack experiment reveal that the security level guaranteed by Timelock is lower than that claimed in Bianchi et al.’s paper. As countermeasures to this problem, three PIN-entry methods are proposed and a usability study is performed. According to the result of this study, a simple modification may improve the security significantly while retaining the design philosophy of unimodal systems. In addition, the proposed methods address the PIN compatibility issue of Timelock and they can be used to enter a legacy numerical PIN without any change in the PIN.

  15. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  16. Improving Air Force Active Network Defense Systems through an Analysis of Intrusion Detection Techniques

    National Research Council Canada - National Science Library

    Dunklee, David R

    2007-01-01

    .... The research then presents four recommendations to improve DCC operations. These include: Transition or improve the current signature-based IDS systems to include the capability to query and visualize network flows to detect malicious traffic...

  17. Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing

    NARCIS (Netherlands)

    Gillani, G.A.; Kokkeler, Andre B.J.

    2017-01-01

    Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error

  18. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement

    OpenAIRE

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna; Nystr?m, Monica Elisabeth

    2016-01-01

    OBJECTIVES: Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. METHODS: We analysed 186 structured kaizen documents containing improvement suggest...

  19. Recommendations to improve imaging and analysis of brain lesion load and atrophy in longitudinal studies of multiple sclerosis

    DEFF Research Database (Denmark)

    Vrenken, H; Jenkinson, M; Horsfield, M A

    2013-01-01

    Focal lesions and brain atrophy are the most extensively studied aspects of multiple sclerosis (MS), but the image acquisition and analysis techniques used can be further improved, especially those for studying within-patient changes of lesion load and atrophy longitudinally. Improved accuracy...... resonance image analysis methods for assessing brain lesion load and atrophy, this paper makes recommendations to improve these measures for longitudinal studies of MS. Briefly, they are (1) images should be acquired using 3D pulse sequences, with near-isotropic spatial resolution and multiple image...... contrasts to allow more comprehensive analyses of lesion load and atrophy, across timepoints. Image artifacts need special attention given their effects on image analysis results. (2) Automated image segmentation methods integrating the assessment of lesion load and atrophy are desirable. (3) A standard...

  20. Improving Cross-Day EEG-Based Emotion Classification Using Robust Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Yuan-Pin Lin

    2017-07-01

    Full Text Available Constructing a robust emotion-aware analytical framework using non-invasively recorded electroencephalogram (EEG signals has gained intensive attentions nowadays. However, as deploying a laboratory-oriented proof-of-concept study toward real-world applications, researchers are now facing an ecological challenge that the EEG patterns recorded in real life substantially change across days (i.e., day-to-day variability, arguably making the pre-defined predictive model vulnerable to the given EEG signals of a separate day. The present work addressed how to mitigate the inter-day EEG variability of emotional responses with an attempt to facilitate cross-day emotion classification, which was less concerned in the literature. This study proposed a robust principal component analysis (RPCA-based signal filtering strategy and validated its neurophysiological validity and machine-learning practicability on a binary emotion classification task (happiness vs. sadness using a five-day EEG dataset of 12 subjects when participated in a music-listening task. The empirical results showed that the RPCA-decomposed sparse signals (RPCA-S enabled filtering off the background EEG activity that contributed more to the inter-day variability, and predominately captured the EEG oscillations of emotional responses that behaved relatively consistent along days. Through applying a realistic add-day-in classification validation scheme, the RPCA-S progressively exploited more informative features (from 12.67 ± 5.99 to 20.83 ± 7.18 and improved the cross-day binary emotion-classification accuracy (from 58.31 ± 12.33% to 64.03 ± 8.40% as trained the EEG signals from one to four recording days and tested against one unseen subsequent day. The original EEG features (prior to RPCA processing neither achieved the cross-day classification (the accuracy was around chance level nor replicated the encouraging improvement due to the inter-day EEG variability. This result

  1. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyoung Tae; Moon, Young Min; Choi, Sung Won; Hwang, Do Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2000-03-15

    The direct-contact condensation hear transfer coefficients are experimentally obtained in the following conditions : pure steam/steam in the presence of noncondensible gas, horizontal/slightly inclined pipe, cocurrent/countercurrent stratified flow with water. The empirical correlation for liquid Nusselt number is developed in conditions of the slightly inclined pipe and the cocurrent stratified flow. The several models - the wall friction coefficient, the interfacial friction coefficient, the correlation of direct-contact condensation with noncondensible gases, and the correlation of wall film condensation - in the RELAP5/MOD3.2 code are modified, As results, RELAP5/MOD3.2 is improved. The present experimental data is used for evaluating the improved code. The standard RELAP5/MOD3.2 code is modified using the non-iterative modeling, which is a mechanistic model and does not require any interfacial information such as the interfacial temperature, The modified RELAP5/MOD3.2 code os used to simulate the horizontally stratified in-tube condensation experiment which represents the direct-contact condensation phenomena in a hot leg of a nuclear reactor. The modeling capabilities of the modified code as well as the standard code are assessed using several hot-leg condensation experiments. The modified code gives better prediction over local experimental data of liquid void fraction and interfacial heat transfer coefficient than the standard code. For the separate effect test of the thermal-hydraulic phenomena in the pressurizer, the scaling analysis is performed to obtain a similarity of the phenomena between the Korea Standard Nuclear Power Plant(KSNPP) and the present experimental facility. The diameters and lengths of the hot-leg, the surge line and the pressurizer are scaled down with the similitude of CCFL and velocity. The ratio of gas flow rate is 1/25. The experimental facility is composed of the air-water supply tank, the horizontal pipe, the surge line and the

  2. Improving Child Oral Health: Cost Analysis of a National Nursery Toothbrushing Programme.

    Science.gov (United States)

    Anopa, Yulia; McMahon, Alex D; Conway, David I; Ball, Graham E; McIntosh, Emma; Macpherson, Lorna M D

    2015-01-01

    Dental caries is one of the most common diseases of childhood. The aim of this study was to compare the cost of providing the Scotland-wide nursery toothbrushing programme with associated National Health Service (NHS) cost savings from improvements in the dental health of five-year-old children: through avoided dental extractions, fillings and potential treatments for decay. Estimated costs of the nursery toothbrushing programme in 2011/12 were requested from all Scottish Health Boards. Unit costs of a filled, extracted and decayed primary tooth were calculated using verifiable sources of information. Total costs associated with dental treatments were estimated for the period from 1999/00 to 2009/10. These costs were based on the unit costs above and using the data of the National Dental Inspection Programme and then extrapolated to the population level. Expected cost savings were calculated for each of the subsequent years in comparison with the 2001/02 dental treatment costs. Population standardised analysis of hypothetical cohorts of 1000 children per deprivation category was performed. The estimated cost of the nursery toothbrushing programme in Scotland was £1,762,621 per year. The estimated cost of dental treatments in the baseline year 2001/02 was £8,766,297, while in 2009/10 it was £4,035,200. In 2002/03 the costs of dental treatments increased by £213,380 (2.4%). In the following years the costs decreased dramatically with the estimated annual savings ranging from £1,217,255 in 2003/04 (13.9% of costs in 2001/02) to £4,731,097 in 2009/10 (54.0%). Population standardised analysis by deprivation groups showed that the largest decrease in modelled costs was for the most deprived cohort of children. The NHS costs associated with the dental treatments for five-year-old children decreased over time. In the eighth year of the toothbrushing programme the expected savings were more than two and a half times the costs of the programme implementation.

  3. Metal Foam Analysis: Improving Sandwich Structure Technology for Engine Fan and Propeller Blades

    Science.gov (United States)

    Fedor, Jessica L.

    2004-01-01

    The Life Prediction Branch of the NASA Glenn Research Center is searching for ways to construct aircraft and rotorcraft engine fan and propeller blades that are lighter and less costly. One possible design is to create a sandwich structure composed of two metal faces sheets and a metal foam core. The face sheets would carry the bending loads and the foam core would have to resist the transverse shear loads. Metal foam is ideal because of its low density and energy absorption capabilities, making the structure lighter, yet still stiff. The material chosen for the face sheets and core was 17-4PH stainless steel, which is easy to make and has appealing mechanical properties. This material can be made inexpensively compared to titanium and polymer matrix composites, the two current fan blade alternatives. Initial tests were performed on design models, including vibration and stress analysis. These tests revealed that the design is competitive with existing designs; however, some problems were apparent that must be addressed before it can be implemented in new technology. The foam did not hold up as well as expected under stress. This could be due to a number of issues, but was most likely a result of a large number of pores within the steel that weakened the structure. The brazing between the face sheets and the foam was also identified as a concern. The braze did not hold up well under shear stress causing the foam to break away from the face sheets. My role in this project was to analyze different options for improving the design. I primarily spent my time examining various foam samples, created with different sintering conditions, to see which exhibited the most favorable characteristics for our purpose. Methods of analysis that I employed included examining strut integrity under a microscope, counting the number of cells per inch, measuring the density, testing the microhardness, and testing the strength under compression. Shear testing will also be done to examine

  4. Numerical Analysis of the Unsteady Propeller Performance in the Ship Wake Modified By Different Wake Improvement Devices

    Directory of Open Access Journals (Sweden)

    Bugalski Tomasz

    2014-10-01

    Full Text Available The paper presents the summary of results of the numerical analysis of the unsteady propeller performance in the non-uniform ship wake modified by the different wake improvement devices. This analysis is performed using the lifting surface program DUNCAN for unsteady propeller analysis. Te object of the analysis is a 7000 ton chemical tanker, for which four different types of the wake improvement devices have been designed: two vortex generators, a pre-swirl stator, and a boundary layer alignment device. These produced five different cases of the ship wake structure: the original hull and hull equipped alternatively with four wake improvement devices. Two different propellers were analyzed in these five wake fields, one being the original reference propeller P0 and the other - a specially designed, optimized propeller P3. Te analyzed parameters were the pictures of unsteady cavitation on propeller blades, harmonics of pressure pulses generated by the cavitating propellers in the selected points and the fluctuating bearing forces on the propeller shaft. Some of the calculated cavitation phenomena were confronted with the experimental. Te objective of the calculations was to demonstrate the differences in the calculated unsteady propeller performance resulting from the application of different wake improvement devices. Te analysis and discussion of the results, together with the appropriate conclusions, are included in the paper.

  5. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  6. [Analysis of patient complaints in Primary Care: An opportunity to improve clinical safety].

    Science.gov (United States)

    Añel-Rodríguez, R M; Cambero-Serrano, M I; Irurzun-Zuazabal, E

    2015-01-01

    To determine the prevalence and type of the clinical safety problems contained in the complaints made by patients and users in Primary Care. An observational, descriptive, cross-sectional study was conducted by analysing both the complaint forms and the responses given to them in the period of one year. At least 4.6% of all claims analysed in this study contained clinical safety problems. The family physician is the professional who received the majority of the complaints (53.6%), and the main reason was the problems related to diagnosis (43%), mainly the delay in diagnosis. Other variables analysed were the severity of adverse events experienced by patients (in 68% of cases the patient suffered some harm), the subsequent impact on patient care, which was affected in 39% of cases (7% of cases even requiring hospital admission), and the level of preventability of adverse events (96% avoidable) described in the claims. Finally the type of response issued to each complaint was analysed, being purely bureaucratic in 64% of all cases. Complaints are a valuable source of information about the deficiencies identified by patients and healthcare users. There is considerable scope for improvement in the analysis and management of claims in general, and those containing clinical safety issues in particular. To date, in our area, there is a lack of appropriate procedures for processing these claims. Likewise, we believe that other pathways or channels should be opened to enable communication by patients and healthcare users. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  7. An analysis of limits for part load efficiency improvement with VVA devices

    International Nuclear Information System (INIS)

    Knop, Vincent; Mattioli, Leonardo

    2015-01-01

    Highlights: • Variable valve actuation aims at reducing pumping losses for spark-ignition engines. • Fully unthrottled operation is never reached because of combustion degradation. • Present paper quantifies the combustion degradation origins for various strategies. • Fully unthrottled CAI combustion mode is a non-combustion-limited alternative. • Combustion limitation is, however, replaced by a heat loss limitation. - Abstract: The implementation of Variable Valve Actuation (VVA) in Spark-Ignition (SI) engines generally aims at increasing part-load efficiency by reducing pumping losses. However, any innovative valve strategy has effects on the combustion process itself, introducing new limitations and mitigating the fuel consumption benefits. The experimental analysis of such valve strategies identifies the optimum settings but does not explain the origin of benefits and the sources of unexpected drawbacks. In the present study, the experimentally-optimised operating conditions for different valve strategies were numerically compared with 3D CFD to gain knowledge about causes for efficiency benefits and consequences of valve strategy on combustion progress. We compared standard SI operation in a single-cylinder port-fuel injection gasoline engine to mixture leaning, early intake valve closure (Miller cycle), late intake valve closure (Atkinson cycle), as well as Controlled Auto-Ignition (CAI). All alternative methods reduced pumping work and improved fuel consumption. However, all alternative methods also altered combustion progress and thermodynamic state within the combustion chamber, so that the observed fuel consumption benefits never reached the expected values. An energy balance provided the additional losses induced by each strategy while in-cylinder turbulence and temperature quantification helped explain the trends in combustion speed.

  8. Improved scatter correction with factor analysis for planar and SPECT imaging

    Science.gov (United States)

    Knoll, Peter; Rahmim, Arman; Gültekin, Selma; Šámal, Martin; Ljungberg, Michael; Mirzaei, Siroos; Segars, Paul; Szczupak, Boguslaw

    2017-09-01

    Quantitative nuclear medicine imaging is an increasingly important frontier. In order to achieve quantitative imaging, various interactions of photons with matter have to be modeled and compensated. Although correction for photon attenuation has been addressed by including x-ray CT scans (accurate), correction for Compton scatter remains an open issue. The inclusion of scattered photons within the energy window used for planar or SPECT data acquisition decreases the contrast of the image. While a number of methods for scatter correction have been proposed in the past, in this work, we propose and assess a novel, user-independent framework applying factor analysis (FA). Extensive Monte Carlo simulations for planar and tomographic imaging were performed using the SIMIND software. Furthermore, planar acquisition of two Petri dishes filled with 99mTc solutions and a Jaszczak phantom study (Data Spectrum Corporation, Durham, NC, USA) using a dual head gamma camera were performed. In order to use FA for scatter correction, we subdivided the applied energy window into a number of sub-windows, serving as input data. FA results in two factor images (photo-peak, scatter) and two corresponding factor curves (energy spectra). Planar and tomographic Jaszczak phantom gamma camera measurements were recorded. The tomographic data (simulations and measurements) were processed for each angular position resulting in a photo-peak and a scatter data set. The reconstructed transaxial slices of the Jaszczak phantom were quantified using an ImageJ plugin. The data obtained by FA showed good agreement with the energy spectra, photo-peak, and scatter images obtained in all Monte Carlo simulated data sets. For comparison, the standard dual-energy window (DEW) approach was additionally applied for scatter correction. FA in comparison with the DEW method results in significant improvements in image accuracy for both planar and tomographic data sets. FA can be used as a user

  9. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan

    2013-01-01

    The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check; ......, a method for selecting a minimum data-set for the parameter regression is also discussed for the cases where it is preferred to retain some data-points from the total data-set to test the reliability of predictions for validation purposes.......; (ii) selection of the most appropriate form of the property model; (iii) selection of the data-set for performing parameter regression and uncertainty analysis; and (iv) analysis of model prediction errors to take necessary corrective steps to improve the accuracy and the reliability of property...

  10. Quality improvement in health care – A fundamental analysis and South African perspective

    Directory of Open Access Journals (Sweden)

    M. E. Muller

    1996-03-01

    Full Text Available This article is based on the opening address that was delivered at a national conference with the launching of the South African Society for Quality in Health Care (SASQuaH. The quality improvement process and principles are analyzed, as well as an exposition of the South African situation. The concepts regarding quality, quality improvement and quality assurance are clarified, the different stakeholders in formalised quality improvement explained, as well as the motives for and principles of quality improvement in health services. The different external and internal mechanisms in quality improvement are analyzed, including the principle of community ownership and international collaboration. Continuous development and capacity building regarding quality improvement is recommended at all levels of health service delivery.

  11. Use of a pre-analysis osmolality normalisation method to correct for variable urine concentrations and for improved metabolomic analyses.

    Science.gov (United States)

    Chetwynd, Andrew J; Abdul-Sada, Alaa; Holt, Stephen G; Hill, Elizabeth M

    2016-01-29

    Metabolomics analyses of urine have the potential to provide new information on the detection and progression of many disease processes. However, urine samples can vary significantly in total solute concentration and this presents a challenge to achieve high quality metabolomic datasets and the detection of biomarkers of disease or environmental exposures. This study investigated the efficacy of pre- and post-analysis normalisation methods to analyse metabolomic datasets obtained from neat and diluted urine samples from five individuals. Urine samples were extracted by solid phase extraction (SPE) prior to metabolomic analyses using a sensitive nanoflow/nanospray LC-MS technique and the data analysed by principal component analyses (PCA). Post-analysis normalisation of the datasets to either creatinine or osmolality concentration, or to mass spectrum total signal (MSTS), revealed that sample discrimination was driven by the dilution factor of urine rather than the individual providing the sample. Normalisation of urine samples to equal osmolality concentration prior to LC-MS analysis resulted in clustering of the PCA scores plot according to sample source and significant improvements in the number of peaks common to samples of all three dilutions from each individual. In addition, the ability to identify discriminating markers, using orthogonal partial least squared-discriminant analysis (OPLS-DA), was greatly improved when pre-analysis normalisation to osmolality was compared with post-analysis normalisation to osmolality and non-normalised datasets. Further improvements for peak area repeatability were observed in some samples when the pre-analysis normalisation to osmolality was combined with a post-analysis mass spectrum total useful signal (MSTUS) or MSTS normalisation. Future adoption of such normalisation methods may reduce the variability in metabolomics analyses due to differing urine concentrations and improve the discovery of discriminating metabolites

  12. Analysis of Drinking Water Supply System Improvement Using Fuzzy AHP(Case Study: Subang Local Water Company

    Directory of Open Access Journals (Sweden)

    Budi Prakoso Satrio

    2018-01-01

    Full Text Available Study of infrastructure is required to improve the drinking water supply system, in order to achieve The Long-Term Plan Targets on Water Supply (RPJMN 2015-2019 concerning 100% universal and equitable access to drinking water services, in accordance with the purpose of Sustainable Development Goals (SDGs in 2019. Subang district was selected as a case study of water supply improvement because of the improving development in that area and the population growth rate, influencing the need for drinking water. The purpose study is to determine the problems of drinking water supply system, to analyze alternatives for improving water supply system by performing weighted ranking factors in drinking water supply improvement with AHP method and analyzed by fuzzy logic. The method for water supply system improvement could be performed by looking at influential factors using fuzzy AHP approach to obtain some appropriate alternatives. Technical and operational factors was considered the most influential factor in the improvement of water supply system, followed by management capacity, environment, and financial aspect. Fuzzy method with rule base input approach used in this analysis was influencing the output that became the policy alternative to improve drinking water supply.

  13. Does Methylphenidate Improve Academic Performance? A Systematic Review and Meta‑analysis

    NARCIS (Netherlands)

    Kortekaas-Rijlaarsdam, Anne Fleur; Luman, Marjolein; Sonuga-Barke, Edmund; Oosterlaan, J.

    2018-01-01

    Academic improvement is amongst the most common treatment targets when prescribing stimulants to children with ADHD.Previous reviews on stimulant-related academic improvements are inconclusive and focus on task engagement. Recentliterature suggests outcome-domain-specific medication effects that are

  14. Improving performance of high risk organizations Spanish nuclear sector from the analysis of organizational culture factors

    International Nuclear Information System (INIS)

    La Salabarnada, E.; German, S.; Silla, I.; Navajas, J.

    2012-01-01

    This paper presents the research project funded by UNESA and conducted by the CISOT-CIEMAT that aims to contribute to improving the operating performance of the Spanish nuclear power plants. This paper aims to identify the factors and key organizational processes to improve efficiency, in order to advance knowledge about the influence of organizational culture on the safety of high reliability organizations.

  15. West Coast tree improvement programs: a break-even, cost-benefit analysis

    Science.gov (United States)

    F. Thomas Ledig; Richard L Porterfield

    1981-01-01

    Three tree improvement programs were analyzed by break-even, cost-benefit technique: one for ponderosa pine in the Pacific Northwest, and two for Douglas-fir in the Pacific Northwest-one of low intensity and the other of high intensity. A return of 8 percent on investment appears feasible by using short rotations or by accompanying tree improvement with thinning....

  16. Osteoarthritis patients with pain improvement are highly likely to also have improved quality of life and functioning. A post hoc analysis of a clinical trial.

    Science.gov (United States)

    Peloso, Paul M; Moore, R Andrew; Chen, Wen-Jer; Lin, Hsiao-Yi; Gates, Davis F; Straus, Walter L; Popmihajlov, Zoran

    2016-10-01

    This analysis evaluated whether osteoarthritis patients achieving the greatest pain control and lowest pain states also have the greatest improvement in functioning and quality of life. Patients (n=419) who failed prior therapies and who were switched to etoricoxib 60mg were categorized as pain responders or non-responders at 4 weeks based on responder definitions established by the Initiative on Methods, Measurement, and Pain (IMMPACT) criteria, including changes from baseline of ≥15%, ≥30%, ≥50%, ≥70% and a final pain status of ≤3/10 (no worse than mild pain). Pain was assessed at baseline and 4 weeks using 4 questions from the Brief Pain Inventory (BPI) (worst pain, least pain, average pain, and pain right now), and also using the Western Ontario and McMaster Universities Arthritis Index (WOMAC) pain subscale. We examined the relationship between pain responses with changes from baseline in two functional measures (the BPI Pain Interference questions and the WOMAC Function Subscale) as well as changes from baseline in quality of life (assessed on the SF-36 Physical and Mental Component Summaries). We also sought to understand whether these relationships were influenced by the choice of the pain instrument used to assess response. We contrast the mean difference in improvements in the functional and quality of life instruments based on pain responder status (responder versus non-responder) and the associated 95% confidence limits around this difference. Patients with better pain responses were much more likely to have improved functional responses and improved quality of life, with higher mean changes in these outcomes versus pain non-responders, regardless of the choice of IMMPACT pain response definition (e.g., using any of 15%, 30%, 50%, 70% change from baseline) or the final pain state of ≤3/10. There was an evident gradient, where higher levels of pain response were associated with greater mean improvements in function and quality of life. The

  17. Transcriptome analysis of a phenol-producing Pseudomonas putida S12 construct: Genetic and physiological basis for improved production

    NARCIS (Netherlands)

    Wierckx, N.J.P.; Ballerstedt, H.; Bont, J.A.M.de; Winde, J.H.de; Ruijssenaars, H.J.; Wery, J.

    2008-01-01

    The unknown genetic basis for improved phenol production by a recombinant Pseudomonas putida S12 derivative bearing the tpl (tyrosine-phenol lyase) gene was investigated via comparative transcriptomics, nucleotide sequence analysis, and targeted gene disruption. We show upregulation of tyrosine

  18. Improved analysis of long-term monitoring data demonstrates marked regional declines of bat populations in the eastern United States

    Science.gov (United States)

    Thomas E. Ingersoll; Brent J. Sewall; Sybill K. Amelon

    2013-01-01

    Bats are diverse and ecologically important, but are also subject to a suite of severe threats. Evidence for localized bat mortality from these threats is well-documented in some cases, but long-term changes in regional populations of bats remain poorly understood. Bat hibernation surveys provide an opportunity to improve understanding, but analysis is complicated by...

  19. Improving failure analysis efficiency by combining FTA and FMEA in a recursive manner

    NARCIS (Netherlands)

    Peeters, J.F.W.; Basten, R.J.I.; Tinga, Tiedo

    2018-01-01

    When designing a maintenance programme for a capital good, especially a new one, it is of key importance to accurately understand its failure behaviour. Failure mode and effects analysis (FMEA) and fault tree analysis (FTA) are two commonly used methods for failure analysis. FMEA is a bottom-up

  20. Improved Analysis of Earth System Models and Observations using Simple Climate Models

    Science.gov (United States)

    Nadiga, B. T.; Urban, N. M.

    2016-12-01

    Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using

  1. Cost savings associated with improving appropriate and reducing inappropriate preventive care: cost-consequences analysis

    Directory of Open Access Journals (Sweden)

    Baskerville Neill

    2005-03-01

    Full Text Available Abstract Background Outreach facilitation has been proven successful in improving the adoption of clinical preventive care guidelines in primary care practice. The net costs and savings of delivering such an intensive intervention need to be understood. We wanted to estimate the proportion of a facilitation intervention cost that is offset and the potential for savings by reducing inappropriate screening tests and increasing appropriate screening tests in 22 intervention primary care practices affecting a population of 90,283 patients. Methods A cost-consequences analysis of one successful outreach facilitation intervention was done, taking into account the estimated cost savings to the health system of reducing five inappropriate tests and increasing seven appropriate tests. Multiple data sources were used to calculate costs and cost savings to the government. The cost of the intervention and costs of performing appropriate testing were calculated. Costs averted were calculated by multiplying the number of tests not performed as a result of the intervention. Further downstream cost savings were determined by calculating the direct costs associated with the number of false positive test follow-ups avoided. Treatment costs averted as a result of increasing appropriate testing were similarly calculated. Results The total cost of the intervention over 12 months was $238,388 and the cost of increasing the delivery of appropriate care was $192,912 for a total cost of $431,300. The savings from reduction in inappropriate testing were $148,568 and from avoiding treatment costs as a result of appropriate testing were $455,464 for a total savings of $604,032. On a yearly basis the net cost saving to the government is $191,733 per year (2003 $Can equating to $3,687 per physician or $63,911 per facilitator, an estimated return on intervention investment and delivery of appropriate preventive care of 40%. Conclusion Outreach facilitation is more expensive

  2. Similarities between pinch analysis and classical blast furnace analysis methods. Possible improvement by synthesis. Paper no. IGEC-1-004

    International Nuclear Information System (INIS)

    Ryman, C.; Grip, C.-E.; Franck, P.-A.; Wikstrom, J.-O.

    2005-01-01

    Pinch analysis originated at UMIST in the 1970's. It has since then been used as a method for energy analysis and optimisation of industrial systems. The blast furnace process for reducing iron oxide to molten iron is a very important process unit in the metallurgical industry. It is a counter-current shaft process with a wide temperature range and gaseous, solid and liquid phases present in different zones. Because of this the blast furnace acts as a system of different sub-processes rather than a single process. The analysis tools developed to describe the process are in some respects similar to the tools of pinch analysis. The exchange between the two fields of knowledge has yet been negligible. In this paper the methods are described and compared. Problems, possibilities and advantages with an exchange and synthesis of knowledge are discussed. (author)

  3. Fault feature extraction method based on local mean decomposition Shannon entropy and improved kernel principal component analysis model

    Directory of Open Access Journals (Sweden)

    Jinlu Sheng

    2016-07-01

    Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.

  4. Analysis of advanced sodium-cooled fast reactor core designs with improved safety characteristics

    International Nuclear Information System (INIS)

    Sun, K.

    2012-09-01

    improvements address both neutronics and thermal-hydraulics aspects. Furthermore, emphasis has been placed on not only the beginning-of-life (BOL) state of the core, but also on the beginning of closed equilibrium fuel cycle (BEC) state. An important context for the current thesis is the 7 th European Framework Program's Collaborative Project for a European Sodium Fast Reactor (CP-ESFR), the reference 3600 MWth ESFR core being the starting point for the conducted research. The principally employed computational tools belong to the so-called FAST code system, viz. the fast-reactor neutronics code ERANOS, the fuel cycle simulating procedure EQL3D, the spatial kinetics code PARCS and the system thermal-hydraulics code TRACE. The research has been carried out in essentially three successive phases. The first phase has involved achieving a clearer understanding of the principal phenomena contributing to the SFR void effect. Decomposition and analysis of sodium void reactivity have been carried out, while considering different fuel cycle states for the core. Furthermore, the spatial distribution of void reactivity importance, in both axial and radial directions, is investigated. For the reactivity decomposition, two methods, based respectively on neutron balance considerations and on perturbation theory, have been applied. The sodium void reactivity of the reference ESFR core has been, accordingly, decomposed reaction-wise, cross-section-wise, isotope-wise and energy-group-wise. Effectively, the neutron balance based method allows an in-depth understanding of the ‘consequences’ of sodium voidage, while the perturbation theory based method provides a complementary understanding of the ‘causes’. The second phase of the research has addressed optimization of the reference ESFR core design from the neutronics viewpoint. Four options oriented towards either the leakage component or the spectral effect have been considered in detail, viz. introducing an upper sodium plenum

  5. An Analysis of Department of Defense Business Systems Modernization Efforts and Recommendations for Improving the Process

    National Research Council Canada - National Science Library

    Guerra, Elizabeth A; Nicholas, Joseph E; Scott, Corey L

    2005-01-01

    .... A large portion of military transformation is focused on DoD business practices. It has been estimated that up to 5 percent of the annual DoD budget could be saved by improving and streamlining defense business operations...

  6. Analysis on nuclear power plant control room system design and improvement based on human factor engineering

    International Nuclear Information System (INIS)

    Gao Feng; Liu Yanzi; Sun Yongbin

    2014-01-01

    The design of nuclear power plant control room system is a process of improvement with the implementation of human factor engineering theory and guidance. The method of implementation human factor engineering principles into the nuclear power plant control room system design and improvement was discussed in this paper. It is recommended that comprehensive address should be done from control room system function, human machine interface, digital procedure, control room layout and environment design based on the human factor engineering theory and experience. The main issues which should be paid more attention during the control room system design and improvement also were addressed in this paper, and then advices and notices for the design and improvement of the nuclear power plant control room system were afforded. (authors)

  7. Improving compliance with hospital antibiotic guidelines : a time-series intervention analysis

    NARCIS (Netherlands)

    Mol, Peter G. M.; Wieringa, Jaap E.; NannanPanday, PV; Gans, Rijk O. B.; Degener, John E.; Laseur, M; Haaijer-Ruskamp, Flora M.

    Objectives: This study investigated the impact of a combined intervention strategy to improve antimicrobial prescribing at University Hospital Groningen. For the intervention, the antimicrobial treatment guidelines were updated and disseminated in paperback and electronic format. The credibility of

  8. The enterprise constant improvement by using medium ratings differentiation analysis on the second Toyota principle

    Directory of Open Access Journals (Sweden)

    Stanisław Borkowski

    2014-07-01

    Full Text Available The organization constant improvement process is possible using a step by step method. The self-learning organization and continuous improvement of the organization's culture guarantees achievement of good market results and raising competitiveness. This paper aims to identify and analyze the ratings importance level for the production process factors in order for the manufacturing process to improve in the chosen construction company. The study used an innovative research method BOST, which refers to the Toyota management principles in the production and the service organizations. As a result of research the ratings differentiation importance level for the production process factors was achieved which provides a basis for the process of the analyzed company to improve.

  9. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    The purpose of this study was to assess the throughput improvement afforded by the various TCP optimization techniques, with respect to a simulated geosynchronous satellite system, to provide a cost...

  10. Information Operations Versus Civilian Marketing and Advertising: A Comparative Analysis to Improve IO Planning and Strategy

    National Research Council Canada - National Science Library

    Chilton, Dan

    2008-01-01

    .... must re-focus its efforts on Information Operations to achieve GWOT objectives. In an environment where rapidly advancing technology is drastically improving the individual's ability to conduct IO against the U.S...

  11. Nuclear power plant safety and reliability improvements derived from operational experience analysis

    International Nuclear Information System (INIS)

    Zebroski, E.L.; Rosen, S.L.

    1983-01-01

    The frequency and distribution of significant events at operating nuclear power plants can be used as one of several indices to obtain a preliminary indication of unit and/or utility performance in the development and implementation of improvements aimed at reducing the probability or consequences of troublesome events. Initial data of this type are presented along with qualifications on the validity of such indicators. Planned further steps to improve this as a performance indicator are noted

  12. Improved StrongARM latch comparator: Design, analysis and performance evaluation

    KAUST Repository

    Almansouri, Abdullah Saud Mohammed

    2017-07-13

    This paper presents an improved StrongARM latch comparator, designed and simulated in 90nm and 32nm CMOS technologies. The proposed design provides an improvement of 7% in energy efficiency, 14% in speed and an average reduction of 41% in the clock feedthrough, compared to the conventional design. The new architecture also minimizes the area by reducing the required transistors needed for the enhanced performance.

  13. APPLICATION OF FAILURE MODE & EFFECT ANALYSIS (FMEA FOR CONTINUOUS QUALITY IMPROVEMENT - MULTIPLE CASE STUDIES IN AUTOMOBILE SMES

    Directory of Open Access Journals (Sweden)

    Jigar Doshi

    2017-06-01

    Full Text Available Failure Mode and Effects Analysis (FMEA is a quality tool used to identify potential failures and related effects on processes and products, so continuous improvement in quality can be achieved by reducing them. The purpose of this research paper is to showcase the contribution of FMEA to achieve Continuous Quality Improvement (CQI by multiple case study research. The outcome research conducted by implementing FMEA; one of the Auto Core Tools (ACTs, in the automobile Small and Medium Enterprises (SMEs in Gujarat, India is presented in this paper which depict various means of Continuous Quality Improvements. The case study based research was carried out in four automobile SMEs; all of them are supplied to automotive Original Equipment Manufacturer (OEM. The FMEA was implemented with the help of Cross Functional Team (CFT to identify the potential failure modes and effects, in overall effect on Continuous Quality Improvement. The outcome of FMEA at four companies' reveals the scope of improvement exists in the manufacturing process. Implementation of those improvement points shows the definite signs of continuous improvement of the quality of process and product as well. The FMEA and subsequent implementations had reduced the quality rejections around 3% to 4% in case companies.

  14. How do Housing Subsidies Improve Quality of Life Among Homeless Adults? A Mediation Analysis.

    Science.gov (United States)

    O'Connell, Maria; Sint, Kyaw; Rosenheck, Robert

    2018-03-01

    Supported housing, combining rent subsidies with intensive case management, is associated with improvements in quality of life of homeless adults, but factors mediating their impact on quality of life have not been studied. Twelve-month outcome data from a randomized trial of the Housing and Urban Development- Veterans Affairs Supported Housing program (HUD-VASH) showed that access to a housing rent subsidy plus intensive case management (ICM) was associated with greater improvement in subjective quality of life than ICM alone. Multiple mediation analyses were applied to identify variables that significantly mediated the relationship between receipt of housing voucher and improvements in quality of life. Significant mediating covariates were those whose 95% bias-corrected confidence intervals, when added to the model predicting improvement in quality of life, did not overlap zero. Increases in the number of days housed, size of social network, and availability of emotional support appear to mediate improvement in quality of life and account for 71% of the benefit attributable to having a rent subsidy. Improvement in subjective quality of life though housing subsidies is mediated by gains in both material and psychosocial factors. Mediating factors deserve special attention in supported housing services. © Society for Community Research and Action 2018.

  15. Pre-medication to improve esophagogastroduodenoscopic visibility: a meta-analysis and systemic review.

    Science.gov (United States)

    Chen, Hsuan-Wei; Hsu, Hsuang-Chun; Hsieh, Tsai-Yuan; Yeh, Ming-Kung; Chang, Wei-Kuo

    2014-09-01

    The use of pre-medication to improve visibility in gastrointestinal endoscopy has not been well addressed and remains controversial. The aim is to evaluate the effects of current pre-medication on endoscopic visualization. We made the overall strategies to search the different databases and assessed the quality of included studies according to the included and excluded standard. 1541 patients were treated with pre-medication. Ten prospective studies involving 1541 patients were included. There was improved visibility in patients treated with Simethicone (weighted mean difference -4.3; 95% confidence interval (CI), -4.94 to -3.67), compared to those who did not use Simethicone. In the Simethicone based regiment, administration of Pronase was noted with significantly improved visibility in the location of antrum and fundus, compared to those who did not use; however, administration of N-acetyl-L-cysteine could not lead to significantly improved visibility. Simethicone offered better visibility than N-acetyl-L-cysteine and Pronase alone. There is improved visibility with pre-medication using Simethicone before esophagogastroduodenoscopy. In the Simethicone based regimen, administration of Pronase or N-acetyl-Lcysteine may be of little use in improving visibility. Based on the literature review, Simethicone dissolved in the water with the acceptably lowest ratio of 0.7 can still offer the good visibility but 30 mL of water should be avoided.

  16. Does methylphenidate improve academic performance? A systematic review and meta-analysis.

    Science.gov (United States)

    Kortekaas-Rijlaarsdam, Anne Fleur; Luman, Marjolein; Sonuga-Barke, Edmund; Oosterlaan, Jaap

    2018-01-20

    Academic improvement is amongst the most common treatment targets when prescribing stimulants to children with ADHD. Previous reviews on stimulant-related academic improvements are inconclusive and focus on task engagement. Recent literature suggests outcome-domain-specific medication effects that are larger for productivity than for accuracy. The aims of this study are quantifying methylphenidate effects on academic productivity and accuracy for math, reading, spelling; exploring the mediating or moderating effects of symptom improvements, demographic-, design- and disorder-related variables. PubMed, EMBASE, ERIC and PsycINFO were searched for articles reporting methylphenidate effects on academic productivity and accuracy. Thirty-four studies met entry criteria. Methylphenidate improved math productivity (7.8% increase, p < .001); math accuracy (3.0% increase, p = .001); increased reading speed (SMD .47, p < .001) but not reading accuracy. None of the mediators or moderators tested affected methylphenidate efficacy. Academic improvements were small compared to symptom improvements; qualitative changes limited to math. Clinicians should take this discrepancy into account when prescribing medication for ADHD.