Sample records for analysis improves col4a5

  1. A Novel COL4A5 Mutation Identified in a Chinese Han Family Using Exome Sequencing

    Directory of Open Access Journals (Sweden)

    Xiaofei Xiu


    Full Text Available Alport syndrome (AS is a monogenic disease of the basement membrane (BM, resulting in progressive renal failure due to glomerulonephropathy, variable sensorineural hearing loss, and ocular anomalies. It is caused by mutations in the collagen type IV alpha-3 gene (COL4A3, the collagen type IV alpha-4 gene (COL4A4, and the collagen type IV alpha-5 gene (COL4A5, which encodes type IV collagen α3, α4, and α5 chains, respectively. To explore the disease-related gene in a four-generation Chinese Han pedigree of AS, exome sequencing was conducted on the proband, and a novel deletion mutation c.499delC (p.Pro167Glnfs*36 in the COL4A5 gene was identified. This mutation, absent in 1,000 genomes project, HapMap, dbSNP132, YH1 databases, and 100 normal controls, cosegregated with patients in the family. Neither sensorineural hearing loss nor typical COL4A5-related ocular abnormalities (dot-and-fleck retinopathy, anterior lenticonus, and the rare posterior polymorphous corneal dystrophy were present in patients of this family. The phenotypes of patients in this AS family were characterized by early onset-age and rapidly developing into end-stage renal disease (ESRD. Our discovery broadens the mutation spectrum in the COL4A5 gene associated with AS, which may also shed new light on genetic counseling for AS.

  2. A COL4A5 mutation with glomerular disease and signs of chronic thrombotic microangiopathy. (United States)

    Wuttke, Matthias; Seidl, Maximilian; Malinoc, Angelica; Prischl, Friedrich C; Kuehn, E Wolfgang; Walz, Gerd; Köttgen, Anna


    COL4A5 mutations are a known cause of Alport syndrome, which typically manifests with haematuria, hearing loss and ocular symptoms. Here we report on a 16-year-old male patient with a negative family history who presented with proteinuria, progressive renal failure and haemolysis, but without overt haematuria or hearing loss. A renal biopsy revealed features of atypical IgA nephropathy, while a second biopsy a year later showed features of focal segmental glomerulosclerosis, but was finally diagnosed as chronic thrombotic microangiopathy. Targeted sequencing of candidate genes for steroid-resistant nephrotic syndrome and congenital thrombotic microangiopathy was negative. Despite all therapeutic efforts, including angiotensin-converting enzyme inhibition, immunosuppressive therapy, plasma exchanges and rituximab, the patient progressed to end-stage renal disease. When a male cousin presented with nephrotic syndrome years later, whole-exome sequencing identified a shared disruptive COL4A5 mutation (p.F222C) that showed X-linked segregation. Thus, mutations in COL4A5 give rise to a broader spectrum of clinical presentation than commonly suspected, highlighting the benefits of comprehensive rather than candidate genetic testing in young patients with otherwise unexplained glomerular disease. Our results are in line with an increasing number of atypical presentations of single-gene disorders identified through genome-wide sequencing.

  3. Mutations in the codon for a conserved arginine-1563 in the COL4A5 collagen gene in Alport syndrome

    DEFF Research Database (Denmark)

    Zhou, J; Gregory, M C; Hertz, Jens Michael


    We have screened 110 unrelated Alport syndrome kindreds for mutations in the exon 48 region of the COL4A5 collagen gene. Denaturing gradient gel electrophoresis (DGGE) of the PCR-amplified region of exon 48 revealed sequence variants in DNA from affected males and carriers of three unrelated kind...

  4. High mutation detection rate in the COL4A5 collagen gene in suspected Alport syndrome using PCR and direct DNA sequencing

    DEFF Research Database (Denmark)

    Martin, P; Heiskari, N; Zhou, J


    Approximately 85% of patients with Alport syndrome (hereditary nephritis) have been estimated to have mutations in the X chromosomal COL4A5 collagen gene; the remaining cases are autosomal with mutations in the COL4A3 or COL4A4 genes located on chromosome 2. In the present work, the promoter...

  5. A nonsense mutation in the COL4A5 collagen gene in a family with X-linked juvenile Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Heiskari, N; Zhou, J;


    . The mutation was found to co-segregate with the disease in the family. The information of the sequence variation in this family was used to perform carrier detection and prenatal diagnosis by allele-specific oligonucleotide hybridization analysis and direct sequencing of PCR amplified exon 47. Prenatal...... diagnosis on chorionic villi tissue, obtained from one of the female carriers in the family, revealed a male fetus hemizygous for the mutated allele. A subsequent prenatal test in her next pregnancy revealed a normal male fetus. Prenatal diagnosis of Alport syndrome has not previously been reported....

  6. Detection of mutations in the COL4A5 gene by SSCP in X-linked Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Juncker, I; Persson, U


    , three in-frame deletions, four nonsense mutations, and six splice site mutations. Twenty-two of the mutations have not previously been reported. Furthermore, we found one non-pathogenic amino acid substitution, one rare variant in a non-coding region, and one polymorphism with a heterozygosity of 28...

  7. Improved Intermittency Analysis of Single Event Data


    Janik, R. A.; Ziaja, B.


    The intermittency analysis of single event data (particle moments) in multiparticle production is improved, taking into account corrections due to the reconstruction of history of a particle cascade. This approach is tested within the framework of the $\\alpha$-model.

  8. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor;


    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  9. Conducting a SWOT Analysis for Program Improvement (United States)

    Orr, Betsy


    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  10. Improving transient analysis technology for aircraft structures (United States)

    Melosh, R. J.; Chargin, Mladen


    Aircraft dynamic analyses are demanding of computer simulation capabilities. The modeling complexities of semi-monocoque construction, irregular geometry, high-performance materials, and high-accuracy analysis are present. At issue are the safety of the passengers and the integrity of the structure for a wide variety of flight-operating and emergency conditions. The technology which supports engineering of aircraft structures using computer simulation is examined. Available computer support is briefly described and improvement of accuracy and efficiency are recommended. Improved accuracy of simulation will lead to a more economical structure. Improved efficiency will result in lowering development time and expense.


    Institute of Scientific and Technical Information of China (English)

    WU Long-hua


    In a Digital Particle Image Velocimetry (DPIV) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and robustness of the correlation algorithm directly affect the validity of the analysis result. In this article, an improved algorithm for the correlation analysis was proposed which could be used to optimize the selection/determination of the correlation window, analysis area and search path. This algorithm not only reduces largely the amount of calculation, but also improves effectively the accuracy and reliability of the correlation analysis. The algorithm was demonstrated to be accurate and efficient in the measurement of the velocity field in a flocculation pool.

  12. Improving information retrieval in functional analysis. (United States)

    Rodriguez, Juan C; González, Germán A; Fresno, Cristóbal; Llera, Andrea S; Fernández, Elmer A


    Transcriptome analysis is essential to understand the mechanisms regulating key biological processes and functions. The first step usually consists of identifying candidate genes; to find out which pathways are affected by those genes, however, functional analysis (FA) is mandatory. The most frequently used strategies for this purpose are Gene Set and Singular Enrichment Analysis (GSEA and SEA) over Gene Ontology. Several statistical methods have been developed and compared in terms of computational efficiency and/or statistical appropriateness. However, whether their results are similar or complementary, the sensitivity to parameter settings, or possible bias in the analyzed terms has not been addressed so far. Here, two GSEA and four SEA methods and their parameter combinations were evaluated in six datasets by comparing two breast cancer subtypes with well-known differences in genetic background and patient outcomes. We show that GSEA and SEA lead to different results depending on the chosen statistic, model and/or parameters. Both approaches provide complementary results from a biological perspective. Hence, an Integrative Functional Analysis (IFA) tool is proposed to improve information retrieval in FA. It provides a common gene expression analytic framework that grants a comprehensive and coherent analysis. Only a minimal user parameter setting is required, since the best SEA/GSEA alternatives are integrated. IFA utility was demonstrated by evaluating four prostate cancer and the TCGA breast cancer microarray datasets, which showed its biological generalization capabilities.

  13. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski


    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  14. Improved Tiled Bitmap Forensic Analysis Algorithm

    Directory of Open Access Journals (Sweden)

    C. D. Badgujar, G. N. Dhanokar


    Full Text Available In Computer network world, the needs for securityand proper systems of control are obvious and findout the intruders who do the modification andmodified data. Nowadays Frauds that occurs incompanies are not only by outsiders but also byinsiders. Insider may perform illegal activity & tryto hide illegal activity. Companies would like to beassured that such illegal activity i.e. tampering hasnot occurred, or if it does, it should be quicklydiscovered. Mechanisms now exist that detecttampering of a database, through the use ofcryptographically-strong hash functions. This papercontains a survey which explores the various beliefsupon database forensics through differentmethodologies using forensic algorithms and toolsfor investigations. Forensic analysis algorithms areused to determine who, when, and what data hadbeen tampered. Tiled Bitmap Algorithm introducesthe notion of a candidate set (all possible locationsof detected tampering(s and provides a completecharacterization of the candidate set and itscardinality. Improved tiled bitmap algorithm willcover come the drawbacks of existing tiled bitmapalgorithm.


    Directory of Open Access Journals (Sweden)

    Alin Molcut


    Full Text Available Sport organizations exist to perform tasks that can only be executed through cooperative effort, and sport management is responsible for the performance and success of these organizations. The main of the paper is to analyze several issues of management sports organizations in order to asses their quality management. In this respect a questionnaire has been desingned for performing a survey analysis through a statistical approach. Investigation was conducted over a period of 3 months, and have been questioned a number of managers and coaches of football, all while pursuing an activity in football clubs in the counties of Timis and Arad, the level of training for children and juniors. The results suggest that there is a significant interest for the improvement of management across teams of children and under 21 clubs, emphasis on players' participation and rewarding performance. Furthermore, we can state that in the sports clubs there is established a vision and a mission as well as the objectives of the club's general refers to both sporting performance, and financial performance.

  16. Improved security analysis of Fugue-256

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Bagheri, Nasour; Knudsen, Lars Ramkilde;


    in the G transform. Next we improve the designers’ meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Next we study the security of Fugue-256 against free-start distinguishers and free-start collisions. In this direction, we use an improved variant of the differential...


    Costello, F. A.


    The Systems Improved Numerical Fluids Analysis Code, SINFAC, consists of additional routines added to the April 1983 revision of SINDA, a general thermal analyzer program. The purpose of the additional routines is to allow for the modeling of active heat transfer loops. The modeler can simulate the steady-state and pseudo-transient operations of 16 different heat transfer loop components including radiators, evaporators, condensers, mechanical pumps, reservoirs and many types of valves and fittings. In addition, the program contains a property analysis routine that can be used to compute the thermodynamic properties of 20 different refrigerants. SINFAC can simulate the response to transient boundary conditions. SINFAC was first developed as a method for computing the steady-state performance of two phase systems. It was then modified using CNFRWD, SINDA's explicit time-integration scheme, to accommodate transient thermal models. However, SINFAC cannot simulate pressure drops due to time-dependent fluid acceleration, transient boil-out, or transient fill-up, except in the accumulator. SINFAC also requires the user to be familiar with SINDA. The solution procedure used by SINFAC is similar to that which an engineer would use to solve a system manually. The solution to a system requires the determination of all of the outlet conditions of each component such as the flow rate, pressure, and enthalpy. To obtain these values, the user first estimates the inlet conditions to the first component of the system, then computes the outlet conditions from the data supplied by the manufacturer of the first component. The user then estimates the temperature at the outlet of the third component and computes the corresponding flow resistance of the second component. With the flow resistance of the second component, the user computes the conditions down stream, namely the inlet conditions of the third. The computations follow for the rest of the system, back to the first component

  18. Novel analysis and improvement of Yahalom protocol

    Institute of Scientific and Technical Information of China (English)

    CHEN Chun-ling; YU Han; L(U) Heng-shan; WANG Ru-chuan


    The modified version of Yahalom protocol improved by Burrows, Abradi, and Needham (BAN) still has security drawbacks. This study analyzed such flaws in a detailed way from the point of strand spaces, which is a novel method of analyzing protocol's security. First, a mathematical model of BAN-Yahalom protocol is constructed. Second, penetrators' abilities are restricted with a rigorous and formalized definition. Moreover, to increase the security of this protocol against potential attackers in practice, a further improvement is made to the protocol. Future application of this re-improved protocol is also discussed.

  19. Improving Public Perception of Behavior Analysis. (United States)

    Freedman, David H


    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  20. Improved Spectrum Analysis Noise Radar Systems. (United States)

    and evaluated. A new spectrum analysis system designed to detect moving targets is presented. Comparison is made of the detection capabilities of all four noise radar systems in the presence of extraneous noise. (Author)

  1. Liquefaction mathematical analysis for improvement structures stability

    Directory of Open Access Journals (Sweden)

    Azam Khodashenas Pelko


    Full Text Available The stability of any structure is possible if foundation is appropriately designed. The Bandar abbas is the largest and most important port of Iran, with high seismicity and occurring strong earthquakes in this territory, the soil mechanical properties of different parts of city have been selected as the subject of current research. The data relating to the design of foundation for improvement of structure at different layer of subsoil have been collected and, accordingly, soil mechanical properties have been evaluated. The results of laboratory experiments can be used for evaluation of geotechnical characteristics of urban area for development a region with high level of structural stability. Ultimately, a new method for calculation of liquefaction force is suggested. It is applicable for improving geotechnical and structure codes and also for reanalysis of structure stability of previously constructed buildings.

  2. The Use of Item Analysis for Improvement of Biochemical Teaching (United States)

    Nagata, Ryoichi


    Item analysis was used to find out which biochemical explanations need to be improved in biochemical teaching, not which items are to be discarded, improved, or reusable in biochemical examinations. The analysis revealed the basic facts of which less able students had more misunderstanding than able students. Identifying these basic facts helps…

  3. Burning analysis on the improved confinement mode

    Energy Technology Data Exchange (ETDEWEB)

    Tateishi, Gonta [Interdisciplinary Graduate School of Engineering Science, Kyushu Univ., Kasuga, Fukuoka (Japan); Yagi, Masatoshi; Itoh, Sanae-I.


    1-D transport code is used to examine the ignition of plasma on the improved confinement mode and impact of profile effect on the burning performance. Energy transport, He-ash particle transport and poloidal magnetic field transport equations are solved with a thermal diffusivity of current diffusive ballooning mode model. The ratio of a thermal diffusivity and a He-ash diffusivity is introduced as a parameter and assumed to be constant. For a fixed current profile, the existence of the ignited state is shown. An internal transport barrier is formed autonomously even if parameters lie in the L-mode boundary condition. It is found that the sensitivity of the ignition condition on the density is strong and there is no margin of ignition for the density limit when density profile is flat. However, if a peaked profile of density is chosen, solutions which satisfy the density limit exist. The long time sustainment of ignition is also shown, solving poloidal magnetic field transport simultaneously. It is shown that the ignition is sustained within the time scale of burn-time, however, MHD stability should be considered in the time scale of current diffusion. (author)

  4. Adapting Job Analysis Methodology to Improve Evaluation Practice (United States)

    Jenkins, Susan M.; Curtin, Patrick


    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  5. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten


    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...

  6. Improvements to the CONTINUE feature in transient analysis (United States)

    Pamidi, P. R.


    The CONTINUE feature in transient analysis as implemented in the standard release of COSMIC/NASTRAN has inherent errors associated with it. As a consequence, the results obtained by a CONTINUEd restart run do not, in general, match the results that would be obtained in a single run without the CONTINUE feature. These inherent errors were eliminated by improvements to the restart logic that were developed by RPK Corporation and that are available on all RPK-supported versions of COSMIC/NASTRAN. These improvements ensure that the results of a CONTINUEd transient analysis run are the same as those of a non-CONTINUEd run. In addition, the CONTINUE feature was extended to transient analysis involving uncoupled modal equations. The improvements and enhancement were illustrated by examples.

  7. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten


    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits...

  8. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten


    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...... this is a major improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore...

  9. Improving Cluster Analysis with Automatic Variable Selection Based on Trees (United States)


    ANALYSIS WITH AUTOMATIC VARIABLE SELECTION BASED ON TREES by Anton D. Orr December 2014 Thesis Advisor: Samuel E. Buttrey Second Reader...DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE IMPROVING CLUSTER ANALYSIS WITH AUTOMATIC VARIABLE SELECTION BASED ON TREES 5. FUNDING NUMBERS 6...2006 based on classification and regression trees to address problems with determining dissimilarity. Current algorithms do not simultaneously address

  10. The National Treatment Improvement Evaluation Study: Retention Analysis. (United States)

    Orwin, Rob; Williams, Valerie

    This study focuses on programmatic factors that predict retention for individuals in drug and alcohol treatment programs through secondary analysis of data from the National Treatment Improvement Evaluation Study (NTIES). It addresses the relationships between completion rates, lengths of stay, and treatment modality. It examines the effect of…

  11. Thermal hydraulic analysis of the JMTR improved LEU-core

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Toshio; Nagao, Yoshiharu; Komukai, Bunsaku; Naka, Michihiro; Fujiki, Kazuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Takeda, Takashi [Radioactive Waste Management and Nuclear Facility Decommissioning Technology Center, Tokai, Ibaraki (Japan)


    After the investigation of the new core arrangement for the JMTR reactor in order to enhance the fuel burn-up and consequently extend the operation period, the ''improved LEU core'' that utilized 2 additional fuel elements instead of formerly installed reflector elements, was adopted. This report describes the results of the thermal-hydraulic analysis of the improved LEU core as a part of safety analysis for the licensing. The analysis covers steady state, abnormal operational transients and accidents, which were described in the annexes of the licensing documents as design bases events. Calculation conditions for the computer codes were conservatively determined based on the neutronic analysis results and others. The results of the analysis, that revealed the safety criteria were satisfied on the fuel temperature, DNBR and primary coolant temperature, were used in the licensing. The operation license of the JMTR with the improved LEU core was granted in March 2001, and the reactor operation with new core started in November 2001 as 142nd operation cycle. (author)

  12. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    Directory of Open Access Journals (Sweden)

    Ada Ip


    Full Text Available Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15, with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  13. Using Operational Analysis to Improve Access to Pulmonary Function Testing. (United States)

    Ip, Ada; Asamoah-Barnieh, Raymond; Bischak, Diane P; Davidson, Warren J; Flemons, W Ward; Pendharkar, Sachin R


    Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15), with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  14. Improvement of QR Code Recognition Based on Pillbox Filter Analysis


    Jia-Shing Sheu; Kai-Chung Teng


    The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the conte...

  15. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi


    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  16. Process correlation analysis model for process improvement identification. (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong


    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  17. Spiral analysis-improved clinical utility with center detection. (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L


    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  18. Improved Methods for the Enrichment and Analysis of Glycated Peptides

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Qibin; Schepmoes, Athena A; Brock, Jonathan W; Wu, Si; Moore, Ronald J; Purvine, Samuel O; Baynes, John; Smith, Richard D; Metz, Thomas O


    Non-enzymatic glycation of tissue proteins has important implications in the development of complications of diabetes mellitus. Herein we report improved methods for the enrichment and analysis of glycated peptides using boronate affinity chromatography and electron transfer dissociation mass spectrometry, respectively. The enrichment of glycated peptides was improved by replacing an off-line desalting step with an on-line wash of column-bound glycated peptides using 50 mM ammonium acetate. The analysis of glycated peptides by MS/MS was improved by considering only higher charged (≥3) precursor-ions during data-dependent acquisition, which increased the number of glycated peptide identifications. Similarly, the use of supplemental collisional activation after electron transfer (ETcaD) resulted in more glycated peptide identifications when the MS survey scan was acquired with enhanced resolution. In general, acquiring ETD-MS/MS data at a normal MS survey scan rate, in conjunction with the rejection of both 1+ and 2+ precursor-ions, increased the number of identified glycated peptides relative to ETcaD or the enhanced MS survey scan rate. Finally, an evaluation of trypsin, Arg-C, and Lys-C showed that tryptic digestion of glycated proteins was comparable to digestion with Lys-C and that both were better than Arg-C in terms of the number glycated peptides identified by LC-MS/MS.

  19. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu


    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  20. Improved spectrum simulation for validating SEM-EDS analysis (United States)

    Statham, P.; Penman, C.; Duncumb, P.


    X-ray microanalysis by SEM-EDS requires corrections for the many physical processes that affect emitted intensity for elements present in the material. These corrections will only be accurate provided a number of conditions are satisfied and it is essential that the correct elements are identified. As analysis is pushed to achieve results on smaller features and more challenging samples it becomes increasingly difficult to determine if all conditions are upheld and whether the analysis results are valid. If a theoretical simulated spectrum based on the measured analysis result is compared with the measured spectrum, any marked differences will indicate problems with the analysis and can prevent serious mistakes in interpretation. To achieve the necessary accuracy a previous theoretical model has been enhanced to incorporate new line intensity measurements, differential absorption and excitation of emission lines, including the effect of Coster-Kronig transitions and an improved treatment of bremsstrahlung for compounds. The efficiency characteristic has been measured for a large area SDD detector and data acquired from an extensive set of standard materials at both 5 kV and 20 kV. The parameterized model has been adjusted to fit measured characteristic intensities and both background shape and intensity at the same beam current. Examples are given to demonstrate how an overlay of an accurate theoretical simulation can expose some non-obvious mistakes and provide some expert guidance towards a valid analysis result. A new formula for calculating the effective mean atomic number for compounds has also been derived that is appropriate and should help improve accuracy in techniques that calculate the bremsstrahlung or use a bremsstrahlung measurement for calibration.

  1. Improved generalized cell mapping for global analysis of dynamical systems

    Institute of Scientific and Technical Information of China (English)


    Three main parts of generalized cell mapping are improved for global analysis. A simple method, which is not based on the theory of digraphs, is presented to locate complete self-cycling sets that corre- spond to attractors and unstable invariant sets involving saddle, unstable periodic orbit and chaotic saddle. Refinement for complete self-cycling sets is developed to locate attractors and unstable in- variant sets with high degree of accuracy, which can start with a coarse cell structure. A nonuniformly interior-and-boundary sampling technique is used to make the refinement robust. For homeomorphic dissipative dynamical systems, a controlled boundary sampling technique is presented to make gen- eralized cell mapping method with refinement extremely accurate to obtain invariant sets. Recursive laws of group absorption probability and expected absorption time are introduced into generalized cell mapping, and then an optimal order for quantitative analysis of transient cells is established, which leads to the minimal computational work. The improved method is applied to four examples to show its effectiveness in global analysis of dynamical systems.

  2. Improved generalized cell mapping for global analysis of dynamical systems

    Institute of Scientific and Technical Information of China (English)

    ZOU HaiLin; XU JianXue


    Three main parts of generalized cell mapping are improved for global analysis. A simple method, whichis not based on the theory of digraphs, is presented to locate complete self-cycling sets that corre-spond to attractors and unstable invariant sets involving saddle, unstable periodic orbit and chaotic saddle. Refinement for complete self-cycling sets is developed to locate attractors and unstable in-variant sets with high degree of accuracy, which can start with a coarse cell structure. A nonuniformly interior-and-boundary sampling technique is used to make the refinement robust. For homeomorphic dissipative dynamical systems, a controlled boundary sampling technique is presented to make gen-eralized cell mapping method with refinement extremely accurate to obtain invariant sets. Recursive laws of group absorption probability and expected absorption time are introduced into generalized cell mapping, and then an optimal order for quantitative analysis of transient cells is established, which leads to the minimal computational work. The improved method is applied to four examples to show its effectiveness in global analysis of dynamical systems.

  3. Modified paraffin wax for improvement of histological analysis efficiency. (United States)

    Lim, Jin Ik; Lim, Kook-Jin; Choi, Jin-Young; Lee, Yong-Keun


    Paraffin wax is usually used as an embedding medium for histological analysis of natural tissue. However, it is not easy to obtain enough numbers of satisfactory sectioned slices because of the difference in mechanical properties between the paraffin and embedded tissue. We describe a modified paraffin wax that can improve the histological analysis efficiency of natural tissue, composed of paraffin and ethylene vinyl acetate (EVA) resin (0, 3, 5, and 10 wt %). Softening temperature of the paraffin/EVA media was similar to that of paraffin (50-60 degrees C). The paraffin/EVA media dissolved completely in xylene after 30 min at 50 degrees C. Physical properties such as the amount of load under the same compressive displacement, elastic recovery, and crystal intensity increased with increased EVA content. EVA medium (5 wt %) was regarded as an optimal composition, based on the sectioning efficiency measured by the numbers of unimpaired sectioned slices, amount of load under the same compressive displacement, and elastic recovery test. Based on the staining test of sectioned slices embedded in a 5 wt % EVA medium by hematoxylin and eosin (H&E), Masson trichrome (MT), and other staining tests, it was concluded that the modified paraffin wax can improve the histological analysis efficiency with various natural tissues.

  4. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization (United States)

    Gern, Frank H.


    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  5. New Framework for Improving Big Data Analysis Using Mobile Agent

    Directory of Open Access Journals (Sweden)

    Youssef M. ESSA


    Full Text Available the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS. Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop designed to support parallel and distributed data processing. Hadoop provides a distributed file processing system that stores and processes a large scale of data. It enables a fault tolerant by replicating data on three or more machines to avoid data loss.Hadoop is based on client server model and used single master machine called NameNode. However, Hadoop has several drawbacks affecting on its performance and reliability against big data analysis. In this paper, a new framework is proposed to improve big data analysis and overcome specified drawbacks of Hadoop. These drawbacks are replication tasks, Centralized node and nodes failure. The proposed framework is called MapReduce Agent Mobility (MRAM. MRAM is developed by using mobile agent and MapReduce paradigm under Java Agent Development Framework (JADE.

  6. Analysis and improvement of the MAC shaping mechanism in RPR (United States)

    Li, Jing; Xu, Zhanqi; Yu, Shaohua


    Resilient Packet Ring (RPR) specified by IEEE 802.17 is a new standard for Metropolitan Area Networks (MANs). One of RPR's characteristics is that it can support three priorities traffic in a single datapath, i.e., class A, class B and class C, ranging from high priority to low priority, respectively. Different entities such as shaping, scheduling, fairness, topology and protection coordinate to guarantee the Quality of Service (QoS) for different services. Various pieces of the datapath in RPR are tied together through logical queues, thus we investigate the datapath from the view of logical queues in this paper. With a detailed analysis of the MAC shaping mechanism in RPR, we propose some improvement to achieve better transport performance for RPR's three priorities traffic. Simulation results show that our improvement is efficient.

  7. Crystal quality analysis and improvement using x-ray topography.

    Energy Technology Data Exchange (ETDEWEB)

    Maj, J.; Goetze, K.; Macrander, A.; Zhong, Y.; Huang, X.; Maj, L.; Univ. of Chicago


    The Topography X-ray Laboratory of the Advanced Photon Source (APS) at Argonne National Laboratory operates as a collaborative effort with APS users to produce high performance crystals for APS X-ray beamline experiments. For many years the topography laboratory has worked closely with an on-site optics shop to help ensure the production of crystals with the highest quality, most stress-free surface finish possible. It has been instrumental in evaluating and refining methods used to produce high quality crystals. Topographical analysis has shown to be an effective method to quantify and determine the distribution of stresses, to help identify methods that would mitigate the stresses and improve the Rocking curve, and to create CCD images of the crystal. This paper describes the topography process and offers methods for reducing crystal stresses in order to substantially improve the crystal optics.

  8. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang


    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  9. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn


    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  10. Skill Gap Analysis for Improved Skills and Quality Deliverables

    Directory of Open Access Journals (Sweden)

    Mallikarjun Koripadu


    Full Text Available With a growing pressure in identifying the skilled resources in Clinical Data Management (CDM world of clinical research organizations, to provide the quality deliverables most of the CDM organizations are planning to improve the skills within the organization. In changing CDM landscape the ability to build, manage and leverage the skills of clinical data managers is very critical and important. Within CDM to proactively identify, analyze and address skill gaps for all the roles involved. In addition to domain skills, the evolving role of a clinical data manager demands diverse skill sets such as project management, six sigma, analytical, decision making, communication etc. This article proposes a methodology of skill gap analysis (SGA management as one of the potential solutions to the big skill challenge that CDM is gearing up for bridging the gap of skills. This would in turn strength the CDM capability, scalability, consistency across geographies along with improved productivity and quality of deliverables

  11. An improved convergence analysis of smoothed aggregation algebraic multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Brezina, Marian [Univ. of Colorado, Boulder, CO (United States). Dept. of Applied Mathematics; Vaněk, Petr [University of West Bohemia (Czech Republic). Dept. of Mathematics; Vassilevski, Panayot S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing


    We present an improved analysis of the smoothed aggregation (SA) alge- braic multigrid method (AMG) extending the original proof in [SA] and its modification in [Va08]. The new result imposes fewer restrictions on the aggregates that makes it eas- ier to verify in practice. Also, we extend a result in [Van] that allows us to use aggressive coarsening at all levels due to the special properties of the polynomial smoother, that we use and analyze, and thus provide a multilevel convergence estimate with bounds independent of the coarsening ratio.

  12. An Efficient and Configurable Preprocessing Algorithm to Improve Stability Analysis. (United States)

    Sesia, Ilaria; Cantoni, Elena; Cernigliaro, Alice; Signorile, Giovanna; Fantino, Gianluca; Tavella, Patrizia


    The Allan variance (AVAR) is widely used to measure the stability of experimental time series. Specifically, AVAR is commonly used in space applications such as monitoring the clocks of the global navigation satellite systems (GNSSs). In these applications, the experimental data present some peculiar aspects which are not generally encountered when the measurements are carried out in a laboratory. Space clocks' data can in fact present outliers, jumps, and missing values, which corrupt the clock characterization. Therefore, an efficient preprocessing is fundamental to ensure a proper data analysis and improve the stability estimation performed with the AVAR or other similar variances. In this work, we propose a preprocessing algorithm and its implementation in a robust software code (in MATLAB language) able to deal with time series of experimental data affected by nonstationarities and missing data; our method is properly detecting and removing anomalous behaviors, hence making the subsequent stability analysis more reliable.

  13. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ


    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  14. Improved environmental multimedia modeling and its sensitivity analysis. (United States)

    Yuan, Jing; Elektorowicz, Maria; Chen, Zhi


    Modeling of multimedia environmental issues is extremely complex due to the intricacy of the systems with the consideration of many factors. In this study, an improved environmental multimedia modeling is developed and a number of testing problems related to it are examined and compared with each other with standard numerical and analytical methodologies. The results indicate the flux output of new model is lesser in the unsaturated zone and groundwater zone compared with the traditional environmental multimedia model. Furthermore, about 90% of the total benzene flux was distributed to the air zone from the landfill sources and only 10% of the total flux emitted into the unsaturated, groundwater zones in non-uniform conditions. This paper also includes functions of model sensitivity analysis to optimize model parameters such as Peclet number (Pe). The analyses results show that the Pe can be considered as deterministic input variables for transport output. The oscillatory behavior is eliminated with the Pe decreased. In addition, the numerical methods are more accurate than analytical methods with the Pe increased. In conclusion, the improved environmental multimedia model system and its sensitivity analysis can be used to address the complex fate and transport of the pollutants in multimedia environments and then help to manage the environmental impacts.

  15. Skill analysis part 3: improving a practice skill. (United States)

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series.


    Directory of Open Access Journals (Sweden)

    Serghei VAMBOL


    Full Text Available Purpose. Energy and economic evaluation of the improved plasma waste utilization technological process, as well as an expediency substantiation of the use of improved plasma technology by comparing its energy consumption with other thermal methods of utilization. Methodology. Analysis of existing modern and advanced methods of waste management and its impact on environmental safety. Considering of energy and monetary costs to implement two different waste management technologies. Results. Studies have shown regular gasification ensure greater heating value due to differences, a significant amount of nitrogen than for plasma gasification. From the point of view of minimizing energy and monetary costs and environmental safety more promising is to offer advanced technology for plasma waste. To carry out the energy assessment of the appropriateness of the considered technologies-comparative calculation was carried out at the standard conditions. This is because in the processing of waste produced useful products, such as liquefied methane, synthetic gas (94% methane and a fuel gas for heating, suitable for sale that provides cost-effectiveness of this technology. Originality. Shown and evaluated ecological and economic efficiency of proposed improved plasma waste utilization technology compared with other thermal techniques. Practical value. Considered and grounded of energy and monetary costs to implement two different waste management technologies, namely ordinary gasification and using plasma generators. Proposed plasma waste utilization technology allows to obtain useful products, such as liquefied methane, synthetic gas and a fuel gas for heating, which are suitable for sale. Plant for improved plasma waste utilization technological process allows to compensate the daily and seasonal electricity and heat consumption fluctuations by allowing the storage of obtained fuel products.

  17. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)


    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  18. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden


    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.


    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс


    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  20. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan


    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  1. Improved Analysis for Graphic TSP Approximation via Matchings

    CERN Document Server

    Mucha, Marcin


    The Travelling Salesman Problem is one the most fundamental and most studied problems in approximation algorithms. For more than 30 years, the best algorithm known for general metrics has been Christofides's algorithm with approximation factor of 3/2, even though the so-called Held-Karp LP relaxation of the problem is conjectured to have the integrality gap of only 4/3. Very recently, significant progress has been made for the important special case of graphic metrics, first by Oveis Gharan et al., and then by Momke and Svensson. In this paper, we provide an improved analysis for the approach introduced by Momke and Svensson yielding a bound of 35/24 on the approximation factor, as well as a bound of 19/12+epsilon for any epsilon>0 for a more general Travelling Salesman Path Problem in graphic metrics.

  2. Improving knowledge management systems with latent semantic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sebok, A.; Plott, C. [Alion Science and Technology, MA and D Operation, 4949 Pearl East Circle, Boulder, CO 80301 (United States); LaVoie, N. [Pearson Knowledge Technologies, 4940 Pearl East Circle, Boulder, CO 80301 (United States)


    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  3. [The analysis of tincture for improvement of blood circulation]. (United States)

    Bernatoniene, Rūta; Bernatoniene, Jurga; Ramanauskiene, Kristina


    The article describes qualitative and quantitative analysis of tincture for improvement of blood circulation. Flavonoids have been qualitatively determined by methods of thin-layer chromatography, high-pressure liquid chromatography and color reaction with magnesium powder and concentrated hydrochloric acid. Ferments were identified with ferric ammonium sulphate solution; glycosides--with dimethylaminobenzaldehyde solution in sulphur acid; saponin--with lead subacetate solution; reductive materials - with silver nitrate ammoniacal solution; albumen--with ninhydrin solution; and caffeic acid--by method of thin-layer chromatography. An optimal spectrophotometrical method for determining the quantity of quercetin was applied. The refractive index, relative density, loss of drying and ethanol concentration were determined according to European Pharmacopoeia Requirements.

  4. Improving Semantic Search in Digital Libraries Using Multimedia Analysis

    Directory of Open Access Journals (Sweden)

    Ilianna Kollia


    Full Text Available Semantic search of cultural content is of major importance in current digital libraries, such as in Europeana. Content metadata constitute the main features of cultural items that are analysed, mapped and used to interpret users' queries, so that the most appropriate content is selected and presented to the users. Multimedia, especially visual, analysis, has not been a main component in these developments. This paper presents a new semantic search methodology, including a query answering mechanism which meets the semantics of users' queries and enriches the answers by exploiting appropriate visual features, both local and MPEG-7, through an interweaved knowledge and machine learning based approach. An experimental study is presented, using content from the Europeana digital library, and involving both thematic knowledge and extracted visual features from Europeana images, illustrating the improved performance of the proposed semantic search approach.

  5. Benchmarking Of Improved DPAC Transient Deflagration Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Laurinat, James E.; Hensel, Steve J.


    The transient deflagration code DPAC (Deflagration Pressure Analysis Code) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak deflagration pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vessel walls. In addition, DPAC has been coupled with CEA, a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. The improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.

  6. Analysis and improvement measures of flight delay in China (United States)

    Zang, Yuhang


    Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.

  7. The NASA aircraft noise prediction program improved propeller analysis system (United States)

    Nguyen, L. Cathy


    The improvements and the modifications of the NASA Aircraft Noise Prediction Program (ANOPP) and the Propeller Analysis System (PAS) are described. Comparisons of the predictions and the test data are included in the case studies for the flat plate model in the Boundary Layer Module, for the effects of applying compressibility corrections to the lift and pressure coefficients, for the use of different weight factors in the Propeller Performance Module, for the use of the improved retarded time equation solution, and for the effect of the number grids in the Transonic Propeller Noise Module. The DNW tunnel test data of a propeller at different angles of attack and the Dowty Rotol data are compared with ANOPP predictions. The effect of the number of grids on the Transonic Propeller Noise Module predictions and the comparison of ANOPP TPN and DFP-ATP codes are studied. In addition to the above impact studies, the transonic propeller noise predictions for the SR-7, the UDF front rotor, and the support of the enroute noise test program are included.

  8. Voxel model in BNCT treatment planning: performance analysis and improvements (United States)

    González, Sara J.; Carando, Daniel G.; Santa Cruz, Gustavo A.; Zamenhof, Robert G.


    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  9. Collagen alpha5 and alpha2(IV) chain coexpression: analysis of skin biopsies of Alport patients. (United States)

    Patey-Mariaud de Serre, N; Garfa, M; Bessiéres, B; Noël, L H; Knebelmann, B


    Alport syndrome is a collagen type IV disease caused by mutations in the COL4A5 gene with the X-linked form being most prevalent. The resultant alpha5(IV) collagen chain is a component of the glomerular and skin basement membranes (SBMs). Immunofluorescent determination of the alpha5(IV) chain in skin biopsies is the procedure of choice to identify patients. In 30% of patients, however, the mutant protein is still found in the SBM resulting in a normal staining pattern. In order to minimize or eliminate false results, we compared the distribution of the alpha2(IV) chain (another SBM component) and the alpha5(IV) chain by standard double label immunofluorescence (IF) and by confocal laser scanning microscopy. The study was performed on 55 skin biopsies of patients suspected of Alports and five normal control specimens. In normal skin, IF showed the classical linear pattern for both collagens along the basement membrane. Additionally, decreased alpha5(IV) was found in the bottom of the dermal papillary basement membrane. Confocal analysis confirmed the results and show alpha5(IV) focal interruptions. In suspected patients, both techniques showed the same rate of abnormal alpha5(IV) expression: segmental in women and absent in men. Our results show a physiological variation of alpha5(IV) location with focal interruptions and decreased expression in the bottom of the dermal basement membrane. Comparison of alpha5(IV) with alpha2(IV) expression is simple and eliminates technical artifacts.

  10. Improved Smoothed Analysis of the k-Means Method

    CERN Document Server

    Manthey, Bodo


    The k-means method is a widely used clustering algorithm. One of its distinguished features is its speed in practice. Its worst-case running-time, however, is exponential, leaving a gap between practical and theoretical performance. Arthur and Vassilvitskii (FOCS 2006) aimed at closing this gap, and they proved a bound of $\\poly(n^k, \\sigma^{-1})$ on the smoothed running-time of the k-means method, where n is the number of data points and $\\sigma$ is the standard deviation of the Gaussian perturbation. This bound, though better than the worst-case bound, is still much larger than the running-time observed in practice. We improve the smoothed analysis of the k-means method by showing two upper bounds on the expected running-time of k-means. First, we prove that the expected running-time is bounded by a polynomial in $n^{\\sqrt k}$ and $\\sigma^{-1}$. Second, we prove an upper bound of $k^{kd} \\cdot \\poly(n, \\sigma^{-1})$, where d is the dimension of the data space. The polynomial is independent of k and d, and w...

  11. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis. (United States)

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles


    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination.

  12. Improved analysis of bias in Monte Carlo criticality safety (United States)

    Haley, Thomas C.


    Criticality safety, the prevention of nuclear chain reactions, depends on Monte Carlo computer codes for most commercial applications. One major shortcoming of these codes is the limited accuracy of the atomic and nuclear data files they depend on. In order to apply a code and its data files to a given criticality safety problem, the code must first be benchmarked against similar problems for which the answer is known. The difference between a code prediction and the known solution is termed the "bias" of the code. Traditional calculations of the bias for application to commercial criticality problems are generally full of assumptions and lead to large uncertainties which must be conservatively factored into the bias as statistical tolerances. Recent trends in storing commercial nuclear fuel---narrowed regulatory margins of safety, degradation of neutron absorbers, the desire to use higher enrichment fuel, etc.---push the envelope of criticality safety. They make it desirable to minimize uncertainty in the bias to accommodate these changes, and they make it vital to understand what assumptions are safe to make under what conditions. A set of improved procedures is proposed for (1) developing multivariate regression bias models, and (2) applying multivariate regression bias models. These improved procedures lead to more accurate estimates of the bias and much smaller uncertainties about this estimate, while also generally providing more conservative results. The drawback is that the procedures are not trivial and are highly labor intensive to implement. The payback in savings in margin to criticality and conservatism for calculations near regulatory and safety limits may be worth this cost. To develop these procedures, a bias model using the statistical technique of weighted least squares multivariate regression is developed in detail. Problems that can occur from a weak statistical analysis are highlighted, and a solid statistical method for developing the bias

  13. Analysis and Improvement of Authenticatable Ring Signcryption Scheme

    Institute of Scientific and Technical Information of China (English)

    LI Fa-gen; Shirase Masaaki; Takagi Tsuyoshi


    We show that the Zhang-Yang-Zhu-Zhang identity-based authenticatable ring signcryption scheme is not secure against chosen plaintext attacks.Furthermore,we propose an improved scheme that remedies the weakness of the Zhang-Yang-Zhu-Zhang scheme.The improved scheme has shorter ciphertext size than the Zhang-Yang-Zhu-Zhang scheme.We then prove that the improved scheme satisfies confidentiality,unforgeability,anonymity and authenticatability.

  14. Improving the flash flood frequency analysis applying dendrogeomorphological evidences (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.


    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  15. Discrete Packet Analysis for Improved Atmospheric Rejection on Modulated Laser Signals

    Energy Technology Data Exchange (ETDEWEB)

    O& #x27; Neill, M., McKenna, I., DiBenedetto, J., Capelle, G., Trainham, R.


    This slide-show discusses how the method of discrete packet analysis improves atmospheric compensation for quasi-CW fluorescence detection methods. This is key to improving remote sensing capabilities.

  16. Analysis of the Improvement Methods for Equipment Maintenance Support

    Institute of Scientific and Technical Information of China (English)

    ZHANG Rui-chang; ZHAO Song-zheng


    According to military requirement, and based on the problems of equipment maintenance support methods in high-tech battles, each element supporting equipment maintenance is analyzed, and the methods for improving equipment maintenance are proposed.

  17. 10 Inventions on improving keyboard efficiency: A TRIZ based analysis


    Mishra, Umakant


    A keyboard is the most important input device for a computer. With the development of technology a basic keyboard does not want to remain confined within the basic functionalities of a keyboard, rather it wants to go beyond. There are several inventions which attempt to improve the efficiency of a conventional keyboard. This article illustrates 10 inventions from US Patent database all of which have proposed very interesting methods for improving the efficiency of a computer keyboard. Some in...

  18. Improving Student Understanding of Geological Rates via Chronotopographic Analysis (United States)

    Linneman, S. R.; Clark, D. H.; Buly, P.


    We are investigating the value of incorporating chronotopographic analysis into undergraduate geology courses using terrestrial laser scanning (TLS) to improve student understanding of the rates and styles of geomorphic processes. Repeat high-resolution TLS surveys can track the evolution of active landscapes, including sites of active faulting, glaciation, landslides, fluvial systems and coastal dynamics. We hypothesize that geology students who collect and analyze such positional data for local active landscapes will develop a better sense of the critical (and non-steady) geomorphic processes affecting landscape change and develop a greater interest in pursuing opportunities for geology field work. We have collected baseline TLS scans of actively evolving landscapes identified in cooperation with land-use agencies. The project team is developing inquiry activities for each site and assessing their impact. For example, our faculty partners at 2-year colleges are interested in rapid retreat of coastal bluffs near their campuses. In this situation, TLS will be part of a laboratory activity in which students compare historic air photos to predict areas of the most active long-term bluff retreat; join their instructor to collect TLS data at the site (replicating the baseline scan); sketch outcrops in the field and suggest areas of the site for higher resolution scanning; and in the following class compare their predictions to the deformation maps that are the output of the repeated TLS scans. A brief two question assessment instrument was developed to address both the content and attitudinal targets. It was given WWU Geomorphology classes in 3 sequential quarters of the 2009/2010 academic year, 2 which did not work with the TLS technology (pre treatment) and one that did participate in the redesigned activities (post treatment). Additionally focus group interviews were conducted with the post students so they could verbalize their experience with the TLS. The content

  19. Why economic analysis of health system improvement interventions matters

    Directory of Open Access Journals (Sweden)

    Edward Ivor Broughton


    Full Text Available There is little evidence to direct health systems toward providing efficient interventions to address medical errors, defined as an unintended act of omission or commission or one not executed as intended that may or may not cause harm to the patient but does not achieve its intended outcome. We believe that lack of guidance on what is the most efficient way to reduce adverse events and improve the quality of health care limits the scale-up of health system improvement interventions. Challenges to economic evaluation of these interventions include defining and implementing improvement interventions in different settings with high fidelity, capturing all of the positive and negative effects of the intervention, using process measures of effectiveness rather than health outcomes, and determining the full cost of the intervention and all economic consequences its effects. However, health system improvement interventions should be treated similarly to individual medical interventions and undergo rigorous economic evaluation to provide actionable evidence to guide policy-makers in decisions of resources allocation for improvement activities among other competing demands for health care resources.

  20. Why Economic Analysis of Health System Improvement Interventions Matters (United States)

    Broughton, Edward Ivor; Marquez, Lani


    There is little evidence to direct health systems toward providing efficient interventions to address medical errors, defined as an unintended act of omission or commission or one not executed as intended that may or may not cause harm to the patient but does not achieve its intended outcome. We believe that lack of guidance on what is the most efficient way to reduce medical errors and improve the quality of health-care limits the scale-up of health system improvement interventions. Challenges to economic evaluation of these interventions include defining and implementing improvement interventions in different settings with high fidelity, capturing all of the positive and negative effects of the intervention, using process measures of effectiveness rather than health outcomes, and determining the full cost of the intervention and all economic consequences of its effects. However, health system improvement interventions should be treated similarly to individual medical interventions and undergo rigorous economic evaluation to provide actionable evidence to guide policy-makers in decisions of resource allocation for improvement activities among other competing demands for health-care resources.

  1. Analysis and Improvement of Low Rank Representation for Subspace segmentation

    CERN Document Server

    Siming, Wei


    We analyze and improve low rank representation (LRR), the state-of-the-art algorithm for subspace segmentation of data. We prove that for the noiseless case, the optimization model of LRR has a unique solution, which is the shape interaction matrix (SIM) of the data matrix. So in essence LRR is equivalent to factorization methods. We also prove that the minimum value of the optimization model of LRR is equal to the rank of the data matrix. For the noisy case, we show that LRR can be approximated as a factorization method that combines noise removal by column sparse robust PCA. We further propose an improved version of LRR, called Robust Shape Interaction (RSI), which uses the corrected data as the dictionary instead of the noisy data. RSI is more robust than LRR when the corruption in data is heavy. Experiments on both synthetic and real data testify to the improved robustness of RSI.

  2. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    ZHANG Shutuan


    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  3. Improved Analysis of Kannan's Shortest Lattice Vector Algorithm

    CERN Document Server

    Hanrot, Guillaume


    The security of lattice-based cryptosystems such as NTRU, GGH and Ajtai-Dwork essentially relies upon the intractability of computing a shortest non-zero lattice vector and a closest lattice vector to a given target vector in high dimensions. The best algorithms for these tasks are due to Kannan, and, though remarkably simple, their complexity estimates have not been improved since more than twenty years. Kannan's algorithm for solving the shortest vector problem is in particular crucial in Schnorr's celebrated block reduction algorithm, on which are based the best known attacks against the lattice-based encryption schemes mentioned above. Understanding precisely Kannan's algorithm is of prime importance for providing meaningful key-sizes. In this paper we improve the complexity analyses of Kannan's algorithms and discuss the possibility of improving the underlying enumeration strategy.

  4. Method for improving accuracy in full evaporation headspace analysis. (United States)

    Xie, Wei-Qi; Chai, Xin-Sheng


    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. This article is protected by copyright. All rights reserved.

  5. Does Competition Improve Public School Efficiency? A Spatial Analysis (United States)

    Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.


    Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…

  6. Oceanic Area System Improvement Study (OASIS). Volume I. Executive Summary and Improvement Alternatives Development and Analysis. (United States)


    h ich is shown in Tabla O2 t ha4v :x outib RLs othyr- improvements) would tJM"PJ sysc r"Oar jeyi - -VMG - 3-s &3V)c ee, 2ari4cularly ina one CAR fli...Al 2.867000 (3) :, 6-.0 Xii ’ / i~s i 2 52.399990 5.0-iK (1)? iscua eio corresponding to disc~yio in(ref. 5,6) \\(2) Parame trs corresponding to

  7. Optimizing Bus Passenger Complaint Service through Big Data Analysis: Systematized Analysis for Improved Public Sector Management

    Directory of Open Access Journals (Sweden)

    Weng-Kun Liu


    Full Text Available With the advances in industry and commerce, passengers have become more accepting of environmental sustainability issues; thus, more people now choose to travel by bus. Government administration constitutes an important part of bus transportation services as the government gives the right-of-way to transportation companies allowing them to provide services. When these services are of poor quality, passengers may lodge complaints. The increase in consumer awareness and developments in wireless communication technologies have made it possible for passengers to easily and immediately submit complaints about transportation companies to government institutions, which has brought drastic changes to the supply–demand chain comprised of the public sector, transportation companies, and passengers. This study proposed the use of big data analysis technology including systematized case assignment and data visualization to improve management processes in the public sector and optimize customer complaint services. Taichung City, Taiwan, was selected as the research area. There, the customer complaint management process in public sector was improved, effectively solving such issues as station-skipping, allowing the public sector to fully grasp the service level of transportation companies, improving the sustainability of bus operations, and supporting the sustainable development of the public sector–transportation company–passenger supply chain.

  8. Analysis of Strategies to Improve Heliostat Tracking at Solar Two

    Energy Technology Data Exchange (ETDEWEB)

    Jones, S.A.; Stone, K.W.


    This paper investigates dhlerent strategies that can be used to improve the tracking accuracy of heliostats at Solar Two. The different strategies are analyzed using a geometrical error model to determine their performance over the course of a day. By using the performance of heliostats in representative locations of the field aad on representative days of the year, an estimate of the annual performance of each strategy is presented.

  9. Analysis of strategies to improve heliostat tracking at Solar Two

    Energy Technology Data Exchange (ETDEWEB)

    Jones, S.A.; Stone, K.W.


    This paper investigates different strategies that can be used to improve the tracking accuracy of heliostats at Solar Two. The different strategies are analyzed using a geometrical error model to determine their performance over the course of a day. By using the performance of heliostats in representative locations of the field and on representative days of the year, an estimate of the annual performance of each strategy is presented.

  10. Analysis and improvement of SNR using time slicing (United States)

    Karanam, Srikrishna; Singh, Amarjot; Kumar, Devinder; Choubey, Akash; Bacchuwar, Ketan


    Noise is a very important factor which in most cases, plays an antagonistic role in the vast field of image processing. Thus noise needs to be studied in great depth in order to improve the quality of images. The quantity of signal in an image, corrupted by noise is generally described by the term Signal-to-Noise ratio. Capturing multiple photos at different focus settings is a powerful approach for improving SNR. The paper analyses a frame work for optimally balancing the tradeoff's between defocus and sensor noise by experimenting on synthetic as well as real video sequences. The method is first applied to synthetic image where the improvement in SNR is studied by the ability of Hough transform to extract the number of lines with respect to the variation in SNR. The paper further experiments on real time video sequences while the improvement in SNR is analyzed using different edge operators like Sobel, Canny, Prewitt, Roberts and Laplacian. The result obtained is further analyzed using different edge operators. The main aim is to detect the edges at different values of SNR which will be a prominent measure of the signal strength as well as clarity of an image. The paper also explains in depth the modeling of noise leading to better understanding of SNR. The results obtain from both synthetic image and real time video sequences elaborate the increase in SNR with the increment in the total number of time slices in a fixed budget leading to clear pictures. This technique can be very effectively applied to capture high quality images from long distances.

  11. Senior Project: Analysis and Improvement of Existing Apparel Technology (United States)


    and improve the quality of the seam , the major cost centers - work in process, operator handlinq, and production control were left far behind. Was it...skilled operation when performed manually; the curve must be stitched without puckering or irregular stitches and the line must remain parallel to the...equipped with side seam expanders. In this plant the Ajax Presses had been modified with a spring I steel clamp on the back of the buck to secure the box

  12. Joint regression analysis and AMMI model applied to oat improvement (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.


    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  13. Improving Family Forest Knowledge Transfer through Social Network Analysis (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.


    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  14. An improved quantitative analysis method for plant cortical microtubules. (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng


    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  15. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu


    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  16. EEG Signal Decomposition and Improved Spectral Analysis Using Wavelet Transform (United States)


    research and medical applications. Wavelet transform (WT) is a new multi-resolution time-frequency analysis method. WT possesses localization feature both... wavelet transform , the EEG signals are successfully decomposed and denoised. In this paper we also use a ’quasi-detrending’ method for classification of EEG

  17. Improved Conjunction Analysis via Collaborative Space Situational Awareness (United States)

    Kelso, T.; Vallado, D.; Chan, J.; Buckwalter, B.

    With recent events such as the Chinese ASAT test in 2007 and the USA 193 intercept in 2008, many satellite operators are becoming increasingly aware of the potential threat to their satellites as the result of orbital debris or even other satellites. However, to be successful at conjunction monitoring and collision avoidance requires accurate orbital information for as many space objects (payloads, dead satellites, rocket bodies, and debris) as possible. Given the current capabilities of the US Space Surveillance Network (SSN), approximately 18,500 objects are now being tracked and orbital data (in the form of two-line element sets) is available to satellite operators for 11,750 of them (as of 2008 September 1). The capability to automatically process this orbital data to look for close conjunctions and provide that information to satellite operators via the Internet has been continuously available on CelesTrak, in the form of Satellite Orbital Conjunction Reports Assessing Threatening Encounters in Space (SOCRATES), since May 2004. Those reports are used by many operators as one way to keep apprised of these potential threats. However, the two-line element sets (TLEs) are generated using non-cooperative tracking via the SSN's network of radar and optical sensors. As a result, the relatively low accuracy of the data results in a large number of false alarms that satellite operators must routinely deal with. Yet, satellite operators typically perform orbit maintenance for their own satellites, using active ranging and GPS systems. These data are often an order of magnitude more accurate than those available using TLEs. When combined (in the form of ephemerides) with maneuver planning information, the ability to maintain predictive awareness increases significantly. And when satellite operators share this data, the improved space situational awareness, particularly in the crowded geosynchronous belt, can be dramatic and the number of false alarms can be reduced

  18. Improved wavelet analysis for induction motors mixed-fault diagnosis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hanlei; ZHOU Jiemin; LI Gang


    Eccentricity is one of the frequent faults of induction motors,and it may cause rub between the rotor and the stator.Early detection of significant rub from pure eccentricity can prolong the lifespan of induction motors.This paper is devoted to such mixed-fault diagnosis:eccentricity plus rub fault.The continuous wavelet transform(CWT)is employed to analyze vibration signals obtained from the motor body.An improved continuous wavelet trartsform was proposed to alleviate the frequency aliasing.Experimental results show that the proposed method can effectively distinguish two types of faults,single-fault of eccentricity and mixed-fault of eccentricity plus rub.

  19. Analysis of radial electric field in LHD towards improved confinement

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.


    The radial electric field (E{sub r}) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E{sub r} transition and bifurcation. The ambipolar E{sub r} is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E{sub r} properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E{sub r} and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E{sub r} in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E{sub r} can exist. This is the region where E{sub r} transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E{sub r} properties, which indicates an optimum path towards improved confinement. (author)

  20. Modeling Analysis and Improvement of Power Loss in Microgrid

    Directory of Open Access Journals (Sweden)

    H. Lan


    Full Text Available The consumption of conventional energy sources and environmental concerns have resulted in rapid growth in the amount of renewable energy introduced to power systems. With the help of distributed generations (DG, the improvement of power loss and voltage profile can be the salient benefits. However, studies show that improper placement and size of energy storage system (ESS lead to undesired power loss and the risk of voltage stability, especially in the case of high renewable energy penetration. To solve the problem, this paper sets up a microgrid based on IEEE 34-bus distribution system which consists of wind power generation system, photovoltaic generation system, diesel generation system, and energy storage system associated with various types of load. Furthermore, the particle swarm optimization (PSO algorithm is proposed in the paper to minimize the power loss and improve the system voltage profiles by optimally managing the different sorts of distributed generations under consideration of the worst condition of renewable energy production. The established IEEE 34-bus system is adopted to perform case studies. The detailed simulation results for each case clearly demonstrate the necessity of optimal management of the system operation and the effectiveness of the proposed method.

  1. Analysis Approach to Improve Star Rating Of Water Heater

    Directory of Open Access Journals (Sweden)

    Sujata Dabhade


    Full Text Available Electric Water Heaters are widely used all over the world that can be categorized in two types i.e. Instant Water Heaters & Storage type Water Heaters. The energy consumption for 6 liter water heaters is much higher in the storage type of water heater. As energy is an important factor for economic development of country, therefore there is need to save the energy which implies the focus to use Storage type Water Heaters. In 6 Liter water heater, Existing model converting from 4 star rating to 5 star rating by thermal analysis & insulation. After the theoretical calculation of thickness of glass wool is the practical testing of product with BEE norms & got results for 5 Star Calculation. Finally we are doing the thermal analysis for theoretical & practical verification of the product

  2. Improved statistics for genome-wide interaction analysis. (United States)

    Ueki, Masao; Cordell, Heather J


    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new "joint effects" statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al

  3. An Effective Analysis of Weblog Files to Improve Website Performance

    Directory of Open Access Journals (Sweden)



    Full Text Available As there is an enormous growth in the web in terms of web sites, the size of web usage data is also increasing gradually. But this web usage data plays a vital role in the effective management of web sites. This web usage data is stored in a file called weblog by the web server. In order to discover the knowledge, required for improving the performance of websites, we need to apply the best preprocessing methodology on the server weblog file. Data preprocessing is a phase which automatically identifies the meaningful patterns and user behavior. So far analyzing the weblog data has been a challenging task in the area of web usage mining. In this paper we propose an effective and enhanced data preprocessing methodology which produces an efficient usage patterns and reduces the size of weblog down to 75-80% of its initial size. The experimental results are also shown in the following chapters.

  4. Politics during Software Process Improvement: A Metatriangulation Analysis

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Mathiassen, Lars; Kræmmergaard, Pernille


    -the-talk, and keeping-up-appearances. Finally, we combine the patterns we observed with insights from the literature to build a metaparadigm theory of SPI politics. In addition to contributing to Information Systems (IS) research in the key area of systems development innovation, the study furthers understanding of how......Software Process Improvement (SPI) has played a dominant role in systems development innovation research and practice for more than 20 years. However, while extant theory acknowledges the political nature of SPI initiatives, researchers have yet to empirically investigate and theorize about how...... organizational politics might impact outcomes. Against this backdrop, we apply metatriangulation to build new theory based on rich data from an SPI project in four business units at a high-tech firm. Reflecting the diverse ways in which politics manifests, we first analyze behaviors and outcomes in each unit...

  5. Demodulation improvement analysis of FEC quasi-coherent CPM (United States)

    Norris, James A.; Nieto, John W.


    Continuous Phase Modulation (CPM) schemes are advantageous for low-power radios. The constant envelope transmit signal is more efficient for both linear and non-linear amplifier architectures. A standard, coherent CPM receiver can take advantage of modulation memory and is more complex than a coherent Phase Shift Keyed receiver. But the CPM signal can be demodulated non-coherently and still take advantage of the trellis structure inherent in the modulation. Prior analyses of several different non-coherent CPM schemes have been provided with many providing coherent or near coherent performance. In this paper we will discuss a new, reduced complexity decoder that improves upon the noncoherent performance. In addition, this new algorithm generates soft decision metrics that allow the addition of a forward error correction scheme (an outer code) with coherent equivalent performance gains.

  6. Improving Between-Shot Fusion Data Analysis with Parallel Structures

    Energy Technology Data Exchange (ETDEWEB)



    In the Phase I project we concentrated on three technical objectives to demonstrate the feasibility of the Phase II project: (1) the development of a parallel MDSplus data handler, (2) the parallelization of existing fusion data analysis packages, and (3) the development of techniques to automatically generate parallelized code using pre-compiler directives. We summarize the results of the Phase I research for each of these objectives below. We also describe below additional accomplishments related to the development of the TaskDL and mpiDL parallelization packages.

  7. Knickpoint finder: A software tool that improves neotectonic analysis (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.


    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  8. Functional Virtual Prototyping in Vehicle Chassis Reform Analysis and Improvement Design

    Institute of Scientific and Technical Information of China (English)


    The contribution of functional virtual prototyping to vehicle chassis development is presented. The different topics that we took into consideration were reform analysis and improvement design during the vehicle chassis development. A frame of coordinates based on the digital-model was established, the main CAE analysis methods, multi-body system dynamics and finite element analysis were applied to the digital-model build by CAD/CAM software. The method was applied in the vehicle chassis reform analysis and improvement design, all the analysis and design projects were implemented in the uniform digital-model, and the development was carried through effectively.

  9. Performance Analysis of an Improved MUSIC DoA Estimator (United States)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe


    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  10. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada


    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  11. Analysis and improvement of cyclotron thallium target room shield. (United States)

    Hajiloo, N; Raisali, G; Aslani, G


    Because of high neutron and gamma-ray intensities generated during bombardment of a thallium-203 target, a thallium target-room shield and different ways of improving it have been investigated. Leakage of neutron and gamma ray dose rates at various points behind the shield are calculated by simulating the transport of neutrons and photons using the Monte Carlo N Particle transport computer code. By considering target-room geometry, its associated shield and neutron and gamma ray source strengths and spectra, three designs for enhancing shield performance have been analysed: a shielding door at the maze entrance, covering maze walls with layers of some effective materials and adding a shadow-shield in the target room in front of the radiation source. Dose calculations were carried out separately for different materials and dimensions for all the shielding scenarios considered. The shadow-shield has been demonstrated to be one suitable for neutron and gamma dose equivalent reduction. A 7.5-cm thick polyethylene shadow-shield reduces both dose equivalent rate at maze entrance door and leakage from the shield by a factor of 3.

  12. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.


    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  13. Stiffness Analysis and Improvement of Bolt-Plate Contact Assemblies

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard; Pedersen, Pauli


    In a previous study it was shown that, a simplified expression for the stiffness of the plate member in a bolt-plate assembly can be found. The stiffnesses of the bolt and the connected plates are the primary quantities that control the lifetime of a dynamically loaded connection. The present study...... of stiffnesses is extended to include different material parameters by including the influence of Poisson's ratio. Two simple practical formulas are suggested and their accuracies are documented for different bolts and different material (Poisson's ratio). Secondly, the contact analysis between the bolt head...... and the plate is extended by the possibility of designing a gap, that is, a nonuniform distance between the bolt and plate before prestressing. Designing the gap function generates the possibility for a better stress field by which the stiffness of the bolt is lowered, and at the same time the stiffness...

  14. An improved method for thin layer chromatographic analysis of saponins. (United States)

    Sharma, Om P; Kumar, Neeraj; Singh, Bikram; Bhat, Tej K


    Analysis of saponins by thin layer chromatography (TLC) is reported. The solvent system was n-butanol:water:acetic acid (84:14:7). Detection of saponins on the TLC plates after development and air-drying was done by immersion in a suspension of sheep erythrocytes, followed by washing off the excess blood on the plate surface. Saponins appeared as white spots against a pink background. The protocol provided specific detection of saponins in the saponins enriched extracts from Aesculusindica (Wall. ex Camb.) Hook.f., Lonicera japonica Thunb., Silene inflata Sm., Sapindusmukorossi Gaertn., Chlorophytum borivilianum Santapau & Fernandes, Asparagusadscendens Roxb., Asparagus racemosus Willd., Agave americana L., Camellia sinensis [L.] O. Kuntze. The protocol is convenient, inexpensive, does not require any corrosive chemicals and provides specific detection of saponins.

  15. Improving Human/Autonomous System Teaming Through Linguistic Analysis (United States)

    Meszaros, Erica L.


    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  16. Wheat signature modeling and analysis for improved training statistics (United States)

    Nalepka, R. F. (Principal Investigator); Malila, W. A.; Cicone, R. C.; Gleason, J. M.


    The author has identified the following significant results. The spectral, spatial, and temporal characteristics of wheat and other signatures in LANDSAT multispectral scanner data were examined through empirical analysis and simulation. Irrigation patterns varied widely within Kansas; 88 percent of wheat acreage in Finney was irrigated and 24 percent in Morton, as opposed to less than 3 percent for western 2/3's of the State. The irrigation practice was definitely correlated with the observed spectral response; wheat variety differences produced observable spectral differences due to leaf coloration and different dates of maturation. Between-field differences were generally greater than within-field differences, and boundary pixels produced spectral features distinct from those within field centers. Multiclass boundary pixels contributed much of the observed bias in proportion estimates. The variability between signatures obtained by different draws of training data decreased as the sample size became larger; also, the resulting signatures became more robust and the particular decision threshold value became less important.

  17. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    monitoring of content that is accessible. The study examines risks associated with information security, technological change and continued popularity of Scouting. Mitigation is based on system functions that are defined. The approach to developing an improved system for facilitating Boy Scout leader functions was iterative with insights into capabilities coming in the course of working through the used cases and sequence diagrams.


    Directory of Open Access Journals (Sweden)

    V. N. Alferov


    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.


    Directory of Open Access Journals (Sweden)

    V. N. Alferov


    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  20. A quality improvement study using fishbone analysis and an electronic medical records intervention to improve care for children with asthma. (United States)

    Gold, Jonathan; Reyes-Gastelum, David; Turner, Jane; Davies, H Dele


    Despite expert guidelines, gaps persist in quality of care for children with asthma. This study sought to identify barriers and potential interventions to improve compliance to national asthma prevention guidelines at a single academic pediatric primary care clinic. Using the plan-do-check-act (PDCA) quality improvement framework and fishbone analysis, several barriers to consistent asthma processes and possible interventions were identified by a group of key stakeholders. Two interventions were implemented using the electronic medical record (EMR). Physician documentation of asthma quality measures were analyzed before intervention and during 2 subsequent time points over 16 months. Documentation of asthma action plans (core group P asthma care in a pediatric primary care setting.

  1. Air Force Training: Further Analysis and Planning Needed to Improve Effectiveness (United States)


    AIR FORCE TRAINING Further Analysis and Planning Needed to Improve Effectiveness Report to Congressional Committees...GAO-16-864, a report to congressional committees September 2016 AIR FORCE TRAINING Further Analysis and Planning Needed to Improve...Effectiveness Why GAO Did This Study For more than a decade, the Air Force focused its training on supporting operations in the Middle East. The Air Force

  2. Analysis of Solidiifcation of High Manganese Steels Using Improved Differential Thermal Analysis Method

    Institute of Scientific and Technical Information of China (English)

    Chang-ling ZHUANG; Jian-hua LIU; Christian BERNHARD; Peter PRESOLY


    High manganese steels can damage the differential thermal analysis (DTA) instrument due to the manganese evaporation during high temperature experiments. After analyzing the relationship between residual oxygen and manganese evaporation, tanta-lum metal was employed to modify the crucible of DTA, and zirconium getter together with strict gas puriifcation measures were applied to control the volatilization of manganese. By these modiifcations, problems of thermocouple damage and DTA instrument contamination were successfully resolved. Cobalt samples were adopted to calibrate the accuracy of DTA instruments under the same trial condition of high manganese steel samples, and the detection error was conifrmed to be less than 1 °C. Liquidus and soli-dus temperatures of high Mn steels were measured by improved DTA method. It was found that the liquidus temperatures of sam-ples tested by experiments increased linearly with the heating rates. To eliminate the effects of the heating rate, equilibrium liquidus temperature was determined by iftting the liquidus temperatures at different heating rates, and referred as real liquidus temperature. No clear relationship between solidus temperatures and heating rates was found, and the solidus temperature was ifnally set as the average value of several experimental data.

  3. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H. [Asan Medical Center, Seoul (Korea, Republic of)


    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported.

  4. Cause-Effect Analysis: Improvement of a First Year Engineering Students' Calculus Teaching Model (United States)

    van der Hoff, Quay; Harding, Ansie


    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of…

  5. A meshless based method for solution of integral equations: Improving the error analysis


    Mirzaei, Davoud


    This draft concerns the error analysis of a collocation method based on the moving least squares (MLS) approximation for integral equations, which improves the results of [2] in the analysis part. This is mainly a translation from Persian of some parts of Chapter 2 of the author's PhD thesis in 2011.

  6. Evaluating and Improving the SAMA (Segmentation Analysis and Market Assessment) Recruiting Model (United States)


    REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EVALUATING AND IMPROVING THE SAMA ( SEGMENTATION ANALYSIS AND MARKET ...This tool calculates recruitment potential of recruiting centers using a four -year weighted performance average within customized Army market segments ...centers using a four -year weighted performance average within customized Army market segments . An analysis of the current SAMA model shows it

  7. An improvement of window factor analysis for resolution of noisy HPLC-DAD data

    Institute of Scientific and Technical Information of China (English)

    邵学广; 林祥钦; 邵利民; 李梅青


    Window factor analysis (WFA) is a powerful tool in analyzing evolutionary process. However, it was found that window factor analysis is much sensitive to the noise involved in original data matrix. An error analysis was done with the fact that the concentration profiles resolved by the conventional window factor analysis are easily distorted by the noise reserved by the abstract factor analysis (AFA), and a modified algorithm for window factor analysis was proposed. Both simulated and experimental HPLC-DAD data were investigated by the conventional and the improved methods. Results show that the improved method can yield less noise-distorted concentration profiles than the conventional method, and the ability for resolution of noisy data sets can be greatly enhanced.

  8. Use-related risk analysis for medical devices based on improved FMEA. (United States)

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping


    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  9. A potential target gene for the host-directed therapy of mycobacterial infection in murine macrophages (United States)

    Bao, Zhang; Chen, Ran; Zhang, Pei; Lu, Shan; Chen, Xing; Yao, Yake; Jin, Xiaozheng; Sun, Yilan; Zhou, Jianying


    Mycobacterium tuberculosis (MTB), one of the major bacterial pathogens for lethal infectious diseases, is capable of surviving within the phagosomes of host alveolar macrophages; therefore, host genetic variations may alter the susceptibility to MTB. In this study, to identify host genes exploited by MTB during infection, genes were non-selectively inactivated using lentivirus-based antisense RNA methods in RAW264.7 macrophages, and the cells that survived virulent MTB infection were then screened. Following DNA sequencing of the surviving cell clones, 26 host genes affecting susceptibility to MTB were identified and their pathways were analyzed by bioinformatics analysis. In total, 9 of these genes were confirmed as positive regulators of collagen α-5(IV) chain (Col4a5) expression, a gene encoding a type IV collagen subunit present on the cell surface. The knockdown of Col4a5 consistently suppressed intracellular mycobacterial viability, promoting the survival of RAW264.7 macrophages following mycobacterial infection. Furthermore, Col4a5 deficiency lowered the pH levels of intracellular vesicles, including endosomes, lysosomes and phagosomes in the RAW264.7 cells. Finally, the knockdown of Col4a5 post-translationally increased microsomal vacuolar-type H+-ATPase activity in macrophages, leading to the acidification of intracellular vesicles. Our findings reveal a novel role for Col4a5 in the regulation of macrophage responses to mycobacterial infection and identify Col4a5 as a potential target for the host-directed anti-mycobacterial therapy. PMID:27432120

  10. Utilizing Collaborative Analysis of Student Learning in Educator Preparation Programs for Continuous Improvement

    Directory of Open Access Journals (Sweden)

    Susan Colby


    Full Text Available In this results-oriented era of accountability, educator preparation programs are called upon to provide comprehensive data related to student and program outcomes while also providing evidence of continuous improvement. Collaborative Analysis of Student Learning (CASL is one approach for fostering critical inquiry about student learning. Graduate educator preparation programs in our university used collaborative analysis as the basis for continuous improvement during an accreditation cycle. As authors of this study, we sought to better understand how graduate program directors and faculty used collaborative analysis to inform practice and improve programs. Our findings suggested that CASL has the potential to foster collective responsibility for student learning, but only with a strong commitment from administrators and faculty, purposefully designed protocols and processes, fidelity to the CASL method, and a focus on professional development. Through CASL, programs have the ability to produce meaningful data related to student and program outcomes and meet the requirements for accreditation.

  11. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment (DAVE) (United States)


    1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...Improving Web Service to Defense Acquisition Visibility Environment (DAVE) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. install the LLA/CLA/SSA system as a web service in the Defense Acquisition Visibility Environment (DAVE) test bed via the AT&L eBusiness Center

  12. Interprofessional service improvement learning and patient safety: a content analysis of pre-registration students' assessments. (United States)

    Machin, Alison I; Jones, Diana


    A culture of continuous service improvement underpins safe, efficient and cost-effective health and social care. This paper reports a qualitative research study of assessment material from one cohort of final year pre-registration health and social care students' interprofessional service improvement learning experience. Initially introduced to the theory of service improvement, students were linked with an interprofessional buddy group, and subsequently planned and implemented, if possible, a small scale service improvement project within a practice placement setting. Assessment was by oral project presentation and written reflection on learning. Summative assessment materials from 150 students were subjected to content analysis to identify: service user triggers for service improvement; ideas to address the identified area for improvement; and perceptions of service improvement learning. Triggers for service improvements included service user disempowerment, poor communication, gaps in service provision, poor transitions, lack of information, lack of role clarity and role duplication, and differed between professions. Ideas for improvement included both the implementation of evidence based best practice protocols in a local context and also innovative approaches to problem solving. Students described both intrapersonal and interprofessional learning as a result of engaging with service improvement theory and practice. Service improvement learning in an interprofessional context has positive learning outcomes for health and social care students. Students can identify improvement opportunities that may otherwise go undetected. Engaging positively in interprofessional service improvement learning as a student is an important rehearsal for life as a qualified practitioner. It can help students to develop an ability to challenge unsafe practice elegantly, thereby acting as advocates for the people in their care. Universities can play a key support role by working

  13. An improved multiple linear regression and data analysis computer program package (United States)

    Sidik, S. M.


    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  14. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Jeffrey, D.; Jansen, John, R.; Janke, David, H.; Plowman, Catherine, M.


    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results.

  15. Improvement in Student Data Analysis Skills after Out-of-Class Assignments

    Directory of Open Access Journals (Sweden)

    Kristen Lee Williams Walton


    Full Text Available The ability to understand and interpret data is a critical aspect of scientific thinking.  However, although data analysis is often a focus in biology majors classes, many textbooks for allied health majors classes are primarily content-driven and do not include substantial amounts of experimental data in the form of graphs and figures.  In a lower-division allied health majors microbiology class, students were exposed to data from primary journal articles as take-home assignments and their data analysis skills were assessed in a pre-/posttest format.  Students were given 3 assignments that included data analysis questions.  Assignments ranged from case studies that included a figure from a journal article to reading a short journal article and answering questions about multiple figures or tables.  Data were represented as line or bar graphs, gel photographs, and flow charts.  The pre- and posttest was designed incorporating the same types of figures to assess whether the assignments resulted in any improvement in data analysis skills.  The mean class score showed a small but significant improvement from the pretest to the posttest across three semesters of testing.  Scores on individual questions testing accurate conclusions and predictions improved the most.  This supports the conclusion that a relatively small number of out-of-class assignments through the semester resulted in a significant improvement in data analysis abilities in this population of students.

  16. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development (United States)


    SPECIFICATION IMPROVEMENT THROUGH ANALYSIS OF PROOF STRUCTURE (SITAPS): HIGH ASSURANCE SOFTWARE DEVELOPMENT BAE SYSTEMS FEBRUARY...ANALYSIS OF PROOF STRUCTURE (SITAPS): HIGH ASSURANCE SOFTWARE DEVELOPMENT 5a. CONTRACT NUMBER FA8750-13-C-0240 5b. GRANT NUMBER N/A 5c. PROGRAM...General adoption of these techniques has had limited penetration in the software development community. Two interrelated causes may account for

  17. Identification of Energy Efficiency Opportunities through Building Data Analysis and Achieving Energy Savings through Improved Controls

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Taasevigen, Danny J.; Koran, Bill


    This chapter will highlight analysis techniques to identify energy efficiency opportunities to improve operations and controls. A free tool, Energy Charting and Metrics (ECAM), will be used to assist in the analysis of whole-building, sub-metered, and/or data from the building automation system (BAS). Appendix A describes the features of ECAM in more depth, and also provide instructions for downloading ECAM and all resources pertaining to using ECAM.

  18. Improved Detection of Time Windows of Brain Responses in Fmri Using Modified Temporal Clustering Analysis

    Institute of Scientific and Technical Information of China (English)


    @@ Temporal clustering analysis (TCA) has been proposed recently as a method to detect time windows of brain responses in functional MRI (fMRI) studies when the timing and location of the activation are completely unknown. Modifications to the TCA technique are introduced in this report to further improve the sensitivity in detecting brain activation.

  19. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara


    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  20. Improving torque per kilogram magnet of permanent magnet couplings using finite element analysis

    DEFF Research Database (Denmark)

    Högberg, Stig; Jensen, Bogi Bech; Bendixen, Flemming Buus


    This paper presents the methodology and subsequent findings of a performance-improvement routine that employs automated finite element (FE) analysis to increase the torque-per-kilogram-magnet (TPKM) of a permanent magnet coupling (PMC). The routine is applied to a commercially available cylindrical...

  1. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis (United States)

    Kumar, Ranjan; Ghosh, Achyuta Krishna


    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  2. An improved modal pushover analysis procedure for estimating seismic demands of structures

    Institute of Scientific and Technical Information of China (English)

    Mao Jianmeng; Zhai Changhai; Xie Lili


    The pushover analysis (POA) procedure is difficult to apply to high-rise buildings, as it cannot account for the contributions of higher modes. To overcome this limitation, a modal pushover analysis (MPA) procedure was proposed by Chopra et al. (2001). However, invariable lateral force distributions are still adopted in the MPA. In this paper, an improved MPA procedure is presented to estimate the seismic demands of structures, considering the redistribution of inertia forces after the structure yields. This improved procedure is verified with numerical examples of 5-, 9- and 22-story buildings. It is concluded that the improved MPA procedure is more accurate than either the POA procedure or MPA procedure. In addition, the proposed procedure avoids a large computational effort by adopting a two-phase lateral force distribution..

  3. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks. (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih


    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology.

  4. Menu Analysis for Improved Customer Demand and Profitability in Hospital Cafeterias. (United States)

    Mann, Linda L.; MacInnis, Donna; Gardiner, Nicole


    Several sophisticated menu analysis methods have been compared in studies using theoretical restaurant menus. Institutional and especially hospital cafeterias differ from commercial restaurants in ways that may influence the effectiveness of these menu analysis methods. In this study, we compared three different menu analysis methods - menu engineering, goal value analysis, and marginal analysis in an institutional setting, to evaluate their relative effectiveness for menu management decision-making. The three methods were used to analyze menu cost and sales data for a representative cafeteria in a large metropolitan hospital. The results were compared with informal analyses by the manager and an employee to determine accuracy and value of information for decision-making. Results suggested that all three methods would improve menu planning and pricing, which in turn would enhance customer demand (revenue) and profitability. However, menu engineering was ranked the easiest of the three methods to interpret.

  5. The Effectiveness of Transactional Analysis Group-counseling on the Improvement of Couples’ Family Functioning

    Directory of Open Access Journals (Sweden)

    Ghorban Ali Yahyaee


    Full Text Available Background & Aims of the Study: Family functioning is among the most important factors ensuring the mental health of family members. Disorder or disturbance in family functioning would cause many psychological problems for family members. Current study intended to examine the effectiveness of transactional analysis group counseling on the improvement of couple's family functioning. Materials & Methods: The design of the study is as semi experimental research with pretest and posttest with follow up and control group. Statistical population consists all couples referring to the psychological and counseling centers of Rasht city in 2012. Samples were selected at first by available sampling method and after completing family assessment  device, and obtaining score for enter to research, were placement using random sampling method in two experimental and control groups (N = 8 couples per group. The experimental group participated in 12 sessions of group counseling based on transactional analysis and control group received no intervention. The gathered data were analyzed using covariance analysis. Results: The results show that there are significant differences between the pre-test and post test scores of the experimental group. This difference is significant at the level of 0.05. Therefore it seems that transactional group therapy improved the dimensions of family functioning in couples. Conclusions: The results indicated that transactional analysis group counseling can improve the family functioning and use this approach to working with couples is recommended.

  6. Improved thermodynamic analysis of gas reactions for compound semiconductor growth by vapor-phase epitaxy (United States)

    Inatomi, Yuya; Kangawa, Yoshihiro; Kakimoto, Koichi; Koukitu, Akinori


    An improved thermodynamic analysis method for vapor-phase epitaxy is proposed. In the conventional method, the mass-balance constraint equations are expressed in terms of variations in partial pressure. Although the conventional method is appropriate for gas–solid reactions occurring near the growth surface, it is not suitable for gas reactions that involve changes in the number of gas molecules. We reconsider the constraint equations in order to predict the effect of gas reactions on semiconductor growth processes. To demonstrate the feasibility of the improved method, the growth process of group-III nitrides by metalorganic vapor-phase epitaxy has been investigated.

  7. Research on artificial neural network intrusion detection photochemistry based on the improved wavelet analysis and transformation (United States)

    Li, Hong; Ding, Xue


    This paper combines wavelet analysis and wavelet transform theory with artificial neural network, through the pretreatment on point feature attributes before in intrusion detection, to make them suitable for improvement of wavelet neural network. The whole intrusion classification model gets the better adaptability, self-learning ability, greatly enhances the wavelet neural network for solving the problem of field detection invasion, reduces storage space, contributes to improve the performance of the constructed neural network, and reduces the training time. Finally the results of the KDDCup99 data set simulation experiment shows that, this method reduces the complexity of constructing wavelet neural network, but also ensures the accuracy of the intrusion classification.

  8. Improving the precipitation accumulation analysis using lightning measurements and different integration periods (United States)

    Gregow, Erik; Pessi, Antti; Mäkelä, Antti; Saltikoff, Elena


    The focus of this article is to improve the precipitation accumulation analysis, with special focus on the intense precipitation events. Two main objectives are addressed: (i) the assimilation of lightning observations together with radar and gauge measurements, and (ii) the analysis of the impact of different integration periods in the radar-gauge correction method. The article is a continuation of previous work by Gregow et al. (2013) in the same research field. A new lightning data assimilation method has been implemented and validated within the Finnish Meteorological Institute - Local Analysis and Prediction System. Lightning data do improve the analysis when no radars are available, and even with radar data, lightning data have a positive impact on the results. The radar-gauge assimilation method is highly dependent on statistical relationships between radar and gauges, when performing the correction to the precipitation accumulation field. Here, we investigate the usage of different time integration intervals: 1, 6, 12, 24 h and 7 days. This will change the amount of data used and affect the statistical calculation of the radar-gauge relations. Verification shows that the real-time analysis using the 1 h integration time length gives the best results.

  9. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    Directory of Open Access Journals (Sweden)

    Bahram A. Ghanbari-Hashemabadi


    Full Text Available Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen based on screening method. In addition, they were randomly divided into three equal groups: control group (15 participants, cognitive experimental group (15 participants and transactional analysis group (15 participants. Conflict-solving questionnaire was used to collect data and the intervention methods were cognitive and transactional analysis group therapy that was administrated during 8 weekly two-hour sessions. Mean and standard deviation were used for data analysis in the descriptive level and One-Way ANOVA method was used at the inference level.Results: The results of the study suggest that the conflict-solving skills in the two experimental groups were significantly increased. Conclusion: The finding of this research is indicative of the fact that both cognitive and transactional analysis group therapy could be an effective intervention for improving conflict-solving skills

  10. Factorial kriging analysis - a geostatistical approach to improve reservoir characterization with seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Evaldo Cesario; Johann, Paulo R. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Remacre, Armando Zaupa [Universidade Estadual de Campinas, SP (Brazil)


    In this work the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes as a secondary variable in Kriging system is discussed. Results prove that Factorial Kriging is an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)

  11. Improvement and analysis of ID3 algorithm in decision-making tree (United States)

    Xie, Xiao-Lan; Long, Zhen; Liao, Wen-Qi


    For the cooperative system under development, it needs to use the spatial analysis and relative technology concerning data mining in order to carry out the detection of the subject conflict and redundancy, while the ID3 algorithm is an important data mining. Due to the traditional ID3 algorithm in the decision-making tree towards the log part is rather complicated, this paper obtained a new computational formula of information gain through the optimization of algorithm of the log part. During the experiment contrast and theoretical analysis, it is found that IID3 (Improved ID3 Algorithm) algorithm owns higher calculation efficiency and accuracy and thus worth popularizing.

  12. Transient Voltage Stability Analysis and Improvement of A Network with different HVDC Systems

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe


    the theoretical analysis and the improved control method, real time simulation model of a hybrid multi-infeed HVDC system based on western Danish power system is established in RTDS™. Simulation results show that the enhanced transient voltage stability can be achieved.......This paper presents transient voltage stability analysis of an AC system with multi-infeed HVDC links including a traditional LCC HVDC link and a VSC HVDC link. It is found that the voltage supporting capability of the VSC-HVDC link is significantly influenced by the tie-line distance between...... the two links and the size of loads. In order to improve the transient voltage stability, a voltage adjusting method is proposed in this paper. A voltage increment component has been introduced into the outer voltage control loop under emergency situation caused by severe grid faults. In order to verify...

  13. Improved reporting of statistical design and analysis: guidelines, education, and editorial policies. (United States)

    Mazumdar, Madhu; Banerjee, Samprit; Van Epps, Heather L


    A majority of original articles published in biomedical journals include some form of statistical analysis. Unfortunately, many of the articles contain errors in statistical design and/or analysis. These errors are worrisome, as the misuse of statistics jeopardizes the process of scientific discovery and the accumulation of scientific knowledge. To help avoid these errors and improve statistical reporting, four approaches are suggested: (1) development of guidelines for statistical reporting that could be adopted by all journals, (2) improvement in statistics curricula in biomedical research programs with an emphasis on hands-on teaching by biostatisticians, (3) expansion and enhancement of biomedical science curricula in statistics programs, and (4) increased participation of biostatisticians in the peer review process along with the adoption of more rigorous journal editorial policies regarding statistics. In this chapter, we provide an overview of these issues with emphasis to the field of molecular biology and highlight the need for continuing efforts on all fronts.

  14. Thermal Analysis in Gas Insulated Transmission Lines Using an Improved Finite-Element Model

    Directory of Open Access Journals (Sweden)

    Ling LI


    Full Text Available  In this paper, an improved finite element (FE model is proposed to investigate the temperature distribution in gas insulated transmission lines (GILs. The solution of joule losses in eddy current field analysis is indirectly coupled into fluid and thermal fields. As is different from the traditional methods, the surrounding air of the GIL is involved in the model to avoid constant convective heat transfer coefficient, thus multiple species transport technique is employed to deal with the problem of two fluid types in a single model. In addition, the temperature dependent electrical and thermal properties of the materials are considered. The steady-state and transient thermal analysis of the GIL are performed separately with the improved model. The corresponding temperature distributions are compared with experimental results reported in the literature.

  15. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III (United States)


    Experimentation ( DISE ) research group where he leads multidisciplinary studies ranging from leading the Analyst Capability Working Group for the U.S. Air...Information and Systems Experimentation ( DISE ). Dr. Gallup has a multidisciplinary science, engineering, and analysis background, including microbiology...4), 19–31. Retrieved from DISE /docs/improving- use-and-understanding-of-data-dod.pdf Zhou

  16. Comparative analysis for performance of brown coal combustion in a vortex furnace with improved design (United States)

    Krasinsky, D. V.


    Comparative study of 3D numerical simulation of fluid flow and coal-firing processes was applied for flame combustion of Kansk-Achinsk brown coal in a vortex furnace of improved design with bottom injection of secondary air. The analysis of engineering performance of this furnace was carried out for several operational modes as a function of coal grinding fineness and coal input rate. The preferable operational regime for furnace was found.


    Directory of Open Access Journals (Sweden)

    Lilian Oliveira de Oliveira


    Full Text Available The main purpose of this research to analyze the occupational risks in a Renal Clinic located in central-RS. From the observational analysis of risk maps and instrument data collection, we implemented improvements in local. Through the results, it was noted that the implementations have been significant and that changes are needed to reduce occupational disorders, promoting better quality of life for clinical professionals.

  18. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation. (United States)

    Echinaka, Yuki; Ozeki, Yukiyasu


    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  19. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security (United States)


    theory , spectral graph theory and closed-loop control are provided in Chapter II. The dual-basis and its role in defining the state of the network...finding is a significant research area within graph theory . Its applications range from finding groups within social networks [35] to finding clusters...GRAPH THEORY ANALYSIS OF SOFTWARE-DEFINED NETWORKS TO IMPROVE PERFORMANCE AND SECURITY by Thomas C. Parker September 2015 Dissertation Co

  20. Quantitative Analysis of a Hybrid Electric HMMWV for Fuel Economy Improvement (United States)


    fuel economy of the M1113 and XM1124 based on analysis of the HEVEA test data set. A repower option of the HMMWV M1113 engine (6.5 L V8 turbo...fuel economy benefits of an engine repower option for the M1113. Table 5 summarizes the results of the computer simulations using both the...current and recommended repower option. The fuel economy improvements reported in Table 5 do not account for the additional cooling requirements for the

  1. Information Operations Versus Civilian Marketing and Advertising: A Comparative Analysis to Improve IO Planning and Strategy (United States)


    OPERATIONS VERSUS CIVILIAN MARKETING AND ADVERTISING : A COMPARATIVE ANALYSIS TO IMPROVE IO PLANNING AND STRATEGY by Dan Chilton March 2008...tactics may be much more suited to achieve this through the use of civilian marketing and advertising fundamentals to organize, plan, and execute IO...strategy. The objective of this work is to analyze and develop the concept of utilizing civilian advertising and marketing fundamentals for

  2. Morphometric MRI analysis improves detection of focal cortical dysplasia type II. (United States)

    Wagner, Jan; Weber, Bernd; Urbach, Horst; Elger, Christian E; Huppertz, Hans-Jürgen


    analysis alone. Since detection of FCDs on MRI during the presurgical evaluation markedly improves the chance of becoming seizure free postoperatively, we apply morphometric analysis in all patients who are MRI-negative after conventional visual analysis at our centre.

  3. Use of peers to improve adherence to antiretroviral therapy: a global network meta-analysis (United States)

    Kanters, Steve; Park, Jay JH; Chan, Keith; Ford, Nathan; Forrest, Jamie; Thorlund, Kristian; Nachega, Jean B; Mills, Edward J


    Introduction It is unclear whether using peers can improve adherence to antiretroviral therapy (ART). To construct the World Health Organization's global guidance on adherence interventions, we conducted a systematic review and network meta-analysis to determine the effectiveness of using peers for achieving adequate adherence and viral suppression. Methods We searched for randomized clinical trials of peer-based interventions to promote adherence to ART in HIV populations. We searched six electronic databases from inception to July 2015 and major conference abstracts within the last three years. We examined the outcomes of adherence and viral suppression among trials done worldwide and those specific to low- and middle-income countries (LMIC) using pairwise and network meta-analyses. Results and discussion Twenty-two trials met the inclusion criteria. We found similar results between pairwise and network meta-analyses, and between the global and LMIC settings. Peer supporter+Telephone was superior in improving adherence than standard-of-care in both the global network (odds-ratio [OR]=4.79, 95% credible intervals [CrI]: 1.02, 23.57) and the LMIC settings (OR=4.83, 95% CrI: 1.88, 13.55). Peer support alone, however, did not lead to improvement in ART adherence in both settings. For viral suppression, we found no difference of effects among interventions due to limited trials. Conclusions Our analysis showed that peer support leads to modest improvement in adherence. These modest effects may be due to the fact that in many settings, particularly in LMICs, programmes already include peer supporters, adherence clubs and family disclosures for treatment support. Rather than introducing new interventions, a focus on improving the quality in the delivery of existing services may be a more practical and effective way to improve adherence to ART. PMID:27914185

  4. Reliability and Sensitivity Analysis of Transonic Flutter Using Improved Line Sampling Technique

    Institute of Scientific and Technical Information of China (English)

    Song Shufang; Lu Zhenzhou; Zhang Weiwei; Ye Zhengyin


    The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter in transonic flow. The improved LS technique is a novel methodology for reliability and sensitivity analysis of high dimensionality and low probability problem with implicit limit state function, and it does not require any approximating surrogate of the implicit limit state equation. The improved LS is used to estimate the flutter reliability and the sensitivity of a two-dimensional wing, in which some structural properties, such as frequency, parameters of gravity center and mass ratio, are considered as random variables. Computational fluid dynamics (CFD) based unsteady aerodynamic reduced order model (ROM) method is used to construct the aerodynamic state equations. Coupling structural state equations with aerodynamic state equations, the safety margin of flutter is founded by using the critical velocity of flutter. The results show that the improved LS technique can effectively decrease the computational cost in the random uncertainty analysis of flutter. The reliability sensitivity, defined by the partial derivative of the failure probability with respect to the distribution parameter of random variable, can help to identify the important parameters and guide the structural optimization design.

  5. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson


    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  6. Improved enteral tolerance following step procedure: systematic literature review and meta-analysis. (United States)

    Fernandes, Melissa A; Usatin, Danielle; Allen, Isabel E; Rhee, Sue; Vu, Lan


    Surgical management of children with short bowel syndrome (SBS) changed with the introduction of the serial transverse enteroplasty procedure (STEP). We conducted a systematic review and meta-analysis using MEDLINE and SCOPUS to determine if children with SBS had improved enteral tolerance following STEP. Studies were included if information about a child's pre- and post-STEP enteral tolerance was provided. A random effects meta-analysis provided a summary estimate of the proportion of children with enteral tolerance increase following STEP. From 766 abstracts, seven case series involving 86 children were included. Mean percent tolerance of enteral nutrition improved from 35.1 to 69.5. Sixteen children had no enteral improvement following STEP. A summary estimate showed that 87 % (95 % CI 77-95 %) of children who underwent STEP had an increase in enteral tolerance. Compilation of the literature supports the belief that SBS subjects' enteral tolerance improves following STEP. Enteral nutritional tolerance is a measure of efficacy of STEP and should be presented as a primary or secondary outcome. By standardizing data collection on children undergoing STEP procedure, better determination of nutritional benefit from STEP can be ascertained.

  7. Method of sensitivity improving in the non-dispersive infrared gas analysis system

    Institute of Scientific and Technical Information of China (English)

    Youwen Sun; Wenqing Liu; Shimei Wang; Shuhua Huang; Xiaoman Yu


    @@ A method of interference correction for improving the sensitivity of non-dispersive infrared (NDIR) gas analysis system is demonstrated.Based on the proposed method, the interference due to water vapor and carbon dioxide in the NDIR NO analyzer is corrected.After interference correction, the absorbance signal at the NO filter channel is only controlled by the absorption of NO, and the sensitivity of the analyzer is improved greatly.In the field experiment for pollution source emission monitoring, the concentration trend of NO monitored by NDIR analyzer is in good agreement with the differential optical absorption spectroscopy NO analyzer.Small variations of NO concentration can also be resolved, and the measuring correlation coefficient of the two analyzers is 94.28%.%A method of interference correction for improving the sensitivity of non-dispersive infrared (NDIR) gas analysis system is demonstrated. Based on the proposed method, the interference due to water vapor and carbon dioxide in the NDIR NO analyzer is corrected. After interference correction, the absorbance signal at the NO filter channel is only controlled by the absorption of NO, and the sensitivity of the analyzer is improved greatly. In the field experiment for pollution source emission monitoring, the concentration trend of NO monitored by NDIR analyzer is in good agreement with the differential optical absorption spectroscopy NO analyzer. Small variations of NO concentration can also be resolved, and the measuring correlation coefficient of the two analyzers is 94.28%.

  8. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology (United States)

    Jonny; Nasution, Januar


    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  9. Improving resolution of gravity data with wavelet analysis and spectral method

    Institute of Scientific and Technical Information of China (English)

    QIU Ning; HE Zhanxiang; CHANG Yanjun


    Gravity data are the results of gravity force field interaction from all the underground sources. The objects of detection are always submerged in the background field, and thus one of the crucial problems for gravity data interpretation is how to improve the resolution of observed information.The wavelet transform operator has recently been introduced into the domain fields both as a filter and as a powerful source analysis tool. This paper studied the effects of improving resolution of gravity data with wavelet analysis and spectral method, and revealed the geometric characteristics of density heterogeneities described by simple shaped sources. First, the basic theory of the multiscale wavelet analysis and its lifting scheme and spectral method were introduced. With the exper-imental study on forward simulation of anomalies given by the superposition of six objects and measured data in Songliao plain, Northeast China, the shape, size and depth of the buried objects were estimated in the study. Also, the results were compared with those obtained by conventional techniques,which demonstrated that this method greatly improves the resolution of gravity anomalies.

  10. Improved Proteomic Analysis Following Trichloroacetic Acid Extraction of Bacillus anthracis Spore Proteins

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.; Warner, Marvin G.; Wahl, Karen L.; Hutchison, Janine R.


    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extraction improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.

  11. Simulations study of neutrino oscillation parameters with the Iron Calorimeter Detector (ICAL): an improved analysis

    CERN Document Server

    Mohan, Lakshmi S


    We present an updated and improved simulations analysis of precision measurements of neutrino oscillation parameters from the study of charged-current interactions of atmospheric neutrinos in the Iron Calorimeter (ICAL) detector at the proposed India-based Neutrino Observatory (INO). The present analysis is done in the extended muon energy range of 0.5--25 GeV, as compared to the previous analyses which were limited to the range 1--11 GeV of muon energy. A substantial improvement in the precision measurement of the oscillation parameters in the 2--3 sector, including the magnitude and sign of the 2--3 mass-squared difference $\\Delta{m^2_{32}}$ and especially $\\theta_{23}$ is observed. The sensitivities are further improved by the inclusion of additional systematics which constrains the ratio of neutrino to anti-neutrino fluxes. The best $1\\sigma$ precision on $\\sin^2 \\theta_{23}$ and $|\\Delta{m^2_{32}}|$ achievable with the new analysis for 500 kTon yr exposure of ICAL are $\\sim9\\%$ and $\\sim2.5\\%$ respective...

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete


    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  13. A Lean Six Sigma approach to the improvement of the selenium analysis method. (United States)

    Cloete, Bronwyn C; Bester, André


    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  14. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    Directory of Open Access Journals (Sweden)

    Zhenying Zhang


    Full Text Available The impulse turbo expander (ITE is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reduction of the friction losses of the rotor. The performance of the novel cycle is investigated based on energy and exergy analysis. A correlation of the optimum intermediate pressure in terms of ITE efficiency is developed. The improved ITE cycle increases the exergy efficiency by 1.4%–6.1% over the conventional ITE cycle, 4.6%–8.3% over the economizer cycle and 7.2%–21.6% over the base cycle. Furthermore, the improved ITE cycle is also preferred due to its lower exergy loss.


    Directory of Open Access Journals (Sweden)

    V. R. Balaji


    Full Text Available Speech enhancement has become an essential issue within the field of speech and signal processing, because of the necessity to enhance the performance of voice communication systems in noisy environment. There has been a number of research works being carried out in speech processing but still there is always room for improvement. The main aim is to enhance the apparent quality of the speech and to improve the intelligibility. Signal representation and enhancement in cosine transformation is observed to provide significant results. Discrete Cosine Transformation has been widely used for speech enhancement. In this research work, instead of DCT, Advanced DCT (ADCT which simultaneous offers energy compaction along with critical sampling and flexible window switching. In order to deal with the issue of frame to frame deviations of the Cosine Transformations, ADCT is integrated with Pitch Synchronous Analysis (PSA. Moreover, in order to improve the noise minimization performance of the system, Improved Iterative Wiener Filtering approach called Constrained Iterative Wiener Filtering (CIWF is used in this approach. Thus, a novel ADCT based speech enhancement using improved iterative filtering algorithm integrated with PSA is used in this approach.

  16. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.


    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  17. Improving Markov Chain Monte Carlo algorithms in LISA Pathfinder Data Analysis (United States)

    Karnesis, N.; Nofrarias, M.; Sopuerta, C. F.; Lobo, A.


    The LISA Pathfinder mission (LPF) aims to test key technologies for the future LISA mission. The LISA Technology Package (LTP) on-board LPF will consist of an exhaustive suite of experiments and its outcome will be crucial for the future detection of gravitational waves. In order to achieve maximum sensitivity, we need to have an understanding of every instrument on-board and parametrize the properties of the underlying noise models. The Data Analysis team has developed algorithms for parameter estimation of the system. A very promising one implemented for LISA Pathfinder data analysis is the Markov Chain Monte Carlo. A series of experiments are going to take place during flight operations and each experiment is going to provide us with essential information for the next in the sequence. Therefore, it is a priority to optimize and improve our tools available for data analysis during the mission. Using a Bayesian framework analysis allows us to apply prior knowledge for each experiment, which means that we can efficiently use our prior estimates for the parameters, making the method more accurate and significantly faster. This, together with other algorithm improvements, will lead us to our main goal, which is no other than creating a robust and reliable tool for parameter estimation during the LPF mission.

  18. Improved wavelet analysis in enhancing Electromagnetic Campatibility of underground monitoring system in coal mine

    Institute of Scientific and Technical Information of China (English)

    SUN Ji-ping; MA Feng-ying; WU Dong-xu; LIU Xiao-yang


    Underground Electro Magnetic Interference (EMI) has become so serious that there were false alarms in monitoring system, which induced troubles of coal mine safety in production. In order to overcome difficulties caused by the explosion-proof enclosure of the equipments and the limitation of multiple startup and stop in transient process during EMI measurement, a novel technique was proposed to measure underground EMI distribution indirectly and enhance Electromagnetic Campatibility(EMC) of the monitoring system. The wavelet time-frequency analysis was introduced to underground monitoring system. Therefore, the sources, the startup time, duration and waveform of EMI could be ascertained correctly based on running records of underground electric equipments. The electrical fast transient/burst (EFT/B) was studied to verify the validity of wavelet analysis.EMI filter was improved in accordance of the EMI distribution gotten from wavelet analysis.Power port immunity was developed obviously. In addition, the method of setting wavelet thresholds was amended based upon conventional thresholds in the wavelet filter design.Therefore the EFT/B of data port was restrained markedly with the wavelet filtering. Coordinative effect of EMI power and wavelet filter makes false alarms of monitoring system reduce evidently. It is concluded that wavelet analysis and the improved EMI filter have enhanced the EMC of monitoring system obviously.

  19. Multi-factor Analysis Model for Improving Profit Management Using Excel in Shellfish Farming Projects

    Institute of Scientific and Technical Information of China (English)

    Zhuming; ZHAO; Changlin; LIU; Xiujuan; SHAN; Jin; YU


    By using a farm’s data in Yantai City and the theory of Cost-Volume-Profit analysis and the financial management methods,this paper construct a multi-factor analysis model for improving profit management using Excel 2007 in Shellfish farming projects and describes the procedures to construct a multi-factor analysis model.The model can quickly calculate the profit,improve the level of profit management,find out the breakeven point and enhance the decision-making efficiency of businesses etc.It is also a thought of the application to offer suggestions for government decisions and economic decisions for corporations as a simple analysis tool.While effort has been exerted to construct a four-variable model,some equally important variables may not be discussed sufficiently due to limitation of the paper’s space and the authors’knowledge.All variables can be listed in EXCEL 2007 and can be associated in a logical way to manage the profit of shellfish farming projects more efficiently and more practically.

  20. ASTM clustering for improving coal analysis by near-infrared spectroscopy. (United States)

    Andrés, J M; Bona, M T


    Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.

  1. UiLog:Improving Log-Based Fault Diagnosis by Log Analysis

    Institute of Scientific and Technical Information of China (English)

    De-Qing Zou; Hao Qin; Hai Jin


    In modern computer systems, system event logs have always been the primary source for checking system status. As computer systems become more and more complex, the interaction between software and hardware increases frequently. The components will generate enormous log information, including running reports and fault information. The sheer quantity of data is a great challenge for analysis relying on the manual method. In this paper, we implement a management and analysis system of log information, which can assist system administrators to understand the real-time status of the entire system, classify logs into different fault types, and determine the root cause of the faults. In addition, we improve the existing fault correlation analysis method based on the results of system log classification. We apply the system in a cloud computing environment for evaluation. The results show that our system can classify fault logs automatically and effectively. With the proposed system, administrators can easily detect the root cause of faults.

  2. Improving the effectiveness of FMEA analysis in automotive – a case study

    Directory of Open Access Journals (Sweden)

    Ványi Gábor


    Full Text Available Many industries, for example automotive, have well defined product development process definitions and risk evaluation methods. The FMEA (Failure Mode and Effects Analysis is a first line risk analysis method in design, which has been implemented in development and production since decades. Although the first applications were focusing on mechanical and electrical design and functionalities, today, software components are implemented in many modern vehicle systems. However, standards or industry specific associations do not specify any “best practice” how to design the interactions of multiple entities in one model. This case study focuses on modelling interconnections and on the improvement of the FMEA modelling process in the automotive. Selecting and grouping software components for the analysis is discussed, but software architect design patterns are excluded from the study.

  3. Modal test and analysis: Multiple tests concept for improved validation of large space structure mathematical models (United States)

    Wada, B. K.; Kuo, C-P.; Glaser, R. J.


    For the structural dynamic analysis of large space structures, the technology in structural synthesis and the development of structural analysis software have increased the capability to predict the dynamic characteristics of the structural system. The various subsystems which comprise the system are represented by various displacement functions; the displacement functions are then combined to represent the total structure. Experience has indicated that even when subsystem mathematical models are verified by test, the mathematical representations of the total system are often in error because the mathematical model of the structural elements which are significant when loads are applied at the interconnection points are not adequately verified by test. A multiple test concept, based upon the Multiple Boundary Condition Test (MBCT), is presented which will increase the accuracy of the system mathematical model by improving the subsystem test and test/analysis correlation procedure.

  4. Quantitative Transcript Analysis in Plants: Improved First-strand cDNA Synthesis

    Institute of Scientific and Technical Information of China (English)

    Nai-Zhong XIAO; Lei BA; Preben Bach HOLM; Xing-Zhi WANG; Steve BOWRA


    The quantity and quality of first-strand cDNA directly influence the accuracy of transcriptional analysis and quantification. Using a plant-derived α-tubulin as a model system, the effect of oligo sequence and DTT on the quality and quantity of first-strand cDNA synthesis was assessed via a combination of semi-quantitative PCR and real-time PCR. The results indicated that anchored oligo dT significantly improved the quantity and quality of α-tubulin cDNA compared to the conventional oligo dT. Similarly, omitting DTT from the first-strand cDNA synthesis also enhanced the levels of transcript. This is the first time that a comparative analysis has been undertaken for a plant system and it shows conclusively that small changes to current protocols can have very significant impact on transcript analysis.

  5. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis (United States)

    Oshima, Mitsutaka


    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration

  6. Economic analysis of interventions to improve village chicken production in Myanmar. (United States)

    Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J


    A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for

  7. Computerized lung sound analysis following clinical improvement of pulmonary edema due to congestive heart failure exacerbations

    Institute of Scientific and Technical Information of China (English)

    WANG Zhen; XIONG Ying-xia


    Background Although acute congestive heart failure (CHF) patients typically present with abnormal auscultatory findings on lung examination, lung sounds are not normally subjected to rigorous analysis. The goals of this study were to use a computerized analytic acoustic tool to evaluate lung sound patterns in CHF patients during acute exacerbation and after clinical improvement and to compare CHF profiles with those of normal individuals.Methods Lung sounds throughout the respiratory cycle was captured using a computerized acoustic-based imaging technique. Thirty-two consecutive CHF patients were imaged at the time of presentation to the emergency department and after clinical improvement. Digital images were created, geographical area of the images and lung sound patterns were quantitatively analyzed.Results The geographical areas of the vibration energy image of acute CHF patients without and with radiographically evident pulmonary edema were (67.9±4.7) and (60.3±3.5) kilo-pixels, respectively (P <0.05). In CHF patients without and with radiographically evident pulmonary edema (REPE), after clinical improvement the geographical area of vibration energy image of lung sound increased to (74.5±4.4) and (73.9±3.9) kilo-pixels (P <0.05), respectively. Vibration energy decreased in CHF patients with REPE following clinical improvement by an average of (85±19)% (P <0.01). Conclusions With clinical improvement of acute CHF exacerbations, there was more homogenous distribution of lung vibration energy, as demonstrated by the increased geographical area of the vibration energy image. Lung sound analysis may be useful to track in acute CHF exacerbations.

  8. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)


    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  9. Improving land cover classification using input variables derived from a geographically weighted principal components analysis (United States)

    Comber, Alexis J.; Harris, Paul; Tsutsumida, Narumasa


    This study demonstrates the use of a geographically weighted principal components analysis (GWPCA) of remote sensing imagery to improve land cover classification accuracy. A principal components analysis (PCA) is commonly applied in remote sensing but generates global, spatially-invariant results. GWPCA is a local adaptation of PCA that locally transforms the image data, and in doing so, can describe spatial change in the structure of the multi-band imagery, thus directly reflecting that many landscape processes are spatially heterogenic. In this research the GWPCA localised loadings of MODIS data are used as textural inputs, along with GWPCA localised ranked scores and the image bands themselves to three supervised classification algorithms. Using a reference data set for land cover to the west of Jakarta, Indonesia the classification procedure was assessed via training and validation data splits of 80/20, repeated 100 times. For each classification algorithm, the inclusion of the GWPCA loadings data was found to significantly improve classification accuracy. Further, but more moderate improvements in accuracy were found by additionally including GWPCA ranked scores as textural inputs, data that provide information on spatial anomalies in the imagery. The critical importance of considering both spatial structure and spatial anomalies of the imagery in the classification is discussed, together with the transferability of the new method to other studies. Research topics for method refinement are also suggested.

  10. A de-noising algorithm to improve SNR of segmented gamma scanner for spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, Huailiang, E-mail: [Fundamental Science on Nuclear Wastes and Environmental Safety Laboratory, Southwest University of Science and Technology, Mianyang 621010 (China); Tuo, Xianguo [Fundamental Science on Nuclear Wastes and Environmental Safety Laboratory, Southwest University of Science and Technology, Mianyang 621010 (China); State Key Laboratory of Geohazard Prevention & Geoenvironmental Protection, Chengdu University of Technology, Chengdu 610059 (China); Shi, Rui [State Key Laboratory of Geohazard Prevention & Geoenvironmental Protection, Chengdu University of Technology, Chengdu 610059 (China); Zhang, Jinzhao; Henderson, Mark Julian [Fundamental Science on Nuclear Wastes and Environmental Safety Laboratory, Southwest University of Science and Technology, Mianyang 621010 (China); Courtois, Jérémie; Yan, Minhao [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China)


    An improved threshold shift-invariant wavelet transform de-noising algorithm for high-resolution gamma-ray spectroscopy is proposed to optimize the threshold function of wavelet transforms and reduce signal resulting from pseudo-Gibbs artificial fluctuations. This algorithm was applied to a segmented gamma scanning system with large samples in which high continuum levels caused by Compton scattering are routinely encountered. De-noising data from the gamma ray spectrum measured by segmented gamma scanning system with improved, shift-invariant and traditional wavelet transform algorithms were all evaluated. The improved wavelet transform method generated significantly enhanced performance of the figure of merit, the root mean square error, the peak area, and the sample attenuation correction in the segmented gamma scanning system assays. We also found that the gamma energy spectrum can be viewed as a low frequency signal as well as high frequency noise superposition by the spectrum analysis. Moreover, a smoothed spectrum can be appropriate for straightforward automated quantitative analysis.

  11. A model for improving energy efficiency in industrial motor system using multicriteria analysis

    Energy Technology Data Exchange (ETDEWEB)

    Herrero Sola, Antonio Vanderley, E-mail: [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil); Mota, Caroline Maria de Miranda, E-mail: [Federal University of Pernambuco, Cx. Postal 7462, CEP 50630-970, Recife (Brazil); Kovaleski, Joao Luiz [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil)


    In the last years, several policies have been proposed by governments and global institutions in order to improve the efficient use of energy in industries worldwide. However, projects in industrial motor systems require new approach, mainly in decision making area, considering the organizational barriers for energy efficiency. Despite the wide application, multicriteria methods remain unexplored in industrial motor systems until now. This paper proposes a multicriteria model using the PROMETHEE II method, with the aim of ranking alternatives for induction motors replacement. A comparative analysis of the model, applied to a Brazilian industry, has shown that multicriteria analysis presents better performance on energy saving as well as return on investments than single criterion. The paper strongly recommends the dissemination of multicriteria decision aiding as a policy to support the decision makers in industries and to improve energy efficiency in electric motor systems. - Highlights: > Lack of decision model in industrial motor system is the main motivation of the research. > A multicriteria model based on PROMETHEE method is proposed with the aim of supporting the decision makers in industries. > The model can contribute to transpose some barriers within the industries, improving the energy efficiency in industrial motor system.

  12. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland


    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  13. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices. (United States)

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor


    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  14. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats (United States)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.


    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  15. An Improved Distance and Mass Estimate for Sgr A* from a Multistar Orbit Analysis

    CERN Document Server

    Boehle, A; Schödel, R; Meyer, L; Yelda, S; Albers, S; Martinez, G D; Becklin, E E; Do, T; Lu, J R; Matthews, K; Morris, M R; Sitarski, B; Witzel, G


    We present new, more precise measurements of the mass and distance of our Galaxy's central supermassive black hole, Sgr A*. These results stem from a new analysis that more than doubles the time baseline for astrometry of faint stars orbiting Sgr A*, combining two decades of speckle imaging and adaptive optics data. Specifically, we improve our analysis of the speckle images by using information about a star's orbit from the deep adaptive optics data (2005 - 2013) to inform the search for the star in the speckle years (1995 - 2005). When this new analysis technique is combined with the first complete re-reduction of Keck Galactic Center speckle images using speckle holography, we are able to track the short-period star S0-38 (K-band magnitude = 17, orbital period = 19 years) through the speckle years. We use the kinematic measurements from speckle holography and adaptive optics to estimate the orbits of S0-38 and S0-2 and thereby improve our constraints of the mass ($M_{bh}$) and distance ($R_o$) of Sgr A*: $...

  16. Are migrants health policies aimed at improving access to quality healthcare? An analysis of Spanish policies. (United States)

    Vázquez, María Luisa; Terraza-Núñez, Rebeca; S-Hernández, Silvia; Vargas, Ingrid; Bosch, Lola; González, Andrea; Pequeño, Sandra; Cantos, Raquel; Martínez, Juan Ignacio; López, Luís Andrés


    Although until April 2012, all Spanish citizens regardless of their origin, residence status and work situation were entitled to health care, available evidence suggested inadequate access for immigrants. Following the Aday and Andersen model, we conducted an analysis of policy elements that affect immigrants' access to health care in Spain, based on documentary analysis of national policies and selected regional policies related to migrant health care. Selected documents were (a) laws and plans in force at the time containing migrant health policies and (b) evaluations. The analysis included policy principles, objectives, strategies and evaluations. Results show that the national and regional policies analyzed are based on the principle that health care is a right granted to immigrants by law. These policies include strategies to facilitate access to health care, reducing barriers for entry to the system, for example simplifying requirements and raising awareness, but mostly they address the necessary qualities for services to be able to attend to a more diverse population, such as the adaptation of resources and programs, or improved communication and training. However, limited planning was identified in terms of their implementation, necessary resources and evaluation. In conclusion, the policies address relevant barriers of access for migrants and signal improvements in the health system's responsiveness, but reinforcement is required in order for them to be effectively implemented.

  17. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital. (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula


    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  18. An Improved Analysis of the Sevoflurane-Benzene Structure by Chirped Pulse Ftmw Spectroscopy (United States)

    Seifert, Nathan A.; Perez, Cristobal; Zaleski, Daniel P.; Neill, Justin L.; Pate, Brooks H.; Lesarri, Alberto; Vallejo, Montserrat; Cocinero, Emilio J.; Castano, Fernando; Kleiner, Isabelle


    Recent improvements to the 2-8 GHz CP-FTMW spectrometer at University of Virginia have improved the structural and spectroscopic analysis of the sevoflurane-benzene cluster. Previously reported results, although robust, were limited to a fit of the a-type transitions of the normal species in the determination of the six-fold barrier to benzene internal rotation. Structural analysis was limited to the benzene hydrogen atom positions using benzene-d_{1}. The increased sensitivity of the new 2-8 GHz setup allows for a full internal rotation analysis of the a- and c-type transitions of the normal species, which was performed with BELGI. A fit value for V_{6} of 32.868(11) cm^{-1} is determined. Additionally, a full substitution structure of the benzene carbon atom positions was determined in natural abundance. Also, new measurements of a sevoflurane/benzene-d_{1} mixture enabled detection of 33 of the 60 possible ^{2}D / ^{13}C double isotopologues. This abundance of isotopic data, a total of 45 isotopologues, enabled a full heavy atom least-squares r_{0} structure fit for the complex, including positions for all seven fluorines in sevoflurane. N. A. Seifert, D. P. Zaleski, J. L. Neill, B. H. Pate, A. Lesarri, M. Vallejo, E. J. Cocinero, F. Castańo. 67th OSU Int. Symp. On Mol. Spectrosc., Columbus, OH, 2012, MH13.

  19. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics. (United States)

    Yin, Jian; Fenley, Andrew T; Henriksen, Niel M; Gilson, Michael K


    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by nonoptimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery.

  20. Silica Fume and Fly Ash Admixed Can Help to Improve the PRC Durability Combine Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Xiao Li-guang


    Full Text Available Silica fume/Fly ash RPC can greatly improve durability. When Silica fume to replace the same amount of 8% of the proportion of cement, re-mixed 15min of mechanically activated Fly ash content of 10%, by chloride ion flux detector measuring, complex doped than the reference RPC impermeability improved significantly; In addition, by using static nitrogen adsorption method showed, RPC internal pore structure determination, the hole integral volume was lower than the reference admixed RPC integral pore volume significantly; And combined SEM microscopic experimental methods, mixed of RPC internal structure and the formation mechanism analysis showed that, SF/FA complex fully embodies the synergy doped composites “Synergistic” principle.

  1. Improving analytic hierarchy process applied to fire risk analysis of public building

    Institute of Scientific and Technical Information of China (English)

    SHI Long; ZHANG RuiFang; XIE QiYuan; FU LiHua


    The structure importance in Fault Tree Analysis (FTA) reflects how important Basic Events are to Top Event.Attribute at alternative level in Analytic Hierarchy Process (AHP) also reflect its importance to general goal.Based on the coherence of these two methods,an improved AHP is put forward.Using this improved method,how important the attribute is to the fire safety of public building can be ana-lyzed more credibly because of the reduction of subjective judgment.Olympic venues are very impor-tant public buildings in China.The fire safety evaluation of them will be a big issue to engineers.Im-proved AHP is a useful tool to the safety evaluation to these Olympic venues,and it will guide the evaluation in other areas.

  2. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole


    productivity for every 10% reduction in the proportion of occupants entering a space who are dissatisfied with the air quality. With this assumption, the annual benefit due to improved air quality was always at least 10 times higher than the increase in annual energy and maintenance costs. The payback time......A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...... recovery. The annual energy cost and first cost of the HVAC system were calculat4ed using DOE 2.1E for different levels of air quality (10-50% dissatisfied). This was achieved by changing the outdoor air supply rate and the pollution loads. Previous studies have documented a 1.1% increase in office...


    Directory of Open Access Journals (Sweden)

    Alidiane Xavier


    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  4. Performance analysis and improvement of the use of wind tower in hot dry climate

    Energy Technology Data Exchange (ETDEWEB)

    Bouchahm, Yasmina; Bourbia, Fatiha [Universite Mentouri, Departement d' Architecture, Laboratoire d' architecture bioclimatique et d' environnement, Constantine (Algeria); Belhamri, Azeddine [Universite Mentouri, Departement de Genie Climatique, Constantine (Algeria)


    Wind towers for passive evaporative cooling offer real opportunity for improving the ambient comfort conditions in building whilst reducing the energy consumption of air-conditioning systems. This study aims at assessing the thermal performance of a bioclimatic housing using wind towers realized in a hot dry region of Algeria. Performance monitoring and site measurement of the system provide data which assist model validation. The analysis and site measurement are encouraging, and they confirm the advantage of the application of this passive cooling strategies in hot dry climate. A mathematical model is developed using heat and mass transfer balances. For a more effective evaporative cooling, a number of improvements on wind tower configurations are proposed. (author)

  5. Analysis of Improvement on Human Resource Management within Chinese Enterprises in Economic Globalization

    Directory of Open Access Journals (Sweden)

    Lihui Xie


    Full Text Available In this study, we analysis of improvement on human resource management within Chinese enterprises in economic globalization. China’s entry into WTO has accelerated the economic globalization pace of Chinese enterprises and Chinese economy is further integrated with the global economy in a global scope. Human resource is what economic globalization of Chinese enterprises relies on, the first resource for China to participate in the international competition and is also the key to make effective use of other resources. Nevertheless, under the background of economic globalization, human resource management in Chinese enterprises is still faced up with quite a lot of challenges and problems. In order to establish a human resource management concept of globalization and set up a human resource management mechanism to respond to the economic globalization, this study makes a discussion and proposes management method and improvement measures for reference.

  6. Analysis of walking improvement with dynamic shoe insoles, using two accelerometers (United States)

    Tsuruoka, Yuriko; Tamura, Yoshiyasu; Shibasaki, Ryosuke; Tsuruoka, Masako


    The orthopedics at the rehabilitation hospital found that disorders caused by sports injuries to the feet or caused by lower-back are improved by wearing dynamic shoe insoles, these improve walking balance and stability. However, the relationship of the lower-back and knees and the rate of increase in stability were not quantitatively analyzed. In this study, using two accelerometers, we quantitatively analyzed the reciprocal spatiotemporal contributions between the lower-back and knee of patients with left lower-back pain by means of Relative Power Contribution Analysis. When the insoles were worn, the contribution of the left and right knee relative to the left lower-back pain was up to 26% ( panalysis of the left and right knee decreased by up to 67% ( p<0.05). This shows an increase in stability.

  7. Error analysis and algorithm implementation for an improved optical-electric tracking device based on MEMS (United States)

    Sun, Hong; Wu, Qian-zhong


    In order to improve the precision of optical-electric tracking device, proposing a kind of improved optical-electric tracking device based on MEMS, in allusion to the tracking error of gyroscope senor and the random drift, According to the principles of time series analysis of random sequence, establish AR model of gyro random error based on Kalman filter algorithm, then the output signals of gyro are multiple filtered with Kalman filter. And use ARM as micro controller servo motor is controlled by fuzzy PID full closed loop control algorithm, and add advanced correction and feed-forward links to improve response lag of angle input, Free-forward can make output perfectly follow input. The function of lead compensation link is to shorten the response of input signals, so as to reduce errors. Use the wireless video monitor module and remote monitoring software (Visual Basic 6.0) to monitor servo motor state in real time, the video monitor module gathers video signals, and the wireless video module will sent these signals to upper computer, so that show the motor running state in the window of Visual Basic 6.0. At the same time, take a detailed analysis to the main error source. Through the quantitative analysis of the errors from bandwidth and gyro sensor, it makes the proportion of each error in the whole error more intuitive, consequently, decrease the error of the system. Through the simulation and experiment results shows the system has good following characteristic, and it is very valuable for engineering application.

  8. Rational improvement of the engineered isobutanol-producing Bacillus subtilis by elementary mode analysis

    Directory of Open Access Journals (Sweden)

    Li Shanshan


    Full Text Available Abstract Background Isobutanol is considered as a leading candidate for the replacement of current fossil fuels, and expected to be produced biotechnologically. Owing to the valuable features, Bacillus subtilis has been engineered as an isobutanol producer, whereas it needs to be further optimized for more efficient production. Since elementary mode analysis (EMA is a powerful tool for systematical analysis of metabolic network structures and cell metabolism, it might be of great importance in the rational strain improvement. Results Metabolic network of the isobutanol-producing B. subtilis BSUL03 was first constructed for EMA. Considering the actual cellular physiological state, 239 elementary modes (EMs were screened from total 11,342 EMs for potential target prediction. On this basis, lactate dehydrogenase (LDH and pyruvate dehydrogenase complex (PDHC were predicted as the most promising inactivation candidates according to flux flexibility analysis and intracellular flux distribution simulation. Then, the in silico designed mutants were experimentally constructed. The maximal isobutanol yield of the LDH- and PDHC-deficient strain BSUL05 reached 61% of the theoretical value to 0.36 ± 0.02 C-mol isobutanol/C-mol glucose, which was 2.3-fold of BSUL03. Moreover, this mutant produced approximately 70 % more isobutanol to the maximal titer of 5.5 ± 0.3 g/L in fed-batch fermentations. Conclusions EMA was employed as a guiding tool to direct rational improvement of the engineered isobutanol-producing B. subtilis. The consistency between model prediction and experimental results demonstrates the rationality and accuracy of this EMA-based approach for target identification. This network-based rational strain improvement strategy could serve as a promising concept to engineer efficient B. subtilis hosts for isobutanol, as well as other valuable products.

  9. Polymer-modified Concrete with Improved Flexural Toughness and Mechanism Analysis

    Institute of Scientific and Technical Information of China (English)

    CAO Qingyu; SUN Wei; GUO Liping; ZHANG Guorong


    By selecting different types of polymer mixing into concrete,the toughness of concrete is investigated,and results indicate polymer has obvious effect to improve the toughness of concrete.Microstructure of polymer-modified concrete were studied through environment scanning electron microscope and digital micro-hardness tester,results show that polymer acts as a flexible filler and reinforcement in concrete,and alters the microstructure at mortar and ITZ.By crack path prediction and energy consumption analysis,the crack path of polymer-modified concrete is more tortuous and consumes more energy than that of ordinary concrete.

  10. Improving power output of inertial energy harvesters by employing principal component analysis of input acceleration (United States)

    Smilek, Jan; Hadas, Zdenek


    In this paper we propose the use of principal component analysis to process the measured acceleration data in order to determine the direction of acceleration with the highest variance on given frequency of interest. This method can be used for improving the power generated by inertial energy harvesters. Their power output is highly dependent on the excitation acceleration magnitude and frequency, but the axes of acceleration measurements might not always be perfectly aligned with the directions of movement, and therefore the generated power output might be severely underestimated in simulations, possibly leading to false conclusions about the feasibility of using the inertial energy harvester for the examined application.

  11. Analysis and Improvement of TCP Congestion Control Mechanism Based on Global Optimization Model

    Institute of Scientific and Technical Information of China (English)


    Network flow control is formulated as a global optimization problem of user profit. A general global optimization flow control model is established. This model combined with the stochastic model of TCP is used to study the global rate allocation characteristic of TCP. Analysis shows when active queue manage ment is used in network TCP rates tend to be allocated to maximize the aggregate of a user utility function Us (called Us fairness). The TCP throughput formula is derived. An improved TCP congestion control mecha nism is proposed. Simulations show its throughput is TCP friendly when competing with existing TCP and its rate change is smoother. Therefore, it is suitable to carry multimedia applications.

  12. A Novel Mutation in a Kazakh Family with X-Linked Alport Syndrome.

    Directory of Open Access Journals (Sweden)

    Barshagul T Baikara

    Full Text Available Alport syndrome is a genetic condition that results in hematuria, progressive renal impairment, hearing loss, and occasionally lenticonus and retinopathy. Approximately 80% of Alport syndrome cases are caused by X-linked mutations in the COL4A5 gene encoding type IV collagen. The objective of this study was to define the SNP profiles for COL4A5 in patients with hereditary nephritis and hematuria. For this, we examined four subjects from one Kazakh family clinically affected with X-linked Alport syndrome due to COL4A5 gene mutations. All 51 exons of the COL4A5 gene were screened by linkage analysis and direct DNA sequencing, resulting in the identification of a novel mutation (G641E in exon 25. The mutation was found only in two affected family individuals but was not present in healthy family members or 200 unrelated healthy controls. This result demonstrates that this novel mutation is pathogenic and has meaningful implications for the diagnosis of patients with Alport syndrome.

  13. Analysis and improvement of work postures in the building industry: application of the computerised OWAS method. (United States)

    Kivi, P; Mattila, M


    Awkward work posture is associated with the development of musculo-skeletal disorders. Previous workplace investigations in new building construction have shown that physical work affects workers' health in 46% of jobs. There is, however, a need for detailed analysis of jobs having physical workload and ergonomics problems. OWAS (Ovako Working Posture Analysing System) is a simple observation method for postural analysis, but there has been no study of its use in the building construction industry. The work described here examined (a) the use of the OWAS method to analyse work postures in building construction, (b) the development of a portable computer system for the OWAS method, (c) improvement of work postures identified as poor, and (d) use of the results as part of the ergonomics training programme of the company. Suggestions for work redesign measures are given.

  14. RIPOSTE: a framework for improving the design and analysis of laboratory-based research. (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn


    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  15. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others


    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  16. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś


    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  17. Linear analysis of the vertical shear instability: outstanding issues and improved solutions (Research Note)

    CERN Document Server

    Umurhan, O M; Gressel, O


    The Vertical Shear Instability is one of two known mechanisms potentially active in the so-called dead zones of protoplanetary accretion disks. A recent analysis indicates that a subset of unstable modes shows unbounded growth - both as resolution is increased and when the nominal lid of the atmosphere is extended, possibly indicating ill-posedness in previous attempts of linear analysis. The reduced equations governing the instability are revisited and the generated solutions are examined using both the previously assumed separable forms and an improved non-separable solution form that is herewith introduced. Analyzing the reduced equations using the separable form shows that, while the low-order body modes have converged eigenvalues and eigenfunctions (as both the vertical boundaries of the atmosphere are extended and with increased radial resolution), it is also confirmed that the corresponding high-order body modes and the surface modes do indeed show unbounded growth rates. However, the energy contained ...

  18. Improving resolution and depth of astronomical observations via modern mathematical methods for image analysis

    CERN Document Server

    Castellano, Marco; Fontana, Adriano; Merlin, Emiliano; Pilo, Stefano; Falcone, Maurizio


    In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.

  19. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data (United States)

    Esmaeili, Mostafa; Motagh, Mahdi


    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  20. Improvements of instrumental proximate and ultimate analysis of coals and coal conversion products

    Energy Technology Data Exchange (ETDEWEB)

    Selucky, M.L.; Iacchelli, A.; Murray, C.; Lieshout, T. van.


    Comparison of proximate analyses obtained using ASTM (American Society for Testing of Materials) methods with those from the Fisher coal analyzer shows that the analyzer gives consistently low moisture and ash values, and high volatile matter values. While the accuracy of moisture and ash determinations can be improved by introducing various instrument and crucible modifications, volatile matter values are less accurate, mainly because of differences in heating rates. However, reproducibility of results is very good and, with modifications, the instrument can be used to advantage for internal purposes, chiefly because of its large sample capacity. In ultimate analysis of coals using the Perkin-Elmer element analyzer, the main problem is that the initial purge gas flushing period after sample introduction partially removes water from the sample. Various methods of sample drying have shown that the best approach is to dry the sample directly in the instrument at the temperature used for moisture determination; with this modification of the analystical cycle, excellent reproducibility and correlation with the ASTM method have been achieved. The proximate and ultimate analysis of samples of extracts and extract residue are impaired by the presence of residual solvent. The samples can contain up to 10% residual solvent which appear as moisture in the proximate analysis. The report describes several ways of removing the solvent so that accurate analysis can be obtained. The foregoing modifications to procedures and equipment have considerably improved both accuracy and reliability of results obtained by instrumental methods. In consequence, considerably more samples can be handled than by using ASTM standard procedures. 4 refs., 1 figs., 19 tabs.

  1. Next generation sequencing as a useful tool in the diagnostics of mosaicism in Alport syndrome. (United States)

    Beicht, Sonja; Strobl-Wildemann, Gertrud; Rath, Sabine; Wachter, Oliver; Alberer, Martin; Kaminsky, Elke; Weber, Lutz T; Hinrichsen, Tanja; Klein, Hanns-Georg; Hoefele, Julia


    Alport syndrome (ATS) is a progressive hereditary nephropathy characterized by hematuria and/or proteinuria with structural defects of the glomerular basement membrane. It can be associated with extrarenal manifestations (high-tone sensorineural hearing loss and ocular abnormalities). Somatic mutations in COL4A5 (X-linked), COL4A3 and COL4A4 genes (both autosomal recessive and autosomal dominant) cause Alport syndrome. Somatic mosaicism in Alport patients is very rare. The reason for this may be due to the difficulty of detection. We report the case of a boy and his mother who presented with Alport syndrome. Mutational analysis showed the novel hemizygote pathogenic mutation c.2396-1G>A (IVS29-1G>A) at the splice acceptor site of the intron 29 exon 30 boundary of the COL4A5 gene in the boy. The mutation in the mother would not have been detected by Sanger sequencing without the knowledge of the mutational analysis result of her son. Further investigation of the mother using next generation sequencing showed somatic mosaicism and implied potential germ cell mosaicism. The mutation in the mother has most likely occurred during early embryogenesis. Analysis of tissue of different embryonic origin in the mother confirmed mosaicism in both mesoderm and ectoderm. Low grade mosaicism is very difficult to detect by Sanger sequencing. Next generation sequencing is increasingly used in the diagnostics and might improve the detection of mosaicism. In the case of definite clinical symptoms of ATS and missing detection of a mutation by Sanger sequencing, mutational analysis should be performed by next generation sequencing.

  2. How process analysis could improve the implementation of spinal cord stimulation treatment for chronic pain. (United States)

    Williams, Kayode A; McLeod, Julia C; Reinhardt, Gilles


    SUMMARY Spinal cord stimulation has been in clinical use for the treatment of chronic pain for over four decades. Since the initial use by Norman Shealy, the indications for its use have increased steadily over the decades to include neuropathic pain owing to failed back surgery syndrome, complex regional pain syndrome and painful diabetic peripheral neuropathies. To date, the precise mechanism of action of spinal cord stimulation remains unclear, yet it is still one of the most expensive interventional treatment modalities available in pain medicine with increasing application across the world. Given the worldwide focus on cost-effective care, there is an opportunity to focus on process analysis as a mechanism for optimizing the operations within and between all specialties engaged in the provision of care in pain medicine. Here, we propose a process analysis approach to model, measure and improve the delivery of disease-based care to enhance effective treatment with a costly modality. Systems-based process analysis is not widely utilized in pain medicine, and there is a limited body of evidence for its application. The purpose of this article is to generate interest in the discipline of process analysis in pain medicine, as it has found value in other healthcare settings and industries. We mention the applicability across countries and specialties that we hope will increase the awareness of this concept and possibly generate interest in further examination by investigators that will lead to the development of highly efficient and effective healthcare delivery processes and systems across the globe.

  3. Improvement of reflood model in RELAP5 code based on sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dong; Liu, Xiaojing; Yang, Yanhua, E-mail:


    Highlights: • Sensitivity analysis is performed on the reflood model of RELAP5. • The selected influential models are discussed and modified. • The modifications are assessed by FEBA experiment and better predictions are obtained. - Abstract: Reflooding is an important and complex process to the safety of nuclear reactor during loss of coolant accident (LOCA). Accurate prediction of the reflooding behavior is one of the challenge tasks for the current system code development. RELAP5 as a widely used system code has the capability to simulate this process but with limited accuracy, especially for low inlet flow rate reflooding conditions. Through the preliminary assessment with six FEBA (Flooding Experiments with Blocked Arrays) tests, it is observed that the peak cladding temperature (PCT) is generally underestimated and bundle quench is predicted too early compared to the experiment data. In this paper, the improvement of constitutive models related to reflooding is carried out based on single parametric sensitivity analysis. Film boiling heat transfer model and interfacial friction model of dispersed flow are selected as the most influential models to the results of interests. Then studies and discussions are specifically focused on these sensitive models and proper modifications are recommended. These proposed improvements are implemented in RELAP5 code and assessed against FEBA experiment. Better agreement between calculations and measured data for both cladding temperature and quench time is obtained.

  4. An improved prompt gamma neutron activation analysis facility using a focused diffracted neutron beam (United States)

    Riley, Kent J.; Harling, Otto K.


    The performance of the prompt gamma neutron activation analysis (PGNAA) facility at the MIT Research Reactor has been improved by a series of modifications. These modifications have increased the flux by a factor of three at the sample position to 1.7 × 10 7 n/cm 2 s, and have increased the sensitivity, on average, by a factor of 2.5. The background for many samples of interest is dominated by unavoidable neutron interactions that occur in or near the sample. Other background components comprise only 20% of the total background count rate. The implementation of fast electronics has helped to keep dead time reasonable, in spite of the increased count rates. The PGNAA facility at the MIT Research Reactor continues to serve as a major analytical tool for quantifying 10B in biological samples for Boron Neutron Capture Therapy (BNCT) research. The sensitivity for boron-10 in water is 18 750 cps/mg. The sensitivity for pure elements suitable for PGNAA analysis is reported. Possible further improvements are discussed.

  5. Security Analysis and Improvements of Authentication and Access Control in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Bruce Ndibanje


    Full Text Available Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. According to our analysis, Jing et al.’s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  6. Security analysis and improvements of authentication and access control in the Internet of Things. (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon


    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  7. Improved compositional analysis of block copolymers using diffusion ordered NMR spectroscopy. (United States)

    Viel, Stéphane; Mazarin, Michaël; Giordanengo, Rémi; Phan, Trang N T; Charles, Laurence; Caldarelli, Stefano; Bertin, Denis


    Block copolymers constitute a fascinating class of polymeric materials that are used in a broad range of applications. The performance of these materials is highly coupled to the physical and chemical properties of the constituting block copolymers. Traditionally, the composition of block copolymers is obtained by 1H NMR spectroscopy on purified copolymer fractions. Specifically, the integrals of a properly selected set of 1H resonances are compared and used to infer the number average molecular weight (M(n)) of one of the block from the (typically known) M(n) value of the other. As a corollary, compositional determinations achieved on imperfectly purified samples lead to serious errors, especially when isolation of the block copolymer from the initial macro initiator is tedious. This investigation shows that Diffusion Ordered NMR Spectroscopy (DOSY) can be used to provide a way to assess the advancement degree of the copolymerization purification/reaction, in order to optimize it and hence contribute to an improved compositional analysis of the resulting copolymer. To this purpose, a series of amphiphilic polystyrene-b-poly(ethylene oxide) block copolymers, obtained by controlled free-radical nitroxide mediated polymerization, were analyzed and it is shown that, under proper experimental conditions, DOSY allows for an improved compositional analysis of these block copolymers.

  8. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP (United States)

    Zhang, Junhong; Wang, Jian; Lin, Jiewei; Bi, Fengrong; Guo, Qian; Chen, Kongwu; Ma, Liang


    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction.

  9. Improved proteomic analysis following trichloroacetic acid extraction of Bacillus anthracis spore proteins. (United States)

    Deatherage Kaiser, Brooke L; Wunschel, David S; Sydor, Michael A; Warner, Marvin G; Wahl, Karen L; Hutchison, Janine R


    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Analysis of cellular proteins is dependent upon efficient extraction from bacterial samples, which can be challenging with increasing complexity and refractory characteristics. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrichment for certain classes of proteins. The method presented here is technically simple, does not require specialized equipment such as a mechanical disrupter, and is effective for protein extraction of the particularly challenging sample type of Bacillus anthracis Sterne spores. The ability of Trichloroacetic acid (TCA) extraction to isolate proteins from spores and enrich for spore-specific proteins was compared to the traditional mechanical disruption method of bead beating. TCA extraction improved the total average number of proteins identified within a sample as compared to bead beating (547 vs 495, respectively). Further, TCA extraction enriched for 270 spore proteins, including those typically identified by first isolating the spore coat and exosporium layers. Bead beating enriched for 156 spore proteins more typically identified from whole spore proteome analyses. The total average number of proteins identified was equal using TCA or bead beating for easily lysed samples, such as B. anthracis vegetative cells. As with all assays, supplemental methods such as implementation of an alternative preparation method may simplify sample preparation and provide additional insight to the protein biology of the organism being studied.

  10. An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers. (United States)

    Sun, Kewen; Jin, Tian; Yang, Dongkai


    In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner-Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments.

  11. Structured hydrological analysis for targeting fallow evaporation to improve water productivity at the irrigation system level

    Directory of Open Access Journals (Sweden)

    S. Khan


    Full Text Available This paper provides results of an application of a holistic systematic approach of water accounting using remote sensing and GIS coupled with ground water modeling to evaluate water saving options by tracking non-beneficial evaporation in the Liuyuankou Irrigation System (LIS of China. Groundwater rise is a major issue in the LIS, where groundwater levels have risen alarmingly close to the ground surface (within 1 m near the Yellow River. The lumped water balance analysis showed high fallow evaporation losses and which need to be reduced for improving water productivity.

    The seasonal actual evapotranspiration (ETs was estimated by applying the SEBAL algorithm for eighteen NOAA AVHRR-12 images over the year of 1990–1991. This analysis was aided by the unsupervised land use classification applied to two Landsat 5 TM images of the study area. SEBAL results confirmed that a significant amount (116.7 MCM of water can be saved by reducing ETs from fallow land which will result in improved water productivity at the irrigation system. The water accounting indicator (for the analysis period shows that the process fraction per unit of depleted water (PFdepleted is 0.52 for LIS, meaning that 52% of the depleted water is consumed by agricultural crops and 48% is lost through non-process depletion.

    Finally, the groundwater modeling was applied to simulate three land use and water management interventions to assess their effectiveness for both water savings and impact on the groundwater in LIS. MODFLOW's Zone Budget code calculates the groundwater budget of user-specified subregions, the exchange of flows between subregions and also calculates a volumetric water budget for the entire model at the end of each time step. The simulation results showed that fallow evaporation could be reduced between 14.2% (25.51 MCM and 45.3% (81.36 MCM by interventions such as canal lining and ground

  12. Topological-based bottleneck analysis and improvement strategies for traffic networks

    Institute of Scientific and Technical Information of China (English)


    A method is proposed to find key components of traffic networks with homogenous and heterogeneous topologies, in which heavier traffic flow is transported. One component, called the skeleton, is the minimum spanning tree (MST) based on the zero flow cost (ZCMST). The other component is the infinite incipient percolation cluster (IIC) which represents the spine of the traffic network. Then, a new method to analysis the property of the bottleneck in a large scale traffic network is given from a macroscopic and statistical viewpoint. Moreover, three effective strategies are proposed to alleviate traffic congestion. The significance of the findings is that one can significantly improve the global transport by enhancing the capacity in the ZCMST with a few links, while for improving the local traffic property, improving a tiny fraction of the traffic network in the IIC is effective. The result can be used to help traffic managers prevent and alleviate traffic congestion in time, guard against the formation of congestion bottleneck, and make appropriate policies for traffic demand management. Meanwhile, the method has very important theoretical significance and practical worthiness in optimizing traffic organization, traffic control, and disposal of emergency.

  13. Repetitive transcranial magnetic stimulation improves consciousness disturbance in stroke patients A quantitative electroencephalography spectral power analysis

    Institute of Scientific and Technical Information of China (English)

    Ying Xie; Tong Zhang


    Repetitive transcranial magnetic stimulation is a noninvasive treatment technique that can directly alter cortical excitability and improve cerebral functional activity in unconscious patients. To investigate the effects and the electrophysiological changes of repetitive transcranial magnetic stimulation cortical treatment, 10 stroke patients with non-severe brainstem lesions and with disturbance of consciousness were treated with repetitive transcranial magnetic stimulation. A quantitative electroencephalography spectral power analysis was also performed. The absolute power in the alpha band was increased immediately after the first repetitive transcranial magnetic stimulation treatment, and the energy was reduced in the delta band. The alpha band relative power values slightly decreased at 1 day post-treatment, then increased and reached a stable level at 2 weeks post-treatment. Glasgow Coma Score and JFK Coma Recovery Scale-Revised score were improved. Relative power value in the alpha band was positively related to Glasgow Coma Score and JFK Coma Recovery Scale-Revised score. These data suggest that repetitive transcranial magnetic stimulation is a noninvasive, safe, and effective treatment technology for improving brain functional activity and promoting awakening in unconscious stroke patients.

  14. Analysis of technological innovation and environmental performance improvement in aviation sector. (United States)

    Lee, Joosung; Mo, Jeonghoon


    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector-aircraft manufacturers and airlines-has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation's lifecycle environmental impact if they can achieve sufficient economies of scale.

  15. Analysis of Technological Innovation and Environmental Performance Improvement in Aviation Sector

    Directory of Open Access Journals (Sweden)

    Jeonghoon Mo


    Full Text Available The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector—aircraft manufacturers and airlines—has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation’s lifecycle environmental impact if they can achieve sufficient economies of scale.

  16. Energy spectrum analysis of blast waves based on an improved Hilbert-Huang transform (United States)

    Li, L.; Wang, F.; Shang, F.; Jia, Y.; Zhao, C.; Kong, D.


    Using the improved Hilbert-Huang transform (HHT), this paper investigates the problems of analysis and interpretation of the energy spectrum of a blast wave. It has been previously established that the energy spectrum is an effective feature by which to characterize a blast wave. In fact, the higher the energy spectra in a frequency band of a blast wave, the greater the damage to a target in the same frequency band. However, most current research focuses on analyzing wave signals in the time domain or frequency domain rather than considering the energy spectrum. We propose here an improved HHT method combined with a wavelet packet to extract the energy spectrum feature of a blast wave. When applying the HHT, the signal is first roughly decomposed into a series of intrinsic mode functions (IMFs) by empirical mode decomposition. The wavelet packet method is then performed on each IMF to eliminate noise on the energy spectrum. Second, a coefficient is introduced to remove unrelated IMFs. The energy of each instantaneous frequency can be derived through the Hilbert transform. The energy spectrum can then be obtained by adding up all the components after the wavelet packet filters and screens them through a coefficient to obtain the effective IMFs. The effectiveness of the proposed method is demonstrated by 12 groups of experimental data, and an energy attenuation model is established based on the experimental data. The improved HHT is a precise method for blast wave signal analysis. For other shock wave signals from blasting experiments, an energy frequency time distribution and energy spectrum can also be obtained through this method, allowing for more practical applications.

  17. Texture analysis improves level set segmentation of the anterior abdominal wall

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhoubing [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Allen, Wade M. [Institute of Imaging Science, Vanderbilt University, Nashville, Tennessee 37235 (United States); Baucom, Rebeccah B.; Poulose, Benjamin K. [General Surgery, Vanderbilt University Medical Center, Nashville, Tennessee 37235 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 and Institute of Imaging Science, Vanderbilt University, Nashville, Tennessee 37235 (United States)


    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  18. Novel Computational Analysis of Left Atrial Anatomy Improves Prediction of Atrial Fibrillation Recurrence after Ablation (United States)

    Varela, Marta; Bisbal, Felipe; Zacur, Ernesto; Berruezo, Antonio; Aslanidi, Oleg V.; Mont, Lluis; Lamata, Pablo


    The left atrium (LA) can change in size and shape due to atrial fibrillation (AF)-induced remodeling. These alterations can be linked to poorer outcomes of AF ablation. In this study, we propose a novel comprehensive computational analysis of LA anatomy to identify what features of LA shape can optimally predict post-ablation AF recurrence. To this end, we construct smooth 3D geometrical models from the segmentation of the LA blood pool captured in pre-procedural MR images. We first apply this methodology to characterize the LA anatomy of 144 AF patients and build a statistical shape model that includes the most salient variations in shape across this cohort. We then perform a discriminant analysis to optimally distinguish between recurrent and non-recurrent patients. From this analysis, we propose a new shape metric called vertical asymmetry, which measures the imbalance of size along the anterior to posterior direction between the superior and inferior left atrial hemispheres. Vertical asymmetry was found, in combination with LA sphericity, to be the best predictor of post-ablation recurrence at both 12 and 24 months (area under the ROC curve: 0.71 and 0.68, respectively) outperforming other shape markers and any of their combinations. We also found that model-derived shape metrics, such as the anterior-posterior radius, were better predictors than equivalent metrics taken directly from MRI or echocardiography, suggesting that the proposed approach leads to a reduction of the impact of data artifacts and noise. This novel methodology contributes to an improved characterization of LA organ remodeling and the reported findings have the potential to improve patient selection and risk stratification for catheter ablations in AF.

  19. An improved model for whole genome phylogenetic analysis by Fourier transform. (United States)

    Yin, Changchuan; Yau, Stephen S-T


    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes.

  20. Characterization of local complex structures in a recurrence plot to improve nonlinear dynamic discriminant analysis. (United States)

    Ding, Hang


    Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.

  1. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H{sub 2}/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author).

  2. Spinning Reserve Requirements Optimization Based on an Improved Multiscenario Risk Analysis Method

    Directory of Open Access Journals (Sweden)

    Liudong Zhang


    Full Text Available This paper proposes a novel security-constrained unit commitment model to calculate the optimal spinning reserve (SR amount. The model combines cost-benefit analysis with an improved multiscenario risk analysis method capable of considering various uncertainties, including load and wind power forecast errors as well as forced outages of generators. In this model, cost-benefit analysis is utilized to simultaneously minimize the operation cost of conventional generators, the expected cost of load shedding, the penalty cost of wind power spillage, and the carbon emission cost. It remedies the defects of the deterministic and probabilistic methods of SR calculation. In cases where load and wind power generation are negatively correlated, this model based on multistep modeling of net demand can consider the wind power curtailment to maximize the overall economic efficiency of system operation so that the optimal economic values of wind power and SR are achieved. In addition, the impact of the nonnormal probability distributions of wind power forecast error on SR optimization can be taken into account. Using mixed integer linear programming method, simulation studies on a modified IEEE 26-generator reliability test system connected to a wind farm are performed to confirm the effectiveness and advantage of the proposed model.

  3. Application of computational fluid dynamics methods to improve thermal hydraulic code analysis (United States)

    Sentell, Dennis Shannon, Jr.

    A computational fluid dynamics code is used to model the primary natural circulation loop of a proposed small modular reactor for comparison to experimental data and best-estimate thermal-hydraulic code results. Recent advances in computational fluid dynamics code modeling capabilities make them attractive alternatives to the current conservative approach of coupled best-estimate thermal hydraulic codes and uncertainty evaluations. The results from a computational fluid dynamics analysis are benchmarked against the experimental test results of a 1:3 length, 1:254 volume, full pressure and full temperature scale small modular reactor during steady-state power operations and during a depressurization transient. A comparative evaluation of the experimental data, the thermal hydraulic code results and the computational fluid dynamics code results provides an opportunity to validate the best-estimate thermal hydraulic code's treatment of a natural circulation loop and provide insights into expanded use of the computational fluid dynamics code in future designs and operations. Additionally, a sensitivity analysis is conducted to determine those physical phenomena most impactful on operations of the proposed reactor's natural circulation loop. The combination of the comparative evaluation and sensitivity analysis provides the resources for increased confidence in model developments for natural circulation loops and provides for reliability improvements of the thermal hydraulic code.

  4. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis. (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B


    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  5. Characterization of local complex structures in a recurrence plot to improve nonlinear dynamic discriminant analysis (United States)

    Ding, Hang


    Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.

  6. [Improvement in the efficiency of an ambulatory service case load by applying a computerized method (patient flow analysis)]. (United States)

    Benussi, G; Canciani, G P; de Luyk, S; Parco, S; Visconti, P; Grandolfo, M; Mangiarotti, M A


    The authors describe the application of a technique called Patient Flow Analysis aimed at the improvement of Clinic Personnel efficiency and reduction of patient waiting time. Results were satisfactory and encourage further experiences.

  7. An Improved, Automated Whole-Air Sampler and VOC Analysis System: Results from SONGNEX 2015 (United States)

    Lerner, B. M.; Gilman, J.; Tokarek, T. W.; Peischl, J.; Koss, A.; Yuan, B.; Warneke, C.; Isaacman-VanWertz, G. A.; Sueper, D.; De Gouw, J. A.; Aikin, K. C.


    Accurate measurement of volatile organic compounds (VOCs) in the troposphere is critical for the understanding of emissions and physical and chemical processes that can impact both air quality and climate. Airborne VOC measurements have proven challenging due to the requirements of short sample collection times (=10 s) to maximize spatial resolution and sampling frequency and high sensitivity (pptv) to chemically diverse hydrocarbons, halocarbons, oxygen- and nitrogen-containing VOCs. NOAA ESRL CSD has built an improved whole air sampler (iWAS) which collects compressed ambient air samples in electropolished stainless steel canisters, based on the NCAR HAIS Advanced Whole Air Sampler [Atlas and Blake]. Post-flight chemical analysis is performed with a custom-built gas chromatograph-mass spectrometer system that pre-concentrates analyte cryostatically via a Stirling cooler, an electromechanical chiller which precludes the need for liquid nitrogen to reach trapping temperatures. For the 2015 Shale Oil and Natural Gas Nexus Study (SONGNEX), CSD conducted iWAS measurements on 19 flights aboard the NOAA WP-3D aircraft between March 19th and April 27th. Nine oil and natural gas production regions were surveyed during SONGNEX and more than 1500 air samples were collected and analyzed. For the first time, we employed real-time mapping of sample collection combined with live data from fast time-response measurements (e.g. ethane) for more uniform surveying and improved target plume sampling. Automated sample handling allowed for more than 90% of iWAS canisters to be analyzed within 96 hours of collection - for the second half of the campaign improved efficiencies reduced the median sample age at analysis to 36 hours. A new chromatography peak-fitting software package was developed to minimize data reduction time by an order of magnitude without a loss of precision or accuracy. Here we report mixing ratios for aliphatic and aromatic hydrocarbons (C2-C8) along with select

  8. Deformation of Japan as measured by improved analysis of GEONET data (United States)

    Owen, S. E.; Dong, D.; Webb, F. H.; Newport, B. J.; Simons, M.


    The Japan subduction zone represents a complex set of plate interfaces with significant trench-parallel variability in great earthquakes and transient deep slip events. Within the Japan arc the Nankai segment of the Eurasian-Philippine plate boundary is one of the classic subduction zone segments that last produced a set of temporally linked great earthquakes in the 1940's. Recently, down-dip of the Nankai seismogenic portion of the plate interface, transient slip events and seismic tremor events were observed. Through analysis of the GEONET GPS data, the spatial and higher frequency temporal characteristics of transient slip events can be captured. We describe our analysis methods, the spatial filtering technique that has been developed for use on large networks, a periodic signal filtering method that improves on commonly-used sinusoidal function models, and the resultant velocities and time series. Our newly developed analysis method, the GPS Network Processor, gives us the ability to process large volumes of data extremely fast. The basis of the GPS Network Processor is the JPL-developed GIPSY-OASIS GPS analysis software and the JPL-developed precise point positioning technique. The Network Processor was designed and developed to efficiently implement precise point positioning and bias fixing on a 1000-node (2000 cpu) Beowulf cluster. The entire 10 year ~1000-station GEONET data set can be reanalyzed using the Network Processor in a matter of days. This permits us to test different processing strategies, each with potentially large influence on our ability to detect strain transients from the subduction zones. For example, we can test different ocean loading models, which can effect the diurnal positions of coastal GPS sites by up to 2 cm. We can also test other potentially important factors such as using reprocessed satellite orbits and clocks, the parameterization of the tropospheric delay, or the implementation of refined solid body tide estimates. We will

  9. Improving air pollution control policy in China--A perspective based on cost-benefit analysis. (United States)

    Gao, Jinglei; Yuan, Zengwei; Liu, Xuewei; Xia, Xiaoming; Huang, Xianjin; Dong, Zhanfeng


    To mitigate serious air pollution, the State Council of China promulgated the Air Pollution Prevention and Control Action Plan in 2013. To verify the feasibility and validity of industrial energy-saving and emission-reduction policies in the action plan, we conducted a cost-benefit analysis of implementing these policies in 31 provinces for the period of 2013 to 2017. We also completed a scenario analysis in this study to assess the cost-effectiveness of different measures within the energy-saving and the emission-reduction policies individually. The data were derived from field surveys, statistical yearbooks, government documents, and published literatures. The results show that total cost and total benefit are 118.39 and 748.15 billion Yuan, respectively, and the estimated benefit-cost ratio is 6.32 in the S3 scenario. For all the scenarios, these policies are cost-effective and the eastern region has higher satisfactory values. Furthermore, the end-of-pipe scenario has greater emission reduction potential than energy-saving scenario. We also found that gross domestic product and population are significantly correlated with the benefit-cost ratio value through the regression analysis of selected possible influencing factors. The sensitivity analysis demonstrates that benefit-cost ratio value is more sensitive to unit emission-reduction cost, unit subsidy, growth rate of gross domestic product, and discount rate among all the parameters. Compared with other provinces, the benefit-cost ratios of Beijing and Tianjin are more sensitive to changes of unit subsidy than unit emission-reduction cost. These findings may have significant implications for improving China's air pollution prevention policy.

  10. An improved continuous flow analysis system for high-resolution field measurements on ice cores. (United States)

    Kaufmann, Patrik R; Federer, Urs; Hutterli, Manuel A; Bigler, Matthias; Schüpbach, Simon; Ruth, Urs; Schmitt, Jochen; Stocker, Thomas F


    Continuous flow analysis (CFA) is a well-established method to obtain information about impurity contents in ice cores as indicators of past changes in the climate system. A section of an ice core is continuously melted on a melter head supplying a sample water flow which is analyzed online. This provides high depth and time resolution of the ice core records and very efficient sample decontamination as only the inner part of the ice sample is analyzed. Here we present an improved CFA system which has been totally redesigned in view of a significantly enhanced overall efficiency and flexibility, signal quality, compactness, and ease of use. These are critical requirements especially for operations of CFA during field campaigns, e.g., in Antarctica or Greenland. Furthermore, a novel deviceto measure the total air content in the ice was developed. Subsequently, the air bubbles are now extracted continuously from the sample water flow for subsequent gas measurements.

  11. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters. (United States)

    Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G


    With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice.

  12. Using pulse shape analysis to improve the position resolution of a resistive anode microchannel plate detector

    CERN Document Server

    Siwal, Davinder; deSouza, R T


    Digital signal processing techniques were employed to investigate the joint use of charge division and risetime analyses for the resistive anode (RA) coupled to a microchannel plate detector (MCP). In contrast to the typical approach of using the relative charge at each corner of the RA, this joint approach results in a significantly improved position resolution. A conventional charge division analysis utilizing analog signal processing provides a position measured resolution of 170 $\\mu$m (FWHM). By using the correlation between risetime and position we were able to obtain a measured resolution of 92 $\\mu$m (FWHM), corresponding to an intrinsic resolution of 64 $\\mu$m (FMHM) for a single Z-stack MCP detector.

  13. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot


    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  14. The CMB temperature power spectrum from an improved analysis of the Archeops data

    CERN Document Server

    Tristram, M; Macias-Perez, J F; Ade, P; Amblard, A; Ansari, R; Aubourg, E; Benoît, A; Bernard, J P; Blanchard, A; Bock, J J; Bouchet, F R; Bourrachot, A; Camus, P; Cardoso, J F; Couchot, F; De Bernardis, P; Delabrouille, J; Désert, F X; Douspis, M; Dumoulin, L; Filliatre, P; Fosalba, P; Giard, M; Giraud-Héraud, Yannick; Gispert, R; Guglielmi, L; Hamilton, J C; Hanany, S; Henrot-Versillé, S; Kaplan, J; Lagache, G; Lange, A E; Madet, K; Maffei, B; Masi, S; Mayet, F; Nati, F; Perdereau, O; Plaszczynski, S; Piat, M; Ponthieu, N; Prunet, S; Renault, C; Rosset, C; Santos, D; Vibert, D; Yvon, D; Filliatre, Ph.


    We present improved results on the measurement of the angular power spectrum of the Cosmic Microwave Background (CMB) temperature anisotropies using the data from the last Archeops flight. This refined analysis is obtained by using the 6 most sensitive photometric pixels in the CMB bands centered at 143 and 217 GHz and 20% of the sky, mostly clear of foregrounds. Using two different cross-correlation methods, we obtain very similar results for the angular power spectrum. Consistency checks are performed to test the robustness of these results paying particular attention to the foreground contamination level which remains well below the statistical uncertainties. The multipole range from l=10 to l=700 is covered with 25 bins, confirming strong evidence for a plateau at large angular scales (the Sachs-Wolfe plateau) followed by two acoustic peaks centered around l=220 and l=550 respectively. These data provide an independent confirmation, obtained at different frequencies, of the WMAP first year results.

  15. Term AnalysisImproving the Quality of Learning and Application Documents in Engineering Design

    Directory of Open Access Journals (Sweden)

    S. Weiss


    Full Text Available Conceptual homogeneity is one determinant of the quality of text documents. A concept remains the same if the words used (termini change [1, 2]. In other words, termini can vary while the concept retains the same meaning. Human beings are able to handle concepts and termini because of their semantic network, which is able to connect termini to the actual context and thus identify the adequate meaning of the termini. Problems could arise when humans have to learn new content and correspondingly new concepts. Since the content is basically imparted by text via particular termini, it is a challenge to establish the right concept from the text with the termini. A term might be known, but have a different meaning [3, 4]. Therefore, it is very important to build up the correct understanding of concepts within a text. This is only possible when concepts are explained by the right termini, within an adequate context, and above all, homogeneously. So, when setting up or using text documents for teaching or application, it is essential to provide concept homogeneity.Understandably, the quality of documents is, ceteris paribus, reciprocally proportional to variations of termini. Therefore, an analysis of variations of termini could form a basis for specific improvement of conceptual homogeneity.Consequently, an exposition of variations of termini as control and improvement parameters is carried out in this investigation. This paper describes the functionality and the profit of a tool called TermAnalysis.It also outlines the margins, typeface and other vital specifications necessary for authors preparing camera-ready papers for submission to the 5th International Conference on Advanced Engineering Design. The aim of this paper is to ensure that all readers are clear as to the uniformity required by the organizing committee and to ensure that readers’ papers will be accepted as camera-ready for the conference.TermAnalysis is a software tool developed

  16. Improved method for the analysis of the composition of polysaccharides by total acid hydrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Mochtar, M.; Delavier, H.J.; Oei Ban Liang


    The analysis of the composition of polysaccharides, i.e. dextran, by total acid hydrolysis, in the presence or absence of oxygen, and by different methods of neutralization of the hydrolysate, is presented. It was found that hydrolysis of polysaccharides under nitrogen atmosphere, in the absence of oxygen, diminishes the possibility of a decomposition of monosaccharides formed during hydrolysis. The neutralization of the acid hydrolysate by passing it through a column of weak-base ion exchange resin. Amberlite IRA-94, instead of neutralizing the hydrolysate by Ba(OH)/sub 2/ diminishes the possibility of epimerization of glucose to other saccharides. This improved method gives more reliable results, even in the presence of readily decomposed polysaccharides.

  17. Blood transfusions in critical care: improving safety through technology & process analysis. (United States)

    Aulbach, Rebecca K; Brient, Kathy; Clark, Marie; Custard, Kristi; Davis, Carolyn; Gecomo, Jonathan; Ho, Judy Ong


    A multidisciplinary safety initiative transformed blood transfusion practices at St. Luke's Episcopal Hospital in Houston, Texas. An intense analysis of a mistransfusion using the principles of a Just Culture and the process of Cause Mapping identified system and human performance factors that led to the transfusion error. Multiple initiatives were implemented including technology, education and human behaviour change. The wireless technology of Pyxis Transfusion Verification by CareFusion is effective with the rapid infusion module efficient for use in critical care. Improvements in blood transfusion safety were accomplished by thoroughly evaluating the process of transfusions and by implementing wireless electronic transfusion verification technology. During the 27 months following implementation of the CareFusion Transfusion Verification there have been zero cases of transfusing mismatched blood.

  18. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving. (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K


    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed.

  19. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.


    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  20. The process analysis and improvement of the slender and flat shaft of track recorder

    Institute of Scientific and Technical Information of China (English)

    DAI Tong-yan; WANG Wei; LIU Ying


    A slender and flat shaft is a key part of the track recorder in marine vessels. However, the axial straightness of the shaft often exceeds standard measurements after it is machined. It has also been found that its precision does not last a long time. After thorough analysis of these problems the main reasons that affect machining quality are identified-and a process modification plan is put forward that meets design requirements of the shaft. The production and practice indicate that the precision of the shaft is stable for a long period and the quality of products improved substantially after new measures were employed, securing the e accuracy of the track recording of the marine vessel.

  1. Analysis of Entropy Generation for the Performance Improvement of a Tubular Solid Oxide Fuel Cell Stack

    Directory of Open Access Journals (Sweden)

    Vittorio Verda


    Full Text Available The aim of the paper is to investigate possible improvements in the design and operation of a tubular solid oxide fuel cell. To achieve this purpose, a CFD model of the cell is introduced. The model includes thermo-fluid dynamics, chemical reactions and electrochemistry. The fluid composition and mass flow rates at the inlet sections are obtained through a finite difference model of the whole stack. This model also provides boundary conditions for the radiation heat transfer. All of these conditions account for the position of each cell within the stack. The analysis of the cell performances is conducted on the basis of the entropy generation. The use of this technique makes it possible to identify the phenomena provoking the main irreversibilities, understand their causes and propose changes in the system design and operation.

  2. Structural Analysis and Improved Design of the Gearbox Casing of a Certain Type of Tracked Vehicle

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xue-sheng; JIA Xiao-ping; CHEN Ya-ning; YU Kui-long


    Loads on a gearbox casing of a certain type of tracked vehicle were calculated according to the engine's full load characteristic curve and the worst load condition where the gearbox operated while the tracked vehicle was running, and then stiffness and strength of the casing were analyzed by means of Patran/Nastran software. After a- nalysis, it was found that the casing satisfied the Mises ' yield condition; however, the stress distribution was hetero- geneous, and stresses near the bearing saddle bores of the casing were higher while those in other regions were much less than the allowable stress. For this reason, thicknesses of the casing wall on bearing assembling holes needed in- creasing, while those in other places can decrease. After much structural improving and re-analysis, the optimal casing design was found, and its weight decreased by 5% ; the casing still satisfied the Mises yield criterion and the stress distribution was more homogeneous.

  3. On-line Batch Process Monitoring with Improved Multi-way Independent Component Analysis

    Institute of Scientific and Technical Information of China (English)

    GUO Hui; LI Hongguang


    In the past decades,on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry.This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and determining the control limits of monitoring statistics in the presence of non-Gaussian distribution.To optimize the number of key independent components,we introduce a novel concept of system deviation,which is able to evaluate the reconstructed observations with different independent components.The monitored statistics are transformed to Gaussian distribution data by means of Box-Cox transformation,which helps readily determine the control limits.The proposed method is applied to on-line monitoring of a fed-batch penicillin fermentation simulator,and the experimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.

  4. Operational Modal Analysis Based on Subspace Algorithm with an Improved Stabilization Diagram Method

    Directory of Open Access Journals (Sweden)

    Shiqiang Qin


    Full Text Available Subspace-based algorithms for operational modal analysis have been extensively studied in the past decades. In the postprocessing of subspace-based algorithms, the stabilization diagram is often used to determine modal parameters. In this paper, an improved stabilization diagram is proposed for stochastic subspace identification. Specifically, first, a model order selection method based on singular entropy theory is proposed. The singular entropy increment is calculated from nonzero singular values of the output covariance matrix. The corresponding model order can be selected when the variation of singular entropy increment approaches to zero. Then, the stabilization diagram with confidence intervals which is established using the uncertainty of modal parameter is presented. Finally, a simulation example of a four-story structure and a full-scale cable-stayed footbridge application is employed to illustrate the improved stabilization diagram method. The study demonstrates that the model order can be reasonably determined by the proposed method. The stabilization diagram with confidence intervals can effectively remove the spurious modes.

  5. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency. (United States)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung


    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment.

  6. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.


    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  7. Numerical Analysis and Improved Algorithms for Lyapunov-Exponent Calculation of Discrete-Time Chaotic Systems (United States)

    He, Jianbin; Yu, Simin; Cai, Jianping


    Lyapunov exponent is an important index for describing chaotic systems behavior, and the largest Lyapunov exponent can be used to determine whether a system is chaotic or not. For discrete-time dynamical systems, the Lyapunov exponents are calculated by an eigenvalue method. In theory, according to eigenvalue method, the more accurate calculations of Lyapunov exponent can be obtained with the increment of iterations, and the limits also exist. However, due to the finite precision of computer and other reasons, the results will be numeric overflow, unrecognized, or inaccurate, which can be stated as follows: (1) The iterations cannot be too large, otherwise, the simulation result will appear as an error message of NaN or Inf; (2) If the error message of NaN or Inf does not appear, then with the increment of iterations, all Lyapunov exponents will get close to the largest Lyapunov exponent, which leads to inaccurate calculation results; (3) From the viewpoint of numerical calculation, obviously, if the iterations are too small, then the results are also inaccurate. Based on the analysis of Lyapunov-exponent calculation in discrete-time systems, this paper investigates two improved algorithms via QR orthogonal decomposition and SVD orthogonal decomposition approaches so as to solve the above-mentioned problems. Finally, some examples are given to illustrate the feasibility and effectiveness of the improved algorithms.

  8. Identifying Interventions for Improving Letter Formation: A Brief Experimental Analysis of Students with Intellectual Disabilities

    Directory of Open Access Journals (Sweden)

    E. Rüya ÖZMEN


    Full Text Available As a group, students with intellectual disabilities display difficulties in a wide range of academic skills, including the acquisition of basic academic skills such as literacy. Early writing and reading skills must be supported to prepare students with intellectual disabilities to learn to read and write. The goal of this study was to replicate and extend the current research on Brief Experimental Analysis with letter formation. Three students with intellectual disabilities participated in the study. A brief multi-element design was used to test effectiveness of four interventions on letter formation. These interventions included goal setting plus contingent reinforcement, graphical feedback, error correction and modeling. For one student, modeling was effective; for the two remaining students, goal setting plus contingent reinforcement was effective. The results of this study extend the BEA literature by investigating the effects of interventions for improving letter formation in students with intellectual disabilities. The study findings suggest that using BEA to assess the relative contribution of each intervention can identify the most effective interventions for improving letter formation in students with intellectual disabilities.

  9. Resequencing of the common marmoset genome improves genome assemblies and gene-coding sequence analysis. (United States)

    Sato, Kengo; Kuroki, Yoko; Kumita, Wakako; Fujiyama, Asao; Toyoda, Atsushi; Kawai, Jun; Iriki, Atsushi; Sasaki, Erika; Okano, Hideyuki; Sakakibara, Yasubumi


    The first draft of the common marmoset (Callithrix jacchus) genome was published by the Marmoset Genome Sequencing and Analysis Consortium. The draft was based on whole-genome shotgun sequencing, and the current assembly version is Callithrix_jacches-3.2.1, but there still exist 187,214 undetermined gap regions and supercontigs and relatively short contigs that are unmapped to chromosomes in the draft genome. We performed resequencing and assembly of the genome of common marmoset by deep sequencing with high-throughput sequencing technology. Several different sequence runs using Illumina sequencing platforms were executed, and 181 Gbp of high-quality bases including mate-pairs with long insert lengths of 3, 8, 20, and 40 Kbp were obtained, that is, approximately 60× coverage. The resequencing significantly improved the MGSAC draft genome sequence. The N50 of the contigs, which is a statistical measure used to evaluate assembly quality, doubled. As a result, 51% of the contigs (total length: 299 Mbp) that were unmapped to chromosomes in the MGSAC draft were merged with chromosomal contigs, and the improved genome sequence helped to detect 5,288 new genes that are homologous to human cDNAs and the gaps in 5,187 transcripts of the Ensembl gene annotations were completely filled.

  10. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis. (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun


    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  11. Improving the precision of fMRI BOLD signal deconvolution with implications for connectivity analysis. (United States)

    Bush, Keith; Cisler, Josh; Bian, Jiang; Hazaroglu, Gokce; Hazaroglu, Onder; Kilts, Clint


    An important, open problem in neuroimaging analyses is developing analytical methods that ensure precise inferences about neural activity underlying fMRI BOLD signal despite the known presence of confounds. Here, we develop and test a new meta-algorithm for conducting semi-blind (i.e., no knowledge of stimulus timings) deconvolution of the BOLD signal that estimates, via bootstrapping, both the underlying neural events driving BOLD as well as the confidence of these estimates. Our approach includes two improvements over the current best performing deconvolution approach; 1) we optimize the parametric form of the deconvolution feature space; and, 2) we pre-classify neural event estimates into two subgroups, either known or unknown, based on the confidence of the estimates prior to conducting neural event classification. This knows-what-it-knows approach significantly improves neural event classification over the current best performing algorithm, as tested in a detailed computer simulation of highly-confounded fMRI BOLD signal. We then implemented a massively parallelized version of the bootstrapping-based deconvolution algorithm and executed it on a high-performance computer to conduct large scale (i.e., voxelwise) estimation of the neural events for a group of 17 human subjects. We show that by restricting the computation of inter-regional correlation to include only those neural events estimated with high-confidence the method appeared to have higher sensitivity for identifying the default mode network compared to a standard BOLD signal correlation analysis when compared across subjects.

  12. Use of Selection Indices Based on Multivariate Analysis for Improving Grain Yield in Rice

    Institute of Scientific and Technical Information of China (English)



    In order to study selection indices for improving rice grain yield, a cross was made between an Iranian traditional rice (Oryza sativa L.) variety, Tarommahalli and an improved indica rice variety, Khazar in 2006. The traits of the parents (30 plants), F1 (30 plants) and F2 generations (492 individuals) were evaluated at the Rice Research institute of Iran (RRII) during 2007. Heritabilities of the number of panicles per plant, plant height, days to heading and panicle exsertion were greater than that of grain yield. The selection indices were developed using the results of multivariate analysis. To evaluate selection strategies to maximize grain yield, 14 selection indices were calculated based on two methods (optimum and base) and combinations of 12 traits with various economic weights. Results of selection indices showed that selection for grain weight, number of panicles per plant and panicle length by using their phenotypic and/or genotypic direct effects (path coefficient) as economic weights should serve as an effective selection criterion for using either the optimum or base index.

  13. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Moon, Young Min; Lee, Dong Won; Lee, Sang Ik; Kim, Eung Soo; Yeom, Keum Soo [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)


    The objective of the present research is to perform the separate effect tests and to assess the RELAP5/MOD3.2 code for the analysis of thermal-hydraulic behavior in the reactor coolant system and the improvement of the auditing technology of safety analysis. Three Separate Effect Tests (SETs) are the reflux condensation in the U-tube, the direct contact condensation in the hot-leg and the mixture level buildup in the pressurizer. The experimental data and the empirical correlations are obtained through SETs. On the ases of the three SET works, models in RELAP5 are modified and improved, which are compared with the data. The Korea Standard Nuclear Power Plant (KSNP) are assessed using the modified RELAP5. In the reflux condensation test, the data of heat transfer coefficients and flooding are obtained and the condensation models are modified using the non-iterative model, as results, modified code better predicts the data. In the direct contact condensation test, the data of heat transfer coefficients are obtained for the cocurrent and countercurrent flow between the mixture gas and the water in condition of horizontal stratified flow. Several condensation and friction models are modified, which well predict the present data. In the mixture level test, the data for the mixture level and the onset of water draining into the surge line are obtained. The standard RELAP5 over-predicts the mixture level and the void fraction in the pressurizer. Simple modification of model related to the pool void fraction is suggested. The KSNP is assessed using the standard and the modified RELAP5 resulting from the experimental and code works for the SETs. In case of the pressurizer manway opening with available secondary side of the steam generators, the modified code predicts that the collapsed level in the pressurizer is little accumulated. The presence and location of the opening and the secondary condition of the steam generators have an effect on the coolant inventory. The

  14. Paediatric ED BiPAP continuous quality improvement programme with patient analysis: 2005–2013 (United States)

    Abramo, Thomas; Williams, Abby; Mushtaq, Samaiya; Meredith, Mark; Sepaule, Rawle; Crossman, Kristen; Burney Jones, Cheryl; Godbold, Suzanne; Hu, Zhuopei; Nick, Todd


    Objective In paediatric moderate-to-severe asthmatics, there is significant bronchospasm, airway obstruction, air trapping causing severe hyperinflation with more positive intraplural pressure preventing passive air movement. These effects cause an increased respiratory rate (RR), less airflow and shortened inspiratory breath time. In certain asthmatics, aerosols are ineffective due to their inadequate ventilation. Bilevel positive airway pressure (BiPAP) in acute paediatric asthmatics can be an effective treatment. BiPAP works by unloading fatigued inspiratory muscles, a direct bronchodilation effect, offsetting intrinsic PEEP and recruiting collapsed alveoli that reduces the patient's work of breathing and achieves their total lung capacity quicker. Unfortunately, paediatric emergency department (PED) BiPAP is underused and quality analysis is non-existent. A PED BiPAP Continuous Quality Improvement Program (CQIP) from 2005 to 2013 was evaluated using descriptive analytics for the primary outcomes of usage, safety, BiPAP settings, therapeutics and patient disposition. Interventions PED BiPAP CQIP descriptive analytics. Setting Academic PED. Participants 1157 patients. Interventions A PED BiPAP CQIP from 2005 to 2013 for the usage, safety, BiPAP settings, therapeutic response parameters and patient disposition was evaluated using descriptive analytics. Primary and secondary outcomes Safety, usage, compliance, therapeutic response parameters, BiPAP settings and patient disposition. Results 1157 patients had excellent compliance without complications. Only 6 (0.5%) BiPAP patients were intubated. BiPAP median settings: IPAP 18 (16,20) cm H2O range 12–28; EPAP 8 cmH2O (8,8) range 6–10; inspiratory-to-expiratory time (I:E) ratio 1.75 (1.5,1.75). Pediatric Asthma Severity score and RR decreased (pcare units (PICU), 832 wards, with 52 of these PED ward patients were discharged home with only 2 hours of PED BiPAP with no returning to the PED within 72

  15. Exercise interventions to improve sleep in cancer patients: A systematic review and meta-analysis. (United States)

    Mercier, Joanie; Savard, Josée; Bernard, Paquito


    Exercise leads to several positive outcomes in oncology. However, the question as to whether exercise is a valuable option for improving patients' sleep, which is frequently disturbed in cancer patients, remains unanswered. The aims of this study were to conduct a systematic review and meta-analysis of randomized and non-randomized clinical trials that have investigated the effect of exercise on sleep outcomes, assessed subjectively and objectively. Relevant studies, published before May 2016, were traced through a systematic search of PubMed, Embase, PsycINFO, SportDiscus and Cochrane library databases. The review looked at twenty one trials, including 17 randomized controlled trials. Most interventions were home-based aerobic walking programs and breast cancer patients were the subgroup most represented. Sleep variables were most commonly used as secondary outcomes in the reviewed studies. Studies were highly heterogeneous in terms of methodology. The qualitative review of available evidence suggested a beneficial effect of exercise interventions on sleep in several studies (48%). However, the meta-analysis conducted on RCTs revealed no significant effect either on subjective or on objective sleep measures. This lack of significant effect could be due, at least in part, to a floor effect. More rigorous studies are needed to assess the effect of exercise interventions in cancer patients, in particular randomized controlled trials conducted in patients with clinically significant sleep disturbances at baseline.

  16. Secondary Data Analysis of National Surveys in Japan Toward Improving Population Health. (United States)

    Ikeda, Nayu


    Secondary data analysis of national health surveys of the general population is a standard methodology for health metrics and evaluation; it is used to monitor trends in population health over time and benchmark the performance of health systems. In Japan, the government has established electronic databases of individual records from national surveys of the population's health. However, the number of publications based on these datasets is small considering the scale and coverage of the surveys. There appear to be two major obstacles to the secondary use of Japanese national health survey data: strict data access control under the Statistics Act and an inadequate interdisciplinary research environment for resolving methodological difficulties encountered when dealing with secondary data. The usefulness of secondary analysis of survey data is evident with examples from the author's previous studies based on vital records and the National Health and Nutrition Surveys, which showed that (i) tobacco smoking and high blood pressure are the major risk factors for adult mortality from non-communicable diseases in Japan; (ii) the decrease in mean blood pressure in Japan from the late 1980s to the early 2000s was partly attributable to the increased use of antihypertensive medication and reduced dietary salt intake; and (iii) progress in treatment coverage and control of high blood pressure is slower in Japan than in the United States and Britain. National health surveys in Japan are an invaluable asset, and findings from secondary analyses of these surveys would provide important suggestions for improving health in people around the world.

  17. An improved high-throughput lipid extraction method for the analysis of human brain lipids. (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett


    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  18. Improvements to PATRIC, the all-bacterial Bioinformatics Database and Analysis Resource Center (United States)

    Wattam, Alice R.; Davis, James J.; Assaf, Rida; Boisvert, Sébastien; Brettin, Thomas; Bun, Christopher; Conrad, Neal; Dietrich, Emily M.; Disz, Terry; Gabbard, Joseph L.; Gerdes, Svetlana; Henry, Christopher S.; Kenyon, Ronald W.; Machi, Dustin; Mao, Chunhong; Nordberg, Eric K.; Olsen, Gary J.; Murphy-Olson, Daniel E.; Olson, Robert; Overbeek, Ross; Parrello, Bruce; Pusch, Gordon D.; Shukla, Maulik; Vonstein, Veronika; Warren, Andrew; Xia, Fangfang; Yoo, Hyunseung; Stevens, Rick L.


    The Pathosystems Resource Integration Center (PATRIC) is the bacterial Bioinformatics Resource Center ( Recent changes to PATRIC include a redesign of the web interface and some new services that provide users with a platform that takes them from raw reads to an integrated analysis experience. The redesigned interface allows researchers direct access to tools and data, and the emphasis has changed to user-created genome-groups, with detailed summaries and views of the data that researchers have selected. Perhaps the biggest change has been the enhanced capability for researchers to analyze their private data and compare it to the available public data. Researchers can assemble their raw sequence reads and annotate the contigs using RASTtk. PATRIC also provides services for RNA-Seq, variation, model reconstruction and differential expression analysis, all delivered through an updated private workspace. Private data can be compared by ‘virtual integration’ to any of PATRIC's public data. The number of genomes available for comparison in PATRIC has expanded to over 80 000, with a special emphasis on genomes with antimicrobial resistance data. PATRIC uses this data to improve both subsystem annotation and k-mer classification, and tags new genomes as having signatures that indicate susceptibility or resistance to specific antibiotics. PMID:27899627

  19. Improved MALDI imaging MS analysis of phospholipids using graphene oxide as new matrix (United States)

    Wang, Zhongjie; Cai, Yan; Wang, Yi; Zhou, Xinwen; Zhang, Ying; Lu, Haojie


    Matrix assisted laser desorption/ionization (MALDI) imaging mass spectrometry (IMS) is an increasingly important technique for detection and spatial localization of phospholipids on tissue. Due to the high abundance and being easy-to-ionize of phosphatidylcholine (PC), therefore, selecting matrix to yield signals of other lipids has become the most crucial factor for a successful MALDI-IMS analysis of phospholipids. Herein, graphene oxide (GO) was proposed as a new matrix to selectively enhance the detection of other types of phospholipids that are frequently suppressed by the presence of PC in positive mode. Compared to the commonly used matrix DHB, GO matrix significantly improved signal-to-noise ratios of phospholipids as a result of its high desorption/ionization efficiency for nonpolar compounds. Also, GO afforded homogeneous crystallizations with analytes due to its monolayer structure and good dispersion, resulting in better reproducibility of shot-to-shot (CV < 13%) and spot-to-spot (CV < 14%) analysis. Finally, GO matrix was successfully applied to simultaneous imaging of PC, PE, PS and glycosphingolipid in the mouse brain, with a total of 65 phospholipids identified. PMID:28294158

  20. Reliability of multiresolution deconvolution for improving depth resolution in SIMS analysis (United States)

    Boulakroune, M.'Hamed


    This paper deals the effectiveness and reliability of multiresolution deconvolution algorithm for recovery Secondary Ions Mass Spectrometry, SIMS, profiles altered by the measurement. This new algorithm is characterized as a regularized wavelet transform. It combines ideas from Tikhonov Miller regularization, wavelet analysis and deconvolution algorithms in order to benefit from the advantages of each. The SIMS profiles were obtained by analysis of two structures of boron in a silicon matrix using a Cameca-Ims6f instrument at oblique incidence. The first structure is large consisting of two distant wide boxes and the second one is thin structure containing ten delta-layers in which the deconvolution by zone was applied. It is shown that this new multiresolution algorithm gives best results. In particular, local application of the regularization parameter of blurred and estimated solutions at each resolution level provided to smoothed signals without creating artifacts related to noise content in the profile. This led to a significant improvement in the depth resolution and peaks' maximums.

  1. An improved global analysis of nuclear parton distribution functions including RHIC data (United States)

    Eskola, Kari J.; Paukkunen, Hannu; Salgado, Carlos A.


    We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus collisions, with inclusive high-pT hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the χ2 function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated χ2 minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-pT forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions—a sector in the nPDFs which has traditionally been very badly constrained. We obtain a strikingly stronger gluon shadowing than what has been estimated in previous global analyses. The obtained nPDFs are released as a parametrization called EPS08.

  2. An improved global analysis of nuclear parton distribution functions including RHIC data

    CERN Document Server

    Eskola, K J; Salgado, C A


    We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus collisions, with inclusive high-$p_T$ hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the $\\chi^2$ function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated $\\chi^2$ minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-$p_T$ forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions -- a sector in the nPDFs which has traditionally been very b...

  3. Large Improvements in MS/MS Based Peptide Identification Rates using a Hybrid Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Rawlins, Mitchell M.; Baxter, Douglas J.; Callister, Stephen J.; Lipton, Mary S.; Bryant, Donald A.


    We have developed a hybrid method for identifying peptides from global proteomics studies that significantly increases sensitivity and specificity in matching peptides to tandem mass spectra using database searches. The method increased the number of spectra that can be assigned to a peptide in a global proteomics study by 57-147% at an estimated false discovery rate of 5%, with clear room for even greater improvements. The approach combines the general utility of using consensus model spectra typical of database search methods1-3 with the accuracy of the intensity information contained in spectral libraries4-6. This hybrid approach is made possible by recent developments that elucidated the statistical framework common to both data analysis and statistical thermodynamics, resulting in a chemically inspired approach to incorporating fragment intensity information into both database searches and spectral library searches. We applied this approach to proteomics analysis of Synechococcus sp. PCC 7002, a cyanobacterium that is a model organism for studies of photosynthetic carbon fixation and biofuels development. The increased specificity and sensitivity of this approach allowed us to identify many more peptides involved in the processes important for photoautotrophic growth.

  4. [The analysis for improving the SNR of blood components noninvasive measurement with DS method]. (United States)

    Li, Gang; Wang, Hui-quan; Zhao, Zhe; Lin, Ling; Zhang, Bao-ju; Wu, Xiao-rong


    In order to increase the accuracy of blood components measurement and enhance the stability of prediction model, the quantitative signal-noise-ratio (SNR) analysis of measuring instruments based on dynamic spectrum (DS) and preprocessing method was conducted. The SNR of DS is increased after adding boxcar integrator, decreasing wavelength revolution, balancing the DS's SNR and excluding gross errors in preprocessing according to experiment results. Two volunteers were tested continuously for many times using the DS data acquiring system. The correlation coefficients of the each volunteer's DS data was increased from 0.934 and 0.953 to 0.991 and 0.987, respectively. Moreover, the gap between the correlation coefficient of the same volunteer's DS and different volunteers' DS is increased too, which shows that the SNR can be improved by these methods. The quantitative SNR analysis can guide the way of choosing preprocessing method efficiently, which will create the condition for clinical application of the blood components noninvasive measurement.

  5. Improvement of tissue analysis and classification using optical coherence tomography combined with Raman spectroscopy (United States)

    Liu, Chih-Hao; Qi, Ji; Lu, Jing; Wang, Shang; Wu, Chen; Shih, Wei-Chuan; Larin, Kirill V.


    Optical coherence tomography (OCT) is an optical imaging technique that is capable of performing high-resolution (approaching the histopathology level) and real-time imaging of tissues without use of contrast agents. Based on these advantages, the pathological features of tumors (in micro scale) can be identified during resection surgery. However, the accuracy of tumor margin prediction still needs to be enhanced for assisting the judgment of surgeons. In this regard, we present a two-dimensional computational method for advanced tissue analysis and characterization based on optical coherence tomography (OCT) and Raman spectroscopy (RS). The method combines the slope of OCT intensity signal and the Principal component (PC) of RS, and relies on the tissue optical attenuation and chemical ingredients for the classification of tissue types. Our pilot experiments were performed on mouse kidney, liver and small intestine. Results demonstrate the improvement of the tissue differentiation compared with the analysis only based on the OCT detection. This combined OCT/RS method is potentially useful as a novel optical biopsy technique for cancer detection.

  6. Ultrasound guidance improves the success rate of axillary plexus block: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Qin Qin


    Full Text Available ABSTRACT OBJECTIVE: To evaluate the value of real-time ultrasound (US guidance for axillary brachial plexus block (AXB through the success rate and the onset time. METHODS: The meta-analysis was carried out in the Anesthesiology Department of the Second Affiliated Hospital of Soochow University, Suzhou, Jiangsu Province, China. A literature search of Medline, EMBASE, Cochrane database from the years 2004 to 2014 was performed. The literature searches were carried out using medical subject headings and free-text word: "axilla", "axillary", "brachial plexus", "ultrasonography", "ultrasound", "ultrasonics". Two different reviewers carried out the search and evaluated studies independently. RESULTS: Seven randomized controlled trials, one cohort study and three retrospective studies were included. A total of 2042 patients were identified. 1157 patients underwent AXB using US guidance (US group and the controlled group included 885 patients (246 patients using traditional approach (TRAD and 639 patients using nerve stimulation (NS. Our analysis showed that the success rate was higher in the US group compared to the controlled group (90.64% vs. 82.21%, p < 0.00001. The average time to perform the block and the onset of sensory time were shorter in the US group than the controlled group. CONCLUSION: The present study demonstrated that the real-time ultrasound guidance for axillary brachial plexus block improves the success rate and reduce the mean time to onset of anesthesia and the time of block performance.

  7. Score-moment combined linear discrimination analysis (SMC-LDA) as an improved discrimination method. (United States)

    Han, Jintae; Chung, Hoeil; Han, Sung-Hwan; Yoon, Moon-Young


    A new discrimination method called the score-moment combined linear discrimination analysis (SMC-LDA) has been developed and its performance has been evaluated using three practical spectroscopic datasets. The key concept of SMC-LDA was to use not only the score from principal component analysis (PCA), but also the moment of the spectrum, as inputs for LDA to improve discrimination. Along with conventional score, moment is used in spectroscopic fields as an effective alternative for spectral feature representation. Three different approaches were considered. Initially, the score generated from PCA was projected onto a two-dimensional feature space by maximizing Fisher's criterion function (conventional PCA-LDA). Next, the same procedure was performed using only moment. Finally, both score and moment were utilized simultaneously for LDA. To evaluate discrimination performances, three different spectroscopic datasets were employed: (1) infrared (IR) spectra of normal and malignant stomach tissue, (2) near-infrared (NIR) spectra of diesel and light gas oil (LGO) and (3) Raman spectra of Chinese and Korean ginseng. For each case, the best discrimination results were achieved when both score and moment were used for LDA (SMC-LDA). Since the spectral representation character of moment was different from that of score, inclusion of both score and moment for LDA provided more diversified and descriptive information.

  8. Improvement of DGGE analysis by modifications of PCR protocols for analysis of microbial community members with low abundance. (United States)

    Wang, Yong-Feng; Zhang, Fang-Qiu; Gu, Ji-Dong


    Denaturing gradient gel electrophoresis (DGGE) is a powerful technique to reveal the community structures and composition of microorganisms in complex natural environments and samples. However, positive and reproducible polymerase chain reaction (PCR) products, which are difficult to acquire for some specific samples due to low abundance of the target microorganisms, significantly impair the effective applications of DGGE. Thus, nested PCR is often introduced to generate positive PCR products from the complex samples, but one problem is also introduced: The total number of thermocycling in nested PCR is usually unacceptably high, which results in skewed community structures by generation of random or mismatched PCR products on the DGGE gel, and this was demonstrated in this study. Furthermore, nested PCR could not resolve the uneven representative issue with PCR products of complex samples with unequal richness of microbial population. In order to solve the two problems in nested PCR, the general protocol was modified and improved in this study. Firstly, a general PCR procedure was used to amplify the target genes with the PCR primers without any guanine cytosine (GC) clamp, and then, the resultant PCR products were purified and diluted to 0.01 μg ml(-1). Subsequently, the diluted PCR products were utilized as templates to amplify again with the same PCR primers with the GC clamp for 17 cycles, and the products were finally subjected to DGGE analysis. We demonstrated that this is a much more reliable approach to obtain a high quality DGGE profile with high reproducibility. Thus, we recommend the adoption of this improved protocol in analyzing microorganisms of low abundance in complex samples when applying the DGGE fingerprinting technique to avoid biased results.

  9. Threat to the point: improving the value of comparative extinction risk analysis for conservation action. (United States)

    Murray, Kris A; Verde Arregoitia, Luis D; Davidson, Ana; Di Marco, Moreno; Di Fonzo, Martina M I


    Comparative extinction risk analysis is a common approach for assessing the relative plight of biodiversity and making conservation recommendations. However, the usefulness of such analyses for conservation practice has been questioned. One reason for underperformance may be that threats arising from global environmental changes (e.g., habitat loss, invasive species, climate change) are often overlooked, despite being widely regarded as proximal drivers of species' endangerment. We explore this problem by (i) reviewing the use of threats in this field and (ii) quantitatively investigating the effects of threat exclusion on the interpretation and potential application of extinction risk model results. We show that threat variables are routinely (59%) identified as significant predictors of extinction risk, yet while most studies (78%) include extrinsic factors of some kind (e.g., geographic or bioclimatic information), the majority (63%) do not include threats. Despite low overall usage, studies are increasingly employing threats to explain patterns of extinction risk. However, most continue to employ methods developed for the analysis of heritable traits (e.g., body size, fecundity), which may be poorly suited to the treatment of nonheritable predictors including threats. In our global mammal and continental amphibian extinction risk case studies, omitting threats reduced model predictive performance, but more importantly (i) reduced mechanistic information relevant to management; (ii) resulted in considerable disagreement in species classifications (12% and 5% for amphibians and mammals, respectively, translating to dozens and hundreds of species); and (iii) caused even greater disagreement (20-60%) in a downstream conservation application (species ranking). We conclude that the use of threats in comparative extinction risk analysis is important and increasing but currently in the early stages of development. Priorities for future studies include improving uptake

  10. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)


    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  11. Using the failure mode and effects analysis model to improve parathyroid hormone and adrenocorticotropic hormone testing

    Directory of Open Access Journals (Sweden)

    Magnezi R


    Full Text Available Racheli Magnezi,1 Asaf Hemi,1 Rina Hemi2 1Department of Management, Public Health and Health Systems Management Program, Bar Ilan University, Ramat Gan, 2Endocrine Service Unit, Sheba Medical Center, Tel Aviv, Israel Background: Risk management in health care systems applies to all hospital employees and directors as they deal with human life and emergency routines. There is a constant need to decrease risk and increase patient safety in the hospital environment. The purpose of this article is to review the laboratory testing procedures for parathyroid hormone and adrenocorticotropic hormone (which are characterized by short half-lives and to track failure modes and risks, and offer solutions to prevent them. During a routine quality improvement review at the Endocrine Laboratory in Tel Hashomer Hospital, we discovered these tests are frequently repeated unnecessarily due to multiple failures. The repetition of the tests inconveniences patients and leads to extra work for the laboratory and logistics personnel as well as the nurses and doctors who have to perform many tasks with limited resources.Methods: A team of eight staff members accompanied by the Head of the Endocrine Laboratory formed the team for analysis. The failure mode and effects analysis model (FMEA was used to analyze the laboratory testing procedure and was designed to simplify the process steps and indicate and rank possible failures.Results: A total of 23 failure modes were found within the process, 19 of which were ranked by level of severity. The FMEA model prioritizes failures by their risk priority number (RPN. For example, the most serious failure was the delay after the samples were collected from the department (RPN =226.1.Conclusion: This model helped us to visualize the process in a simple way. After analyzing the information, solutions were proposed to prevent failures, and a method to completely avoid the top four problems was also developed. Keywords: failure mode

  12. Improvement of the analysis of the biochemical oxygen demand (BOD) of Mediterranean seawater by seeding control. (United States)

    Simon, F Xavier; Penru, Ywann; Guastalli, Andrea R; Llorens, Joan; Baig, Sylvie


    Biochemical oxygen demand (BOD) is a useful parameter for assessing the biodegradability of dissolved organic matter in water. At the same time, this parameter is used to evaluate the efficiency with which certain processes remove biodegradable natural organic matter (NOM). However, the values of BOD in seawater are very low (around 2 mgO(2)L(-1)) and the methods used for its analysis are poorly developed. The increasing attention given to seawater desalination in the Mediterranean environment, and related phenomena such as reverse osmosis membrane biofouling, have stimulated interest in seawater BOD close to the Spanish coast. In this study the BOD analysis protocol was refined by introduction of a new step in which a critical quantity of autochthonous microorganisms, measured as adenosine triphosphate, is added. For the samples analyzed, this improvement allowed us to obtain reliable and replicable BOD measurements, standardized with solutions of glucose-glutamic acid and acetate. After 7 days of analysis duration, more than 80% of ultimate BOD is achieved, which in the case of easily biodegradable compounds represents nearly a 60% of the theoretical oxygen demand. BOD(7) obtained from the Mediterranean Sea found to be 2.0±0.3 mgO(2)L(-1) but this value decreased with seawater storage time due to the rapid consumption of labile compounds. No significant differences were found between two samples points located on the Spanish coast, since their organic matter content was similar. Finally, the determination of seawater BOD without the use of inoculum may lead to an underestimation of BOD.

  13. An improved Agrobacterium-mediated transformation system for the functional genetic analysis of Penicillium marneffei. (United States)

    Kummasook, Aksarakorn; Cooper, Chester R; Vanittanakom, Nongnuch


    We have developed an improved Agrobacterium-mediated transformation (AMT) system for the functional genetic analysis of Penicillium marneffei, a thermally dimorphic, human pathogenic fungus. Our AMT protocol included the use of conidia or pre-germinated conidia of P. marneffei as the host recipient for T-DNA from Agrobacterium tumefaciens and co-cultivation at 28°C for 36 hours. Bleomycin-resistant transformants were selected as yeast-like colonies following incubation at 37°C. The efficiency of transformation was approximately 123 ± 3.27 and 239 ± 13.12 transformants per plate when using 5 × 10(4) conidia and pre-germinated conidia as starting materials, respectively. Southern blot analysis demonstrated that 95% of transformants contained single copies of T-DNA. Inverse PCR was employed for identifying flanking sequences at the T-DNA insertion sites. Analysis of these sequences indicated that integration occurred as random recombination events. Among the mutants isolated were previously described stuA and gasC defective strains. These AMT-derived mutants possessed single T-DNA integrations within their particular coding sequences. In addition, other morphological and pigmentation mutants possessing a variety of gene-specific defects were isolated, including two mutants having T-DNA integrations within putative promoter regions. One of the latter integration events was accompanied by the deletion of the entire corresponding gene. Collectively, these results indicated that AMT could be used for large-scale, functional genetic analyses in P. marneffei. Such analyses can potentially facilitate the identification of those genetic elements related to morphogenesis, as well as pathogenesis in this medically important fungus.

  14. Liposome bupivacaine for improvement in economic outcomes and opioid burden in GI surgery: IMPROVE Study pooled analysis

    Directory of Open Access Journals (Sweden)

    Cohen SM


    Full Text Available Stephen M Cohen,1 Jon D Vogel,2 Jorge E Marcet,3 Keith A Candiotti4 1Atlanta Colon and Rectal Surgery, PA, Atlanta, GA, USA; 2General Surgery Clinic, University of Colorado, Aurora, CO, USA; 3Department of Surgery, Morsani College of Medicine, University of South Florida, Tampa, FL, USA; 4Department of Anesthesiology, University of Miami Leonard Miller School of Medicine, Miami, FL, USA Abstract: Postsurgical pain management remains a significant challenge. Liposome bupivacaine, as part of a multimodal analgesic regimen, has been shown to significantly reduce postsurgical opioid consumption, hospital length of stay (LOS, and hospitalization costs in gastrointestinal (GI surgery, compared with intravenous (IV opioid-based patient-controlled analgesia (PCA. Pooled results from open-label studies comparing a liposome bupivacaine-based multimodal analgesic regimen with IV opioid PCA were analyzed. Patients (n=191 who underwent planned surgery and received study drug (IV opioid PCA, n=105; multimodal analgesia, n=86 were included. Liposome bupivacaine-based multimodal analgesia compared with IV opioid PCA significantly reduced mean (standard deviation [SD] postsurgical opioid consumption (38 [55] mg versus [vs] 96 [85] mg; P<0.0001, postsurgical LOS (median 2.9 vs 4.3 days; P<0.0001, and mean hospitalization costs (US$8,271 vs US$10,726; P=0.0109. The multimodal analgesia group reported significantly fewer patients with opioid-related adverse events (AEs than the IV opioid PCA group (P=0.0027; there were no significant between-group differences in patient satisfaction scores at 30 days. A liposome bupivacaine-based multimodal analgesic regimen was associated with significantly less opioid consumption, opioid-related AEs, and better health economic outcomes compared with an IV opioid PCA-based regimen in patients undergoing GI surgery. Study registration: This pooled analysis is based on data from Phase IV clinical trials registered on the US National

  15. Training State and Community Instructors in Use of NHTSA Curriculum Packages: Driver Improvement Analysis, Driver License Examiner-Supervisor and Traffic Record Analysis. (United States)

    Burgener, V. E.; Tiryakioglu, Dona

    A series of five national instructor training institutes were planned for each of three emerging highway safety technician areas for which curriculum packages have been prepared (Driver Improvement Analysis, Driver License Examiner-Supervisor, and Traffic Record Analysis). Technical Education Research Centers and Dunlap & Associates…

  16. Technical improvements in the DNA analysis of the myotonic dystrophy (DM) mutation

    Energy Technology Data Exchange (ETDEWEB)

    Leblond, S.; Lehev, D.; Barcelo, J. [Children`s Hospital of Eastern Ontario, Ottawa (Canada)] [and others


    It has become increasingly clear that widespread clinical application of routine DNA diagnosis requires robust and easily replicated methodologies. Mutation analysis in DM involves detection of a CTG expansion which may increase in size between generations within a family. DNA testing has required two distinct methods: genomic and PCR DNA Southern blotting. Genomic Southerns visualize from E1 (hundreds of repeats) to the very largest E4 (thousands of repeats in congenital DM). PCR Southerns permit detection of the smallest mutations (E0, protomutations associated with minimal if any clinical signs) to E3, but E4 is not uniformly visualized. In order to improve the DM testing such that even the largest expansions are visualized by a single PCR test, we have altered the PCR conditions. Since the PCR conditions do not substantially affect the normal allele of less than 200 bp, CTG expansion must be directly monitored by hybridization with a labelled (CTG){sub 10} oligonucleotide. Unlike PCR of the CGG expansion in fragile X, addition of deazaGTP reduced visualization of the DM expansion. Addition of single-stranded protein and DMSO significantly improved PCR up to ten-fold such that E4s were visualized. The CTG expansion was very sensitive to the denaturing cycle temperature (which does not affect the intensity of the normal allele). Thus, 96{degrees}C on the Perkin Elmer 480 was optimal in our hands, with 98{degrees}C and 94{degrees}C actually causing loss of even the intermediate sized E1 and E2 expansions. This has implications when setting up the DM test on different thermocyclers where digital readings may not reflect actual block temperature. These PCR `tune-ups` will support more reliable and streamlined analyses, as more expansion mutations are recognized and routinely offered for clinical use.

  17. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    Energy Technology Data Exchange (ETDEWEB)

    Miwa, Kenta, E-mail: [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Inubushi, Masayuki, E-mail: [Department of Nuclear Medicine, Kawasaki Medical School, 577 Matsushima Kurashiki, Okayama 701-0192 (Japan); Wagatsuma, Kei, E-mail: [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Nagao, Michinobu, E-mail: [Department of Molecular Imaging and Diagnosis, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Murata, Taisuke, E-mail: [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koyama, Masamichi, E-mail: [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koizumi, Mitsuru, E-mail: [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Sasaki, Masayuki, E-mail: [Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan)


    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUV{sub max}) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUV{sub max} and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUV{sub max}, the difference did not reach statistical significance. Tumor size correlated significantly with SUV{sub max} (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUV{sub max} or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUV{sub max} and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy.

  18. Improved methodologies for continuous-flow analysis of stable water isotopes in ice cores (United States)

    Jones, Tyler R.; White, James W. C.; Steig, Eric J.; Vaughn, Bruce H.; Morris, Valerie; Gkinis, Vasileios; Markle, Bradley R.; Schoenemann, Spruce W.


    Water isotopes in ice cores are used as a climate proxy for local temperature and regional atmospheric circulation as well as evaporative conditions in moisture source regions. Traditional measurements of water isotopes have been achieved using magnetic sector isotope ratio mass spectrometry (IRMS). However, a number of recent studies have shown that laser absorption spectrometry (LAS) performs as well or better than IRMS. The new LAS technology has been combined with continuous-flow analysis (CFA) to improve data density and sample throughput in numerous prior ice coring projects. Here, we present a comparable semi-automated LAS-CFA system for measuring high-resolution water isotopes of ice cores. We outline new methods for partitioning both system precision and mixing length into liquid and vapor components - useful measures for defining and improving the overall performance of the system. Critically, these methods take into account the uncertainty of depth registration that is not present in IRMS nor fully accounted for in other CFA studies. These analyses are achieved using samples from a South Pole firn core, a Greenland ice core, and the West Antarctic Ice Sheet (WAIS) Divide ice core. The measurement system utilizes a 16-position carousel contained in a freezer to consecutively deliver ˜ 1 m × 1.3 cm2 ice sticks to a temperature-controlled melt head, where the ice is converted to a continuous liquid stream and eventually vaporized using a concentric nebulizer for isotopic analysis. An integrated delivery system for water isotope standards is used for calibration to the Vienna Standard Mean Ocean Water (VSMOW) scale, and depth registration is achieved using a precise overhead laser distance device with an uncertainty of ±0.2 mm. As an added check on the system, we perform inter-lab LAS comparisons using WAIS Divide ice samples, a corroboratory step not taken in prior CFA studies. The overall results are important for substantiating data obtained from LAS

  19. Analysis of Stakeholder's Behaviours for an Improved Management of an Agricultural Coastal Region in Oman (United States)

    Khatri, Ayisha Al; Jens, Grundmann; der Weth Rüdiger, van; Niels, Schütze


    differences exist between groups on how to achieve this improvement, since farmers prefer management interventions operating more on the water resources side while decision makers support measures for a better management on the water demand side. Furthermore, the opinions within single groups are sometimes contradicting for several management interventions. The use of more advanced statistical methods like discriminant analysis or Bayesian network allow for identifying factors and drivers to explain these differences. Both approaches, will help to understand stakeholder's behaviours and to evaluate the implementation potential of several management interventions. Keywords IWRM, Stakeholder participation, field survey, statistical analysis, Oman

  20. Experimental study and mechanism analysis of modified limestone by red mud for improving desulfurization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongtao; Han, Kuihua; Niu, Shengli; Lu, Chunmei; Liu, Mengqi; Li, Hui [Shandong Univ., Jinan (China). School of Energy and Power Engineering


    Red mud is a type of solid waste generated during alumina production from bauxite, and how to dispose and utilize red mud in a large scale is yet a question with no satisfied answer. This paper attempts to use red mud as a kind of additive to modify the limestone. The enhancement of the sulfation reaction of limestone by red mud (two kinds of Bayer process red mud and one kind of sintering process red mud) are studied by a tube furnace reactor. The calcination and sulfation process and kinetics are investigated in a thermogravimetric (TG) analyzer. The results show that red mud can effectively improve the desulfurization performance of limestone in the whole temperature range (1,073-1,373K). Sulfur capacity of limestone (means quality of SO{sub 2} which can be retained by 100mg of limestone) can be increased by 25.73, 7.17 and 15.31% while the utilization of calcium can be increased from 39.68 to 64.13%, 60.61 and 61.16% after modified by three kinds of red mud under calcium/metallic element (metallic element described here means all metallic elements which can play a catalytic effect on the sulfation process, including the Na, K, Fe, Ti) ratio being 15, at the temperature of 1,173K. The structure of limestone modified by red mud is interlaced and tridimensional which is conducive to the sulfation reaction. The phase composition analysis measured by XRD of modified limestone sulfated at high temperature shows that there are correspondingly more sulphates for silicate and aluminate complexes of calcium existing in the products. Temperature, calcium/metallic element ratio and particle diameter are important factors as for the sulfation reaction. The optimum results can be obtained as calcium/metallic element ratio being 15. Calcination characteristic of limestone modified by red mud shows a migration to lower temperature direction. The enhancement of sulfation by doping red mud is more pronounced once the product layer has been formed and consequently the promoting

  1. Stable Isotope Labeling for Improved Comparative Analysis of RNA Digests by Mass Spectrometry (United States)

    Paulines, Mellie June; Limbach, Patrick A.


    Even with the advent of high throughput methods to detect modified ribonucleic acids (RNAs), mass spectrometry remains a reliable method to detect, characterize, and place post-transcriptional modifications within an RNA sequence. Here we have developed a stable isotope labeling comparative analysis of RNA digests (SIL-CARD) approach, which improves upon the original 18O/16O labeling CARD method. Like the original, SIL-CARD allows sequence or modification information from a previously uncharacterized in vivo RNA sample to be obtained by direct comparison with a reference RNA, the sequence of which is known. This reference is in vitro transcribed using a 13C/15N isotopically enriched nucleoside triphosphate (NTP). The two RNAs are digested with an endonuclease, the specificity of which matches the labeled NTP used for transcription. As proof of concept, several transfer RNAs (tRNAs) were characterized by SIL-CARD, where labeled guanosine triphosphate was used for the reference in vitro transcription. RNase T1 digestion products from the in vitro transcript will be 15 Da higher in mass than the same digestion products from the in vivo tRNA that are unmodified, leading to a doublet in the mass spectrum. Singlets, rather than doublets, arise if a sequence variation or a post-transcriptional modification is present that results in a relative mass shift different from 15 Da. Moreover, the use of the in vitro synthesized tRNA transcript allows for quantitative measurement of RNA abundance. Overall, SIL-CARD simplifies data analysis and enhances quantitative RNA modification mapping by mass spectrometry.

  2. Secondary Data Analysis of National Surveys in Japan Toward Improving Population Health

    Directory of Open Access Journals (Sweden)

    Nayu Ikeda


    Full Text Available Secondary data analysis of national health surveys of the general population is a standard methodology for health metrics and evaluation; it is used to monitor trends in population health over time and benchmark the performance of health systems. In Japan, the government has established electronic databases of individual records from national surveys of the population’s health. However, the number of publications based on these datasets is small considering the scale and coverage of the surveys. There appear to be two major obstacles to the secondary use of Japanese national health survey data: strict data access control under the Statistics Act and an inadequate interdisciplinary research environment for resolving methodological difficulties encountered when dealing with secondary data. The usefulness of secondary analysis of survey data is evident with examples from the author’s previous studies based on vital records and the National Health and Nutrition Surveys, which showed that (i tobacco smoking and high blood pressure are the major risk factors for adult mortality from non-communicable diseases in Japan; (ii the decrease in mean blood pressure in Japan from the late 1980s to the early 2000s was partly attributable to the increased use of antihypertensive medication and reduced dietary salt intake; and (iii progress in treatment coverage and control of high blood pressure is slower in Japan than in the United States and Britain. National health surveys in Japan are an invaluable asset, and findings from secondary analyses of these surveys would provide important suggestions for improving health in people around the world.

  3. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin


    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  4. Improved machine learning method for analysis of gas phase chemistry of peptides

    Directory of Open Access Journals (Sweden)

    Ahn Natalie


    Full Text Available Abstract Background Accurate peptide identification is important to high-throughput proteomics analyses that use mass spectrometry. Search programs compare fragmentation spectra (MS/MS of peptides from complex digests with theoretically derived spectra from a database of protein sequences. Improved discrimination is achieved with theoretical spectra that are based on simulating gas phase chemistry of the peptides, but the limited understanding of those processes affects the accuracy of predictions from theoretical spectra. Results We employed a robust data mining strategy using new feature annotation functions of MAE software, which revealed under-prediction of the frequency of occurrence in fragmentation of the second peptide bond. We applied methods of exploratory data analysis to pre-process the information in the MS/MS spectra, including data normalization and attribute selection, to reduce the attributes to a smaller, less correlated set for machine learning studies. We then compared our rule building machine learning program, DataSqueezer, with commonly used association rules and decision tree algorithms. All used machine learning algorithms produced similar results that were consistent with expected properties for a second gas phase mechanism at the second peptide bond. Conclusion The results provide compelling evidence that we have identified underlying chemical properties in the data that suggest the existence of an additional gas phase mechanism for the second peptide bond. Thus, the methods described in this study provide a valuable approach for analyses of this kind in the future.

  5. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Siew Li Teoh


    Full Text Available Chicken essence (CE is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane’s Risk of Bias (ROB tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD of −0.55 (−1.04, −0.06 and another with no significant difference (pooled SMD of 0.70 (−0.001, 1.40. For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (−0.16, 1.42. Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE.

  6. Combination of principal component analysis and optical-flow motion compensation for improved cardiac MR thermometry (United States)

    Toupin, S.; de Senneville, B. Denis; Ozenne, V.; Bour, P.; Lepetit-Coiffe, M.; Boissenin, M.; Jais, P.; Quesson, B.


    The use of magnetic resonance (MR) thermometry for the monitoring of thermal ablation is rapidly expanding. However, this technique remains challenging for the monitoring of the treatment of cardiac arrhythmia by radiofrequency ablation due to the heart displacement with respiration and contraction. Recent studies have addressed this problem by compensating in-plane motion in real-time with optical-flow based tracking technique. However, these algorithms are sensitive to local variation of signal intensity on magnitude images associated with tissue heating. In this study, an optical-flow algorithm was combined with a principal component analysis method to reduce the impact of such effects. The proposed method was integrated to a fully automatic cardiac MR thermometry pipeline, compatible with a future clinical workflow. It was evaluated on nine healthy volunteers under free breathing conditions, on a phantom and in vivo on the left ventricle of a sheep. The results showed that local intensity changes in magnitude images had lower impact on motion estimation with the proposed method. Using this strategy, the temperature mapping accuracy was significantly improved.


    Directory of Open Access Journals (Sweden)

    Marek Ďurica


    Full Text Available Nowadays, Internet plays a major role in people's lives. It is usually used for entertainment, as a source of information, and also for electronic commerce. Electronic commerce (e-commerce is gradually replacing traditional shopping, especially in the past years. It is a quick and easy form of marketing, which provides convenience for the customers, and, therefore, more and more users are using this form of shopping on the Internet. E-commerce also provides new opportunities for companies, which force them to begin dealing with the Internet. Many customers who are shopping on the Internet look for the best product or service close to their home. Most of the space in the search results in Google is occupied by local results. If a company offers some goods or services and they do not show up on the local search results, the company may be losing a lot of profits from these potential customers. That is why companies have to focus on best ranking in the local search results. In this article, we try to experimentally determine which factors affect ranking in Google search. Of course, it is necessary to quantify the impact of these factors. To select these factors and to determine their impact, we use exact methods of mathematical statistics, hypothesis testing, correlation, and regression analysis. Confirmation and quantification of the impact of some qualitative and quantitative characteristics of the company can be used to formulate recommendations for improving corporate strategy in acquiring new customers.

  8. Improved retrieval of SO2 from Ozone Monitoring Instrument: residual analysis and data noise correction

    Directory of Open Access Journals (Sweden)

    D. Han


    Full Text Available In this study, based on Ozone Monitoring Instrument (OMI observation data and considering the shortage of current Band Residual Difference algorithm (BRD algorithm in data noise correction since late 2008, we make a detailed analysis of OMI SO2 main noise sources and determine the best residual adjustment area by analyzing the different residual correction effects. After such modification, the OMI SO2 PBL results noise which use BRD retrieval algorithm is largely reduced, the precision of the SO2 results is improved, and the optimization of the BRD algorithm for data noise is realized. We select China as our study area and compare the results between the optimized results and the OMI SO2 PBL products. Results show that they are consistent with each other in January 2008; however, our modified algorithm results have higher precision and more reliable SO2 spatial distribution in January 2009. Finally, other current retrieval error sources are discussed, and further research is needed on these areas.

  9. Improved metabolites of pharmaceutical ingredient grade Ginkgo biloba and the correlated proteomics analysis. (United States)

    Zheng, Wen; Li, Ximin; Zhang, Lin; Zhang, Yanzhen; Lu, Xiaoping; Tian, Jingkui


    Ginkgo biloba is an attractive and traditional medicinal plant, and has been widely used as a phytomedicine in the prevention and treatment of cardiovascular and cerebrovascular diseases. Flavonoids and terpene lactones are the major bioactive components of Ginkgo, whereas the ginkgolic acids (GAs) with strong allergenic properties are strictly controlled. In this study, we tested the content of flavonoids and GAs under ultraviolet-B (UV-B) treatment and performed comparative proteomic analyses to determine the differential proteins that occur upon UV-B radiation. That might play a crucial role in producing flavonoids and GAs. Our phytochemical analyses demonstrated that UV-B irradiation significantly increased the content of active flavonoids, and decreased the content of toxic GAs. We conducted comparative proteomic analysis of both whole leaf and chloroplasts proteins. In total, 27 differential proteins in the whole leaf and 43 differential proteins in the chloroplast were positively identified and functionally annotated. The proteomic data suggested that enhanced UV-B radiation exposure activated antioxidants and stress-responsive proteins as well as reduced the rate of photosynthesis. We demonstrate that UV-B irradiation pharmaceutically improved the metabolic ingredients of Ginkgo, particularly in terms of reducing GAs. With high UV absorption properties, and antioxidant activities, the flavonoids were likely highly induced as protective molecules following UV-B irradiation.

  10. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems. (United States)

    Wu, Fan; Xu, Lili


    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  11. Doing supplements to improve performance in club cycling: a life-course analysis. (United States)

    Stewart, B; Outram, S; Smith, A C T


    Using qualitative life-course and pathway analysis, this article explores the beliefs that serious club cyclists have about performance improvement, and what they think are appropriate and inappropriate ways of achieving it. We interviewed 11 cyclists from suburban clubs in Melbourne, Australia, and invited them to discuss their approach to training, racing, and supplementation. We found that each of the 11 cyclists were not only committed to the sport, but also paid a keen interest in bike technology and training regimes. In addition, they believed that supplement use was integral to meeting the physical and mental demands of their sport, even at club level. They also understood that supplement use, like training regimes, followed a sequential pathway where the accumulation of capacity, know-know, and knowledge, allowed progression to the next level of performance. And, like similar studies of club cycling in Europe, this cohort of cyclists balked at using banned substances, but also believed that in order to effectively transition to the elite - that is, professional - level, some additional supplement and drug-use was essential.

  12. A New Method for Improving the Discrimination Power and Weights Dispersion in the Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    S. Kordrostami


    Full Text Available The appropriate choice of input-output weights is necessary to have a successful DEA model. Generally, if the number of DMUs i.e., n, is less than number of inputs and outputs i.e., m+s, then many of DMUs are introduced as efficient then the discrimination between DMUs is not possible. Besides, DEA models are free to choose the best weights. For resolving the problems that are resulted from freedom of weights, some constraints are set on the input-output weights. Symmetric weight constraints are a kind of weight constrains. In this paper, we represent a new model based on a multi-criterion data envelopment analysis (MCDEA are developed to moderate the homogeneity of weights distribution by using symmetric weight constrains.Consequently, we show that the improvement of the dispersal of unrealistic input-output weights and the increasing discrimination power for our suggested models. Finally, as an application of the new model, we use this model to evaluate and ranking guilan selected hospitals.

  13. Structure analysis of interstellar clouds: I. Improving the Delta-variance method

    CERN Document Server

    Ossenkopf, V; Stutzki, J


    The Delta-variance analysis, has proven to be an efficient and accurate method of characterising the power spectrum of interstellar turbulence. The implementation presently in use, however, has several shortcomings. We propose and test an improved Delta-variance algorithm for two-dimensional data sets, which is applicable to maps with variable error bars and which can be quickly computed in Fourier space. We calibrate the spatial resolution of the Delta-variance spectra. The new Delta-variance algorithm is based on an appropriate filtering of the data in Fourier space. It allows us to distinguish the influence of variable noise from the actual small-scale structure in the maps and it helps for dealing with the boundary problem in non-periodic and/or irregularly bounded maps. We try several wavelets and test their spatial sensitivity using artificial maps with well known structure sizes. It turns out that different wavelets show different strengths with respect to detecting characteristic structures and spectr...

  14. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador


    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  15. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Analysis of Factors Affecting Its Performance (United States)

    Perry, Bruce A.; Anderson, Molly S.


    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station Water Processor Assembly to form a complete water recovery system for future missions. A preliminary chemical process simulation was previously developed using Aspen Custom Modeler® (ACM), but it could not simulate thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. This paper describes modifications to the ACM simulation of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version can be used to model thermal startup and predicts the total energy consumption of the CDS. The simulation has been validated for both NaC1 solution and pretreated urine feeds and no longer requires retuning when operating parameters change. The simulation was also used to predict how internal processes and operating conditions of the CDS affect its performance. In particular, it is shown that the coefficient of performance of the thermoelectric heat pump used to provide heating and cooling for the CDS is the largest factor in determining CDS efficiency. Intrastage heat transfer affects CDS performance indirectly through effects on the coefficient of performance.

  16. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.


    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  17. Analysis of microbiota on abalone (Haliotis discus hannai) in South Korea for improved product management. (United States)

    Lee, Min-Jung; Lee, Jin-Jae; Chung, Han Young; Choi, Sang Ho; Kim, Bong-Soo


    Abalone is a popular seafood in South Korea; however, because it contains various microorganisms, its ingestion can cause food poisoning. Therefore, analysis of the microbiota on abalone can improve understanding of outbreaks and causes of food poisoning and help to better manage seafood products. In this study, we collected a total of 40 abalones from four different regions in March and July, which are known as the maximum abalone production areas in Korea. The microbiota were analyzed using high-throughput sequencing, and bacterial loads on abalone were quantified by real-time PCR. Over 2700 species were detected in the samples, and Alpha- and Gammaproteobacteria were the predominant classes. The differences in microbiota among regions and at each sampling time were also investigated. Although Psychrobacter was the dominant genus detected on abalone in both March and July, the species compositions were different between the two sampling times. Five potential pathogens (Lactococcus garvieae, Yersinia kristensenii, Staphylococcus saprophyticus, Staphylococcus warneri, and Staphylococcus epidermidis) were detected among the abalone microbiota. In addition, we analyzed the influence of Vibrio parahaemolyticus infection on shifts in abalone microbiota during storage at different temperatures. Although the proportion of Vibrio increased over time in infected and non-infected abalone, the shifts of microbiota were more dynamic in infected abalone. These results can be used to better understand the potential of food poisoning caused by abalone consumption and manage abalone products according to the microbiota composition.


    Directory of Open Access Journals (Sweden)

    Indar Sugiarto


    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  19. An Improved Variable Structure Adaptive Filter Design and Analysis for Acoustic Echo Cancellation

    Directory of Open Access Journals (Sweden)

    A. Kar


    Full Text Available In this research an advance variable structure adaptive Multiple Sub-Filters (MSF based algorithm for single channel Acoustic Echo Cancellation (AEC is proposed and analyzed. This work suggests a new and improved direction to find the optimum tap-length of adaptive filter employed for AEC. The structure adaptation, supported by a tap-length based weight update approach helps the designed echo canceller to maintain a trade-off between the Mean Square Error (MSE and time taken to attain the steady state MSE. The work done in this paper focuses on replacing the fixed length sub-filters in existing MSF based AEC algorithms which brings refinements in terms of convergence, steady state error and tracking over the single long filter, different error and common error algorithms. A dynamic structure selective coefficient update approach to reduce the structural and computational cost of adaptive design is discussed in context with the proposed algorithm. Simulated results reveal a comparative performance analysis over proposed variable structure multiple sub-filters designs and existing fixed tap-length sub-filters based acoustic echo cancellers.

  20. Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method. (United States)

    Lu, Zhaolin; Hu, Xiaojuan; Lu, Yao


    Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250 μm were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250 μm. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis.

  1. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyougn Tae; Moon, Young Min; Choi, Sung Won; Heo, Sun [Korea Advanced Institute Science and Technology, Taejon (Korea, Republic of)


    The loss-of-RHR accident during midloop operation has been important as results of the probabilistic safety analysis. The condensation models In RELAP5/MOD3 are not proper to analyze the midloop operation. To audit and improve the model in RELAP5/MOD3.2, several items of separate effect tests have been performed. The 29 sets of reflux condensation data is obtained and the correlation is developed with these heat transfer coefficient's data. In the experiment of the direct contact condensation in hot leg, the apparatus setting is finished and a few experimental data is obtained. Non-iterative model is used to predict the model in RELAP5/MOD3.2 with the results of reflux condensation and evaluates better than the present model. The results of the direct contact condensation in a hot leg represent to be similar with the present model. The study of the CCF and liquid entrainment in a surge line and pressurizer is selected as the third separate experiment and is on performance.

  2. An Improved Adaptive Multi-way Principal Component Analysis for Monitoring Streptomycin Fermentation Process

    Institute of Scientific and Technical Information of China (English)

    何宁; 王树青; 谢磊


    Multi-way principal component analysis (MPCA) had been successfully applied to monitoring the batch and semi-batch process in most chemical industry. An improved MPCA approach, step-by-step adaptive MPCA (SAMPCA), using the process variable trajectories to monitoring the batch process is presented in this paper. It does not need to estimate or fill in the unknown part of the process variable trajectory deviation from the current time until the end. The approach is based on a MPCA method that processes the data in a sequential and adaptive manner. The adaptive rate is easily controlled through a forgetting factor that controls the weight of past data in a summation. This algorithm is used to evaluate the industrial streptomycin fermentation process data and is compared with the traditional MPCA. The results show that the method is more advantageous than MPCA, especially when monitoring multi-stage batch process where the latent vector structure can change at several points during the batch.


    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin


    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  4. Genre Analysis and Writing Skill: Improving Iranian EFL Learners Writing Performance through the Tenets of Genre Analysis

    Directory of Open Access Journals (Sweden)

    Nazanin Naderi Kalali


    Full Text Available The main thrust of this study was to determine whether a genre-based instruction improve the writing proficiency of Iranian EFL learners. To this end, 30 homogenous Iranian BA learners studying English at Islamic Azad University, Bandar Abbas Branch were selected as the participants of the study through a version of TOEFL test as the proficiency test. The selected participants were 15 females and 15 males who were randomly divided into two groups of experimental and control. The both experimental and control groups were asked to write on a topic determined by the researcher which were considered as the pre-test. The writing of the students were scored using holistic scoring procedure. The subjects received sixteen hours instruction—the experimental group using a genre-based pedagogy and the control group through the traditional methodology which was followed by a post-test—the subjects were, this time, asked to write on the same topic which they were asked to write before instruction. Their post-writings were also scored through the holistic scoring procedures. In analyzing the data, t-test statistic was utilized for comparing the performances of the two groups. It was found that there is statistically significant difference between the writing ability of the participants who go under a genre-based instruction and who don’t. The study, however, didn’t find any significant role for gender. Keywords: genre analysis, writing skill, holistic scoring procedure, pre-test, post-test, t-test

  5. Hyperspectral Analysis of Paleoseismic Trench Stratigraphy: Toward Improving the Recognition and Documentation of Past Earthquakes (United States)

    Ragona, D. E.; Minster, B.; Fialko, Y.; Rockwell, T.


    We are conducting a pilot project to use hyperspectral imagery to assist in the recognition and documentation of paleoseismic events in trench exposures. Recent advances in hyperspectral imagery suggest that stratigraphy can be analyzed in much the same way as Aviris imagery of Earth's surface. In principle, hyperspectral images may be able to elucidate and record otherwise-poor stratigraphy in some exposures, thereby improving the information that can be gleaned from a paleoseismic site. This technique may also eliminate some problems in interpretation of the earthquake history of a site by illuminating details of the stratigraphy and structure that are not apparent to the human eye, such as unique unit correlations across complicated fault ruptures. The trench site chosen for this study is located at Hog Lake in the Anza seismic gap along the San Jacinto Fault in southern California. The site was selected because of its detailed, well defined stratigraphy. The method adopted was to obtain a 50 cm side square matrix of samples that could be used to generate a low-resolution image of the sampled area, in the sense that each sample represents a single pixel. The samples were collected 2.5 cm apart in a square matrix of 20x20 samples. Each of the 400 samples collected are stored into PVC or metallic cylinders of 3/4" or 1/2" diameter. All samples were spectrally analyzed at JPL using a FieldSpec Pro instrument that measure radiation in the 350-2,500 nm wavelength window. Five measurements of each sample were performed, along with measurements of the radiation reflected by a reference surface (Spectralon), under natural light and clear sky conditions. The data obtained was then processed to obtain reflectance spectra for all samples. Principal Component Analysis was used to create a pixilated image from the three dominant components. That image shows promising similarity with the standard digital picture of the sampled trench wall. However, large random measurement

  6. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyoung Tae; Moon, Young Min; Choi, Sung Won; Hwang, Do Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)


    The direct-contact condensation hear transfer coefficients are experimentally obtained in the following conditions : pure steam/steam in the presence of noncondensible gas, horizontal/slightly inclined pipe, cocurrent/countercurrent stratified flow with water. The empirical correlation for liquid Nusselt number is developed in conditions of the slightly inclined pipe and the cocurrent stratified flow. The several models - the wall friction coefficient, the interfacial friction coefficient, the correlation of direct-contact condensation with noncondensible gases, and the correlation of wall film condensation - in the RELAP5/MOD3.2 code are modified, As results, RELAP5/MOD3.2 is improved. The present experimental data is used for evaluating the improved code. The standard RELAP5/MOD3.2 code is modified using the non-iterative modeling, which is a mechanistic model and does not require any interfacial information such as the interfacial temperature, The modified RELAP5/MOD3.2 code os used to simulate the horizontally stratified in-tube condensation experiment which represents the direct-contact condensation phenomena in a hot leg of a nuclear reactor. The modeling capabilities of the modified code as well as the standard code are assessed using several hot-leg condensation experiments. The modified code gives better prediction over local experimental data of liquid void fraction and interfacial heat transfer coefficient than the standard code. For the separate effect test of the thermal-hydraulic phenomena in the pressurizer, the scaling analysis is performed to obtain a similarity of the phenomena between the Korea Standard Nuclear Power Plant(KSNPP) and the present experimental facility. The diameters and lengths of the hot-leg, the surge line and the pressurizer are scaled down with the similitude of CCFL and velocity. The ratio of gas flow rate is 1/25. The experimental facility is composed of the air-water supply tank, the horizontal pipe, the surge line and the

  7. Improvement of the quality of work in a biochemistry laboratory via measurement system analysis. (United States)

    Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming


    An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The ANOVA of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to

  8. Use of failure mode effect analysis (FMEA) to improve medication management process. (United States)

    Jain, Khushboo


    Purpose Medication management is a complex process, at high risk of error with life threatening consequences. The focus should be on devising strategies to avoid errors and make the process self-reliable by ensuring prevention of errors and/or error detection at subsequent stages. The purpose of this paper is to use failure mode effect analysis (FMEA), a systematic proactive tool, to identify the likelihood and the causes for the process to fail at various steps and prioritise them to devise risk reduction strategies to improve patient safety. Design/methodology/approach The study was designed as an observational analytical study of medication management process in the inpatient area of a multi-speciality hospital in Gurgaon, Haryana, India. A team was made to study the complex process of medication management in the hospital. FMEA tool was used. Corrective actions were developed based on the prioritised failure modes which were implemented and monitored. Findings The percentage distribution of medication errors as per the observation made by the team was found to be maximum of transcription errors (37 per cent) followed by administration errors (29 per cent) indicating the need to identify the causes and effects of their occurrence. In all, 11 failure modes were identified out of which major five were prioritised based on the risk priority number (RPN). The process was repeated after corrective actions were taken which resulted in about 40 per cent (average) and around 60 per cent reduction in the RPN of prioritised failure modes. Research limitations/implications FMEA is a time consuming process and requires a multidisciplinary team which has good understanding of the process being analysed. FMEA only helps in identifying the possibilities of a process to fail, it does not eliminate them, additional efforts are required to develop action plans and implement them. Frank discussion and agreement among the team members is required not only for successfully conducing

  9. Numerical Analysis of the Unsteady Propeller Performance in the Ship Wake Modified By Different Wake Improvement Devices

    Directory of Open Access Journals (Sweden)

    Bugalski Tomasz


    Full Text Available The paper presents the summary of results of the numerical analysis of the unsteady propeller performance in the non-uniform ship wake modified by the different wake improvement devices. This analysis is performed using the lifting surface program DUNCAN for unsteady propeller analysis. Te object of the analysis is a 7000 ton chemical tanker, for which four different types of the wake improvement devices have been designed: two vortex generators, a pre-swirl stator, and a boundary layer alignment device. These produced five different cases of the ship wake structure: the original hull and hull equipped alternatively with four wake improvement devices. Two different propellers were analyzed in these five wake fields, one being the original reference propeller P0 and the other - a specially designed, optimized propeller P3. Te analyzed parameters were the pictures of unsteady cavitation on propeller blades, harmonics of pressure pulses generated by the cavitating propellers in the selected points and the fluctuating bearing forces on the propeller shaft. Some of the calculated cavitation phenomena were confronted with the experimental. Te objective of the calculations was to demonstrate the differences in the calculated unsteady propeller performance resulting from the application of different wake improvement devices. Te analysis and discussion of the results, together with the appropriate conclusions, are included in the paper.

  10. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;


    to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve......The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check......; (ii) selection of the most appropriate form of the property model; (iii) selection of the data-set for performing parameter regression and uncertainty analysis; and (iv) analysis of model prediction errors to take necessary corrective steps to improve the accuracy and the reliability of property...

  11. Stream gradient Hotspot and Cluster Analysis (SL-HCA) for improving the longitudinal profiles metrics (United States)

    Troiani, Francesco; Piacentini, Daniela; Seta Marta, Della


    analysis conducted on 52 clusters of high and very high Gi* values indicate that mass movement of slope material represents the dominant process producing over-steeped long-profiles along connected streams, whereas the litho-structure accounts for the main anomalies along disconnected steams. Tectonic structures generally provide to the largest clusters. Our results demonstrate that SL-HCA maps have the same potential of lithologically-filtered SL maps for detecting knickzones due to hillslope processes and/or tectonic structures. The reduced-complexity model derived from SL-HCA approach highly improve the readability of the morphometric outcomes, thus the interpretation at a regional scale of the geological-geomorphological meaning of over-steeped segments on long-profiles. SL-HCA maps are useful to investigate and better interpret knickzones within regions poorly covered by geological data and where field surveys are difficult to be performed.

  12. Metal Foam Analysis: Improving Sandwich Structure Technology for Engine Fan and Propeller Blades (United States)

    Fedor, Jessica L.


    The Life Prediction Branch of the NASA Glenn Research Center is searching for ways to construct aircraft and rotorcraft engine fan and propeller blades that are lighter and less costly. One possible design is to create a sandwich structure composed of two metal faces sheets and a metal foam core. The face sheets would carry the bending loads and the foam core would have to resist the transverse shear loads. Metal foam is ideal because of its low density and energy absorption capabilities, making the structure lighter, yet still stiff. The material chosen for the face sheets and core was 17-4PH stainless steel, which is easy to make and has appealing mechanical properties. This material can be made inexpensively compared to titanium and polymer matrix composites, the two current fan blade alternatives. Initial tests were performed on design models, including vibration and stress analysis. These tests revealed that the design is competitive with existing designs; however, some problems were apparent that must be addressed before it can be implemented in new technology. The foam did not hold up as well as expected under stress. This could be due to a number of issues, but was most likely a result of a large number of pores within the steel that weakened the structure. The brazing between the face sheets and the foam was also identified as a concern. The braze did not hold up well under shear stress causing the foam to break away from the face sheets. My role in this project was to analyze different options for improving the design. I primarily spent my time examining various foam samples, created with different sintering conditions, to see which exhibited the most favorable characteristics for our purpose. Methods of analysis that I employed included examining strut integrity under a microscope, counting the number of cells per inch, measuring the density, testing the microhardness, and testing the strength under compression. Shear testing will also be done to examine

  13. Structural analysis of char by Raman spectroscopy: Improving band assignments through computational calculations from first principles

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Matthew W.; Dallmeyer, Ian; Johnson, Timothy J.; Brauer, Carolyn S.; McEwen, Jean-Sabin; Espinal, Juan F.; Garcia-Perez, Manuel


    Raman spectroscopy is a powerful tool for the characterization of many carbon 27 species. The complex heterogeneous nature of chars and activated carbons has confounded 28 complete analysis due to the additional shoulders observed on the D-band and high intensity 29 valley between the D and G-bands. In this paper the effects of various vacancy and substitution 30 defects have been systematically analyzed via molecular modeling using density functional 31 theory (DFT) and how this is manifested in the calculated gas-phase Raman spectra. The 32 accuracy of these calculations was validated by comparison with (solid-phase) experimental 33 spectra, with a small correction factor being applied to improve the accuracy of frequency 34 predictions. The spectroscopic effects on the char species are best understood in terms of a 35 reduced symmetry as compared to a “parent” coronene molecule. Based upon the simulation 36 results, the shoulder observed in chars near 1200 cm-1 has been assigned to the totally symmetric 37 A1g vibrations of various small polyaromatic hydrocarbons (PAH) as well as those containing 38 rings of seven or more carbons. Intensity between 1400 cm-1 and 1450 cm-1 is assigned to A1g 39 type vibrations present in small PAHs and especially those containing cyclopentane rings. 40 Finally, band intensity between 1500 cm-1 and 1550 cm-1 is ascribed to predominately E2g 41 vibrational modes in strained PAH systems. A total of ten potential bands have been assigned 42 between 1000 cm-1 and 1800 cm-1. These fitting parameters have been used to deconvolute a 43 thermoseries of cellulose chars produced by pyrolysis at 300-700 °C. The results of the 44 deconvolution show consistent growth of PAH clusters with temperature, development of non-45 benzyl rings as temperature increases and loss of oxygenated features between 400 °C and 46 600 °C

  14. Improving reproducibility of VEP recording in rats: electrodes, stimulus source and peak analysis. (United States)

    You, Yuyi; Klistorner, Alexander; Thie, Johnson; Graham, Stuart L


    The aims of this study were to evaluate and improve the reproducibility of visual evoked potential (VEP) measurement in rats and to develop a mini-Ganzfeld stimulator for rat VEP recording. VEPs of Sprague-Dawley rats were recorded on one randomly selected eye on three separate days within a week, and the recordings were repeated three times on the first day to evaluate the intrasession repeatability and intersession reproducibility. The VEPs were recorded with subdermal needle and implanted skull screw electrodes, respectively, to evaluate the effect of electrode configuration on VEP reproducibility. We also designed a mini-Ganzfeld stimulator for rats, which provided better eye isolation than the conventional visual stimuli such as flash strobes and large Ganzfeld systems. The VEP responses from mini-Ganzfeld were compared with PS33-PLUS photic strobe and single light-emitting diode (LED). The latencies of P1, N1, P2, N2, and P3 and the amplitude of each component were measured and analysed. Intrasession and intersession within-subject standard deviations (Sw), coefficient of variation, repeatability (R95) and intraclass correlation coefficient (ICC) were calculated. The VEPs recorded using the implanted skull electrodes showed significantly larger amplitude and higher reproducibility compared to the needle electrodes (Pwaves. The mean intrasession and intersession ICCs were 0.96 and 0.86 for the early peaks. Using a combination of skull screw electrodes, mini-Ganzfeld stimulator and early peak analysis, we achieved a high reproducibility in the rat VEP measurement. The latencies of the early peaks of rat VEPs were more consistent, which may be due to their generation in the primary visual cortex via the retino-geniculate fibres.

  15. Improved soil carbonate determination by FT-IR and X-ray analysis (United States)

    Bruckman, Viktor; Wriessnig, Karin


    In forest soils on calcareous parent material, carbonate is a key component which influences both chemical and physical soil properties and thus fertility and productivity. At low organic carbon contents it is difficult to distinguish between organic and inorganic carbon (carbonate) in soils. The common gas-volumetric method to determine carbonate has a number of disadvantages. We hypothesize that a combination of two spectroscopic methods, which account for different forms of carbonate, can be used to model soil carbonate in our study region. Fourier Transform Mid-Infrared Spectroscopy (FT-MIR) was combined with X-ray diffraction (XRD) to develop a model based on partial least squares regression (PLSR). Results of the gas-volumetric Scheibler method were corrected for the calcite/dolomite ratio. The best model performance was achieved when we combined the two analytical methods using four principal components. The root mean squared error of prediction decreased from 13.07 to 11.57, while full cross-validation explained 94.5% of the variance of the carbonate content. This is the first time that a combination of the proposed methods has been used to predict carbonate in forest soils, offering a simple and cheap method to precisely estimate soil carbonate contents while increasing accuracy in comparison to spectroscopic approaches proposed earlier. This approach has the potential to complement or substitute gas-volumetric methods, specifically in study areas with low soil heterogeneity and similar parent material or in long-term monitoring by consecutive sampling. Reference: Bruckman, V. and K. Wriessnig, Improved soil carbonate determination by FT-IR and X-ray analysis. Environmental Chemistry Letters, 2012: p. 1-6. [DOI:DOI 10.1007/s10311-012-0380-4

  16. T4 RNA Ligase 2 truncated active site mutants: improved tools for RNA analysis

    Directory of Open Access Journals (Sweden)

    Zhuang Fanglei


    Full Text Available Abstract Background T4 RNA ligases 1 and 2 are useful tools for RNA analysis. Their use upstream of RNA analyses such as high-throughput RNA sequencing and microarrays has recently increased their importance. The truncated form of T4 RNA ligase 2, comprising amino acids 1-249 (T4 Rnl2tr, is an attractive tool for attachment of adapters or labels to RNA 3'-ends. Compared to T4 RNA ligase 1, T4 Rnl2tr has a decreased ability to ligate 5'-PO4 ends in single-stranded RNA ligations, and compared to the full-length T4 Rnl2, the T4 Rnl2tr has an increased activity for joining 5'-adenylated adapters to RNA 3'-ends. The combination of these properties allows adapter attachment to RNA 3'-ends with reduced circularization and concatemerization of substrate RNA. Results With the aim of further reducing unwanted side ligation products, we substituted active site residues, known to be important for adenylyltransferase steps of the ligation reaction, in the context of T4 Rnl2tr. We characterized the variant ligases for the formation of unwanted ligation side products and for activity in the strand-joining reaction. Conclusions Our data demonstrate that lysine 227 is a key residue facilitating adenylyl transfer from adenylated ligation donor substrates to the ligase. This reversal of the second step of the ligation reaction correlates with the formation of unwanted ligation products. Thus, T4 Rn2tr mutants containing the K227Q mutation are useful for reducing undesired ligation products. We furthermore report optimal conditions for the use of these improved T4 Rnl2tr variants.

  17. Analysis of first flush to improve the water quality in rainwater tanks. (United States)

    Kus, B; Kandasamy, J; Vigneswaran, S; Shon, H K


    Although most Australians receive their domestic supply from reticulated mains or town water, there are vast areas with very low population densities and few reticulated supplies. In many of these areas rainwater collected in tanks is the primary source of drinking water. Heavy metals have recently become a concern as their concentration in rain water tanks was found to exceed recommended levels suitable for human consumption. Rainwater storage tanks also accumulate contaminants and sediments that settle to the bottom. Although not widely acknowledged, small amounts of contaminants such as lead found in rain water (used as drinking water) may have a cumulative and poisonous effect on human health over a life time. This is true for certain factors that underlie many of the chronic illnesses that are becoming increasingly common in contemporary society. The paper reports on a study which is part of a project that aims to develop a cost effective in-line filtration system to improve water quality in rainwater tanks. To enable this, the characteristics of rainwater need to be known. One component of this characterization is to observe the effects of the first flush on a rainwater tank. Samples of the roof runoff collected from an urban residential roof located in the Sydney Metropolitan Area in the initial first few millimetres of rain were analysed. The results show that bypassing the first 2 mm of rainfall gives water with most water quality parameters compliant with the Australian Drinking Water Guidelines (ADWG) standards. The parameters that did not comply were lead and turbidity, which required bypassing approximately the first 5 mm of rainfall to meet ADWG standards. Molecular weight distribution (MWD) analysis showed that the concentration of rainwater organic matter (RWOM) decreased with increasing amount of roof runoff.

  18. Improved neutron kinetics for coupled three-dimensional boiling water reactor analysis (United States)

    Akdeniz, Bedirhan

    The need for a more accurate method of modelling cross section variations for off-nominal core conditions is becoming an important issue with the increased use of coupled three-dimensional (3-D) thermal-hydraulics/neutronics simulations. In traditional reactor core analysis, thermal reactor core calculations are customarily performed with 3-D two-group nodal diffusion methods. Steady-state multi-group transport theory calculations on heterogeneous single assembly domains subject to reflective boundary conditions are normally used to prepare the equivalent two-group spatially homogenized nodal parameters. For steady-state applications, the equivalent nodal parameters are theoretically well-defined; but, for transient applications, the definition of the nodal kinetics parameters, in particular, delayed neutron precursor data is somewhat unclear. The fact that delayed neutrons are emitted at considerably lower energies than prompt neutrons and that this difference cannot be accounted for in a two-group representation is of particular concern. To compensate for this inherent deficiency of the two-group model a correction is applied to the nodal values of the delayed neutron fractions; however, the adequacy of this correction has never been tested thoroughly for Boiling Water Reactor (BWR) applications, especially where the instantaneous thermal-hydraulic conditions play an important role on the core neutron kinetics calculations. This thesis proposes a systematic approach to improve the 3-D neutron kinetics modelling in coupled BWR transient calculations by developing, implementing and validating methods for consistent generation of neutron kinetics and delayed neutron data for such coupled thermal-hydraulics/neutronics simulations.

  19. Synthetic analysis of Tc and Hc2 of NbTi/Ti multilayers based on improved proximity effect theory (United States)

    Obi, Y.; Ikebe, M.; Takanaka, K.; Fujimori, H.


    Tc and Hc2 of NbTi/Ti multilayers have been calculated based on an improved approximation for the proximity effect. Analysis reveals that the agreement between the calculation and the experiment is satisfactory except for some inevitable scatters of the sample quality.

  20. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams (United States)

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne


    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  1. Recommendations to improve imaging and analysis of brain lesion load and atrophy in longitudinal studies of multiple sclerosis

    DEFF Research Database (Denmark)

    Vrenken, H; Jenkinson, M; Horsfield, M A;


    Focal lesions and brain atrophy are the most extensively studied aspects of multiple sclerosis (MS), but the image acquisition and analysis techniques used can be further improved, especially those for studying within-patient changes of lesion load and atrophy longitudinally. Improved accuracy...... and sensitivity will reduce the numbers of patients required to detect a given treatment effect in a trial, and ultimately, will allow reliable characterization of individual patients for personalized treatment. Based on open issues in the field of MS research, and the current state of the art in magnetic...... resonance image analysis methods for assessing brain lesion load and atrophy, this paper makes recommendations to improve these measures for longitudinal studies of MS. Briefly, they are (1) images should be acquired using 3D pulse sequences, with near-isotropic spatial resolution and multiple image...

  2. Tailoring quality improvement interventions to identified barriers: a multiple case analysis.

    NARCIS (Netherlands)

    Bosch, M.; Weijden, T. van der; Wensing, M.J.P.; Grol, R.P.T.M.


    RATIONALE, AIMS AND OBJECTIVES: The prevailing view on implementation interventions to improve the organization and management of health care is that the interventions should be tailored to potential barriers. Ideally, possible barriers are analysed before the quality improvement interventions are d

  3. Cost savings associated with improving appropriate and reducing inappropriate preventive care: cost-consequences analysis

    Directory of Open Access Journals (Sweden)

    Baskerville Neill


    Full Text Available Abstract Background Outreach facilitation has been proven successful in improving the adoption of clinical preventive care guidelines in primary care practice. The net costs and savings of delivering such an intensive intervention need to be understood. We wanted to estimate the proportion of a facilitation intervention cost that is offset and the potential for savings by reducing inappropriate screening tests and increasing appropriate screening tests in 22 intervention primary care practices affecting a population of 90,283 patients. Methods A cost-consequences analysis of one successful outreach facilitation intervention was done, taking into account the estimated cost savings to the health system of reducing five inappropriate tests and increasing seven appropriate tests. Multiple data sources were used to calculate costs and cost savings to the government. The cost of the intervention and costs of performing appropriate testing were calculated. Costs averted were calculated by multiplying the number of tests not performed as a result of the intervention. Further downstream cost savings were determined by calculating the direct costs associated with the number of false positive test follow-ups avoided. Treatment costs averted as a result of increasing appropriate testing were similarly calculated. Results The total cost of the intervention over 12 months was $238,388 and the cost of increasing the delivery of appropriate care was $192,912 for a total cost of $431,300. The savings from reduction in inappropriate testing were $148,568 and from avoiding treatment costs as a result of appropriate testing were $455,464 for a total savings of $604,032. On a yearly basis the net cost saving to the government is $191,733 per year (2003 $Can equating to $3,687 per physician or $63,911 per facilitator, an estimated return on intervention investment and delivery of appropriate preventive care of 40%. Conclusion Outreach facilitation is more expensive

  4. Turbulence Analysis Upstream of a Wind Turbine: a LES Approach to Improve Wind LIDAR Technology (United States)

    Calaf, M.


    upstream, much can be learned about the incoming turbulence, hence allowing improved wind turbine readjustments. Time correlations with the upstream incoming turbulence have been computed through an entire diurnal cycle, and a non-dimensional analysis shows the existence of different behaviors throughout the day.

  5. [Improvement of transrectal ultrasound. Artificial neural network analysis (ANNA) in detection and staging of prostatic carcinoma]. (United States)

    Loch, T; Leuschner, I; Genberg, C; Weichert-Jacobsen, K; Küppers, F; Retz, M; Lehmann, J; Yfantis, E; Evans, M; Tsarev, V; Stöckle, M


    As a result of the enhanced clinical application of prostate specific antigen (PSA), an increasing number of men are becoming candidates for prostate cancer work-up. A high PSA value over 20 ng/ml is a good indicator of the presence of prostate cancer, but within the range of 4-10 ng/ml, it is rather unreliable. Even more alarming is the fact that prostate cancer has been found in 12-37% of patients with a "normal" PSA value of under 4 ng/ml (Hybritech). While PSA is capable of indicating a statistical risk of prostate cancer in a defined patient population, it is not able to localize cancer within the prostate gland or guide a biopsy needle to a suspicious area. This necessitates an additional effective diagnostic technique that is able to localize or rule out a malignant growth within the prostate. The methods available for the detection of these prostate cancers are digital rectal examination (DRE) and Transrectal ultrasound (TRUS). DRE is not suitable for early detection, as about 70% of the palpable malignancies have already spread beyond the prostate. The classic problem of visual interpretation of TRUS images is that hypoechoic areas suspicious for cancer may be either normal or cancerous histologically. Moreover, about 25% of all cancers have been found to be isoechoic and therefore not distinguishable from normal-appearing areas. None of the current biopsy or imaging techniques are able to cope with this dilemma. Artificial neural networks (ANN) are complex nonlinear computational models, designed much like the neuronal organization of a brain. These networks are able to model complicated biologic relationships without making assumptions based on conventional statistical distributions. Applications in Medicine and Urology have been promising. One example of such an application will be discussed in detail: A new method of Artificial Neural Network Analysis (ANNA) was employed in an attempt to obtain existing subvisual information, other than the gray scale

  6. Performance analysis of improved iterated cubature Kalman filter and its application to GNSS/INS. (United States)

    Cui, Bingbo; Chen, Xiyuan; Xu, Yuan; Huang, Haoqian; Liu, Xiao


    In order to improve the accuracy and robustness of GNSS/INS navigation system, an improved iterated cubature Kalman filter (IICKF) is proposed by considering the state-dependent noise and system uncertainty. First, a simplified framework of iterated Gaussian filter is derived by using damped Newton-Raphson algorithm and online noise estimator. Then the effect of state-dependent noise coming from iterated update is analyzed theoretically, and an augmented form of CKF algorithm is applied to improve the estimation accuracy. The performance of IICKF is verified by field test and numerical simulation, and results reveal that, compared with non-iterated filter, iterated filter is less sensitive to the system uncertainty, and IICKF improves the accuracy of yaw, roll and pitch by 48.9%, 73.1% and 83.3%, respectively, compared with traditional iterated KF.

  7. Vibration analysis of rotating turbomachinery blades by an improved finite difference method (United States)

    Subrahmanyam, K. B.; Kaza, K. R. V.


    The problem of calculating the natural frequencies and mode shapes of rotating blades is solved by an improved finite difference procedure based on second-order central differences. Lead-lag, flapping and coupled bending-torsional vibration cases of untwisted blades are considered. Results obtained by using the present improved theory have been observed to be close lower bound solutions. The convergence has been found to be rapid in comparison with the classical first-order finite difference method. While the computational space and time required by the present approach is observed to be almost the same as that required by the first-order theory for a given mesh size, accuracies of practical interest can be obtained by using the improved finite difference procedure with a relatively smaller matrix size, in contrast to the classical finite difference procedure which requires either a larger matrix or an extrapolation procedure for improvement in accuracy.

  8. Quantitative Analysis of Impact of Education on Improving Farmers' Net Income and Yield Per Capita

    Institute of Scientific and Technical Information of China (English)

    DING Jing-zhi


    In this paper, we analyze the relation between farmers' schooling and their net income and yield per capita by systemic and scientific method, concluding that improving farmers' educational level may increase their net income.

  9. Can Latent Class Analysis Be Used to Improve the Diagnostic Process in Pediatric Patients with Chronic Ataxia? (United States)

    Klassen, Samantha; Dufault, Brenden; Salman, Michael S


    Chronic ataxia is a relatively common symptom in children. There are numerous causes of chronic ataxia, making it difficult to derive a diagnosis in a timely manner. We hypothesized that the efficiency of the diagnostic process can be improved with systematic analysis of clinical features in pediatric patients with chronic ataxia. Our aim was to improve the efficiency of the diagnostic process in pediatric patients with chronic ataxia. A cohort of 184 patients, aged 0-16 years with chronic ataxia who received medical care at Winnipeg Children's Hospital during 1991-2008, was ascertained retrospectively from several hospital databases. Clinical details were extracted from hospital charts. The data were compared among the more common diseases using univariate analysis to identify pertinent clinical features that could potentially improve the efficiency of the diagnostic process. Latent class analysis was then conducted to detect unique patterns of clinical features and to determine whether these patterns could be associated with chronic ataxia diagnoses. Two models each with three classes were chosen based on statistical criteria and clinical knowledge for best fit. Each class represented a specific pattern of presenting symptoms or other clinical features. The three classes corresponded to a plausible and shorter list of possible diagnoses. For example, developmental delay and hypotonia correlated best with Angelman syndrome. Specific patterns of presenting symptoms or other clinical features can potentially aid in the initial assessment and diagnosis of pediatric patients with chronic ataxia. This will likely improve the efficiency of the diagnostic process.

  10. An analysis of the aircraft engine Component Improvement Program (CIP) : a life cycle cost approach


    Borer, Chris Joseph


    Approved for public release; distribution unlimited. Increasing budgetary constraints have prompted actions to reduce the maintenance cost of current naval aircraft. This thesis examines the Aircraft Engine Component Improvement Program (CIP), its impact on these cost at the organizational and intermediate levels of maintenance, and savings from these improvements. The objectives of the research were to identify current life cycle cost (LCC) models used by the Navy andor the other services...

  11. Unravelling evolutionary strategies of yeast for improving galactose utilization through integrated systems level analysis

    DEFF Research Database (Denmark)

    Hong, Kuk-Ki; Vongsangnak, Wanwipa; Vemuri, Goutham N


    Identification of the underlying molecular mechanisms for a derived phenotype by adaptive evolution is difficult. Here, we performed a systems-level inquiry into the metabolic changes occurring in the yeast Saccharomyces cerevisiae as a result of its adaptive evolution to increase its specific...... design in bioengineering of improved strains and, that through systems biology, it is possible to identify mutations in evolved strain that can serve as unforeseen metabolic engineering targets for improving microbial strains for production of biofuels and chemicals....

  12. Impact of Improved Maize Adoption on Welfare of Farm Households in Malawi: A Panel Data Analysis


    Bezu, Sosina; Kassie, Girma; Shiferaw, Bekele; Ricker-Gilbert, Jacob


    This paper assesses improved maize adoption in Malawi and examines the link between adoption and household welfare using a three-year household panel data. The distributional effect of maize technology adoption is also investigated by looking at impacts across wealth and gender groups. We applied control function approach and IV regression to control for endogeneity of input subsidy and improved maize adoption. We found that modern maize variety adoption is positively correlated with the hous...

  13. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory


    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  14. What is the best dose of nature and green exercise for improving mental health? A multi-study analysis. (United States)

    Barton, Jo; Pretty, Jules


    Green exercise is activity in the presence of nature. Evidence shows it leads to positive short and long-term health outcomes. This multistudy analysis assessed the best regime of dose(s) of acute exposure to green exercise required to improve self-esteem and mood (indicators of mental health). The research used meta-analysis methodology to analyze 10 UK studies involving 1252 participants. Outcomes were identified through a priori subgroup analyses, and dose-responses were assessed for exercise intensity and exposure duration. Other subgroup analyses included gender, age group, starting health status, and type of habitat. The overall effect size for improved self-esteem was d = 0.46 (CI 0.34-0.59, p improved both self-esteem and mood; the presence of water generated greater effects. Both men and women had similar improvements in self-esteem after green exercise, though men showed a difference for mood. Age groups: for self-esteem, the greatest change was in the youngest, with diminishing effects with age; for mood, the least change was in the young and old. The mentally ill had one of the greatest self-esteem improvements. This study confirms that the environment provides an important health service.

  15. Linkage analysis and physical mapping near the gene for x-linked agammaglobulinemia at Xq22

    Energy Technology Data Exchange (ETDEWEB)

    Parolini, O.; Lassiter, G.L.; Henry, M.J.; Conley, M.E. (Univ. of Tennessee College of Medicine, Memphis (United States) St. Jude Children' s Research Hospital, Memphis, TN (United States)); Hejtmancik, J.F. (National Inst. of Health, Bethesda, MD (United States)); Allen, R.C.; Belmont, J.W. (Baylor College of Medicine, Houston, TX (United States)); Barker, D.F. (Univ. of Utah, Salt Lake City (United States))


    The gene for x-linked agammaglobulinemia (XLA) has been mapped to Xq22. No recombinations have been reported between the gene and the prob p212 at DXS178; however, this probe is informative in only 30-40% of women and the reported flanking markers, DXS3 and DXS94, and 10-15 cM apart. To identify additional probes that might be useful in genetic counseling, we examined 11 polymorphisms that have been mapped to the Xq21.3-q22 region in 13 families with XLA. In addition, pulsed-field gel electrophoresis and yeast artificial chromosomes (YACs) were used to further characterize the segman of DNA within which the gene for SLA must lie. The results demonstrated that DXS366 and DXS442, which share a 430-kb pulsed-field fragment, could replace DXS3 as proximal flanking markers. Probes at DXS178 and DXS265 identified the same 145-kb pulsed-field fragment, and both loci were contained within a 200-kb YAC identified with the probe p212. A highly polymorphic CA repeat (DCS178CA) was isolated from one end of this YAC and used in linkage analysis. Probes at DXS101 and DXS328 shared several pulsed-field fragments, the smallest of which was 250 kb. No recombinations were seen between XLA and the DXS178-DXS265-DXS178CA complex, DXS101, DXS328, DXS87, or the gene for proteolipid protein (PLP). Key crossovers, when combined with the linkage data from families with Alport syndrome, suggested the following order of loci: cen-DXS3-DXS366-DXS442-(PLP, DXS101, DXS328, DXS178-DXS265-DXS178CA complex, XL)-(DXS87, DXS94)-DXS327-(DXS350, DXS362)-tel. Our studies also limit the segment of DNA within which the XLA gene must lie to the 3- to 4-cM distance between DCS442 and DXS94 and they identify and orient polymorphisms that can be used in genetic counseling not only for XLA but also for Pelizaeus-Merzbacher disease (PLP deficiency), Alport syndrome (COL4A5 deficiency), and Fabry disease ([alpha]-galactosidase A difficiency). 31 refs., 5 figs., 2 tabs.

  16. Theoretical analysis and an improvement method of the bias effect on the linearity of RF linear power amplifiers

    Institute of Scientific and Technical Information of China (English)

    Wu Tuo; Chen Hongyi; Qian Dahong


    Based on the Gummel-Poon model of BJT, the change of the DC bias as a function of the AC input signal in RF linear power amplifiers is theoretically derived, so that the linearity of different DC bias circuits can be interpreted and compared. According to the analysis results, a quantitative adaptive DC bias circuit is proposed,which can improve the linearity and efficiency. From the simulation and test results, we draw conclusions on how to improve the design of linear power amplifier.

  17. Improvement in the Plutonium Parameter Files of the FRAM Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    D. T. Vo; T. E. Sampson


    The isotopic analysis code Fixed-energy Response-function Analysis with Multiple efficiency (FRAM) employs user-editable parameter sets to analyze a broad range of sample types. This report presents new parameter files, based upon a new set of plutonium branding ratios, which give more accurate isotope results than the current parameter files that use FRAM.

  18. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.


    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  19. An Improved Flame Test for Qualitative Analysis Using a Multichannel UV-Visible Spectrophotometer (United States)

    Blitz, Jonathan P.; Sheeran, Daniel J.; Becker, Thomas L.


    Qualitative analysis schemes are used in undergraduate laboratory settings as a way to introduce equilibrium concepts and logical thinking. The main component of all qualitative analysis schemes is a flame test, as the color of light emitted from certain elements is distinctive and a flame photometer or spectrophotometer in each laboratory is…

  20. Comparative analysis of maize (Zea mays) crop performance: natural variation, incremental improvements and economic impacts. (United States)

    Leibman, Mark; Shryock, Jereme J; Clements, Michael J; Hall, Michael A; Loida, Paul J; McClerren, Amanda L; McKiness, Zoe P; Phillips, Jonathan R; Rice, Elena A; Stark, Steven B


    Grain yield from maize hybrids continues to improve through advances in breeding and biotechnology. Despite genetic improvements to hybrid maize, grain yield from distinct maize hybrids is expected to vary across growing locations due to numerous environmental factors. In this study, we examine across-location variation in grain yield among maize hybrids in three case studies. The three case studies examine hybrid improvement through breeding, introduction of an insect protection trait or introduction of a transcription factor trait associated with increased yield. In all cases, grain yield from each hybrid population had a Gaussian distribution. Across-location distributions of grain yield from each hybrid partially overlapped. The hybrid with a higher mean grain yield typically outperformed its comparator at most, but not all, of the growing locations (a 'win rate'). These results suggest that a broad set of environmental factors similarly impacts grain yields from both conventional- and biotechnology-derived maize hybrids and that grain yields among two or more hybrids should be compared with consideration given to both mean yield performance and the frequency of locations at which each hybrid 'wins' against its comparators. From an economic standpoint, growers recognize the value of genetically improved maize hybrids that outperform comparators in the majority of locations. Grower adoption of improved maize hybrids drives increases in average U.S. maize grain yields and contributes significant value to the economy.

  1. The job analysis of Korean nurses as a strategy to improve the Korean Nursing Licensing Examination

    Directory of Open Access Journals (Sweden)

    In Sook Park


    Full Text Available Purpose: This study aimed at characterizing Korean nurses’ occupational responsibilities to apply the results for improvement of the Korean Nursing Licensing Examination. Methods: First, the contents of nursing job were defined based on a focus group interview of 15 nurses. Developing a Curriculum (DACOM method was used to examine those results and produce the questionnaire by 13 experts. After that, the questionnaire survey to 5,065 hospital nurses was done. Results: The occupational responsibilities of nurses were characterized as involving 8 duties, 49 tasks, and 303 task elements. Those 8 duties are nursing management and professional development, safety and infection control, the management of potential risk factors, basic nursing and caring, the maintenance of physiological integrity, medication and parenteral treatments, socio-psychological integrity, and the maintenance and improvement of health. Conclusion: The content of Korean Nursing Licensing Examination should be improved based on 8 duties and 49 tasks of the occupational responsibilities of Korean nurses.

  2. Design and Analysis of a Differential Waveguide Structure to Improve Magnetostrictive Linear Position Sensors

    Directory of Open Access Journals (Sweden)

    Hui Zhao


    Full Text Available Magnetostrictive linear position sensors (MLPS are high-precision sensors used in the industrial field for measuring the propagation time of ultrasonic signals in a waveguide. To date, MLPS have attracted widespread attention for their accuracy, reliability, and cost-efficiency in performing non-contact, multiple measurements. However, the sensor, with its traditional structure, is susceptible to electromagnetic interference, which affects accuracy. In the present study, we propose a novel structure of MLPS that relies on two differential waveguides to improve the signal-to-noise ratio, common-mode rejection ratio, and accuracy of MLPS. The proposed sensor model can depict sensor performance and the relationship of sensor parameters. Experimental results with the new sensor indicate that the new structure can improve accuracy to ±0.1 mm higher than ±0.2 mm with a traditional structure. In addition, the proposed sensor shows a considerable improvement in temperature characteristics.

  3. Improvement Analysis in a Municipal Pumping System Aiming the Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Rafael Fernando Dutra


    Full Text Available With the rapid and disorderly growth that occurred in the city of Caxias do Sul - RS in the last two decades, many problems of water supply are observed in specific areas, especially in peak times and days of consumption. In order to solve the mentioned problem and focusing on energy efficiency, this study proposed two improvements in the pumping system of Santa Fe, which is responsible for supplying the northern part of Caxias do Sul. The improvements mentioned dealt with the exchange of the pumping system and the use of a frequency converter to control its speed. From the measurements made and simulations in spreadsheets and software Epanet, it was found that the two improvements are technically and economically viable, providing an estimated monthly savings of 37.7%.


    Directory of Open Access Journals (Sweden)

    A. M. Kozuberda


    Full Text Available The article is dealt with proposals for improvement of accounting in signaling and communication and the reasons for such decisions. It is offered to use new methods of calculating depreciation, change the criteria for enrollment in low-value items and simplify the procedure for writing off fixed assets. These changes should reduce the costs at the enterprise, simplify the accounting work and improve the overall performance of the maintenance section, provide better to its main function – the traffic safety of trains.

  5. Waste reduction and process improvements in the analysis of plutonium by x-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Worley, Christopher G [Los Alamos National Laboratory; Sodweberg, Constance B [Los Alamos National Laboratory; Townsend, Lisa E [Los Alamos National Laboratory


    Significant modifications were made to a sample preparation process for quantifying gallium in plutonium metal by wavelength dispersive X-ray fluorescence. These changes were made to minimize waste and improve process safety and efficiency. Sample sizes were reduced, cheaper sample preparation acids were used, and safety improvements were implemented. Using this modified process, results from analyzing a batch oftest samples indicated that relative precision and accuracy were {approx}0.2% and {approx}0.1% respectively, which is comparable to that obtained using the older, established sample preparation method.

  6. Improved diagnosis of MV paper-insulated cables using signal analysis

    DEFF Research Database (Denmark)

    Villefrance, Rasmus; Holbøll, Joachim T.; Sørensen, John Aasted;


    With the purpose of improving the PD estimation accuracy and the degree of automation of the measurements, the following study is carried out. Initially, a library of different discharge pulses and actual background noise from a selection of cables is established. The library is then used...... for the estimation of PD-signals from a parametric model leading to reduction of the noise superimposed on the PD-signals and thus to improved PD-detection. The applicability of these methods is discussed in relation to mobile systems for the assessment of cable insulation condition....


    Institute of Scientific and Technical Information of China (English)

    Y.Wang; J.Chen; H.B.Li


    An improved interface cohesive zone model is developed for the simulation of inter-face contact,under mixed-mode loading.A new debonding initiation criterion and propagation of debonding law,taking into account the pressure stress influence on contact shear strength,is proposed.The model is implemented in a finite-element program using subroutine VUINTER of ABA QUS Explicit.An edge-notch four-point bending process and laminated vibration damping steel sheet punch forming test are simulated with the improved model in ABAQUS Explicit.The numerical predictions agree satisfactorily with the corresponding experimental results.

  8. Improvement of the Kruk-Jaroniec-Sayari method for pore size analysis of ordered silicas with cylindrical mesopores. (United States)

    Jaroniec, Mietek; Solovyov, Leonid A


    In this work, the X-ray diffraction structure modeling was employed for analysis of hexagonally ordered large-pore silicas, SBA-15, to determine their pore width independently of adsorption measurements. Nitrogen adsorption isotherms were used to evaluate the relative pressure of capillary condensation in cylindrical mesopores of these materials. This approach allowed us to extend the original Kruk-Jaroniec-Sayari (KJS) relation (Langmuir 1997, 13, 6267) between the pore width and capillary condensation pressure up to 10 nm instead of previously established range from 2 to 6.5 nm for a series of MCM-41 and to improve the KJS pore size analysis of large pore silicas.

  9. Simulation System of Car Crash Test in C-NCAP Analysis Based on an Improved Apriori Algorithm* (United States)

    Xiang, LI

    In order to analysis car crash test in C-NCAP, an improved algorithm is given based on Apriori algorithm in this paper. The new algorithm is implemented with vertical data layout, breadth first searching, and intersecting. It takes advantage of the efficiency of vertical data layout and intersecting, and prunes candidate frequent item sets like Apriori. Finally, the new algorithm is applied in simulation of car crash test analysis system. The result shows that the relations will affect the C-NCAP test results, and it can provide a reference for the automotive design.

  10. Time-frequency analysis of non-stationary fusion plasma signals using an improved Hilbert-Huang transform (United States)

    Liu, Yangqing; Tan, Yi; Xie, Huiqiao; Wang, Wenhao; Gao, Zhe


    An improved Hilbert-Huang transform method is developed to the time-frequency analysis of non-stationary signals in tokamak plasmas. Maximal overlap discrete wavelet packet transform rather than wavelet packet transform is proposed as a preprocessor to decompose a signal into various narrow-band components. Then, a correlation coefficient based selection method is utilized to eliminate the irrelevant intrinsic mode functions obtained from empirical mode decomposition of those narrow-band components. Subsequently, a time varying vector autoregressive moving average model instead of Hilbert spectral analysis is performed to compute the Hilbert spectrum, i.e., a three-dimensional time-frequency distribution of the signal. The feasibility and effectiveness of the improved Hilbert-Huang transform method is demonstrated by analyzing a non-stationary simulated signal and actual experimental signals in fusion plasmas.

  11. Testing a four-dimensional variational data assimilation method using an improved intermediate coupled model for ENSO analysis and prediction (United States)

    Gao, Chuan; Wu, Xinrong; Zhang, Rong-Hua


    A four-dimensional variational (4D-Var) data assimilation method is implemented in an improved intermediate coupled model (ICM) of the tropical Pacific. A twin experiment is designed to evaluate the impact of the 4D-Var data assimilation algorithm on ENSO analysis and prediction based on the ICM. The model error is assumed to arise only from the parameter uncertainty. The "observation" of the SST anomaly, which is sampled from a "truth" model simulation that takes default parameter values and has Gaussian noise added, is directly assimilated into the assimilation model with its parameters set erroneously. Results show that 4D-Var effectively reduces the error of ENSO analysis and therefore improves the prediction skill of ENSO events compared with the non-assimilation case. These results provide a promising way for the ICM to achieve better real-time ENSO prediction.

  12. Time-frequency analysis of non-stationary fusion plasma signals using an improved Hilbert-Huang transform

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yangqing, E-mail:; Tan, Yi; Xie, Huiqiao; Wang, Wenhao; Gao, Zhe [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)


    An improved Hilbert-Huang transform method is developed to the time-frequency analysis of non-stationary signals in tokamak plasmas. Maximal overlap discrete wavelet packet transform rather than wavelet packet transform is proposed as a preprocessor to decompose a signal into various narrow-band components. Then, a correlation coefficient based selection method is utilized to eliminate the irrelevant intrinsic mode functions obtained from empirical mode decomposition of those narrow-band components. Subsequently, a time varying vector autoregressive moving average model instead of Hilbert spectral analysis is performed to compute the Hilbert spectrum, i.e., a three-dimensional time-frequency distribution of the signal. The feasibility and effectiveness of the improved Hilbert-Huang transform method is demonstrated by analyzing a non-stationary simulated signal and actual experimental signals in fusion plasmas.

  13. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail:; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)


    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  14. Detection of ULF electromagnetic emissions as a precursor to an earthquake in China with an improved polarization analysis

    Directory of Open Access Journals (Sweden)

    Y. Ida


    Full Text Available An improved analysis of polarization (as the ratio of vertical magnetic field component to the horizontal one has been developed, and applied to the approximately four years data (from 1 March 2003 to 31 December 2006 observed at Kashi station in China. It is concluded that the polarization ratio has exhibited an apparent increase only just before the earthquake on 1 September 2003 (magnitude = 6.1 and epicentral distance of 116 km.

  15. An Analysis of a Plan to Improve Graduation Rates in Johnston County Schools (United States)

    Renfrow, David Ross


    There have been limited qualitative case studies exploring effective strategies designed to improve graduation rates in rural school districts. Specifically, few studies have presented information based solely upon the voices of practitioners themselves in solving the graduation crisis in America's public schools. This study will add to the…

  16. Analysis of Employee Engagement to Improve the Performance of Retail Risk Group PT Bank Mandiri (United States)

    Wiseto, Artody; Hubeis, Aida Vitayala; Sukandar, Dadang


    Nowadays, every company requires their employees have a bound sense to their company. It's called engagement. Also have that expectation, PT Bank Mandiri (Persero) Tbk, Bank with the largest assets in Indonesia. PT Bank Mandiri (Persero) Tbk expect which employee engagement can improve the performance such as financial, service, and production…

  17. A Meta-Analysis of Educational Data Mining on Improvements in Learning Outcomes (United States)

    AlShammari, Iqbal A.; Aldhafiri, Mohammed D.; Al-Shammari, Zaid


    A meta-synthesis study was conducted of 60 research studies on educational data mining (EDM) and their impacts on and outcomes for improving learning outcomes. After an overview, an examination of these outcomes is provided (Romero, Ventura, Espejo, & Hervas, 2008; Romero, "et al.", 2011). Then, a review of other EDM-related research…


    Institute of Scientific and Technical Information of China (English)

    Wei Gong; Ruo Li; Ningning Yan; Weibo Zhao


    This paper is concerned with an ill-posed problem which results from the area of molecular imaging and is known as BLT problem. Using Tikhonov regularization technique, a quadratic optimization problem can be formulated. We provide an improved error estimate for the finite element approximation of the regularized optimization problem. Some numerical examples are presented to demonstrate our theoretical results.

  19. Reporting Data with "Over-the-Counter" Data Analysis Supports Improves Educators' Data Analyses (United States)

    Rankin, Jenny Grant


    The benefits of making data-informed decisions to improve learning rely on educators correctly interpreting given data. Many educators routinely misinterpret data, even at districts with proactive support for data use. The tool most educators use for data analyses, which is an information technology data system or its reports, typically reports…

  20. Improving the design and analysis of superconducting magnets for particle acclerators

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, Ramesh Chandra [Univ. of Rajasthan (India)


    The field quality in superconducting magnets has been improved to a level that it does not appear to be a limiting factor on the performance of RHIC. The many methods developed, improved and adopted during the course of this work have contributed significantly to that performance. One can not only design and construct magnets with better field quality than in one made before but can also improve on that quality after construction. The relative field error (ΔB/B) can now be made as low as a few parts in 10-5 at 2/3 of the coil radius. This is about an order of magnitude better than what is generally expected for superconducting magnets. This extra high field quality is crucial to the luminosity performance of RHIC. The research work described here covers a number of areas which all must be addressed to build the production magnets with a high field quality. The work has been limited to the magnetic design of the cross section which in most cases essentially determines the field quality performance of the whole magnet since these magnets are generally long. Though the conclusions to be presented in this chapter have been discussed at the end of each chapter, a summary of them might be useful to present a complete picture. The lessons learned from these experiences may be useful in the design of new magnets. The possibilities of future improvements will also be presented.

  1. Improvement in DMSA imaging using adaptive noise reduction: an ROC analysis. (United States)

    Lorimer, Lisa; Gemmell, Howard G; Sharp, Peter F; McKiddie, Fergus I; Staff, Roger T


    Dimercaptosuccinic acid imaging is the 'gold standard' for the detection of cortical defects and diagnosis of scarring of the kidneys. The Siemens planar processing package, which implements adaptive noise reduction using the Pixon algorithm, is designed to allow a reduction in image noise, enabling improved image quality and reduced acquisition time/injected activity. This study aimed to establish the level of improvement in image quality achievable using this algorithm. Images were acquired of a phantom simulating a single kidney with a range of defects of varying sizes, positions and contrasts. These images were processed using the Pixon processing software and shown to 12 observers (six experienced and six novices) who were asked to rate the images on a six-point scale depending on their confidence that a defect was present. The data were analysed using a receiver operating characteristic approach. Results showed that processed images significantly improved the performance of the experienced observers in terms of their sensitivity and specificity. Although novice observers showed significant increase in sensitivity when using the software, a significant decrease in specificity was also seen. This study concludes that the Pixon software can be used to improve the assessment of cortical defects in dimercaptosuccinic acid imaging by suitably trained observers.

  2. Interspecific Chromosome Substitution Lines as Genetic Resources for Improvement,Trait Analysis and Genomic Inference

    Institute of Scientific and Technical Information of China (English)

    RASKA Dwaine A; SAHA Sukumar; JENKINS Johnie N; MCCARTY Jack C; WU Ji-xiang; STELLY David M


    @@ The genetic base that cotton breeders commonly use to improve Upland cultivars is very narrow.The AD-genome species Gossypium barbadense,G.tomentosum,and G.mustelinum are part of the primary germplasm pool,too,and constitute genetic reservoirs of genes for resistance to abiotic stress,pests,and pathogens,as well as agronomic and fiber traits.

  3. How to Improve Pupils' Literacy? A Cost-Effectiveness Analysis of a French Educational Project (United States)

    Massoni, Sebastien; Vergnaud, Jean-Christophe


    The "Action Lecture" program is an innovative teaching method run in some nursery and primary schools in Paris and designed to improve pupils' literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program's impact on several types of indicators. Data were processed…

  4. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)


    The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.

  5. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jing; GUO ZengYuan


    The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.

  6. An improved method for seed-bank analysis : Seedling emergence after removing the soil by sieving

    NARCIS (Netherlands)

    ter Heerdt, G.N.J.; Bekker, R.M.; Bakker, J.P.; Verweij, G.L.


    1. The seedling emergence method for assessing the size of the seed bank is improved by washing soil samples on a fine sieve and spreading the thus concentrated samples in a 3-5 mm thick layer on sterilized potting compost. 2. The method largely increases the number of seedlings that emerge as compa

  7. Improvement of Brain Function through Combined Yogic Intervention, Meditation and Pranayama: A Critical Analysis

    Directory of Open Access Journals (Sweden)

    Anup De


    Full Text Available Background: The practice of yoga includes static and dynamic postures (asanas, breathing manipulations (pranayama and meditation (dhyana. Yoga is a tool which works in the gross body level to the shuttle mind level. Yoga is a simple and inexpensive health regimen that can be incorporated as an effective adjuvant therapy for the improvement of brain and mental activity. Aim: To review scientific literatures related to yoga practice and brain function. Method: Researchers collected scientific evidences through electronic databases; Pubmed, Embase, Medline, Google Scholar, Google Advance Search, PsycINFO, ROAJ, DOAJR, Web of Science and critically analyzed the entire relevant article according to the nature of this study. Findings:Combined yogic practices improve memory which can influence the academic performance of the students. Meditation practices improve higher level of concentration and consciousness which may reduce the psychic disorder. Pranayama practice may be applied as alternative therapy for reducing stress related diseases Conclusions: Regular yogic practices may improve brain and others neuro cognitive functions.

  8. The Practical Relevance of Accountability Systems for School Improvement: A Descriptive Analysis of California Schools (United States)

    Mintrop, Heinrich; Trujillo, Tina


    In search for the practical relevance of accountability systems for school improvement, the authors ask whether practitioners traveling between the worlds of system-designated high- and low-performing schools would detect tangible differences in educational quality and organizational effectiveness. In comparing nine exceptionally high and low…

  9. The value of clinical judgement analysis for improving the quality of doctors' prescribing decisions

    NARCIS (Netherlands)

    Denig, P; Wahlstrom, R; de Saintonge, MC; Haaijer-Ruskamp, F; Wahlström, R.; de Saintonge, Mark Chaput


    Background Many initiatives are taken to improve prescribing decisions. Educational strategies for doctors have been effective in at least 50% of cases. Some reflection on one's own performance seems to be a common feature of the most effective strategies. So far, such reflections have mainly focuse

  10. Treatment of periodontitis improves the atherosclerotic profile: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Teeuw, W.J.; Slot, D.E.; Susanto, H.; Gerdes, V.E.A.; Abbas, F.; D'Aiuto, F.; Kastelein, J.J.P.; Loos, B.G.


    Aim Systematic review and meta-analyses to study the robustness of observations that treatment of periodontitis improves the atherosclerotic profile. Material and Methods Literature was searched in Medline-PubMed, Cochrane CENTRAL and EMBASE, based on controlled periodontal intervention trials, incl

  11. Improving Iranian High School Students' Reading Comprehension Using the Tenets of Genre Analysis (United States)

    Adelnia, Rezvan; Salehi, Hadi


    This study is an attempt to investigate impact of using a technique, namely, genre-based approach on improving reading ability on Iranian EFL learners' achievement. Therefore, an attempt was made to compare genre-based approach to teaching reading with traditional approaches. For achieving this purpose, by administering the Oxford Quick Placement…

  12. A Novel Approach to Improve the Detectability of CO2 by GC Analysis

    Institute of Scientific and Technical Information of China (English)


    A novel stochastic resonance algorithm was employed to enhance the signal-to-noise ratio (SNR) of signals of analytical chemistry. By using a gas chromatographic data set, it was proven that the SNR was greatly improved and the quantitative relationship between concentrations and chromatographic responses remained simultaneously. The linear range was extended beyond the instrumental detection limit.

  13. Treatment of periodontitis improves the atherosclerotic profile : a systematic review and meta-analysis

    NARCIS (Netherlands)

    Teeuw, Wijnand J.; Slot, Dagmar E.; Susanto, Hendri; Gerdes, Victor E. A.; Abbas, Frank; D'Aiuto, Francesco; Kastelein, John J. P.; Loos, Bruno G.


    AimSystematic review and meta-analyses to study the robustness of observations that treatment of periodontitis improves the atherosclerotic profile. Material and MethodsLiterature was searched in Medline-PubMed, Cochrane CENTRAL and EMBASE, based on controlled periodontal intervention trials, includ

  14. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing. (United States)

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L


    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  15. Toward improving the proteomic analysis of formalin-fixed, paraffin-embedded tissue. (United States)

    Fowler, Carol B; O'Leary, Timothy J; Mason, Jeffrey T


    Archival formalin-fixed, paraffin-embedded (FFPE) tissue and their associated diagnostic records represent an invaluable source of retrospective proteomic information on diseases for which the clinical outcome and response to treatment are known. However, analysis of archival FFPE tissues by high-throughput proteomic methods has been hindered by the adverse effects of formaldehyde fixation and subsequent tissue histology. This review examines recent methodological advances for extracting proteins from FFPE tissue suitable for proteomic analysis. These methods, based largely upon heat-induced antigen retrieval techniques borrowed from immunohistochemistry, allow at least a qualitative analysis of the proteome of FFPE archival tissues. The authors also discuss recent advances in the proteomic analysis of FFPE tissue; including liquid-chromatography tandem mass spectrometry, reverse phase protein microarrays and imaging mass spectrometry.

  16. Improving Recruiting of the 6th Recruiting Brigade Through Statistical Analysis and Efficiency Measures (United States)


    commentary/2013/10/16/op-ed-why-ending- volunteer-service-next-fight-military-equalityPost. 11 Ibid. 12 Kathy Roth -Douquet and Frank Schaefer, AWOL...Spreadsheet Modeling & Decision Analysis, 4th ed. ( Mason , OH: Thomson South-Western, 2004), 107. 27 U.S. Army Recruiting Command, G7/9 - Marketing, Education...2009. newsarticle.aspx?id=53310. Ragsdale, Cliff T. Spreadsheet Modeling & Decision Analysis. 4th ed. Mason , OH

  17. Analysis of the evaluated data discrepancies for minor actinides and development of improved evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ignatyuk, A. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)


    The work is directed on a compilation of experimental and evaluated data available for neutron induced reaction cross sections on {sup 237}Np, {sup 241}Am, {sup 242m}Am and {sup 243}Am isotopes, on the analysis of the old data and renormalizations connected with changes of standards and on the comparison of experimental data with theoretical calculation. Main results of the analysis performed by now are presented in this report. (J.P.N.)

  18. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.


    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  19. Improving sensitivity and linear dynamic range of intact protein analysis using a robust and easy to use microfluidic device. (United States)

    Roman, Gregory T; Murphy, James P


    We demonstrate an integrated microfluidic LC device coupled to a QTOF capable of improving sensitivity and linearity for intact protein analysis while also tuning the charge state distributions (CSD) of whole antibodies. The mechanism for sensitivity improvement using microflow ESI is demonstrated by shifting of the CSD to higher charge state, and narrowing of the overall CSD. Both of these aspects serve to improve ion current of the most abundant charge state of antibodies and lead to improvement in sensitivity over high flow ESI by a factor of 15×. Current limits of detection are 0.1 ng (on-column) (n = 100, %RSD = 17.5) using IgG glycosylated antibody, as compared to 5 ng (on-column) (n = 10, %RSD = 15) for high flow LC-ESI-MS. In addition to improvements in sensitivity we also observe improvements in linear dynamic range for microflow ESI that results from a combination of lower limits of detection and narrower CSD. An improvement of linear dynamic range of 1.5 orders of magnitude was observed over conventional high flow LC-MS. In cases where the complexity of the antibody limited both sensitivity and spectral charge state resolution, we employed supercharging and decharging mechanisms to further improve sensitivity and charge state spacing resolution. We demonstrate an 89% increase in sensitivity using glycerol that was added post column, with retention of the glycoform resolution. Since large proteins reside in a relatively low noise region of the mass spectra it is possible to realize effects of supercharging for intact proteins, specifically antibodies of 150 kDa, that are less pronounced for peptide supercharging. We also demonstrate a 51% increase in charge state resolution as imidazole was used to generate lower charge states for high-mass ions. The increase in charge state resolution enables more complex antibodies, or antibody mixtures that coelute in the LC, to be deconvoluted more efficiently. In summary, we demonstrate an analytical technique that

  20. The value of improved wind power forecasting: Grid flexibility quantification, ramp capability analysis, and impacts of electricity market operation timescales

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin; Wu, Hongyu; Florita, Anthony R.; Brancucci Martinez-Anido, Carlo; Hodge, Bri-Mathias


    The value of improving wind power forecasting accuracy at different electricity market operation timescales was analyzed by simulating the IEEE 118-bus test system as modified to emulate the generation mixes of the Midcontinent, California, and New England independent system operator balancing authority areas. The wind power forecasting improvement methodology and error analysis for the data set were elaborated. Production cost simulation was conducted on the three emulated systems with a total of 480 scenarios, considering the impacts of different generation technologies, wind penetration levels, and wind power forecasting improvement timescales. The static operational flexibility of the three systems was compared through the diversity of generation mix, the percentage of must-run baseload generators, as well as the available ramp rate and the minimum generation levels. The dynamic operational flexibility was evaluated by the real-time upward and downward ramp capacity. Simulation results show that the generation resource mix plays a crucial role in evaluating the value of improved wind power forecasting at different timescales. In addition, the changes in annual operational electricity generation costs were mostly influenced by the dominant resource in the system. Finally, the impacts of pumped-storage resources, generation ramp rates, and system minimum generation level requirements on the value of improved wind power forecasting were also analyzed.

  1. Analysis of policy towards improvement of perinatal mortality in the Netherlands (2004-2011). (United States)

    Vos, Amber A; van Voorst, Sabine F; Steegers, Eric A P; Denktaş, Semiha


    Relatively high perinatal mortality and morbidity rates(2) in the Netherlands resulted in a process which induced policy changes regarding the Dutch perinatal healthcare system. Aims of this policy analysis are (1) to identify actors, context and process factors that promoted or impeded agenda setting and formulation of policy regarding perinatal health care reform and (2) to present an overview of the renewed perinatal health policy. The policy triangle framework for policy analysis by Walt and Gilson was applied(3). Contents of policy, actors, context factors and process factors were identified by triangulation of data from three sources: a document analysis, stakeholder analysis and semi-structured interviews with key stakeholders. Analysis enabled us to chronologically reconstruct the policy process in response to the perinatal mortality rates. The quantification of the perinatal mortality problem, the openness of the debate and the nature of the topic were important process factors. Main theme of policy was that change was required in the entire spectrum of perinatal healthcare. This ranged from care in the preconception phase through to the puerperium. Furthermore emphasis was placed on the importance of preventive measures and socio-environmental determinants of health. This required involvement of the preventive setting, including municipalities. The Dutch tiered perinatal healthcare system and divergent views amongst curative perinatal health care providers were important context factors. This study provides lessons which are applicable to health care professionals and policy makers in perinatal care or other multidisciplinary fields.

  2. Performance Efficiency Improvement of Parabolic Solar Concentrating Collector (An Experimental Evaluation and Analysis

    Directory of Open Access Journals (Sweden)

    Durai Kalyana Kumar


    Full Text Available Energy conserved is energy generated’. Energy crisis is one of the crucial problems faced by all countries due to the rapid depletion of natural resources. A viable and an immediate solution at this juncture is the use of renewable energy sources like solar energy, wind energy, etc. A focusing type solar energy concentrator was fabricated and tested to evaluate its performance and to improve its operation efficiency. The experimental evaluations were carried out during the solar window (between 9:00 am to 3:00 pm using the statistical solar irradiation data and the real time measurements carried out using a pyranometer. Efficiency improvement was tried through different reflecting surfaces, greenhouse effect and selective coating. The energy conservation, preservation of fossil fuel and carbon foot print were estimated along with the cost economics and presented in this article in a very simplified style.

  3. Improvement Distance Discriminant Analysis Method%改进的距离判别分析法

    Institute of Scientific and Technical Information of China (English)



    在不对判别变量进行处理的条件下,对传统的距离判别方法进行改进,提出一种新的判别方法,试图解决复杂球形数据的判别问题,以提高判别的正确率.通过实例表明,该方法的判别效果良好,能较好地处理复杂球形数据的判别问题.%This paper presents a new discriminant method through improving the traditional distance discriminant method and tries to improve the accuracy of discriminant problems of complex spherical data in condition of not reducing variables. The example shows that this method has good discriminant effect, and can effectively deal with the discriminant problems of complex spherical data.

  4. Analysis of visual quality improvements provided by known tools for HDR content (United States)

    Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo


    In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.

  5. Improved Sinusoid Analysis and Post-Processing in Parametric Audio Coding

    Institute of Scientific and Technical Information of China (English)

    周宏; 陈健


    This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.

  6. Security Analysis and Improvement of User Authentication Framework for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Nan Chen


    Full Text Available Cloud Computing, as an emerging, virtual, large-scale distributed computing model, has gained increasing attention these years. Meanwhile it also faces many secure challenges, one of which is authentication. In this paper, we firstly analyze a user authentication framework for cloud computing proposed by Amlan Jyoti Choudhury et al and point out the security attacks existing in the protocol. Then we propose an improved user authentication scheme. Our improved protocol ensures user legitimacy before entering into the cloud. The confidentiality and the mutual authentication of our protocol are formally proved by the strand space model theory and the authentication test method. The simulation illustrates that the communication performance of our scheme is efficient

  7. Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference

    CERN Document Server

    Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter


    A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...

  8. A Comparitive Analysis of Improved 6t Sram Cell With Different Sram Cell

    Directory of Open Access Journals (Sweden)

    Aastha Singh


    Full Text Available High speed and low power consumption have been the primary issue to design Static Random Access Memory (SRAM, but we are facing new challenges with the scaling of technology. The stability and speed of SRAM are important issues to improve efficiency and performance of the system. Stability of the SRAM depends on the static noise margin (SNM so the noise margin is also important parameter for the design of memory because the higher noise margin confirms the high speed of the SRAM cell. In this paper, the improved 6T SRAM cell shows maximum reduction in power consumption of 88%, maximum reduction in delay of 64% and maximum SNM of 17% increases compared with 7T SRAM cell.

  9. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama


    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  10. Genetic Diversity Analysis of Iranian Improved Rice Cultivars through RAPD Markers

    Directory of Open Access Journals (Sweden)

    Ghaffar KIANI


    Full Text Available The aim of this study was to evaluate the genetic diversity of Iranian improved rice varieties. Sixteen rice varieties of particular interest to breeding programs were evaluated by means of random amplified polymorphic DNA (RAPD technique. The number of amplification products generated by each primer varied from 4 (OPB-04 to 11 (OPD-11 with an average of 8.2 bands per primer. Out of 49 bands, 33 (67.35% were found to be polymorphic for one or more cultivars ranging from 4 to 9 fragments per primer. The size of amplified fragments ranged between 350 to 1800 bp. Pair-wise Nei and Li�s (1979 similarity estimated the range of 0.59 to 0.98 between rice cultivars. Results illustrate the potential of RAPD markers to distinguish improved cultivars at DNA level. The information will facilitate selection of genotypes to serve as parents for effective rice breeding programs in Iran.

  11. An analysis of "eco-centric management" in China and methods to improve it

    Institute of Scientific and Technical Information of China (English)

    Li Zhengfeng; Zhang Fan


    This essay focuses on the question "what does the evidence on environmental regulation and its implementation tell us about the extent of eco-centric management in China and how to improve it". The first part will introduce ecocentrism, eco-centric management, and one major way to achieve eco-centric management in reality. Second, the environmental regulations of United Nation (UN) and China will be analyzed and compared to find out whether they are eco-centric. Moreover, the implementation of environmental regulation in China will be analyzed because regulation cannot exist without proper implementation. Three suggestions were given to improve eco-centric management in China:natural science research and public administration, environmental education, international cooperation.

  12. Improved neutron-gamma discrimination for a 6Li-glass neutron detector using digital signal analysis methods (United States)

    Wang, C. L.; Riedel, R. A.


    A 6Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 104. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a1, b0) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods.

  13. Improved neutron-gamma discrimination for a {sup 6}Li-glass neutron detector using digital signal analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C. L., E-mail:; Riedel, R. A. [Instrument and Source Division, Neutron Sciences Directorate, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States)


    A {sup 6}Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 10{sup 4}. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a{sub 1}, b{sub 0}) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods.

  14. Does downsizing improve organizational performance? An analysis of Spanish manufacturing firms


    Sánchez-Bueno, María José; Muñoz-Bullón, Fernando


    The objective of this study is to examine the effect of downsizing on corporate performance, considering a sample of manufacturing firms drawn from the Spanish Survey of Business Strategies during the 1993- 2005 period. No significant differences in post-downsizing performance arise between companies which downsize and those that do not. Likewise, we find that substantial workforce reductions through collective dismissals do not either lead to improved performance levels. Downsizing, therefor...

  15. The job analysis of Korean nurses as a strategy to improve the Korean Nursing Licensing Examination


    In Sook Park; Yeon Ok Suh; Hae Sook Park; Soo Yeon Ahn; So Young Kang; Il Sun Ko


    Purpose: This study aimed at characterizing Korean nurses’ occupational responsibilities to apply the results for improvement of the Korean Nursing Licensing Examination. Methods: First, the contents of nursing job were defined based on a focus group interview of 15 nurses. Developing a Curriculum (DACOM) method was used to examine those results and produce the questionnaire by 13 experts. After that, the questionnaire survey to 5,065 hospital nurses was done. Results: The occupational respon...

  16. An Impact Analysis of Logistics Accessibility Improvements on the Productivity of Manufacturing Sectors


    ITOH, Hidekazu


    International audience; This study constructs a theoretical production function that incorporates logistics accessibility and analyzes the economic impacts of improvements in freight transport for a regional economy. Using panel data between 1995 and 2010 for Japan, we evaluate the impacts of interregional logistics accessibility, or inbound (outbound) shipping of intermediate (final) goods, on production activity. The results show that the production function has increasing returns to scale,...

  17. Improving Scaling and root planing over the past 40 years: A meta-analysis


    Zaugg, Balthasar; Sahrmann, Philipp; Roos, Malgorzata; Attin, Thomas; SCHMIDLIN,Patrick R.


    Aim: To screen whether or not classical non-surgical periodontal therapy improved over the last four decades and how adjunctive local or systemic measures influenced its clinical outcome. Methodology: Starting from the year 1970, the entire annual sets of publications of every 5th year of the “Journal of Clinical Periodontology” and the “Journal of Periodontology” were hand searched for articles dealing with nonsurgical periodontal therapy, i.e. scaling and root planing either alone (SRP) ...

  18. Analysis of Productivity Improvement Act for Clinical Staff Working in the Health System: A Qualitative Study


    Vali, Leila; Tabatabaee, Seyed Saeed; Kalhor, Rohollah; Amini, Saeed; Kiaei, Mohammad Zakaria


    Introduction: The productivity of healthcare staff is one of the main issues for health managers. This study explores the concept of executive regulation of Productivity Improvement Act of clinical staff in health. Methods: In this study phenomenological methodology has been employed. The data were collected through semi-structured interviews and focus group composed of 10 hospital experts and experts in human resources department working in headquarter of Mashhad University of Medical Scienc...

  19. Analysis of explosion in enclosure based on improved method of images (United States)

    Wu, Z.; Guo, J.; Yao, X.; Chen, G.; Zhu, X.


    The aim of this paper is to present an improved method to calculate the pressure loading on walls during a confined explosion. When an explosion occurs inside of an enclosure, reflected shock waves produce multiple pressure peaks at a given wall location, especially at the corners. The effects of confined blast loading may bring about more serious damage to the structure due to multiple shock reflection. An approach, first proposed by Chan to describe the track of shock waves based on the mirror reflecting theory, using the method of images (MOI) is proposed to simplify internal explosion loading calculations. An improved method of images is proposed that takes into account wall openings and oblique reflections that cannot be considered with the standard MOI. The approach, validated using experimental data, provides a simplified and quick approach for loading calculation of a confined explosion. The results show that the peak overpressure tends to decline as the measurement point moves away from the center, and increases sharply as it approaches the enclosure corners. The specific impulse increases from the center to the corners. The improved method is capable of predicting pressure-time history and impulse with an accuracy comparable to that of three-dimensional AUTODYN code predictions.

  20. Subspace Leakage Analysis and Improved DOA Estimation With Small Sample Size (United States)

    Shaghaghi, Mahdi; Vorobyov, Sergiy A.


    Classical methods of DOA estimation such as the MUSIC algorithm are based on estimating the signal and noise subspaces from the sample covariance matrix. For a small number of samples, such methods are exposed to performance breakdown, as the sample covariance matrix can largely deviate from the true covariance matrix. In this paper, the problem of DOA estimation performance breakdown is investigated. We consider the structure of the sample covariance matrix and the dynamics of the root-MUSIC algorithm. The performance breakdown in the threshold region is associated with the subspace leakage where some portion of the true signal subspace resides in the estimated noise subspace. In this paper, the subspace leakage is theoretically derived. We also propose a two-step method which improves the performance by modifying the sample covariance matrix such that the amount of the subspace leakage is reduced. Furthermore, we introduce a phenomenon named as root-swap which occurs in the root-MUSIC algorithm in the low sample size region and degrades the performance of the DOA estimation. A new method is then proposed to alleviate this problem. Numerical examples and simulation results are given for uncorrelated and correlated sources to illustrate the improvement achieved by the proposed methods. Moreover, the proposed algorithms are combined with the pseudo-noise resampling method to further improve the performance.

  1. Performance Analysis Of An Improved Graded Precision Localization Algorithm For Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sanat Sarangi


    Full Text Available In this paper an improved version of the graded precision localization algorithm GRADELOC, calledIGRADELOC is proposed. The performance of GRADELOC is dependent on the regions formed by theoverlapping radio ranges of the nodes of the underlying sensor network. A different region pattern couldsignificantly alter the nature and precision of localization. In IGRADELOC, two improvements aresuggested. Firstly, modifications are proposed in the radio range of the fixed-grid nodes, keeping in mindthe actual radio range of commonly available nodes, to allow for routing through them. Routing is notaddressed by GRADELOC, but is of prime importance to the deployment of any adhoc network,especially sensor networks. A theoretical model expressing the radio range in terms of the celldimensions of the grid infrastructure is proposed, to help in carrying out a deployment plan whichachieves the desirable precision of coarse-grained localization. Secondly, in GRADELOC it is observedthat fine-grained localization does not achieve significant performance benefits over coarse-grainedlocalization. In IGRADELOC, this factor is addressed with the introduction of a parameter that could beused to improve and fine-tune the precision of fine-grained localization..

  2. The Fuel Efficiency of Maritime Transport. Potential for improvement and analysis of barriers

    Energy Technology Data Exchange (ETDEWEB)

    Faber, J.; Nelissen, D.; Smit, M. [CE Delft, Delft (Netherlands); Behrends, B. [Marena Ltd., s.l. (United Kingdom); Lee, D.S. [Manchester Metropolitan University, Machester (United Kingdom)


    There is significant potential to improve the fuel efficiency of ships and thus contribute to reducing greenhouse gas emissions from maritime transport. It has long been recognised that this potential is not being fully exploited, owing to the existence of non-market barriers. This report analyses the barriers to implementing fuel efficiency improvements, and concludes that the most important of these are the split incentive between ship owners and operators, a lack of trusted data on new technologies, and transaction costs associated with evaluating measures. As a result, in practice about a quarter of the cost-effective abatement potential is unavailable. There are several ways to overcome these barriers. The split incentive can - to some extent - be overcome by providing more detailed information on the fuel efficiency of vessels, making due allowance for operational profiles. This would allow fuel consumption to be more accurately projected and a larger share of efficiency benefits to accrue to ship owners, thus increasing the return on investment in fuel-saving technologies. This would also require changes to standard charter parties. The credibility of information on new technologies can be improved through intensive collaboration between suppliers of new technologies and shipping companies. In order to overcome risk, government subsidies could provide an incentive. This could have the additional benefit that governments could require publication of results.

  3. Improved simultaneous gas-chromatographic analysis for homovanillic acid and vanillylmandelic acid in urine. (United States)

    Leiendecker-Foster, C; Freier, E F


    We describe an improved gas-chromatographic method for the simultaneous quantitation of the catecholamine metabolites, homovanillic acid (3-methoxy-4-hydroxyphenylacetic acid) and vanillylmandelic acid (3-methoxy-4-hydroxymandelic acid). Our improvements in the method of Muskiet et al. (Clin. Chem. 23: 863, 1977) include a shorter program time and a longer silylation interval. Recovery and precision data obtained by this improved technique are similar to those of Muskiet et al. Vanillylmandelic acid results (y) were compared with those by the method of Pisano et al. (Clin. Chim. Acta 7: 285, 1962). The relation is expressed by the equation y = 0.52 + 1.05x (Sy . x = 2.33 mg/24 h and r = 0.997). Results for homovanillic acid (y) were compared with those by the method of Knight and Haymond (Clin. Chem. 23: 2007, 1977); the equation was y = 0.84 + 0.90x (Sy . x = 2.04 and r = 0.97). Retention times are also reported for several phenolic acids and other related compounds found in urine.

  4. Improved sample preparation for CE-LIF analysis of plant N-glycans. (United States)

    Nagels, Bieke; Santens, Francis; Weterings, Koen; Van Damme, Els J M; Callewaert, Nico


    In view of glycomics studies in plants, it is important to have sensitive tools that allow one to analyze and characterize the N-glycans present on plant proteins in different species. Earlier methods combined plant-based sample preparations with CE-LIF N-glycan analysis but suffered from background contaminations, often resulting in non-reproducible results. This publication describes a reproducible and sensitive protocol for the preparation and analysis of plant N-glycans, based on a combination of the 'in-gel release method' and N-glycan analysis on a multicapillary DNA sequencer. Our protocol makes it possible to analyze plant N-glycans starting from low amounts of plant material with highly reproducible results. The developed protocol was validated for different plant species and plant cells.

  5. Foreign object detection and removal to improve automated analysis of chest radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van [Diagnostic Image Analysis Group, Radboud University Nijmegen Medical Centre, Nijmegen 6525 GA (Netherlands); Story, Alistair; Hayward, Andrew [University College London, Centre for Infectious Disease Epidemiology, London NW3 2PF (United Kingdom)


    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.

  6. Time series analysis of improved quality of life in Canada: social change, collective consciousness, and the TM-Sidhi program. (United States)

    Assimakis, P D; Dillbeck, M C


    Two replication studies test in Canada a field theory of the effect of consciousness on social change. The exogenous variable is the number of participants in the largest North American group practice of the Transcendental Meditation and TM-Sidhi program, in Iowa. The first study indicated a significant reduction in violent deaths (homicide, suicide, and motor vehicle fatalities), using both time series intervention analysis and transfer function analysis methods, in weeks following change in the exogenous variable during the period 1983 to 1985. The second study, using time series intervention analysis, gave during and after intervention periods a significant improvement in quality of life on an index composed of the behavioral variables available on a monthly basis for Canada from 1972 to 1986-homicide, suicide, motor vehicle fatalities, cigarette consumption, and workers' days lost due to strikes. Implications of the findings for theory and social policy are noted briefly.

  7. Improvement of Information and Methodical Provision of Macro-economic Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Tiurina Dina M.


    Full Text Available The article generalises and analyses main shortcomings of the modern system of macro-statistical analysis based on the use of the system of national accounts and balance of the national economy. The article proves on the basis of historic analysis of formation of indicators of the system of national accounts that problems with its practical use have both regional and global reasons. In order to eliminate impossibility of accounting life quality the article offers a system of quality indicators based on the general perception of wellbeing as assurance in own solvency of population and representative sampling of economic subjects.

  8. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.


    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  9. Improved elucidation of biological processes linked to diabetic nephropathy by single probe-based microarray data analysis.

    Directory of Open Access Journals (Sweden)

    Clemens D Cohen

    Full Text Available BACKGROUND: Diabetic nephropathy (DN is a complex and chronic metabolic disease that evolves into a progressive fibrosing renal disorder. Effective transcriptomic profiling of slowly evolving disease processes such as DN can be problematic. The changes that occur are often subtle and can escape detection by conventional oligonucleotide DNA array analyses. METHODOLOGY/PRINCIPAL FINDINGS: We examined microdissected human renal tissue with or without DN using Affymetrix oligonucleotide microarrays (HG-U133A by standard Robust Multi-array Analysis (RMA. Subsequent gene ontology analysis by Database for Annotation, Visualization and Integrated Discovery (DAVID showed limited detection of biological processes previously identified as central mechanisms in the development of DN (e.g. inflammation and angiogenesis. This apparent lack of sensitivity may be associated with the gene-oriented averaging of oligonucleotide probe signals, as this includes signals from cross-hybridizing probes and gene annotation that is based on out of date genomic data. We then examined the same CEL file data using a different methodology to determine how well it could correlate transcriptomic data with observed biology. ChipInspector (CI is based on single probe analysis and de novo gene annotation that bypasses probe set definitions. Both methods, RMA and CI, used at default settings yielded comparable numbers of differentially regulated genes. However, when verified by RT-PCR, the single probe based analysis demonstrated reduced background noise with enhanced sensitivity and fewer false positives. CONCLUSIONS/SIGNIFICANCE: Using a single probe based analysis approach with de novo gene annotation allowed an improved representation of the biological processes linked to the development and progression of DN. The improved analysis was exemplified by the detection of Wnt signaling pathway activation in DN, a process not previously reported to be involved in this disease.

  10. MATT: Monitoring, Analysis and Toxicity of Toxaphene: improvement of analytical methods


    De Boer, J.; Klungsøyr, J; Nesje, G.; Meier, S; McHugh, B.; Nixon, E.; Rimkus, G.G.


    The European Research Project MATT (Investigation into the Monitoring, Analysis and Toxicity of Toxaphene) started in 1997 and had the objective to provide information on toxicological risks to the consumer of toxaphene residues in fish from European waters. This report includes information on the analytical block of the project, which comprised three studies.

  11. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase II (United States)


    Distributed Information and Systems Experimentation ( DISE ). Dr. Gallup has a multi-disciplinary science, engineering, and analysis background including...Experimentation ( DISE ) research group, since 2007. In 2009, he became involved with data mining research and its effect on Knowledge Management and defense

  12. Improving the quality of team task analysis: experimental validation of guidelines

    NARCIS (Netherlands)

    Berlo, M.P.W. van


    Traditionally, task analysis is conducted at the level of individual task performers. Instructional designers for team training find it hard to conduct task analyses at the level of a team. It is difficult to capture the interactivity and interdependency between the various team members, and to form

  13. Improved cosmological constraints from a joint analysis of the SDSS-II and SNLS supernova samples

    CERN Document Server

    Betoule, M; Guy, J; Mosher, J; Hardin, D; Biswas, R; Astier, P; El-Hage, P; Konig, M; Kuhlmann, S; Marriner, J; Pain, R; Regnault, N; Balland, C; Bassett, B A; Brown, P J; Campbell, H; Carlberg, R G; Cellier-Holzem, F; Cinabro, D; Conley, A; D'Andrea, C B; DePoy, D L; Doi, M; Ellis, R S; Fabbro, S; Filippenko, A V; Foley, R J; Frieman, J A; Fouchez, D; Galbany, L; Goobar, A; Gupta, R R; Hill, G J; Hlozek, R; Hogan, C J; Hook, I M; Howell, D A; Jha, S W; Guillou, L Le; Leloudas, G; Lidman, C; Marshall, J L; Möller, A; Mourão, A M; Neveu, J; Nichol, R; Olmstead, M D; Palanque-Delabrouille, N; Perlmutter, S; Prieto, J L; Pritchet, C J; Richmond, M; Riess, A G; Ruhlmann-Kleider, V; Sako, M; Schahmaneche, K; Schneider, D P; Smith, M; Sollerman, J; Sullivan, M; Walton, N A; Wheeler, C J


    We present cosmological constraints from a joint analysis of type Ia supernova (SN Ia) observations obtained by the SDSS-II and SNLS collaborations. The data set includes several low-redshift samples (z<0.1), all 3 seasons from the SDSS-II (0.05 < z < 0.4), and 3 years from SNLS (0.2

  14. Improving production of β-lactam antibiotics by Penicillium chrysogenum: Metabolic engineering based on transcriptome analysis

    NARCIS (Netherlands)

    Veiga, T.


    In Chapters 2-5 of this thesis, the applicability of transcriptome analysis to guide metabolic engineering strategies in P. chrysogenum is explored by investigating four cellular processes that are of potential relevance for industrial production of β-lactam antibiotics: - Regulation of secondary me

  15. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey (United States)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.


    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  16. The Efficacy of Written Corrective Feedback in Improving L2 Written Accuracy: A Meta-Analysis (United States)

    Kang, EunYoung; Han, Zhaohong


    Written corrective feedback has been subject to increasing attention in recent years, in part because of the conceptual controversy surrounding it and in part because of its ubiquitous practice. This study takes a meta-analytic approach to synthesizing extant empirical research, including 21 primary studies. Guiding the analysis are two questions:…

  17. Improving Student Critical Thinking and Perceptions of Critical Thinking through Direct Instruction in Rhetorical Analysis (United States)

    McGuire, Lauren A.


    This study investigated the effect of direct instruction in rhetorical analysis on students' critical thinking abilities, including knowledge, skills, and dispositions. The researcher investigated student perceptions of the effectiveness of argument mapping; Thinker's Guides, based on Paul's model of critical thinking; and Socratic questioning.…

  18. Improved method for fibre content and quality analysis and their application to flax genetic diversity investigations

    NARCIS (Netherlands)

    Oever, van den M.J.A.; Bas, N.; Soest, van L.J.M.; Melis, C.; Dam, van J.E.G.


    Evaluation for fibre content and quality in a breeding selection program is time consuming and costly. Therefore, this study aims to develop a method for fast and reproducible fibre content analysis on small flax straw samples. A protocol has been developed and verified with fibre screening methods

  19. An Improvement of the Harman-Fukuda-Method for the Minres Solution in Factor Analysis. (United States)

    Hafner, Robert


    The method proposed by Harman and Fukuda to treat the so-called Heywood case in the minres method in factor analysis (i.e., the case where the resulting communalities are greater than one) involves the frequent solution of eigenvalue problems. A simple method to treat this problem is presented. (Author/JKS)

  20. Economic viewpoints in educational effectiveness : Cost-effectiveness analysis of an educational improvement project

    NARCIS (Netherlands)

    Creemers, B; van der Werf, G


    Cost-effectiveness analysis is not only important for decision making in educational policy and practice. Also within educational effectiveness research it is important to establish the costs of educational processes in relationship to their effects. The integrated multilevel educational effectivene

  1. Meta-Analysis as a Choice to Improve Research in Career and Technical Education (United States)

    Gordon, Howard R. D.; McClain, Clifford R.; Kim, Yeonsoo; Maldonado, Cecilia


    A search of the ERIC and Academic Search Premier data bases, and a comprehensive review of literature suggest that meta-analysis is ignored by career and technical education (CTE) researchers, a situation that is regrettable but remediable. The purpose of this theoretical paper is to provide CTE researchers and consumers with procedures for…

  2. Improving Treatment Plan Implementation in Schools: A Meta-Analysis of Single Subject Design Studies (United States)

    Noell, George H.; Gansle, Kristin A.; Mevers, Joanna Lomas; Knox, R. Maria; Mintz, Joslyn Cynkus; Dahir, Amanda


    Twenty-nine peer-reviewed journal articles that analyzed intervention implementation in schools using single-case experimental designs were meta-analyzed. These studies reported 171 separate data paths and provided 3,991 data points. The meta-analysis was accomplished by fitting data extracted from graphs in mixed linear growth models. This…

  3. Costs and benefits of automotive fuel economy improvement: A partial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. (Oak Ridge National Lab., TN (United States)); Duleep, K.G. (Energy and Environmental Analysis, Inc., Arlington, VA (United States))


    This paper is an exercise in estimating the costs and benefits of technology-based fuel economy improvements for automobiles and light trucks. Benefits quantified include vehicle cots, fuel savings, consumer's surplus effects, the effect of reduced weight on vehicle safety, impacts on emissions of CO{sub 2} and criteria pollutants, world oil market and energy security benefits, and the transfer of wealth from US consumes to oil producers. A vehicle stock model is used to capture sales, scrappage, and vehicle use effects under three fuel price scenarios. Three alternative fuel economy levels for 2001 are considered, ranging from 32.9 to 36.5 MPG for cars and 24.2 to 27.5 MPG for light trucks. Fuel economy improvements of this size are probably cost-effective. The size of the benefit, and whether there is a benefit, strongly depends on the financial costs of fuel economy improvement and judgments about the values of energy security, emissions, safety, etc. Three sets of values for eight parameters are used to define the sensitivity of costs and benefits to key assumptions. The net present social value (1989$) of costs and benefits ranges from a cost of $11 billion to a benefit of $286 billion. The critical parameters being the discount rate (10% vs. 3%) and the values attached to externalities. The two largest components are always the direct vehicle costs and fuel savings, but these tend to counterbalance each other for the fuel economy levels examined here. Other components are the wealth transfer, oil cost savings, CO{sub 2} emissions reductions, and energy security benefits. Safety impacts, emissions of criteria pollutants, and consumer's surplus effects are relatively minor components. The critical issues for automotive fuel economy are therefore: (1) the value of present versus future costs and benefits, (2) the values of external costs and benefits, and (3) the financially cost-effective level of MPG achievable by available technology. 53 refs.

  4. Costs and benefits of automotive fuel economy improvement: A partial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. [Oak Ridge National Lab., TN (United States); Duleep, K.G. [Energy and Environmental Analysis, Inc., Arlington, VA (United States)


    This paper is an exercise in estimating the costs and benefits of technology-based fuel economy improvements for automobiles and light trucks. Benefits quantified include vehicle cots, fuel savings, consumer`s surplus effects, the effect of reduced weight on vehicle safety, impacts on emissions of CO{sub 2} and criteria pollutants, world oil market and energy security benefits, and the transfer of wealth from US consumes to oil producers. A vehicle stock model is used to capture sales, scrappage, and vehicle use effects under three fuel price scenarios. Three alternative fuel economy levels for 2001 are considered, ranging from 32.9 to 36.5 MPG for cars and 24.2 to 27.5 MPG for light trucks. Fuel economy improvements of this size are probably cost-effective. The size of the benefit, and whether there is a benefit, strongly depends on the financial costs of fuel economy improvement and judgments about the values of energy security, emissions, safety, etc. Three sets of values for eight parameters are used to define the sensitivity of costs and benefits to key assumptions. The net present social value (1989$) of costs and benefits ranges from a cost of $11 billion to a benefit of $286 billion. The critical parameters being the discount rate (10% vs. 3%) and the values attached to externalities. The two largest components are always the direct vehicle costs and fuel savings, but these tend to counterbalance each other for the fuel economy levels examined here. Other components are the wealth transfer, oil cost savings, CO{sub 2} emissions reductions, and energy security benefits. Safety impacts, emissions of criteria pollutants, and consumer`s surplus effects are relatively minor components. The critical issues for automotive fuel economy are therefore: (1) the value of present versus future costs and benefits, (2) the values of external costs and benefits, and (3) the financially cost-effective level of MPG achievable by available technology. 53 refs.

  5. Stability analysis and design of the improved droop controller on a voltage source inverter

    DEFF Research Database (Denmark)

    Calabria, Mauro; Schumacher, Walter; Guerrero, Josep M.;


    This paper studies the dynamics of a droop-controlled voltage source inverter connected to a stiff grid and addresses the use of the improved droop controller in order to enhance the dynamic behavior of the system. The small-signal stability of the inverter is studied in depth considering...... variations on its droop parameters, providing a general understanding on the functioning of such system. Finally, three methods to tune the dynamic droop gain are introduced and compared, providing analytical, simulation, and experimental results....

  6. Analysis and Improvement of Aerodynamic Performance of Straight Bladed Vertical Axis Wind Turbines (United States)

    Ahmadi-Baloutaki, Mojtaba

    Vertical axis wind turbines (VAWTs) with straight blades are attractive for their relatively simple structure and aerodynamic performance. Their commercialization, however, still encounters many challenges. A series of studies were conducted in the current research to improve the VAWTs design and enhance their aerodynamic performance. First, an efficient design methodology built on an existing analytical approach is presented to formulate the design parameters influencing a straight bladed-VAWT (SB-VAWT) aerodynamic performance and determine the optimal range of these parameters for prototype construction. This work was followed by a series of studies to collectively investigate the role of external turbulence on the SB-VAWTs operation. The external free-stream turbulence is known as one of the most important factors influencing VAWTs since this type of turbines is mainly considered for urban applications where the wind turbulence is of great significance. Initially, two sets of wind tunnel testing were conducted to study the variation of aerodynamic performance of a SB-VAWT's blade under turbulent flows, in two major stationary configurations, namely two- and three-dimensional flows. Turbulent flows generated in the wind tunnel were quasi-isotropic having uniform mean flow profiles, free of any wind shear effects. Aerodynamic force measurements demonstrated that the free-stream turbulence improves the blade aerodynamic performance in stall and post-stall regions by delaying the stall and increasing the lift-to-drag ratio. After these studies, a SB-VAWT model was tested in the wind tunnel under the same type of turbulent flows. The turbine power output was substantially increased in the presence of the grid turbulence at the same wind speeds, while the increase in turbine power coefficient due to the effect of grid turbulence was small at the same tip speed ratios. The final section presents an experimental study on the aerodynamic interaction of VAWTs in arrays

  7. Improved quality control of silicon wafers using novel off-line air pocket image analysis (United States)

    Valley, John F.; Sanna, M. Cristina


    Air pockets (APK) occur randomly in Czochralski (Cz) grown silicon (Si) crystals and may become included in wafers after slicing and polishing. Previously the only APK of interest were those that intersected the front surface of the wafer and therefore directly impacted device yield. However mobile and other electronics have placed new demands on wafers to be internally APK-free for reasons of thermal management and packaging yield. We present a novel, recently patented, APK image processing technique and demonstrate the use of that technique, off-line, to improve quality control during wafer manufacturing.

  8. Enhancement on "Security analysis and improvements of arbitrated quantum signature schemes"

    CERN Document Server

    Hwang, Tzonelih; Chong, Song-Kong


    Recently, Zou et al. [Phys. Rev. A 82, 042325 (2010)] demonstrated that two arbitrated quantum signature (AQS) schemes are not secure, because an arbitrator cannot arbitrate the dispute between two users when a receiver repudiates the integrity of a signature. By using a public board, Zou et al. proposed two AQS schemes to solve the problem. This work shows that the same security problem may exist in Zou et al.'s schemes and also that a malicious party can reveal the other party's secret key without being detected by using Trojan-horse attacks. Accordingly, an improved scheme is proposed to resolve the problems.

  9. Analysis of Server Log by Web Usage Mining for Website Improvement

    Directory of Open Access Journals (Sweden)

    Navin Kumar Tyagi


    Full Text Available Web server logs stores click stream data which can be useful for mining purposes. The data is stored as a result of user's access to a website. Web usage mining an application of data mining can be used to discover user access patterns from weblog data. The obtained results are used in different applications like, site modifications, business intelligence, system improvement and personalization. In this study, we have analyzed the log files of smart sync software web server to get information about visitors; top errors which can be utilized by system administrator and web designer to increase the effectiveness of the web site.

  10. An Improved Method for Discriminating ECG Signals using Typical Nonlinear Dynamic Parameters and Recurrence Quantification Analysis in Cardiac Disease Therapy. (United States)

    Tang, M; Chang, C Q; Fung, P C W; Chau, K T; Chan, F H Y


    The discrimination of ECG signals using nonlinear dynamic parameters is of crucial importance in the cardiac disease therapy and chaos control for arrhythmia defibrillation in the cardiac system. However, the discrimination results of previous studies using features such as maximal Lyapunov exponent (λmax) and correlation dimension (D2) alone are somewhat limited in recognition rate. In this paper, improved methods for computing λmaxand D2are purposed. Another parameter from recurrence quantification analysis is incorporated to the new multi-feature Bayesian classifier with λmaxand D2so as to improve the discrimination power. Experimental results have verified the prediction using Fisher discriminant that the maximal vertical line length (Vmax) from recurrence quantification analysis is the best to distinguish different ECG classes. Experimental results using the MIT-BIH Arrhythmia Database show improved and excellent overall accuracy (96.3%), average sensitivity (96.3%) and average specificity (98.15%) for discriminating sinus, premature ventricular contraction and ventricular flutter signals.

  11. Analysis of Power Quality Based on Real Data and Quality Improvement at Campus Distribution System (United States)

    Kawasaki, Shoji; Matsuki, Junya; Hayashi, Yasuhiro; Ito, Akitoshi

    In recent years, a lot of equipments have been made using the inverter technology from home electric appliances to office automation apparatuses and industrial equipments with the development of power electronics technology. The voltage distortion of a distribution system has increased due to the harmonic currents generated from these apparatuses, and the increase in harmonics continues to be expected. In addition, the distribution system forms the circuit of harmonic distortion expansion by the prevalence of static capacitor without L for power factor improvement. Moreover, the voltage imbalance occurs by diversification of loads or imbalanced connection of single-phase loads. The deterioration of power quality in the distribution system causes various problems such as the overheating of equipments and malfunction of rotating machines. Since the power quality changes according to air temperature and date, it is desirable to measure the voltages and currents continuously for a long time. In this study, the authors focus attention on the distribution system in the University of Fukui campus, and the authors have measured the voltages and currents in the distribution system for a long period with WAMS (Wide Area Measurement System) using NCT (Network Computing Terminal). Based on the obtained data, the authors analyzed the power quality of the campus distribution system from viewpoints of voltage imbalance, current imbalance, voltage THD (Total Harmonic Distortion), and current THD. Furthermore, the improvement effect of power quality of the campus distribution system by exchange of single-phase load connection is described.

  12. Some drastic improvements found in the analysis of routing protocol for the Bluetooth technology using scatternet

    CERN Document Server

    Perwej, Yusuf; Jaleel, Uruj; Saxena, Sharad


    Bluetooth is a promising wireless technology that enables portable devices to form short-range wireless ad hoc networks. Unlike wireless LAN, the communication of Bluetooth devices follow a strict master slave relationship, that is, it is not possible for a slave device to directly communicate with another slave device even though they are within the radio coverage of each other. For inter piconet communication, a scatternet has to be formed, in which some Bluetooth devices have to act as bridge nodes between piconets. The Scatternet formed have following properties in which they are connected i.e every Bluetooth device can be reached from every other device, Piconet size is limited to eight nodes [1]. The author of this research paper have studied different type of routing protocol and have made efforts to improve throughput and reduce packet loss due to failure in the routing loop and increased mobility and improve the cohesive network structure, resolve the change topology conflicts [2], and a successful &...

  13. An improved quantitative mass spectrometry analysis of tumor specific mutant proteins at high sensitivity. (United States)

    Ruppen-Cañás, Isabel; López-Casas, Pedro P; García, Fernando; Ximénez-Embún, Pilar; Muñoz, Manuel; Morelli, M Pia; Real, Francisco X; Serna, Antonio; Hidalgo, Manuel; Ashman, Keith


    New disease specific biomarkers, especially for cancer, are urgently needed to improve individual diagnosis, prognosis, and treatment selection, that is, for personalized medicine. Genetic mutations that affect protein function drive cancer. Therefore, the detection of such mutations represents a source of cancer specific biomarkers. Here we confirm the implementation of the mutant protein specific immuno-SRM (where SRM is selective reaction monitoring) mass spectrometry method of RAS proteins reported by Wang et al. [Proc. Natl. Acad. Sci. USA 2011, 108, 2444-2449], which exploits an antibody to simultaneously capture the different forms of the target protein and the resolving power and sensitivity of LC-MS/MS and improve the technique by using a more sensitive mass spectrometer. The mutant form G12D was quantified by SRM on a QTRAP 5500 mass spectrometer and the MIDAS workflow was used to confirm the sequence of the targeted peptides. This assay has been applied to quantify wild type and mutant RAS proteins in patient tumors, xenografted human tissue, and benign human epidermal tumors at high sensitivity. The limit of detection for the target proteins was as low as 12 amol (0.25 pg). It requires low starting amounts of tissue (ca.15 mg) that could be obtained from a needle aspiration biopsy. The described strategy could find application in the clinical arena and be applied to the study of expression of protein variants in disease.

  14. Shock reliability analysis and improvement of MEMS electret-based vibration energy harvesters (United States)

    Renaud, M.; Fujita, T.; Goedbloed, M.; de Nooijer, C.; van Schaijk, R.


    Vibration energy harvesters can serve as a replacement solution to batteries for powering tire pressure monitoring systems (TPMS). Autonomous wireless TPMS powered by microelectromechanical system (MEMS) electret-based vibration energy harvester have been demonstrated. The mechanical reliability of the MEMS harvester still has to be assessed in order to bring the harvester to the requirements of the consumer market. It should survive the mechanical shocks occurring in the tire environment. A testing procedure to quantify the shock resilience of harvesters is described in this article. Our first generation of harvesters has a shock resilience of 400 g, which is far from being sufficient for the targeted application. In order to improve this aspect, the first important aspect is to understand the failure mechanism. Failure is found to occur in the form of fracture of the device’s springs. It results from impacts between the anchors of the springs when the harvester undergoes a shock. The shock resilience of the harvesters can be improved by redirecting these impacts to nonvital parts of the device. With this philosophy in mind, we design three types of shock absorbing structures and test their effect on the shock resilience of our MEMS harvesters. The solution leading to the best results consists of rigid silicon stoppers covered by a layer of Parylene. The shock resilience of the harvesters is brought above 2500 g. Results in the same range are also obtained with flexible silicon bumpers, which are simpler to manufacture.

  15. An Effective Analysis of Weblog Files to Improve Website Personalization for E-Business

    Directory of Open Access Journals (Sweden)

    Bhavyesh Gandhi


    Full Text Available The World Wide Web is a perennial repository of immense information. Web has provided with too many options and web users are overloaded with information. As there is an enormous growth in the web in terms of web sites, the size of web usage data is also increasing gradually. But this web usage data plays a vital role in the effective management of web sites. This web usage data is stored in a file called weblog by the web server. In order to discover the knowledge, required for improving the performance of websites and web personalization, we need to apply best preprocessing methodology on the server weblog files. Data preprocessing is a phase which automatically identifies the meaningful patterns and user behavior. So far analyzing the weblog data has been a challenging task in the area of web usage mining. Web personalization is most effective approach to overcome the problem of information overload.In this paper we propose an effective and enhanced data preprocessing methodology which produces an efficient usage patterns, reduces the size of weblog and produce a personalized website which help to improve E-business.The experimental results are also shown in the following chapters

  16. Improved response surface method and its application in stability reliability degree analysis of tunnel surrounding rock

    Institute of Scientific and Technical Information of China (English)


    An approach of limit state equation for surrounding rock was put forward based on deformation criterion. A method of symmetrical sampling of basic random variables adopted by classical response surface method was mended, and peak value and deflection degree of basic random variables distribution curve were took into account in the mended sampling method. A calculation way of probability moment, based on mended Rosenbluth method, suitable for non-explicit performance function was put forward.The first, second, third and fourth order moments of functional function value were calculated by mended Rosenbluth method through the first, second, third and fourth order moments of basic random variable. A probability density the function(PDF) of functional function was deduced through its first, second, third and fourth moments, the PDF in the new method took the place of the method of quadratic polynomial to approximate real functional function and reliability probability was calculated through integral by the PDF for random variable of functional function value in the new method. The result shows that the improved response surface method can adapt to various statistic distribution types of basic random variables, its calculation process is legible and need not iterative circulation. In addition, a stability probability of surrounding rock for a tunnel was calculated by the improved method,whose workload is only 30% of classical method and its accuracy is comparative.

  17. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.


    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  18. Outlier analysis in orthopaedics: use of CUSUM: the Scottish Arthroplasty Project: shouldering the burden of improvement. (United States)

    Macpherson, Gavin J; Brenkel, Ivan J; Smith, Rik; Howie, Colin R


    National joint registries have become well established across the world. Most registries track implant survival so that poorly performing implants can be removed from the market. The Scottish Arthroplasty Project was established in 1999 with the aim of encouraging continual improvement in the quality of care provided to joint replacement patients in Scotland. This aim has been achieved by using statistics to engage surgeons in the process of audit. We monitor easily identifiable end points of public concern and inform surgeons if they breach our statistical limits and become "outliers." Outlier status is often associated with poor implants, and our methods are therefore applicable for indirect implant surveillance. The present report describes the evolution of our statistical methodology, the processes that we use to promote positive changes in practice, and the improvements in patient outcomes that we have achieved. Failure need not be fatal, but failure to change almost always is. We describe the journey of both the Scottish Arthroplasty Project and the orthopaedic surgeons of Scotland to this realization.

  19. Walk This Way: Improving Pedestrian Agent-Based Models through Scene Activity Analysis

    Directory of Open Access Journals (Sweden)

    Andrew Crooks


    Full Text Available Pedestrian movement is woven into the fabric of urban regions. With more people living in cities than ever before, there is an increased need to understand and model how pedestrians utilize and move through space for a variety of applications, ranging from urban planning and architecture to security. Pedestrian modeling has been traditionally faced with the challenge of collecting data to calibrate and validate such models of pedestrian movement. With the increased availability of mobility datasets from video surveillance and enhanced geolocation capabilities in consumer mobile devices we are now presented with the opportunity to change the way we build pedestrian models. Within this paper we explore the potential that such information offers for the improvement of agent-based pedestrian models. We introduce a Scene- and Activity-Aware Agent-Based Model (SA2-ABM, a method for harvesting scene activity information in the form of spatiotemporal trajectories, and incorporate this information into our models. In order to assess and evaluate the improvement offered by such information, we carry out a range of experiments using real-world datasets. We demonstrate that the use of real scene information allows us to better inform our model and enhance its predictive capabilities.

  20. Design and Analysis of Multi Level D-STATCOM to Improve the Power Quality

    Directory of Open Access Journals (Sweden)

    Dinesh. Badavath,


    Full Text Available In the last decade, the electrical power quality issue has been the main concern of the power companies. Power quality is defined as the index which both the delivery and consumption of electric power affect on the performance of electrical apparatus. From a customer point of view, a power quality problem can be defined as any problem is manifested on voltage, current, or frequency deviation that results in power failure. The power electronics progressive, especially in flexible alternating-current transmission system (FACTS and custom power devices, affects power quality improvement. This paper presents an investigation of seven-Level Cascaded H - bridge (CHB Inverter as Distribution Static Compensator (DSTATCOM in Power System (PS for compensation of reactive power and harmonics. The advantages of CHB inverter are low harmonic distortion, reduced number of switches and suppression of switching losses. The DSTATCOM helps to improve the power factor and eliminate the Total Harmonics Distortion (THD drawn from a Non-Liner Diode Rectifier Load (NLDRL. The D-Q reference frame theory is used to generate the reference compensating currents for DSTATCOM while Proportional and Integral (PI control is used for capacitor dc voltage regulation. A CHB Inverter is considered for shunt compensation of a 11 Kv distribution system. Finally a level shifted PWM (LSPWM and phase shifted PWM (PSPWM techniques are adopted to investigate the performance of CHB Inverter. The results are obtained through Matlab/Simulink software package.

  1. Improved pulse transit time estimation by system identification analysis of proximal and distal arterial waveforms. (United States)

    Xu, Da; Ryan, Kathy L; Rickards, Caroline A; Zhang, Guanqun; Convertino, Victor A; Mukkamala, Ramakrishna


    We investigated the system identification approach for potentially improved estimation of pulse transit time (PTT), a popular arterial stiffness marker. In this approach, proximal and distal arterial waveforms are measured and respectively regarded as the input and output of a system. Next, the system impulse response is identified from all samples of the measured input and output. Finally, the time delay of the impulse response is detected as the PTT estimate. Unlike conventional foot-to-foot detection techniques, this approach is designed to provide an artifact robust estimate of the true PTT in the absence of wave reflection. The approach is also applicable to arbitrary types of arterial waveforms. We specifically applied a parametric system identification technique to noninvasive impedance cardiography (ICG) and peripheral arterial blood pressure waveforms from 15 humans subjected to lower-body negative pressure. We assessed the technique through the correlation coefficient (r) between its 1/PTT estimates and measured diastolic pressure (DP) per subject and the root mean squared error (RMSE) of the DP predicted from these estimates and measured DP. The technique achieved average r and RMSE values of 0.81 ± 0.16 and 4.3 ± 1.3 mmHg. For comparison, the corresponding values were 0.59 ± 0.37 (P system identification approach can indeed improve PTT estimation.

  2. Statistical analysis of textural features for improved classification of oral histopathological images. (United States)

    Muthu Rama Krishnan, M; Shah, Pratik; Chakraborty, Chandan; Ray, Ajoy K


    The objective of this paper is to provide an improved technique, which can assist oncopathologists in correct screening of oral precancerous conditions specially oral submucous fibrosis (OSF) with significant accuracy on the basis of collagen fibres in the sub-epithelial connective tissue. The proposed scheme is composed of collagen fibres segmentation, its textural feature extraction and selection, screening perfomance enhancement under Gaussian transformation and finally classification. In this study, collagen fibres are segmented on R,G,B color channels using back-probagation neural network from 60 normal and 59 OSF histological images followed by histogram specification for reducing the stain intensity variation. Henceforth, textural features of collgen area are extracted using fractal approaches viz., differential box counting and brownian motion curve . Feature selection is done using Kullback-Leibler (KL) divergence criterion and the screening performance is evaluated based on various statistical tests to conform Gaussian nature. Here, the screening performance is enhanced under Gaussian transformation of the non-Gaussian features using hybrid distribution. Moreover, the routine screening is designed based on two statistical classifiers viz., Bayesian classification and support vector machines (SVM) to classify normal and OSF. It is observed that SVM with linear kernel function provides better classification accuracy (91.64%) as compared to Bayesian classifier. The addition of fractal features of collagen under Gaussian transformation improves Bayesian classifier's performance from 80.69% to 90.75%. Results are here studied and discussed.

  3. Improving breast cancer survival analysis through competition-based multidimensional modeling.

    Directory of Open Access Journals (Sweden)

    Erhan Bilal

    Full Text Available Breast cancer is the most common malignancy in women and is responsible for hundreds of thousands of deaths annually. As with most cancers, it is a heterogeneous disease and different breast cancer subtypes are treated differently. Understanding the difference in prognosis for breast cancer based on its molecular and phenotypic features is one avenue for improving treatment by matching the proper treatment with molecular subtypes of the disease. In this work, we employed a competition-based approach to modeling breast cancer prognosis using large datasets containing genomic and clinical information and an online real-time leaderboard program used to speed feedback to the modeling team and to encourage each modeler to work towards achieving a higher ranked submission. We find that machine learning methods combined with molecular features selected based on expert prior knowledge can improve survival predictions compared to current best-in-class methodologies and that ensemble models trained across multiple user submissions systematically outperform individual models within the ensemble. We also find that model scores are highly consistent across multiple independent evaluations. This study serves as the pilot phase of a much larger competition open to the whole research community, with the goal of understanding general strategies for model optimization using clinical and molecular profiling data and providing an objective, transparent system for assessing prognostic models.

  4. Benchmark Report on Key Outage Attributes: An Analysis of Outage Improvement Opportunities and Priorities

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn St. [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Farris, Ronald [Idaho National Laboratory (INL), Idaho Falls, ID (United States)


    Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally do not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.

  5. Error analysis to improve the speech recognition accuracy on Telugu language

    Indian Academy of Sciences (India)

    N Usha Rani; P N Girija


    Speech is one of the most important communication channels among the people. Speech Recognition occupies a prominent place in communication between the humans and machine. Several factors affect the accuracy of the speech recognition system. Much effort was involved to increase the accuracy of the speech recognition system, still erroneous output is generating in current speech recognition systems. Telugu language is one of the most widely spoken south Indian languages. In the proposed Telugu speech recognition system, errors obtained from decoder are analysed to improve the performance of the speech recognition system. Static pronunciation dictionary plays a key role in the speech recognition accuracy. Modification should be performed in the dictionary, which is used in the decoder of the speech recognition system. This modification reduces the number of the confusion pairs which improves the performance of the speech recognition system. Language model scores are also varied with this modification. Hit rate is considerably increased during this modification and false alarms have been changing during the modification of the pronunciation dictionary. Variations are observed in different error measures such as F-measures, error-rate and Word Error Rate (WER) by application of the proposed method.

  6. How to improve mental health competency in general practice training?--a SWOT analysis. (United States)

    van Marwijk, Harm


    It is quite evident there is room for improvement in the primary care management of common mental health problems. Patients respond positively when GPs adopt a more proactive role in this respect. The Dutch general practice curriculum is currently being renewed. The topics discussed here include the Strengths, Weaknesses, Opportunities and Threats (SWOT) of present primary mental healthcare teaching. What works well and what needs improving? Integrated teaching packages are needed to help general practice trainees manage various presentations of psychological distress. Such packages comprise training videotapes, in which models such as problem-solving treatment (PST) are demonstrated, as well as roleplaying material for new skills, self-report questionnaires for patients, and small-group video feedback of consultations. While GP trainees can effectively master such skills, it is important to query the level of proficiency required by registrars. Are these skills of use only to connoisseur GPs, or to all? More room for specialisation and differentiation among trainees may be the way forward. We have just developed a new curriculum for the obligatory three-month psychiatry housemanship. It is competency oriented, self-directed and assignment driven. This new curriculum will be evaluated in due course.

  7. Improved gene prediction by principal component analysis based autoregressive Yule-Walker method. (United States)

    Roy, Manidipa; Barman, Soma


    Spectral analysis using Fourier techniques is popular with gene prediction because of its simplicity. Model-based autoregressive (AR) spectral estimation gives better resolution even for small DNA segments but selection of appropriate model order is a critical issue. In this article a technique has been proposed where Yule-Walker autoregressive (YW-AR) process is combined with principal component analysis (PCA) for reduction in dimensionality. The spectral peaks of DNA signal are used to detect protein-coding regions based on the 1/3 frequency component. Here optimal model order selection is no more critical as noise is removed by PCA prior to power spectral density (PSD) estimation. Eigenvalue-ratio is used to find the threshold between signal and noise subspaces for data reduction. Superiority of proposed method over fast Fourier Transform (FFT) method and autoregressive method combined with wavelet packet transform (WPT) is established with the help of receiver operating characteristics (ROC) and discrimination measure (DM) respectively.

  8. A Data Matrix Method for Improving the Quantification of Element Percentages of SEM/EDX Analysis (United States)

    Lane, John


    A simple 2D M N matrix involving sample preparation enables the microanalyst to peer below the noise floor of element percentages reported by the SEM/EDX (scanning electron microscopy/ energy dispersive x-ray) analysis, thus yielding more meaningful data. Using the example of a 2 3 sample set, there are M = 2 concentration levels of the original mix under test: 10 percent ilmenite (90 percent silica) and 20 percent ilmenite (80 percent silica). For each of these M samples, N = 3 separate SEM/EDX samples were drawn. In this test, ilmenite is the element of interest. By plotting the linear trend of the M sample s known concentration versus the average of the N samples, a much higher resolution of elemental analysis can be performed. The resulting trend also shows how the noise is affecting the data, and at what point (of smaller concentrations) is it impractical to try to extract any further useful data.

  9. A novel LIDAR-based Atmospheric Calibration Method for Improving the Data Analysis of MAGIC

    CERN Document Server

    Fruck, Christian; Zanin, Roberta; Dorner, Daniela; Garrido, Daniel; Mirzoyan, Razmik; Font, Lluis


    A new method for analyzing the returns of the custom-made 'micro'-LIDAR system, which is operated along with the two MAGIC telescopes, allows to apply atmospheric corrections in the MAGIC data analysis chain. Such corrections make it possible to extend the effective observation time of MAGIC under adverse atmospheric conditions and reduce the systematic errors of energy and flux in the data analysis. LIDAR provides a range-resolved atmospheric backscatter profile from which the extinction of Cherenkov light from air shower events can be estimated. Knowledge of the extinction can allow to reconstruct the true image parameters, including energy and flux. Our final goal is to recover the source-intrinsic energy spectrum also for data affected by atmospheric extinction from aerosol layers, such as clouds.

  10. Improving the Computational Morphological Analysis of a Swahili Corpus for Lexicographic Purposes

    Directory of Open Access Journals (Sweden)

    Guy De Pauw


    Full Text Available

    Abstract: Computational morphological analysis is an important first step in the automatic treatment of natural language and a useful lexicographic tool. This article describes a corpus-based approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation.


    Samenvatting: Accuratere computationele morfologische analyse van eenSwahili corpus voor lexicografische doeleinden. Computationele morfologischeanalyse is een belangrijke eerste stap in de automatische verwerking van natuurlijke taal en eennuttig lexicografisch hulpmiddel. Dit artikel beschrijft een corpusgebaseerde aanpak voor de morfologischeanalyse van het Swahili. We concentreren ons hierbij vooral op de lemmatiseringseigenschappenvan het ontwikkelde systeem en evalueren het als een hulpmiddel bij de corpusgebaseerdeontwikkeling van woordenboeken.


  11. MANGO – Modal Analysis for Grid Operation: A Method for Damping Improvement through Operating Point Adjustment

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhenyu; Zhou, Ning; Tuffner, Francis K.; Chen, Yousu; Trudnowski, Daniel J.; Diao, Ruisheng; Fuller, Jason C.; Mittelstadt, William A.; Hauer, John F.; Dagle, Jeffery E.


    Small signal stability problems are one of the major threats to grid stability and reliability in the U.S. power grid. An undamped mode can cause large-amplitude oscillations and may result in system breakups and large-scale blackouts. There have been several incidents of system-wide oscillations. Of those incidents, the most notable is the August 10, 1996 western system breakup, a result of undamped system-wide oscillations. Significant efforts have been devoted to monitoring system oscillatory behaviors from measurements in the past 20 years. The deployment of phasor measurement units (PMU) provides high-precision, time-synchronized data needed for detecting oscillation modes. Measurement-based modal analysis, also known as ModeMeter, uses real-time phasor measurements to identify system oscillation modes and their damping. Low damping indicates potential system stability issues. Modal analysis has been demonstrated with phasor measurements to have the capability of estimating system modes from both oscillation signals and ambient data. With more and more phasor measurements available and ModeMeter techniques maturing, there is yet a need for methods to bring modal analysis from monitoring to actions. The methods should be able to associate low damping with grid operating conditions, so operators or automated operation schemes can respond when low damping is observed. The work presented in this report aims to develop such a method and establish a Modal Analysis for Grid Operation (MANGO) procedure to aid grid operation decision making to increase inter-area modal damping. The procedure can provide operation suggestions (such as increasing generation or decreasing load) for mitigating inter-area oscillations.

  12. Psychological treatments to improve quality of life in cancer contexts: A meta-analysis


    Alejandro de la Torre-Luque; Hilda Gambara; Escarlata López; Juan Antonio Cruzado


    This study aimed to analyze the effects of psychological treatments on quality of life among cancer patients and survivors. Additionally, it was explored the moderating influence of some medical- and treatment-related features on these effects. Scientific studies published between 1970 and 2012 were analyzed. Seventy-eight studies were included in a meta-analysis. Concerns related to samples, interventions, and standard of methodological evidence were explored across the studies. A signifi...

  13. An improved silver staining procedure for schizodeme analysis in polyacrylamide gradient gels

    Directory of Open Access Journals (Sweden)

    Antonio M. Gonçalves


    Full Text Available A simple protocol is described for the silver staining of polyacrylamide gradient gels used for the separation of restriction fragments of kinetoplast DNA [schizodeme analysis of trypanosomatids (Morel et al., 1980]. The method overcomes the problems of non-uniform staining and strong background color which are frequently encountered when conventional protocols for silver staining of linear gels. The method described has proven to be of general applicability for DNA, RNA and protein separations in gradient gels.

  14. Improved methods for enrichment of organic ultra trace components for analysis by gas chromatography


    Kloskowski, Adam


    This thesis describes some new methods for analysis oforganic trace components from air and water by gaschromatography. The work is particularly focused on thedevelopment of new technologies for analyte enrichment, usingsorbent-based concepts. Short lengths of open tubular columnswere examined for their potential use as denuders.Polydimethylsiloxane-based stationary phases as well as anadsorbent-based column were evaluated in an equilibrium mode oftrapping. For the analytes selected, detectio...

  15. Improved Stability Analysis of Nonlinear Networked Control Systems over Multiple Communication Links


    Delavar, Rahim; Tavassoli, Babak; Beheshti, Mohammad Taghi Hamidi


    In this paper, we consider a nonlinear networked control system (NCS) in which controllers, sensors and actuators are connected via several communication links. In each link, networking effects such as the transmission delay, packet loss, sampling jitter and data packet miss-ordering are captured by time-varying delays. Stability analysis is carried out based on the Lyapunov Krasovskii method to obtain a condition for stability of the nonlinear NCS in the form of linear matrix inequality (LMI...

  16. Improved Duct Systems Task Report with StageGate 2 Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moyer, Neil [Florida Solar Energy Center, Cocoa, FL (United States); Stroer, Dennis [Calcs-Plus, Venice, FL (United States)


    This report is about Building America Industrialized Housing Partnership's work with two industry partners, Davalier Homes and Southern Energy Homes, in constructing and evaluating prototype interior duct systems. Issues of energy performance, comfort, DAPIA approval, manufacturability and cost is addressed. A stage gate 2 analysis addresses the current status of project showing that there are still refinements needed to the process of incorporating all of the ducts within the air and thermal boundaries of the envelope.

  17. Tourism Development Strategies, SWOT analysis and improvement of Albania’s image.

    Directory of Open Access Journals (Sweden)

    Eriketa Vladi


    Full Text Available Albania has a range of historical, natural and cultural potentials. The marketing strategiesprepared with the aim to create and develop Albania’s tourism and at what stage is theimage of Albania is the subject of this paper. I considered necessary also to conduct aSWOT analysis on tourism development strategies and communication of Albania as atourist destination.Keywords: Albania, Tourism Communication, Image Management, Destination Branding,Marketing Strategies.

  18. A Lean Six Sigma approach to the improvement of the selenium analysis method



    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality meth...

  19. Analysis of the Important Mobile Devices Features to Improve Mobile Web Applications


    Omari, R.; M. Feisst; A. Christ


    The mobile devices related industries are subjectto rapid change, driven by technological advances anddynamic consumer behaviour. Hence, the understanding ofthe mobile devices markets is an important step in theanalysis phase of mobile applications development. In thispaper, a brief description of the different markets isintroduced followed by an analysis of the main features ofthe markets leaders’ devices which are important in thedevelopment process of mobile web applications. Finally,appro...

  20. Problems in wavelet analysis of hydrologic series and some suggestions on improvement

    Institute of Scientific and Technical Information of China (English)

    WANG Hongrui; YE Letian; LIU Changming; YANG Chi; LIU Peng


    Applying the wavelet theory and methods to investigate the hydrologic processes such as precipitation and runoff is a hot field. However, several aspects in research are usually ignored: the effect of admissible condition of wavelet functions and the disturbance of noises for the detection of periods, the effect of the length of a hydrologic time-series on the final result, and the choice between the anomaly and the original time series for wavelet analysis. In this paper, these issues are fully discussed. Precipitation data from Lanzhou Precipitation Station are taken for case study. The result indicates that in the wavelet analysis of hydrologic series, denoise methods should be used to eliminate the influence of noises. The MexHat wavelet function satisfies the admissible condition, which ensures that the periodic properties of hydrologic processes can be well represented by using the MexHat wavelet for decomposition. The affected range of hydrologic series which should be discarded before analysis is given. It is also suggested that the anomaly series should be used to highlight the actual undulation of the hydrologic series.

  1. Statistically improved Analysis of Neutrino Oscillation Data with the latest KamLAND result

    CERN Document Server

    Aliani, P; Torrente-Lujan, E


    We present an updated analysis of all available solar and reactor neutrino data, emphasizing in particular the totality of the KamLAND (314d live time) results and including for the first time the solar $SNO$ (391d live time, phase II NaCl-enhanced) spectrum data. As a novelty of the statistical analysis, we study the variability of the KamLand results with respect the use of diverse statistics. A new statistic, not used before is proposed. Moreover, in the analysis of the SNO spectrum a novel technique is used in order to include full correlated errors among bins. Combining all data, we obtain the following best-fit parameters: we determine individual neutrino mixing parameters and their errors $ \\Delta m^2= 8.2\\pm 0.08\\times 10^{-5} \\eV^2,\\quad \\tan^2\\theta= 0.50^{+0.12}_{-0.07}.$ The impact of these results is discussed. We also estimate the individual elements of the neutrino mass matrix. In the framework of three neutrino oscillations we obtain the mass matrix: \\begin{eqnarray}M&=& eV \\pmatrix{1....

  2. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis.

    Directory of Open Access Journals (Sweden)

    Matthias Deliano

    Full Text Available Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning.

  3. Improved spatial regression analysis of diffusion tensor imaging for lesion detection during longitudinal progression of multiple sclerosis in individual subjects (United States)

    Liu, Bilan; Qiu, Xing; Zhu, Tong; Tian, Wei; Hu, Rui; Ekholm, Sven; Schifitto, Giovanni; Zhong, Jianhui


    Subject-specific longitudinal DTI study is vital for investigation of pathological changes of lesions and disease evolution. Spatial Regression Analysis of Diffusion tensor imaging (SPREAD) is a non-parametric permutation-based statistical framework that combines spatial regression and resampling techniques to achieve effective detection of localized longitudinal diffusion changes within the whole brain at individual level without a priori hypotheses. However, boundary blurring and dislocation limit its sensitivity, especially towards detecting lesions of irregular shapes. In the present study, we propose an improved SPREAD (dubbed improved SPREAD, or iSPREAD) method by incorporating a three-dimensional (3D) nonlinear anisotropic diffusion filtering method, which provides edge-preserving image smoothing through a nonlinear scale space approach. The statistical inference based on iSPREAD was evaluated and compared with the original SPREAD method using both simulated and in vivo human brain data. Results demonstrated that the sensitivity and accuracy of the SPREAD method has been improved substantially by adapting nonlinear anisotropic filtering. iSPREAD identifies subject-specific longitudinal changes in the brain with improved sensitivity, accuracy, and enhanced statistical power, especially when the spatial correlation is heterogeneous among neighboring image pixels in DTI.

  4. The Improvement Effect of Dispersant in Fluorite Flotation: Determination by the Analysis of XRD and FESEM-EDX

    Directory of Open Access Journals (Sweden)

    Y. J. Li


    Full Text Available Different dispersants were added in the dispersion process to improve the efficiency of fluorite flotation. The types and dosage of dispersant on the improvement of fluorite flotation were investigated; when the sodium polyacrylate (SPA was used as the dispersant and its addition is 0.5%, the concentrate grade of CaF2 increased from 90% to 98% and the fluorite recovery increased from 81% to 85%. Methods of X-ray powder diffraction (XRD, field emission scanning electron microscopy (FESEM, and Energy dispersive X-ray spectrometer (EDX were used to characterize the sample. According to the analysis of results, the optimal sample consisted of CaF2 and very little CaCO3 in the size range of 0–5 μm. It could be concluded that the mechanism of improvement for the concentrate grade and recovery of CaF2 was attributed to the change of potential energy barrier which caused the separation of particles with different charge. All results indicate that SPA has a great potential to be an efficient and cost-effective dispersant for the improvement of fluorite flotation.

  5. Fluorescence colocalization microscopy analysis can be improved by combining object‐recognition with pixel‐intensity‐correlation (United States)

    Moser, Bernhard; Hochreiter, Bernhard; Herbst, Ruth


    Abstract The question whether two proteins interact with each other or whether a protein localizes to a certain region of the cell is often addressed with fluorescence microscopy and analysis of a potential colocalization of fluorescence markers. Since a mere visual estimation does not allow quantification of the degree of colocalization, different statistical methods of pixel‐intensity correlation are commonly used to score it. We observed that these correlation coefficients are prone to false positive results and tend to show high values even for molecules that reside in different organelles. Our aim was to improve this type of analysis and we developed a novel method combining object‐recognition based colocalization analysis with pixel‐intensity correlation to calculate an object‐corrected Pearson coefficient. We designed a macro for the Fiji‐version of the software ImageJ and tested the performance systematically with various organelle markers revealing an improved robustness of our approach over classical methods. In order to prove that colocalization does not necessarily mean a physical interaction, we performed FRET (fluorescence resonance energy transfer) microscopy. This confirmed that non‐interacting molecules can exhibit a nearly complete colocalization, but that they do not show any significant FRET signal in contrast to proteins that are bound to each other. PMID:27420480

  6. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era. (United States)

    Ferreira, Ana P; Tobyn, Mike


    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  7. Constraint analysis to improve integrated dairy production systems in developing countries: the importance of participatory rural appraisal. (United States)

    Devendra, C


    The paper describes the rationale and importance of the approaches and methodologies of Participatory Rural Appraisal (PRA) to enable constraint analysis, to understand the complexities of farming systems and to improve integrated dairy productivity. Implicit in this objective is Farming Systems Research (FSR), which focused on cropping systems in the 1970's, with the subsequent addition of animal components. The methodology for FSR involves the following sequential components: site selection, site description and characterization (diagnosis), planning of on-farm research, on-farm testing and validation of alternatives, diffusion of results, and impact assessment. PRA is the development of FSR, which involves the active participation of farmers to identify constraints and plan appropriate solutions. In the Coordinated Research Project (CRP), the approach was adapted to 10 different country situations and led to Economic Opportunity Surveys (EOS) and Diagnostic Surveillance Studies (DSS), allowing the planning and implantation of integrated interventions to improve dairy productivity.

  8. Fuzzy approach to analysis of flood risk based on variable fuzzy sets and improved information diffusion methods

    Directory of Open Access Journals (Sweden)

    Q. Li


    Full Text Available The predictive analysis of natural disasters and their consequences is challenging because of uncertainties and incomplete data. The present article studies the use of variable fuzzy sets (VFS and improved information diffusion method (IIDM to construct a composite method. The proposed method aims to integrate multiple factors and quantification of uncertainties within a consistent system for catastrophic risk assessment. The fuzzy methodology is proposed in the area of flood disaster risk assessment to improve probability estimation. The purpose of the current study is to establish a fuzzy model to evaluate flood risk with incomplete data sets. The results of the example indicate that the methodology is effective and practical; thus, it has the potential to forecast the flood risk in flood risk management.

  9. An Improved Fourier Series Method for the Free Vibration Analysis of the Three-Dimensional Coupled Beams

    Directory of Open Access Journals (Sweden)

    Runze Zhang


    Full Text Available This paper presents a free vibration analysis of three-dimensional coupled beams with arbitrary coupling angle using an improved Fourier method. The displacement and rotation of the coupled beams are represented by the improved Fourier series which consisted of Fourier cosine series and closed-form auxiliary functions. The coupling and boundary conditions are accomplished by setting coupling and boundary springs and assigning corresponding stiffness values to the springs. Modal parameters are determined through the application of Rayleigh-Ritz procedure to the system energy formulation. The accuracy and convergence of the present method are demonstrated by finite element method (FEM result. Investigation on vibration of the propulsion shafting structure shows the extensive applicability of present method. The studies on the vibration suppression devices are also reported.

  10. Vacuum structure revealed by over-improved stout-link smearing compared with the overlap analysis for quenched QCD

    Energy Technology Data Exchange (ETDEWEB)

    Ilgenfritz, E.M.; Leinweber, D.; Moran, P. [Adelaide Univ., SA (AU). Special Research Centre for the Subatomic Structure of Matter (CSSM); Koller, K. [Muenchen Univ. (Germany). Sektion Physik; Schierholz, G. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Weinberg, V. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Freie Univ. Berlin (Germany). Inst. fuer Theoretische Physik


    A detailed comparison is made between the topological structure of quenched QCD as revealed by the recently proposed over-improved stout-link smearing in conjunction with an improved gluonic definition of the topological density on one hand and a similar analysis made possible by the overlap-fermionic topological charge density both with and without variable ultraviolet cutoff {lambda}{sub cut}. The matching is twofold, provided by fitting the density-density two-point functions on one hand and by a point-by-point fitting of the topological densities according to the two methods. We point out the similar cluster structure of the topological density for moderate smearing and 200 MeV<{lambda}{sub cut}<600 MeV, respectively. We demonstrate the relation of the gluonic topological density for extensive smearing to the location of the overlap zero modes and the lowest overlap non-zero mode as found for the unsmeared configurations. (orig.)

  11. 改进的PMAC及安全性分析%Improved PMAC and its security analysis

    Institute of Scientific and Technical Information of China (English)

    晁仕德; 张绍兰; 田华; 杨义先


    According to the forgery attack on PMAC proposed by Lee Changhcon et al,the weakness is found.The method to pro-cess the last block of message is proposed to avoid the forgery attack by using the fact that the block cipher has same output with the same input using a key.The security of the improved scheme is analysed.%针对Lee Changhoon等人对PMAC工作模式提出的伪造攻击,找到PMAC被攻击的弱点.改进了PMAC最后一个消息块的处理方式,避免了利用分组密码在同一个密钥下相同输入有相同输出这一特点进行的伪造攻击,并给出了改进方案的安全性分析.

  12. Improved angular resolution in electron backscatter diffraction analysis by use of image correlation techniques

    Institute of Scientific and Technical Information of China (English)



    In this paper we describe a method for improving the angular resolution of the electron backscatter diffraction(EBSD)technique based on a correlative matching of EBSD patterns.Standard image interpolation methods are used to detect shifts between selected regions of the EBSD patterns to an accuracy of one tenth of a pixel.Simulated data sets are used to show that such accuracy,combined with a small angle approximation in calculation of the rotation angle,allows determination of the misorientation between patterns to an accuracy of 0.01 degrees.The method is tested on samples of both single crystal aluminum and recrystallized nickel.The results demonstrate the accuracy and stability of the new method compared to the conventional method.

  13. Performance Analysis of Binary Exponential Backoff and Improved Backoff for WPAN

    Directory of Open Access Journals (Sweden)

    S. Mehta


    Full Text Available The IEEE 802.15.3 medium access control (MAC is proposed, especially, for wireless personal area network (WPAN short and high data rates applications, to coordinate the access to the wireless medium among the competing devices. A concept of a geometrically increasing probability distribution for contention process was brought up in the work of Tay et al. (2004. In this paper, we adopt this idea as improved backoff (IB for contention process of IEEE 802.15.3, where binary exponential backoff (BEB is originally used. Here, we propose an analytical model for IB and compared both BEB and IB for saturated and nonsaturated traffic conditions. Furthermore, our research results demonstrate that IB provides an edge over BEB in terms of channel efficiency, channel access delay, and energy efficiency.

  14. Metagenomic Analysis of Chicken Gut Microbiota for Improving Metabolism and Health of Chickens - A Review. (United States)

    Choi, Ki Young; Lee, Tae Kwon; Sul, Woo Jun


    Chicken is a major food source for humans, hence it is important to understand the mechanisms involved in nutrient absorption in chicken. In the gastrointestinal tract (GIT), the microbiota plays a central role in enhancing nutrient absorption and strengthening the immune system, thereby affecting both growth and health of chicken. There is little information on the diversity and functions of chicken GIT microbiota, its impact on the host, and the interactions between the microbiota and host. Here, we review the recent metagenomic strategies to analyze the chicken GIT microbiota composition and its functions related to improving metabolism and health. We summarize methodology of metagenomics in order to obtain bacterial taxonomy and functional inferences of the GIT microbiota and suggest a set of indicator genes for monitoring and manipulating the microbiota to promote host health in future.

  15. Analysis of elastoplasticity problems using an improved complex variable element-free Galerkin method

    Institute of Scientific and Technical Information of China (English)

    程玉民; 刘超; 白福浓; 彭妙娟


    In this paper, based on the conjugate of the complex basis function, a new complex variable moving least-squares approximation is discussed. Then using the new approximation to obtain the shape function, an improved complex vari-able element-free Galerkin (ICVEFG) method is presented for two-dimensional (2D) elastoplasticity problems. Compared with the previous complex variable moving least-squares approximation, the new approximation has greater computational precision and efficiency. Using the penalty method to apply the essential boundary conditions, and using the constrained Galerkin weak form of 2D elastoplasticity to obtain the system equations, we obtain the corresponding formulae of the ICVEFG method for 2D elastoplasticity. Three selected numerical examples are presented using the ICVEFG method to show that the ICVEFG method has the advantages such as greater precision and computational efficiency over the conven-tional meshless methods.

  16. Activity cost analysis: a tool to cost medical services and improve quality of care. (United States)

    Udpa, S


    This paper suggests an activity-based cost (ABC) system as the appropriate cost accounting system to measure and control costs under the microstatistical episode of care (EOC) paradigm suggested by D. W. Emery (1999). ABC systems work well in such an environment because they focus on activities performed to provide services in the delivery of care. Thus, under an ABC system it is not only possible to accurately cost episodes of care but also to more effectively monitor and improve the quality of care. Under the ABC system, costs are first traced to activities and then traced from the activities to units of episodic care using cost drivers based on the consumption of activity resources.

  17. Analysis of Operation and Possibilities of Improving E-Learning System in Traffic Engineering

    Directory of Open Access Journals (Sweden)

    Dragan Peraković


    Full Text Available Since the Faculty of Transport and Traffic Sciences wasconnected to CARNet academic network, new informationand communication technologies are constantly being introducedand the old ones updated with the aim of improving thequality of studying, from the introduction of WEBCT applicationto complete design, development and implementation ofone's own solution of e-leaming. A complete e-Learning systemhas been developed, named e-Student which consists of severalprogram modules called SAN, DMS, SMSCentar, etc. Sincethe introduction of the system, the students and the teachingstaff have shown great interest for the system for the reasons ofeasier monitoring of the students' activities, through seminarpapers and tasks, exercises and throug h solving of variousknowledge tests. The work provides graphical illustrations andstatistical data which analyze the operation of the system andthe exploitation characteristics. The obtained results indicatethe increase in the interest of the teaching staff and studentswhich indicates further need to upgrade the system in order toincrease the safety and speed of information transfer.

  18. Analysis of Urban Heat Island Effect Using an Improved CTTC and STTC Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yufeng; WANG Zhigang; SUN Yuexia


    An improved cluster thermal time constant (CTTC) and surface thermal time constant (STTC) numerical model was introduced,which took into account the effect of vegetation coverage and modified the expression of net longwave radiation of the canyon layer.In the case study the model was used to calculate the air temperature variation at downtown of Tianjin City.The relative error between the calculated and measured air temperatures was less than 3%.The tendency of air temperature variation was predicted when the building aspect ratio,vegetation rate,and wind speed changed respectively.It is demonstrated that when the aspect ratio of a building with south-north orientation increased,the heat island intensity at day time was mitigated; however,it became worse after sunset.The vegetation coverage rate and wind speed both had negative relationship with the urban heat island intensity.

  19. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  20. Harmonic analysis and field quality improvement of an HTS quadrupole magnet for a heavy ion accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Zhan; Wei, Shaoqing; Lee, Sang Jin [Uiduk University, Gyeongju (Korea, Republic of); Jo, Hyun Chul; Kim, Do Gyun; Kim, Jong Won [Rare Isotope Science Project, Institute for Basic Science, Daejeon (Korea, Republic of)


    In recent years, the iron-dominated high-temperature superconductor (HTS) quadrupole magnets are being developed for heavy ion accelerators. Field analyses for iron-dominated quadrupole magnets were based on the normal-conducting (NC) quadrupole magnet early in the development for accelerators. Some conclusions are still in use today. However, the magnetic field of iron-dominated HTS quadrupole magnets cannot fully follow these conclusions. This study established an HTS quadrupole magnet model and an NC quadrupole magnet model, respectively. The harmonic characteristics of two magnets were analyzed and compared. According to the comparison, the conventional iron-dominated quadrupole magnets can be designed for maximum field gradient; the HTS quadrupole magnet, however, should be considered with varying field gradient. Finally, the HTS quadrupole magnet was designed for the changing field gradient. The field quality of the design was improved comparing with the result of the previous study. The new design for the HTS quadrupole magnet has been suggested.