WorldWideScience

Sample records for analysis improves col4a5

  1. Novel X-linked glomerulopathy associated with a COL4A5 missense mutation in a noncollagenous interruption

    Science.gov (United States)

    Becknell, Brian; Zender, Gloria; Houston, Ronald; Baker, Peter; McBride, Kim L.; Luo, Wentian; Hains, David; Borza, Dorin-Bogdan; Schwaderer, Andrew L.

    2011-01-01

    We report a novel COL4A5 mutation causing rapid progression to end stage renal disease in males despite the absence of clinical and biopsy findings associated with Alport syndrome. Affected males had proteinuria, variable hematuria, early progression to end stage renal disease; and renal biopsy findings which included global and segmental glomerulosclerosis, mesangial hypercellularity and basement membrane immune complex deposition. Exon sequencing of the COL4A5 locus identified a thymine to guanine transversion at nucleotide 665, resulting in a phenylalanine to cysteine missense mutation at codon 222. This mutation was confirmed in 4 affected males and 4 female obligate carriers, but was absent in 6 asymptomatic male family members and 198 unrelated individuals. α5(IV) collagen staining in renal biopsies from affected males was normal. The phenylalanine at position 222 is 100% conserved among vertebrates. This is the first description of a mutation in a non-collagenous interruption associated with severe renal disease, providing evidence for the importance of this structural motif. The range of phenotypes associated with COL4A5 mutations is more diverse than previously realized. COL4A5 mutation analysis should be considered when glomerulonephritis presents in an X-linked inheritance pattern, even with a distinct presentation from Alport syndrome. PMID:20881942

  2. A Novel COL4A5 Mutation Identified in a Chinese Han Family Using Exome Sequencing

    Directory of Open Access Journals (Sweden)

    Xiaofei Xiu

    2014-01-01

    Full Text Available Alport syndrome (AS is a monogenic disease of the basement membrane (BM, resulting in progressive renal failure due to glomerulonephropathy, variable sensorineural hearing loss, and ocular anomalies. It is caused by mutations in the collagen type IV alpha-3 gene (COL4A3, the collagen type IV alpha-4 gene (COL4A4, and the collagen type IV alpha-5 gene (COL4A5, which encodes type IV collagen α3, α4, and α5 chains, respectively. To explore the disease-related gene in a four-generation Chinese Han pedigree of AS, exome sequencing was conducted on the proband, and a novel deletion mutation c.499delC (p.Pro167Glnfs*36 in the COL4A5 gene was identified. This mutation, absent in 1,000 genomes project, HapMap, dbSNP132, YH1 databases, and 100 normal controls, cosegregated with patients in the family. Neither sensorineural hearing loss nor typical COL4A5-related ocular abnormalities (dot-and-fleck retinopathy, anterior lenticonus, and the rare posterior polymorphous corneal dystrophy were present in patients of this family. The phenotypes of patients in this AS family were characterized by early onset-age and rapidly developing into end-stage renal disease (ESRD. Our discovery broadens the mutation spectrum in the COL4A5 gene associated with AS, which may also shed new light on genetic counseling for AS.

  3. Novel X-linked glomerulopathy is associated with a COL4A5 missense mutation in a non-collagenous interruption.

    Science.gov (United States)

    Becknell, Brian; Zender, Gloria A; Houston, Ronald; Baker, Peter B; McBride, Kim L; Luo, Wentian; Hains, David S; Borza, Dorin-Bogdan; Schwaderer, Andrew L

    2011-01-01

    A novel COL4A5 mutation causes rapid progression to end-stage renal disease in males, despite the absence of clinical and biopsy findings associated with Alport syndrome. Affected males have proteinuria, variable hematuria, and an early progression to end-stage renal disease. Renal biopsy findings include global and segmental glomerulosclerosis, mesangial hypercellularity and basement membrane immune complex deposition. Exon sequencing of the COL4A5 locus identified a thymine to guanine transversion at nucleotide 665, resulting in a phenylalanine to cysteine missense mutation at codon 222. The phenylalanine at position 222 is absolutely conserved among vertebrates. This mutation was confirmed in 4 affected males and 4 female obligate carriers, but was absent in 6 asymptomatic male family members and 198 unrelated individuals. Immunostaining for α5(IV) collagen in renal biopsies from affected males was normal. This mutation, in a non-collagenous interruption associated with severe renal disease, provides evidence for the importance of this structural motif and suggests the range of phenotypes associated with COL4A5 mutations is more diverse than previously realized. Hence, COL4A5 mutation analysis should be considered when glomerulonephritis presents in an X-linked inheritance pattern, even with a presentation distinct from Alport syndrome. PMID:20881942

  4. A nonsense mutation in the COL4A5 collagen gene in a family with X-linked juvenile Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Heiskari, N; Zhou, J; Jensen, U B; Tryggvason, K

    1995-01-01

    47 of the COL4A5 gene in a patient with a juvenile form of X-linked Alport syndrome with deafness. This two base deletion caused a shift in the reading frame and introduced a premature stop codon which resulted in an alpha 5(IV)-chain shortened by 202 residues and lacking almost the entire NC1 domain....... Prenatal diagnosis on chorionic villi tissue, obtained from one of the female carriers in the family, revealed a male fetus hemizygous for the mutated allele. A subsequent prenatal test in her next pregnancy revealed a normal male fetus. Prenatal diagnosis of Alport syndrome has not previously been...

  5. Mutations in the codon for a conserved arginine-1563 in the COL4A5 collagen gene in Alport syndrome

    DEFF Research Database (Denmark)

    Zhou, J; Gregory, M C; Hertz, Jens Michael; Barker, D F; Atkin, C; Spencer, E S; Tryggvason, K

    1993-01-01

    arginine to the translation stop codon TGA. In Utah kindred 2123 and in the Danish kindred A13, there was a C-->T mutation in the noncoding strand changing the same codon to CAA for glutamine. Both mutations were confirmed by allele-specific hybridization on PCR-amplified DNA from other family members....

  6. Detection of mutations in the COL4A5 gene by SSCP in X-linked Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Juncker, I; Persson, U;

    2001-01-01

    , three in-frame deletions, four nonsense mutations, and six splice site mutations. Twenty-two of the mutations have not previously been reported. Furthermore, we found one non-pathogenic amino acid substitution, one rare variant in a non-coding region, and one polymorphism with a heterozygosity of 28......%. Three de novo mutations were found, two of which were paternal and one of maternal origin....

  7. Failure Analysis for Improved Reliability

    Science.gov (United States)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  8. Analysis and Improvement for SPINS

    Directory of Open Access Journals (Sweden)

    Yuan Wang

    2013-01-01

    Full Text Available Wireless sensor network is a new application network and with broad application prospects, which is considered as the leader of the top-ten technologies in changing world in future. Recent years, the wireless sensor network receives much attention and application, its security becomes even more prominent. SPINS only considers the simple main key sharing way in the safe guidance aspect. It always encountered Dos attack. SPINS lacks key update mechanism, does not support the network expansion .We propose a design scheme of secure communication protocol for wireless sensor network based on the hierarchical topology referring to SPINS, which is widely accepted at present. Our scheme has utilized the LEACH algorithm and has made up it in the security insufficiency. During the topology establishment, we add the authentications of the cluster-heads through base station to ensure the validity of the cluster-heads; in addition, we add a one-way hash function and a shared key in the key management, making the encryption key and authentication key change dynamically to enhance the security of network communications. The improved scheme takes into account its own characteristics and limitations of wireless sensor networks, and seeks to meet the security needs of network communications, that is confidentiality, integrity, data freshness, key update, and authentication. Although the improved protocol consumed slightly more energy compared to SPINS, safety performance has improved a lot.

  9. Improved Intermittency Analysis of Single Event Data

    OpenAIRE

    Janik, R. A.; Ziaja, B.

    1998-01-01

    The intermittency analysis of single event data (particle moments) in multiparticle production is improved, taking into account corrections due to the reconstruction of history of a particle cascade. This approach is tested within the framework of the $\\alpha$-model.

  10. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor;

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  11. Improved Intermittency Analysis of Individual Events

    OpenAIRE

    Janik, R. A.; Ziaja, B.

    1998-01-01

    Recent progress on the event-by-event analysis of intermittent data by R. A. Janik and myself is reported. The intermittency analysis of single event data (particle moments) in multiparticle production is improved, taking into account corrections due to the reconstruction of history of a particle cascade. This approach is tested within the framework of the $\\alpha$-model.}

  12. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  13. AN IMPROVED ALGORITHM FOR DPIV CORRELATION ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    WU Long-hua

    2007-01-01

    In a Digital Particle Image Velocimetry (DPIV) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and robustness of the correlation algorithm directly affect the validity of the analysis result. In this article, an improved algorithm for the correlation analysis was proposed which could be used to optimize the selection/determination of the correlation window, analysis area and search path. This algorithm not only reduces largely the amount of calculation, but also improves effectively the accuracy and reliability of the correlation analysis. The algorithm was demonstrated to be accurate and efficient in the measurement of the velocity field in a flocculation pool.

  14. Systems Improved Numerical Fluids Analysis Code

    Science.gov (United States)

    Costello, F. A.

    1990-01-01

    Systems Improved Numerical Fluids Analysis Code, SINFAC, consists of additional routines added to April, 1983, version of SINDA. Additional routines provide for mathematical modeling of active heat-transfer loops. Simulates steady-state and pseudo-transient operations of 16 different components of heat-transfer loops, including radiators, evaporators, condensers, mechanical pumps, reservoirs, and many types of valves and fittings. Program contains property-analysis routine used to compute thermodynamic properties of 20 different refrigerants. Source code written in FORTRAN 77.

  15. Analysis of Questionnaire using Multivariate Analysis for Improving Lectures

    Science.gov (United States)

    Abe, Takehiko; Tajima, Takuya; Kimura, Haruhiko

    Recently, universities send out questionnaire to students. Result of this questionnaire is used for improving lectures. However, subjective views of students control result of questionnaire. Therefore, there is a necessity for analyzing result of questionnaire. This paper uses regression analysis and quantification theory type 3. In conclusion, this paper explains relation between student's satisfaction and grade by analysis of result of questionnaire.

  16. Improving Intelligence Analysis With Decision Science.

    Science.gov (United States)

    Dhami, Mandeep K; Mandel, David R; Mellers, Barbara A; Tetlock, Philip E

    2015-11-01

    Intelligence analysis plays a vital role in policy decision making. Key functions of intelligence analysis include accurately forecasting significant events, appropriately characterizing the uncertainties inherent in such forecasts, and effectively communicating those probabilistic forecasts to stakeholders. We review decision research on probabilistic forecasting and uncertainty communication, drawing attention to findings that could be used to reform intelligence processes and contribute to more effective intelligence oversight. We recommend that the intelligence community (IC) regularly and quantitatively monitor its forecasting accuracy to better understand how well it is achieving its functions. We also recommend that the IC use decision science to improve these functions (namely, forecasting and communication of intelligence estimates made under conditions of uncertainty). In the case of forecasting, decision research offers suggestions for improvement that involve interventions on data (e.g., transforming forecasts to debias them) and behavior (e.g., via selection, training, and effective team structuring). In the case of uncertainty communication, the literature suggests that current intelligence procedures, which emphasize the use of verbal probabilities, are ineffective. The IC should, therefore, leverage research that points to ways in which verbal probability use may be improved as well as exploring the use of numerical probabilities wherever feasible. PMID:26581731

  17. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  18. Improved Tiled Bitmap Forensic Analysis Algorithm

    Directory of Open Access Journals (Sweden)

    C. D. Badgujar, G. N. Dhanokar

    2012-12-01

    Full Text Available In Computer network world, the needs for securityand proper systems of control are obvious and findout the intruders who do the modification andmodified data. Nowadays Frauds that occurs incompanies are not only by outsiders but also byinsiders. Insider may perform illegal activity & tryto hide illegal activity. Companies would like to beassured that such illegal activity i.e. tampering hasnot occurred, or if it does, it should be quicklydiscovered. Mechanisms now exist that detecttampering of a database, through the use ofcryptographically-strong hash functions. This papercontains a survey which explores the various beliefsupon database forensics through differentmethodologies using forensic algorithms and toolsfor investigations. Forensic analysis algorithms areused to determine who, when, and what data hadbeen tampered. Tiled Bitmap Algorithm introducesthe notion of a candidate set (all possible locationsof detected tampering(s and provides a completecharacterization of the candidate set and itscardinality. Improved tiled bitmap algorithm willcover come the drawbacks of existing tiled bitmapalgorithm.

  19. Analysis and Improvement of a User Authentication Improved Protocol

    Directory of Open Access Journals (Sweden)

    Zuowen Tan

    2010-05-01

    Full Text Available Remote user authentication always adopts the method of password to login the server within insecure network environments. Recently, Peyravin and Jeffries proposed a practical authentication scheme based on one-way collision-resistant hash functions. However, Shim and Munilla independently showed that the scheme is vulnerable to off-line guessing attacks. In order to remove the weakness, Hölbl, Welzer and Brumenn presented an improved secure password-based protocols for remote user authentication, password change and session key establishment.  Unfortunately, the remedies of their improved scheme cannot work. The improved scheme still suffers from the off-line attacks. And the password change protocol is insecure against Denial-of-Service attack. A proposed scheme is presented which overcomes these weaknesses. Detailed cryanalysis show that the proposed password-based protocols for remote user authentication, password change and session key establishment are immune against man-in-the-middle attacks, replay attacks, password guessing attacks, outsider attacks, denial-of-Service attacks and impersonation attacks.

  20. Productivity improvement through cycle time analysis

    Science.gov (United States)

    Bonal, Javier; Rios, Luis; Ortega, Carlos; Aparicio, Santiago; Fernandez, Manuel; Rosendo, Maria; Sanchez, Alejandro; Malvar, Sergio

    1996-09-01

    A cycle time (CT) reduction methodology has been developed in the Lucent Technology facility (former AT&T) in Madrid, Spain. It is based on a comparison of the contribution of each process step in each technology with a target generated by a cycle time model. These targeted cycle times are obtained using capacity data of the machines processing those steps, queuing theory and theory of constrains (TOC) principles (buffers to protect bottleneck and low cycle time/inventory everywhere else). Overall efficiency equipment (OEE) like analysis is done in the machine groups with major differences between their target cycle time and real values. Comparisons between the current value of the parameters that command their capacity (process times, availability, idles, reworks, etc.) and the engineering standards are done to detect the cause of exceeding their contribution to the cycle time. Several friendly and graphical tools have been developed to track and analyze those capacity parameters. Specially important have showed to be two tools: ASAP (analysis of scheduling, arrivals and performance) and performer which analyzes interrelation problems among machines procedures and direct labor. The performer is designed for a detailed and daily analysis of an isolate machine. The extensive use of this tool by the whole labor force has demonstrated impressive results in the elimination of multiple small inefficiencies with a direct positive implications on OEE. As for ASAP, it shows the lot in process/queue for different machines at the same time. ASAP is a powerful tool to analyze the product flow management and the assigned capacity for interdependent operations like the cleaning and the oxidation/diffusion. Additional tools have been developed to track, analyze and improve the process times and the availability.

  1. SINFAC - SYSTEMS IMPROVED NUMERICAL FLUIDS ANALYSIS CODE

    Science.gov (United States)

    Costello, F. A.

    1994-01-01

    The Systems Improved Numerical Fluids Analysis Code, SINFAC, consists of additional routines added to the April 1983 revision of SINDA, a general thermal analyzer program. The purpose of the additional routines is to allow for the modeling of active heat transfer loops. The modeler can simulate the steady-state and pseudo-transient operations of 16 different heat transfer loop components including radiators, evaporators, condensers, mechanical pumps, reservoirs and many types of valves and fittings. In addition, the program contains a property analysis routine that can be used to compute the thermodynamic properties of 20 different refrigerants. SINFAC can simulate the response to transient boundary conditions. SINFAC was first developed as a method for computing the steady-state performance of two phase systems. It was then modified using CNFRWD, SINDA's explicit time-integration scheme, to accommodate transient thermal models. However, SINFAC cannot simulate pressure drops due to time-dependent fluid acceleration, transient boil-out, or transient fill-up, except in the accumulator. SINFAC also requires the user to be familiar with SINDA. The solution procedure used by SINFAC is similar to that which an engineer would use to solve a system manually. The solution to a system requires the determination of all of the outlet conditions of each component such as the flow rate, pressure, and enthalpy. To obtain these values, the user first estimates the inlet conditions to the first component of the system, then computes the outlet conditions from the data supplied by the manufacturer of the first component. The user then estimates the temperature at the outlet of the third component and computes the corresponding flow resistance of the second component. With the flow resistance of the second component, the user computes the conditions down stream, namely the inlet conditions of the third. The computations follow for the rest of the system, back to the first component

  2. Novel analysis and improvement of Yahalom protocol

    Institute of Scientific and Technical Information of China (English)

    CHEN Chun-ling; YU Han; L(U) Heng-shan; WANG Ru-chuan

    2009-01-01

    The modified version of Yahalom protocol improved by Burrows, Abradi, and Needham (BAN) still has security drawbacks. This study analyzed such flaws in a detailed way from the point of strand spaces, which is a novel method of analyzing protocol's security. First, a mathematical model of BAN-Yahalom protocol is constructed. Second, penetrators' abilities are restricted with a rigorous and formalized definition. Moreover, to increase the security of this protocol against potential attackers in practice, a further improvement is made to the protocol. Future application of this re-improved protocol is also discussed.

  3. Improving Public Perception of Behavior Analysis.

    Science.gov (United States)

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis. PMID:27606184

  4. Improving Software Quality through Program Analysis

    International Nuclear Information System (INIS)

    In this paper, we present the Program Analysis Framework (PAF) to analyze the software architecture and software modularity of large software packages using techniques in Aspect Mining. The basic idea about PAF is to record the call relationships information among the important elements firstly and then use the different analysis algorithms to find the crosscutting concerns which could destroy the modularity of the software from this recording information. We evaluate our framework through analyzing DATE, the ALICE Data-Acquisition (DAQ) software which handles the data flow from the detector electronics to the permanent storage archiving. The analysis results prove the effectiveness and efficiency of our framework. PAF has pinpointed a number of possible optimizations which could be applied and help maximizing the software quality. PAF could also be used for the analysis of other projects written in C language.

  5. An Economic Analysis of Improved Water Quality

    OpenAIRE

    Alam, Khorshed; Rolfe, John; Donaghy, Peter

    2006-01-01

    The research reported in this paper is focused on the cost-effectiveness of intervention strategies to reduce pollution loads and improve water quality in South-east Queensland. Strategies considered include point and non-point source interventions. Predicted reductions in pollution levels were calculated for each action based on the expected population growth. The costs of the interventions included the full investment and annual running costs as well as planned public investment by the stat...

  6. Improved WKB analysis of cosmological perturbations

    International Nuclear Information System (INIS)

    Improved Wentzel-Kramers-Brillouin (WKB)-type approximations are presented in order to study cosmological perturbations beyond the lowest order. Our methods are based on functions which approximate the true perturbation modes over the complete range of the independent (Langer) variable, from subhorizon to superhorizon scales, and include the region near the turning point. We employ both a perturbative Green's function technique and an adiabatic (or semiclassical) expansion (for a linear turning point) in order to compute higher order corrections. Improved general expressions for the WKB scalar and tensor power spectra are derived for both techniques. We test our methods on the benchmark of power-law inflation, which allows comparison with exact expressions for the perturbations, and find that the next-to-leading order adiabatic expansion yields the amplitude of the power spectra with excellent accuracy, whereas the next-to-leading order with the perturbative Green's function method does not improve the leading order result significantly. However, in more general cases, either or both methods may be useful

  7. Microfracturing and new tools improve formation analysis

    Energy Technology Data Exchange (ETDEWEB)

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

    1992-12-07

    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

  8. Recent improvements in Thomson scattering data analysis

    International Nuclear Information System (INIS)

    A new profile analysis package for use with the Thomson scattering data on ISX-B has recently been implemented. The primary feature of this package is a weighted least squares fitting of temperature and density data to generate a representative curve, as opposed to the previous hand-fitting technique. The changes will automate the manner in which data are transmitted and manipulated, without affecting the calculational techniques previously used. The computer programs have also been used to estimate the sensitivity of various plasma quantities to the accuracy of the Thomson scattering data

  9. Video analysis applied to volleyball didactics to improve sport skills

    OpenAIRE

    Raiola, Gaetano; Parisi, Fabio; Giugno, Ylenia; Di Tore, Pio Alfredo

    2013-01-01

    The feedback method is increasingly used in learning new skills and improving performance. "Recent research, however, showed that the most objective and quantitative feedback is, theº greater its effect on performance". The video analysis, which is the analysis of sports performance by watching the video, is used primarily for use in the quantitative performance of athletes through the notational analysis. It may be useful to combine the quantitative and qualitative analysis of the single ges...

  10. An improved evaluation method for fault tree kinetic analysis

    International Nuclear Information System (INIS)

    By means of the exclusive sum of products of a fault tree, the improved method uses the basic event parameters direct in the synthetic evaluation and makes the fault tree kinetic analysis more simple. This paper provides a reasonable evaluation method for the kinetic analysis of basic events which has parameters of the synthetic distribution, too

  11. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  12. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2015-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of...... believe this is a major improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented...

  13. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2013-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly, the...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits of the...

  14. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly, the...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits of the...

  15. An Analysis of an Improved Bus-Based Multiprocessor Architecture

    Science.gov (United States)

    Ricks, Kenneth G.; Wells, B. Earl

    1998-01-01

    This paper analyses the effectiveness of a hybrid multiprocessing/multicomputing architecture that is based upon a single-board-computer multiprocessor (SBCM) architecture. Based upon empirical analysis using discrete event simulations and Monte Carlo techniques, this hybrid architecture, called the enhanced single-board-computer multiprocessor (ESBCM), is shown to have improved performance and scalability characteristics over current SBCM designs.

  16. Improved reliability analysis method based on the failure assessment diagram

    Science.gov (United States)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  17. Thermal hydraulic analysis of the JMTR improved LEU-core

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Toshio; Nagao, Yoshiharu; Komukai, Bunsaku; Naka, Michihiro; Fujiki, Kazuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Takeda, Takashi [Radioactive Waste Management and Nuclear Facility Decommissioning Technology Center, Tokai, Ibaraki (Japan)

    2003-01-01

    After the investigation of the new core arrangement for the JMTR reactor in order to enhance the fuel burn-up and consequently extend the operation period, the ''improved LEU core'' that utilized 2 additional fuel elements instead of formerly installed reflector elements, was adopted. This report describes the results of the thermal-hydraulic analysis of the improved LEU core as a part of safety analysis for the licensing. The analysis covers steady state, abnormal operational transients and accidents, which were described in the annexes of the licensing documents as design bases events. Calculation conditions for the computer codes were conservatively determined based on the neutronic analysis results and others. The results of the analysis, that revealed the safety criteria were satisfied on the fuel temperature, DNBR and primary coolant temperature, were used in the licensing. The operation license of the JMTR with the improved LEU core was granted in March 2001, and the reactor operation with new core started in November 2001 as 142nd operation cycle. (author)

  18. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    Directory of Open Access Journals (Sweden)

    Ada Ip

    2016-01-01

    Full Text Available Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15, with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  19. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    OpenAIRE

    Jia-Shing Sheu; Kai-Chung Teng

    2013-01-01

    The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the conte...

  20. Core analysis and CT imaging improve shale completions

    International Nuclear Information System (INIS)

    To improve hydraulic fracturing efficiency in Devonian shales, core analysis and computerized tomography (CT) can provide data for orienting perforations, determining fracture direction, and selecting deviated well trajectories. This article reports on technology tested in a West Virginia well for improving the economics of developing Devonian shale and other low permeability gas reservoirs. With slight production increase per well, Columbia Natural Resources Inc. (CNR) has determined that marginal gas well payout time can be shortened enough to encourage additional drilling. For eight wells completed by CNR in 1992, the absolute open flow (AOF) averaged 116 Mcfd before stimulation. After stimulation using long-standing fracture stimulation procedures, the AOF averaged 500 Mcfd

  1. Development and improvement of safety analysis code for geological disposal

    International Nuclear Information System (INIS)

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  2. Improvements in antenna coupling path algorithms for aircraft EMC analysis

    Science.gov (United States)

    Bogusz, Michael; Kibina, Stanley J.

    The algorithms to calculate and display the path of maximum electromagnetic interference coupling along the perfectly conducting surface of a frustrum cone model of an aircraft nose are developed and revised for the Aircraft Inter-Antenna Propagation with Graphics (AAPG) electromagnetic compatibility analysis code. Analysis of the coupling problem geometry on the frustrum cone model and representative numerical test cases reveal how the revised algorithms are more accurate than their predecessors. These improvements in accuracy and their impact on realistic aircraft electromagnetic compatibility problems are outlined.

  3. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    OpenAIRE

    Ada Ip; Raymond Asamoah-Barnieh; Diane P. Bischak; Warren J Davidson; W. Ward Flemons; Pendharkar, Sachin R.

    2016-01-01

    Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilizati...

  4. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  5. Improved hydrogen combustion model for multi-compartment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ogino, Masao; Hashimoto, Takashi [Systems Safety Department, Nuclear Power Engineering Corp., Tokyo (Japan)

    2000-11-01

    NUPEC has been improving a hydrogen combustion model in MELCOR code for severe accident analysis. In the proposed combustion model, the flame velocity in a node was predicted using six different flame front shapes of fireball, prism, bubble, spherical jet, plane jet, and parallelepiped. A verification study of the proposed model was carried out using the NUPEC large-scale combustion test results following the previous work in which the GRS/Battelle multi-compartment combustion test results had been used. The selected test cases for the study were the premixed test and the scenario-oriented test which simulated the severe accident sequences of an actual plant. The improved MELCOR code replaced by the proposed model could predict sufficiently both results of the premixed test and the scenario-oriented test of NUPEC large-scale test. The improved MELCOR code was confirmed to simulate the combustion behavior in the multi-compartment containment vessel during a severe accident with acceptable degree of accuracy. Application of the new model to the LWR severe accident analysis will be continued. (author)

  6. Improved hydrogen combustion model for multi-compartment analysis

    International Nuclear Information System (INIS)

    NUPEC has been improving a hydrogen combustion model in MELCOR code for severe accident analysis. In the proposed combustion model, the flame velocity in a node was predicted using six different flame front shapes of fireball, prism, bubble, spherical jet, plane jet, and parallelepiped. A verification study of the proposed model was carried out using the NUPEC large-scale combustion test results following the previous work in which the GRS/Battelle multi-compartment combustion test results had been used. The selected test cases for the study were the premixed test and the scenario-oriented test which simulated the severe accident sequences of an actual plant. The improved MELCOR code replaced by the proposed model could predict sufficiently both results of the premixed test and the scenario-oriented test of NUPEC large-scale test. The improved MELCOR code was confirmed to simulate the combustion behavior in the multi-compartment containment vessel during a severe accident with acceptable degree of accuracy. Application of the new model to the LWR severe accident analysis will be continued. (author)

  7. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  8. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    Energy Technology Data Exchange (ETDEWEB)

    Sun Zengzhi [School of Materials Science and Engineering, Chang' an University, Xi' an 710064 (China)], E-mail: zz-sun@126.com; Xu Qinwu [Pavement research, Transtec Group Inc., Austin 78731 (United States)], E-mail: qinwu_xu@yahoo.com

    2008-08-25

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca{sup 2+} and Al{sup 3+} cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH){sub 2}, acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent.

  9. New Framework for Improving Big Data Analysis Using Mobile Agent

    Directory of Open Access Journals (Sweden)

    Youssef M. ESSA

    2014-01-01

    Full Text Available the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS. Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop designed to support parallel and distributed data processing. Hadoop provides a distributed file processing system that stores and processes a large scale of data. It enables a fault tolerant by replicating data on three or more machines to avoid data loss.Hadoop is based on client server model and used single master machine called NameNode. However, Hadoop has several drawbacks affecting on its performance and reliability against big data analysis. In this paper, a new framework is proposed to improve big data analysis and overcome specified drawbacks of Hadoop. These drawbacks are replication tasks, Centralized node and nodes failure. The proposed framework is called MapReduce Agent Mobility (MRAM. MRAM is developed by using mobile agent and MapReduce paradigm under Java Agent Development Framework (JADE.

  10. Skill Gap Analysis for Improved Skills and Quality Deliverables

    Directory of Open Access Journals (Sweden)

    Mallikarjun Koripadu

    2014-10-01

    Full Text Available With a growing pressure in identifying the skilled resources in Clinical Data Management (CDM world of clinical research organizations, to provide the quality deliverables most of the CDM organizations are planning to improve the skills within the organization. In changing CDM landscape the ability to build, manage and leverage the skills of clinical data managers is very critical and important. Within CDM to proactively identify, analyze and address skill gaps for all the roles involved. In addition to domain skills, the evolving role of a clinical data manager demands diverse skill sets such as project management, six sigma, analytical, decision making, communication etc. This article proposes a methodology of skill gap analysis (SGA management as one of the potential solutions to the big skill challenge that CDM is gearing up for bridging the gap of skills. This would in turn strength the CDM capability, scalability, consistency across geographies along with improved productivity and quality of deliverables

  11. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  12. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  13. Analysis and implementation of an improved recycling folded cascode amplifier

    Institute of Scientific and Technical Information of China (English)

    李一雷; 韩科峰; 闫娜; 谈熙; 闵昊

    2012-01-01

    A generally improved recycling folded cascode (IRFC) is analyzed and implemented.Analysis and comparisons among the IRFC,the original recycling folded cascode (RFC) and the conventional folded cascode (FC) are made,and it is shown that with the flexible structure of IRFC,significant enhancement in transconductance,slew rate and noise can be achieved.Prototype amplifiers were fabricated in 0.13 μm technology.Measurement shows that IRFC has 3 × enhancement in gain-bandwidth and slew rate over conventional FC,and the enhancement is 1.5× when compared with the RFC.

  14. Quality Improvement of Multispectral Images for Ancient Document Analysis

    Czech Academy of Sciences Publication Activity Database

    Bianco, G.; Bruno, F.; Salerno, E.; Tonazzini, A.; Zitová, Barbara; Šroubek, Filip

    Budapest : ARCHAEOLINGUA, 2010 - (Ioannides, M.; Fellner, D.; Georgopoulos, A.; Hadjimitsis, D.), s. 29-34 ISBN 978-963-9911-16-1. [3rd International Conference dedicated on Digital Heritage. Limasol (CY), 08.11.2010-13.11.2010] R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/1593 Institutional research plan: CEZ:AV0Z10750506 Keywords : document analysis * deblurring * image registration * multispectral imaging * blind source deconvolution Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2010/ZOI/zitova-quality%20improvement%20of%20multispectral%20images%20for%20ancient%20document%20analysis.pdf

  15. An Efficient and Configurable Preprocessing Algorithm to Improve Stability Analysis.

    Science.gov (United States)

    Sesia, Ilaria; Cantoni, Elena; Cernigliaro, Alice; Signorile, Giovanna; Fantino, Gianluca; Tavella, Patrizia

    2016-04-01

    The Allan variance (AVAR) is widely used to measure the stability of experimental time series. Specifically, AVAR is commonly used in space applications such as monitoring the clocks of the global navigation satellite systems (GNSSs). In these applications, the experimental data present some peculiar aspects which are not generally encountered when the measurements are carried out in a laboratory. Space clocks' data can in fact present outliers, jumps, and missing values, which corrupt the clock characterization. Therefore, an efficient preprocessing is fundamental to ensure a proper data analysis and improve the stability estimation performed with the AVAR or other similar variances. In this work, we propose a preprocessing algorithm and its implementation in a robust software code (in MATLAB language) able to deal with time series of experimental data affected by nonstationarities and missing data; our method is properly detecting and removing anomalous behaviors, hence making the subsequent stability analysis more reliable. PMID:26540679

  16. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    Science.gov (United States)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  17. An improved radial basis function network for structural reliability analysis

    International Nuclear Information System (INIS)

    Approximation methods such as response surface method and artificial neural network (ANN) method are widely used to alleviate the computation costs in structural reliability analysis. However most of the ANN methods proposed in the literature suffer various drawbacks such as poor choice of parameter setting, poor generalization and local minimum. In this study, a support vector machine-based radial basis function (RBF) network method is proposed, in which the improved RBF model is used to approximate the limit state function and then is connected to a reliability method to estimate failure probability. Since the learning algorithm of RBF network is replaced by the support vector algorithm, the advantage of the latter, such as good generalization ability and global optimization are propagated to the former, thus the inherent drawback of RBF network can be defeated. Numerical examples are given to demonstrate the applicability of the improved RBF network method in structural reliability analysis, as well as to illustrate the validity and effectiveness of the proposed method

  18. Improvements and experience in the analysis of reprocessing samples

    International Nuclear Information System (INIS)

    Improvements in the analysis of input samples for reprocessing were obtained. To cope with the decomposition of reprocessing input solutions owling to the high radioactivity, an aluminium capsule technique was developed. A known amount of the dissolver solution was weighed into an aluminium can, dried, and the capsule was sealed. In this form, the sample could be stored over a long period and could be redissolved later for the analysis. The isotope correlation technique offers an attractive alternative for measuring the plutonium isotopic content in the dissolver solution. Moreover, this technique allows for consistency checks of analytical results. For this purpose, a data bank of correlated isotopic data is in use. To improve the efficiency of analytical work, four automatic instruments have been developed. The conditioning of samples for the U-Pu isotopic measurement was achieved by an automatic ion exchanger. A mass spectrometer, to which a high vacuum lock is connected, allows the automatic measurement of U-Pu samples. A process-computer controls the heating, focusing and scanning processes during the measurement and evaluates the data. To ease the data handling, alpha-spectrometry as well as a balance have been automated. (author)

  19. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ

    2010-09-01

    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  20. Skill analysis part 3: improving a practice skill.

    Science.gov (United States)

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series. PMID:22356066

  1. Potential Improvements in Human Reliability Analysis for Fire Risk Assessments

    International Nuclear Information System (INIS)

    The results of numerous fire risk assessments (FRA) and the experience gained from actual fire events have shown that fire can be a significant contributor to nuclear power plant (NPP) risk. However, on the basis of reviews of the FRAs performed for the Individual Plant External Events Examination (IPEEE) program in the U.S. and on recent research performed by U.S. Nuclear Regulatory Commission (NRC) to support increased use of risk information in regulatory decision making [e.g., Ref. 1, 2], it has become clear that improved modelling and quantification of human performance during fire events requires a better treatment of the special environment and response context produced by fires. This paper describes fire-related factors that have been identified as potentially impacting human performance, discusses to what extent such factors were modelled in the IPEEE FRAs, discusses prioritization of the factors likely to be most important to a realistic assessment of plant safety, and discusses which factors are likely to need additional research and development in order to allow adequate modelling in the human reliability analysis (HRA) portions of FRAs. The determination of which factors need to be modelled and the improvement of HRA related approaches for modelling such factors are critical aspects of the NRC's plan to improve FRA methods, tools, and data and to update a number of existing FRAs. (authors)

  2. ECONOMIC AND ENERGETICAL ANALYSIS OF IMPROVED WASTE UTILIZATION PLASMA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Serghei VAMBOL

    2015-07-01

    Full Text Available Purpose. Energy and economic evaluation of the improved plasma waste utilization technological process, as well as an expediency substantiation of the use of improved plasma technology by comparing its energy consumption with other thermal methods of utilization. Methodology. Analysis of existing modern and advanced methods of waste management and its impact on environmental safety. Considering of energy and monetary costs to implement two different waste management technologies. Results. Studies have shown regular gasification ensure greater heating value due to differences, a significant amount of nitrogen than for plasma gasification. From the point of view of minimizing energy and monetary costs and environmental safety more promising is to offer advanced technology for plasma waste. To carry out the energy assessment of the appropriateness of the considered technologies-comparative calculation was carried out at the standard conditions. This is because in the processing of waste produced useful products, such as liquefied methane, synthetic gas (94% methane and a fuel gas for heating, suitable for sale that provides cost-effectiveness of this technology. Originality. Shown and evaluated ecological and economic efficiency of proposed improved plasma waste utilization technology compared with other thermal techniques. Practical value. Considered and grounded of energy and monetary costs to implement two different waste management technologies, namely ordinary gasification and using plasma generators. Proposed plasma waste utilization technology allows to obtain useful products, such as liquefied methane, synthetic gas and a fuel gas for heating, which are suitable for sale. Plant for improved plasma waste utilization technological process allows to compensate the daily and seasonal electricity and heat consumption fluctuations by allowing the storage of obtained fuel products.

  3. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  4. TENDENCY OF IMPROVEMENT ANALYSIS OF VENTURE ACTIVITY FOR MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс

    2015-03-01

    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  5. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  6. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)

    2012-03-15

    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  7. Benchmarking Of Improved DPAC Transient Deflagration Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Laurinat, James E.; Hensel, Steve J.

    2013-03-21

    The transient deflagration code DPAC (Deflagration Pressure Analysis Code) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak deflagration pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vessel walls. In addition, DPAC has been coupled with CEA, a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. The improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.

  8. Improved Analysis for Graphic TSP Approximation via Matchings

    CERN Document Server

    Mucha, Marcin

    2011-01-01

    The Travelling Salesman Problem is one the most fundamental and most studied problems in approximation algorithms. For more than 30 years, the best algorithm known for general metrics has been Christofides's algorithm with approximation factor of 3/2, even though the so-called Held-Karp LP relaxation of the problem is conjectured to have the integrality gap of only 4/3. Very recently, significant progress has been made for the important special case of graphic metrics, first by Oveis Gharan et al., and then by Momke and Svensson. In this paper, we provide an improved analysis for the approach introduced by Momke and Svensson yielding a bound of 35/24 on the approximation factor, as well as a bound of 19/12+epsilon for any epsilon>0 for a more general Travelling Salesman Path Problem in graphic metrics.

  9. Improving knowledge management systems with latent semantic analysis

    International Nuclear Information System (INIS)

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  10. A improved method for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    In this work we describe a methodology, developed in the last years, for the analysis of alpha emitters spectra, obtained with implanted ion detectors, that tend to solve some of the problems that shows this type of spectra. This is an improved methodology respect to that described in a previous publication. The method is based on the application of a mathematical function that allows to model the tail of an alpha peak, to evaluate the part of the peak that is not seen in the cases of partial superposition with another peak. Also, a calculation program that works in a semiautomatic way, with the possibility of interactive intervention of the analyst, has been developed simultaneously and is described in detail. (author)

  11. Improved iterative error analysis for endmember extraction from hyperspectral imagery

    Science.gov (United States)

    Sun, Lixin; Zhang, Ying; Guindon, Bert

    2008-08-01

    Automated image endmember extraction from hyperspectral imagery is a challenge and a critical step in spectral mixture analysis (SMA). Over the past years, great efforts were made and a large number of algorithms have been proposed to address this issue. Iterative error analysis (IEA) is one of the well-known existing endmember extraction methods. IEA identifies pixel spectra as a number of image endmembers by an iterative process. In each of the iterations, a fully constrained (abundance nonnegativity and abundance sum-to-one constraints) spectral unmixing based on previously identified endmembers is performed to model all image pixels. The pixel spectrum with the largest residual error is then selected as a new image endmember. This paper proposes an updated version of IEA by making improvements on three aspects of the method. First, fully constrained spectral unmixing is replaced by a weakly constrained (abundance nonnegativity and abundance sum-less-or-equal-to-one constraints) alternative. This is necessary due to the fact that only a subset of endmembers exhibit in a hyperspectral image have been extracted up to an intermediate iteration and the abundance sum-to-one constraint is invalid at the moment. Second, the search strategy for achieving an optimal set of image endmembers is changed from sequential forward selection (SFS) to sequential forward floating selection (SFFS) to reduce the so-called "nesting effect" in resultant set of endmembers. Third, a pixel spectrum is identified as a new image endmember depending on both its spectral extremity in the feature hyperspace of a dataset and its capacity to characterize other mixed pixels. This is achieved by evaluating a set of extracted endmembers using a criterion function, which is consisted of the mean and standard deviation of residual error image. Preliminary comparison between the image endmembers extracted using improved and original IEA are conducted based on an airborne visible infrared imaging

  12. Voxel model in BNCT treatment planning: performance analysis and improvements

    Science.gov (United States)

    González, Sara J.; Carando, Daniel G.; Santa Cruz, Gustavo A.; Zamenhof, Robert G.

    2005-02-01

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  13. Voxel model in BNCT treatment planning: performance analysis and improvements

    International Nuclear Information System (INIS)

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  14. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  15. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  16. Receiver operating characteristic analysis improves diagnosis by radionuclide ventriculography

    International Nuclear Information System (INIS)

    Receiver operating characteristic analysis (ROC) evaluates continuous variables to define diagnostic criteria for the optimal sensitivity (SENS) and specificity (SPEC) of a test. The authors studied exercise-induced chest pain (CP), ST-changes on electrocardiography (ECG) and rest-exercise gated radionuclide ventriculography (RVG) using ROC to clarify the optimal criteria for detecting myocardial ischemia due to coronary artherosclerosis (CAD). The data of 95 consecutive patients studied with coronary angiography, rest-exercise RVG and ECG were reviewed. 77 patients had ''significant'' CAD (≥50% lesions). Exercise-induced CP, ECG abnormalities (ST-T shifts) and RVG abnormalities (change in ejection fraction, 2-view regional wall motion change and relative end-systolic volume) were evaluated to define optimal SENS/SPEC of each and for the combined data. ROC curves were constructed by multiple logistic regression (MLR). By MLR, RVG alone was superior to ECG and CP. The combination of all three produced the best ROC curve for the entire group and for clinical subsets based on the number of diseased vessels and the presence or absence of prior myocardial infarction. When CP, ECG and RVG were combined, the optimal SENS/SPEC for detection of single vessel disease was 88/86. The SENS/SPEC for 3 vessel disease was 93/95. Thus, the application of RVG for the diagnosis of myocardial ischemia is improved with the inclusion of ECG and CP data by the use of a multiple logistic regression model. ROC analysis allows clinical application of multiple data for diagnosing CAD at desired SENS/SPEC rather than by arbitrary single-standard criteria

  17. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  18. Improvement of powertrain efficiency through energy breakdown analysis

    International Nuclear Information System (INIS)

    Highlights: • Energy breakdown analysis for the vehicular powertrain. • Model for road vehicles simulation in different missions. • Implemented powertrain management strategies: intelligent gearbox, Stop and Start, free wheel. • Innovative hybrid powertrain turned to engine thermodynamic cycles minimization. • Evaluation of fuel savings associated to each management strategy. - Abstract: A vehicular powertrain can be thought as an energy conversion chain, each component being characterized by its efficiency. Significant global efficiency improvements can be achieved once is identified the system energy breakdown, individuating the losses connected to each powertrain component; it is then possible to carry out the most appropriate interventions. This paper presents a simulation study of a diesel-fuelled commercial vehicle powertrain based on the above summarized point of view. The work aims at individuating the energy flows involved in the system during different missions, proposing an intelligent combination of technical solutions which minimize fuel consumption. Through a validated Matlab–Simulink model, able to indicate the powertrain energy breakdown, simulations are carried out to evaluate the fuel saving associated to a series of powertrain management logics which lead to the minimization of engine losses, the recovery of reverse power in deceleration and braking, the elimination of useless engine cycles. Tests were performed for different real missions (urban, extra-urban and highway). The results obtained point out a –23% fuel consumption (average value for urban, extra-urban and highway missions) compared to the traditional powertrain. Clearly, such result affects positively the CO2 emission

  19. Improved analysis of solar signals for differential reflectivity monitoring

    Science.gov (United States)

    Huuskonen, Asko; Kurri, Mikko; Holleman, Iwan

    2016-07-01

    The method for the daily monitoring of the differential reflectivity bias for polarimetric weather radars is developed further. Improved quality control is applied to the solar signals detected during the operational scanning of the radar, which efficiently removes rain and clutter-contaminated gates occurring in the solar hits. The simultaneous reflectivity data are used as a proxy to determine which data points are to be removed. A number of analysis methods to determine the differential reflectivity bias are compared, and methods based on surface fitting are found superior to simple averaging. A separate fit to the reflectivity of the horizontal and vertical polarization channels is recommended because of stability. Separate fitting also provides, in addition to the differential reflectivity bias, the pointing difference of the polarization channels. Data from the Finnish weather radar network show that the pointing difference is less than 0.02° and that the differential reflectivity bias is stable and determined to better than 0.04 dB. The results are compared to those from measurements at vertical incidence, which allows us to determine the total differential reflectivity bias including the differential receiver bias and the transmitter bias.

  20. Process Correlation Analysis Model for Process Improvement Identification

    OpenAIRE

    Su-jin Choi; Dae-Kyoo Kim; Sooyong Park

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practice...

  1. Analysis of the Improvement Methods for Equipment Maintenance Support

    Institute of Scientific and Technical Information of China (English)

    ZHANG Rui-chang; ZHAO Song-zheng

    2005-01-01

    According to military requirement, and based on the problems of equipment maintenance support methods in high-tech battles, each element supporting equipment maintenance is analyzed, and the methods for improving equipment maintenance are proposed.

  2. Gap Analysis Approach for Construction Safety Program Improvement

    OpenAIRE

    Thanet Aksorn; B.H.W. Hadikusumo

    2007-01-01

    To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual...

  3. Improve the Method for Requirements Analysis on Commercial Information System

    OpenAIRE

    Peng, Chen

    2005-01-01

    This thesis states the tasks of the analyst: communicating with commercial customer to establish their requirements; reframing those requirements by negotiation in order that programmers can understand it to write the codes efficiently. Soft System Methodology (SSM) is an effective approach to identify the situation of the problem. In my thesis, I will improve a new business – oriented method that is called Process Improvement for Strategic Objectives (PISO) with SSM to make PISO have more ef...

  4. Aberration analysis and efficiency improvement of a bidirectional optical subassembly

    Science.gov (United States)

    Wu, Hao; Huang, Zhangdi; Yu, Ziyan; Qian, Xiaoshi; Xu, Fei; Chen, Beckham; Lu, Yanqing

    2009-10-01

    An approach to improve the coupling efficiency of bidirectional optical subassembly (BOSA) modules is proposed and experimentally demonstrated. We analyzed the wavefront aberration coefficients of a typical BOSA. It was found that the 45-deg wavelength filter induces coma and astigmatism, and then it further deteriorates the laser diode to fiber coupling. We measured the BOSA efficiencies based on a series of different filters. For a typical 0.5-mm filter, 25% coupling efficiency improvement was achieved by optimizing the filter parameters.

  5. New Framework for Improving Big Data Analysis Using Mobile Agent

    OpenAIRE

    Youssef M. ESSA; Gamal ATTIYA; El-Sayed, Ayman

    2014-01-01

    the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS). Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop de...

  6. Customer Satisfaction Analysis and Proposals for Its Improvement

    OpenAIRE

    Slobodníková, Eva

    2014-01-01

    The diploma thesis deals with the analysis of customer satisfaction with the product. This work defines the theoretical backgroung and issues related to customer satisfaction. The basic methods and processes for analysis of the marketing environment and the isssue of a questionnaire survey. Theoretical solutions are applied to a specific product, running shoes of the company adidas. The questionnaire survey, internal and external analysis, suggests the factors with which customers are or are ...

  7. Using external data sources to improve audit trail analysis.

    OpenAIRE

    Herting, R. L.; Asaro, P. V.; Roth, A. C.; Barnes, M. R.

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system wh...

  8. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  9. Severe accident analysis code SAMPSON improvement for IMPACT project

    International Nuclear Information System (INIS)

    SAMPSON is the integral code for severe accident analysis in detail with modular structure, developed in the IMPACT project. Each module can run independently and communications with multiple analysis modules supervised by the analysis control module makes an integral analysis possible. At the end of Phase 1 (1994-1997), demonstration simulation by combinations of up to 11 analysis modules had been performed and physical models in the code had been verified by separate-effect tests and validated by integral tests. Multi-dimensional mechanistic models and theoretical-based conservation equations have been applied, during Phase 2 (1998 - 2000). New models for Accident Management evaluation have been also developed. Verification and validation have been performed by analysing separate-effect tests and integral tests, while actual plant analyses are also being in progress. (author)

  10. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  11. Analysis and Improvement of Low Rank Representation for Subspace segmentation

    CERN Document Server

    Siming, Wei

    2011-01-01

    We analyze and improve low rank representation (LRR), the state-of-the-art algorithm for subspace segmentation of data. We prove that for the noiseless case, the optimization model of LRR has a unique solution, which is the shape interaction matrix (SIM) of the data matrix. So in essence LRR is equivalent to factorization methods. We also prove that the minimum value of the optimization model of LRR is equal to the rank of the data matrix. For the noisy case, we show that LRR can be approximated as a factorization method that combines noise removal by column sparse robust PCA. We further propose an improved version of LRR, called Robust Shape Interaction (RSI), which uses the corrected data as the dictionary instead of the noisy data. RSI is more robust than LRR when the corruption in data is heavy. Experiments on both synthetic and real data testify to the improved robustness of RSI.

  12. Does Competition Improve Public School Efficiency? A Spatial Analysis

    Science.gov (United States)

    Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.

    2012-01-01

    Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…

  13. Breeding ratio analysis for the improved Flower-SCWFR core

    International Nuclear Information System (INIS)

    Supercritical Water-cooled Fast Reactor (SCWFR) presents features of combining fast and light water reactor characteristics in one design. The coolant mass flow rate is just as 1/8 as in the BWR, and the neutron energy is harder than in PWR, so it would has the breeding ability. In this paper, using different models of improved Flower-SCWFR core, the void reactivity effect, power distribution,and breeding ratio are analyzed by core zoning scheme, axial coolant densities zoning,seed and blanket assembly with suitable P/D value, MOX fuel with different design and enrichment zoning, and solid uranium matrix cooled by internal clad channels in blanket assembly. Finally, an optimized modal of the improved SCWFR cores-'Flower type' is obtained. (authors)

  14. Analysis of Strategies to Improve Heliostat Tracking at Solar Two

    Energy Technology Data Exchange (ETDEWEB)

    Jones, S.A.; Stone, K.W.

    1999-01-14

    This paper investigates dhlerent strategies that can be used to improve the tracking accuracy of heliostats at Solar Two. The different strategies are analyzed using a geometrical error model to determine their performance over the course of a day. By using the performance of heliostats in representative locations of the field aad on representative days of the year, an estimate of the annual performance of each strategy is presented.

  15. An analysis of potential improvements within Lithuanian sawlog supply

    OpenAIRE

    Puodžiūnas, Mindaugas

    2013-01-01

    The wood supply from the forest to the industry is often characterized by high variability caused by the divergent structure of forest products, seasonality and uneven geographical distribution of forest resources and forest products industries. Based on previous knowledge on wood supply this thesis aimed to identify existing wood supply patterns and strategies in the Lithuanian state forest sector, to evaluate supply chain performance for sawmills and to examine potential improvements for ro...

  16. Point-Based POMDP Algorithms: Improved Analysis and Implementation

    OpenAIRE

    Smith, Trey; Simmons, Reid

    2012-01-01

    Existing complexity bounds for point-based POMDP value iteration algorithms focus either on the curse of dimensionality or the curse of history. We derive a new bound that relies on both and uses the concept of discounted reachability; our conclusions may help guide future algorithm design. We also discuss recent improvements to our (point-based) heuristic search value iteration algorithm. Our new implementation calculates tighter initial bounds, avoids solving linear programs, and makes more...

  17. Improvements in longwall downtime analysis and fault identification

    Energy Technology Data Exchange (ETDEWEB)

    Daniel Bongers [CRCMining (Australia)

    2006-12-15

    In this project we have developed a computer program for recording detailed information relating to face equipment downtime in longwall mining operations. This software is intended to replace the current manual recording of delay information, which has been proven to be inaccurate. The software developed is intended to be operated from the maingate computer. Users are provided with a simple user interface requesting that nature of each delay in production, which is time-stamped in alignment with the SCADA system, removing the need for operators to estimate the start time and duration of each delay. Each instance of non-production is recorded to a database, which may be accessed by surface computers, removing the need for transcribing of the deputy's report into the delay database. An additional suggestive element has been developed, based on sophisticated fault detection technology, which reduces the data input required by operators, and provides a basis for the implementation of real-time fault detection. Both the basic recording software and the suggestive element offer improvements in efficiency and accuracy to longwall operations. More accurate data allows improved maintenance planning and improved measures of operational KPIs. The suggestive element offers the potential for rapid fault diagnosis, and potentially delay forecasting, which may be used to reduce lost time associated with machine downtime.

  18. Some improvements in air particulate matter analysis by INAA

    Science.gov (United States)

    Farinha, M. M.; Freitas, M. C.; Almeida, S. M.; Reis, M. A.

    2001-06-01

    At ITN, analysis of air particulate matter has been made since 1999, stimulated by a contract for air quality monitoring of an urban waste incinerator. Samples are analysed by Instrumental Neutron Activation Analysis (INAA) and Proton Induced X-ray Emission (PIXE). Heavy metals and other elements are determined. The procedures for filter analysis have recently been changed, leading to the present comparison between the old and the new procedures. For INAA, in this new procedure we look for the 336.2 keV gamma line of 115mIn in addition to the gamma-ray line of 527.9 keV used for the detection of 115Cd. Cd evaluations obtained by both gamma lines are compared and detection limits for Cd are presented. Preliminary results for Cd, As, Ni, and Hg are shown for a region in the north of Lisbon.

  19. Some improvements in air particulate matter analysis by INAA

    Energy Technology Data Exchange (ETDEWEB)

    Farinha, M.M. E-mail: mmanuelf@itn1.itn.pt; Freitas, M.C.; Almeida, S.M.; Reis, M.A

    2001-06-01

    At ITN/ analysis of air particulate matter has been made since 1999/ stimulated by a contract for air quality monitoring of an urban waste incinerator. Samples are analysed by Instrumental Neutron Activation Analysis (INAA) and Proton Induced X-ray Emission (PIXE). Heavy metals and other elements are determined. The procedures for filter analysis have recently been changed/ leading to the present comparison between the old and the new procedures. For INAA/ in this new procedure we look for the 336.2 keV gamma line of {sup 115m}In in addition to the gamma-ray line of 527.9 keV used for the detection of {sup 115}Cd. Cd evaluations obtained by both gamma lines are compared and detection limits for Cd are presented. Preliminary results for Cd/ As/ Ni/ and Hg are shown for a region in the north of Lisbon.

  20. Estimating the Benefits of Water Quality Improvements Using Meta-Analysis and Benefits Transfer

    OpenAIRE

    Alvarez, Sergio; Asci, Serhat

    2014-01-01

    In this paper we conduct a meta-analysis of the non-market valuation literature dealing with water quality improvements in the United States. We use this meta-analysis to estimate benefits transfer functions, which will allow us to estimate the water quality improvements in the state of Florida as a result of adoption and implementation of agricultural BMPs.

  1. Improvement of DYANA. The dynamic analysis program for event transition

    International Nuclear Information System (INIS)

    In the probabilistic safety assessment (PSA), the fault tree/event tree technique has been widely used to evaluate accident sequence frequencies. However, event transition which operators actually face can not be dynamically treated by the conventional technique. Therefore, we have made the dynamic analysis program(DYANA) for event transition for a liquid metal cooled fast breeder reactor. In the previous development, we made basic model for analysis. However, we have a problem that calculation time is too long. At the current term, we made parallelization of DYANA using MPI. So we got good performance on WS claster. It performance is close to ideal one. (author)

  2. Preintervention Analysis and Improvement of Customer Greeting in a Restaurant

    Science.gov (United States)

    Therrien, Kelly; Wilder, David A.; Rodriguez, Manuel; Wine, Byron

    2005-01-01

    We examined customer greeting by employees at one location of a sandwich restaurant chain. First, a preintervention analysis was conducted to determine the conditions under which greeting a customer within 3 s of his or her entry into the restaurant did and did not occur. Results suggested that an appropriate customer greeting was most likely to…

  3. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  4. Some Ideas to Improve Pyroclast Density and Vesicularity Data Analysis

    Science.gov (United States)

    Bernard, B.; Kueppers, U.; Ortiz, H. D.

    2014-12-01

    Pyroclast density and vesicularity are critical parameters in physical volcanology used to reconstruct eruptive dynamics and feed numerical models. Pyroclastic deposits typically present a wide range of density and vesicularity values, so measurements must be repeated tens of times. These data are generally treated using classical statistical analysis including averages and frequency histograms. One issue in this approach is that density and vesicularity are intensive properties and therefore they cannot be added or averaged directly. We encourage the use of weighted density and vesicularity averages and histograms, which is, until now, done only in few studies. In order to insure an adequate and efficient use of the weighting equations, we introduce an open-source R code to calculate the most common statistical parameters such as range and weighted averages, and produce abundance histograms. An important question when working with statistics is whether or not the sample size is large enough. To address this matter we also included a stability analysis based on a Monte Carlo approach which enables to quantify the reliability of the results. To illustrate this methodology we chose two large datasets from Chachimbiro (Ecuador) and Unzen (Japan) volcanoes. Our first results indicate that the use of weighted analysis instead of frequency analysis can change the density and vesicularity averages up to 4% and the shape of the abundance histogram leading to different interpretations. The stability analysis reveals that the number of measurements required for reliable results depends greatly on the distribution of density and vesicularity values. Therefore the number of measurements must be fixed on an ipso facto basis using a large sample size at the beginning and reducing it to achieve time efficiency.

  5. Taylor's swimming sheet: Analysis and improvement of the perturbation series

    CERN Document Server

    Sauzade, Martin; Lauga, Eric; 10.1016/j.physd.2011.06.023

    2013-01-01

    In G.I. Taylor's historic paper on swimming microorganisms, a two dimensional sheet was proposed as a model for flagellated cells passing traveling waves as a means of locomotion. Using a perturbation series, Taylor computed swimming speeds up to fourth order in amplitude. Here we systematize that expansion so that it can be carried out formally to arbitrarily high order. The resultant series diverges for an order one value of the wave amplitude, but may be transformed into series with much improved convergence properties and which yield results comparing favorably to those obtained numerically via a boundary integral method for moderate and large values of the wave amplitudes.

  6. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  7. Non-destructive infrared spectroscopic analysis of IMPROVE aerosol samples

    Science.gov (United States)

    Ruthenburg, T. C.; Dillner, A. M.

    2011-12-01

    The use of mid-infrared (MIR) spectroscopy is of increasing interest for determining organic functional group composition of aerosols. The organic fraction of aerosols is thought to affect visibility, climate and toxicity. Organic functional group composition can provide insights into aerosol sources and aging. The Interagency Monitoring of Protected Visual Environments (IMPROVE) program, established in 1985, operates a long term particulate matter monitoring network primarily in National Parks and Wilderness Areas. IMRPROVE samples collected on polytetrafluoroethylene (PTFE) filters are analyzed via IR spectroscopy to determine organic functional group composition. Organic carbon (OC) mass determined by MIR spectroscopy is compared to OC derived from a thermal-optical method.

  8. Improved Conjunction Analysis via Collaborative Space Situational Awareness

    Science.gov (United States)

    Kelso, T.; Vallado, D.; Chan, J.; Buckwalter, B.

    With recent events such as the Chinese ASAT test in 2007 and the USA 193 intercept in 2008, many satellite operators are becoming increasingly aware of the potential threat to their satellites as the result of orbital debris or even other satellites. However, to be successful at conjunction monitoring and collision avoidance requires accurate orbital information for as many space objects (payloads, dead satellites, rocket bodies, and debris) as possible. Given the current capabilities of the US Space Surveillance Network (SSN), approximately 18,500 objects are now being tracked and orbital data (in the form of two-line element sets) is available to satellite operators for 11,750 of them (as of 2008 September 1). The capability to automatically process this orbital data to look for close conjunctions and provide that information to satellite operators via the Internet has been continuously available on CelesTrak, in the form of Satellite Orbital Conjunction Reports Assessing Threatening Encounters in Space (SOCRATES), since May 2004. Those reports are used by many operators as one way to keep apprised of these potential threats. However, the two-line element sets (TLEs) are generated using non-cooperative tracking via the SSN's network of radar and optical sensors. As a result, the relatively low accuracy of the data results in a large number of false alarms that satellite operators must routinely deal with. Yet, satellite operators typically perform orbit maintenance for their own satellites, using active ranging and GPS systems. These data are often an order of magnitude more accurate than those available using TLEs. When combined (in the form of ephemerides) with maneuver planning information, the ability to maintain predictive awareness increases significantly. And when satellite operators share this data, the improved space situational awareness, particularly in the crowded geosynchronous belt, can be dramatic and the number of false alarms can be reduced

  9. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    OpenAIRE

    Shao Jie; Wang Li; Zhao WeiSong; Zhong YaQin; Reza Malekian

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power...

  10. Modeling Analysis and Improvement of Power Loss in Microgrid

    Directory of Open Access Journals (Sweden)

    H. Lan

    2015-01-01

    Full Text Available The consumption of conventional energy sources and environmental concerns have resulted in rapid growth in the amount of renewable energy introduced to power systems. With the help of distributed generations (DG, the improvement of power loss and voltage profile can be the salient benefits. However, studies show that improper placement and size of energy storage system (ESS lead to undesired power loss and the risk of voltage stability, especially in the case of high renewable energy penetration. To solve the problem, this paper sets up a microgrid based on IEEE 34-bus distribution system which consists of wind power generation system, photovoltaic generation system, diesel generation system, and energy storage system associated with various types of load. Furthermore, the particle swarm optimization (PSO algorithm is proposed in the paper to minimize the power loss and improve the system voltage profiles by optimally managing the different sorts of distributed generations under consideration of the worst condition of renewable energy production. The established IEEE 34-bus system is adopted to perform case studies. The detailed simulation results for each case clearly demonstrate the necessity of optimal management of the system operation and the effectiveness of the proposed method.

  11. Analysis of metals and alloys for improved material compatibility

    International Nuclear Information System (INIS)

    Various metals and alloys are used in boilers and heat exchangers. Chemical and physical reactions occurring in the boiler may lead to destruction of materials of construction or to the formation of scales and sludge. Many of the problems associated with boilers can be minimised by suitable material selection. Analytical techniques play a vital role in this task. The use of conventional wet chemical methods are well established and yield accurate results for the assay of major constituents. The use of atomic absorption spectrophotometry has led to the development of elegant procedures for a convenient and rapid estimation of minor constituents without any need for separation of matrix elements. The various procedures developed at Analytical Chemistry Division for trace analysis metals and alloys are described in this paper with special reference to the analysis of steel and other nuclear materials. (author)

  12. A simple way to improved formulation of {FE}^2 analysis

    Science.gov (United States)

    Šolinc, Urša; Korelc, Jože

    2015-11-01

    A new formulation of two-scale {FE}^2 analysis introduces symmetric stretch tensor as strain measure on macro level instead of asymmetric deformation gradient to determine boundary conditions on embedded microstructure. This significantly reduces computational cost of boundary conditions related sensitivity analysis of microstructure and with it the evaluation of local macroscopic stress tensors and tangent matrices. Various {FE}^2 formulations with isogeometric and standard finite element microanalysis are tested for consistency, accuracy and numerical efficiency on numerical homogenisation examples. Objective performance comparison of different {FE}^2 formulations is enabled with automation of all procedures in symbolic code generation system AceGen. The results obtained in numerical examples show reduced computational cost of the new {FE}^2 formulation without loss of accuracy and comparable numerical efficiency of higher order isogeometric and standard {FE}^2 formulations.

  13. Database improvements for motor vehicle/bicycle crash analysis

    OpenAIRE

    Lusk, Anne C; Asgarzadeh, Morteza; Farvid, Maryam S

    2015-01-01

    Background: Bicycling is healthy but needs to be safer for more to bike. Police crash templates are designed for reporting crashes between motor vehicles, but not between vehicles/bicycles. If written/drawn bicycle-crash-scene details exist, these are not entered into spreadsheets. Objective: To assess which bicycle-crash-scene data might be added to spreadsheets for analysis. Methods: Police crash templates from 50 states were analysed. Reports for 3350 motor vehicle/bicycle crashes (2011) w...

  14. Analysis and improvement of the marketing mix. Case Company X.

    OpenAIRE

    Tarasova, Maria

    2014-01-01

    The aim of the thesis was to analyse the importance of the marketing mix, consider its crucial elements, discover their correlation and effect on overall performance. The work considers the analysis of the marketing mix and ways of enhancement in order to achieve marketing objectives. The research was done for Case Company X, the new outlet apparel store in Lappeenranta. The company’s main goal was to increase awareness of the shop and increase sales by attracting more customers. Th...

  15. Improving E-Business Design through Business Model Analysis

    OpenAIRE

    Ilayperuma, Tharaka

    2010-01-01

    To a rapidly increasing degree, traditional organizational structures evolve in large parts of the world towards online business using modern Information and Communication Technology (ICT) capabilities. For efficient applications of inter-organizational information systems, the alignment between business and ICT is a key factor. In this context, business analysis using business modelling can be regarded as a first step in designing economically sustainable e-business solutions. This thesis ex...

  16. Metadata-based analysis to improve clinical trial exchange

    OpenAIRE

    Luzi, Daniela; Ricci, Fabrizio L. (CNR-IRPPS); Serbanati, Luca D.; GreyNet, Grey Literature Network Service

    2006-01-01

    There are various, important information sources devoted to the diffusion of clinical trials, but they fail to achieve a complete coverage of clinical research. The demand for a mandatory public registration of clinical trials is emerging from different institutions, which are making efforts to develop common metadata schemas to both increase information exchange and make this information publicly available. The paper describes a metadata analysis of the various solutions of CT data represent...

  17. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  18. Improved corner detection by ultrasonic testing using phase analysis.

    Science.gov (United States)

    Broberg, Patrik; Runnemalm, Anna; Sjödahl, Mikael

    2013-02-01

    In ultrasonic testing, corners are used for sensitivity calibration in the form of notches, for measuring the sound velocity in the material, and as known reference points during testing. A 90° corner will always reflect incoming waves in the opposite direction due to a double reflection and therefore give a strong echo. This article presents a method for separating the echo from a corner from other echoes and more accurately find the position of the corner. The method is based on analysing the phase of the reflected signal. The proposed method was tested on a steel calibration block and the width of the indication was reduced by up to 50% compared to the amplitude signal. This results in a more accurate positioning of the corner. Using the phase instead of the amplitude will also improve the reliability, since reflections other than from corners will disappear. PMID:23164172

  19. An Effective Analysis of Weblog Files to Improve Website Performance

    Directory of Open Access Journals (Sweden)

    T.Revathi

    2012-02-01

    Full Text Available As there is an enormous growth in the web in terms of web sites, the size of web usage data is also increasing gradually. But this web usage data plays a vital role in the effective management of web sites. This web usage data is stored in a file called weblog by the web server. In order to discover the knowledge, required for improving the performance of websites, we need to apply the best preprocessing methodology on the server weblog file. Data preprocessing is a phase which automatically identifies the meaningful patterns and user behavior. So far analyzing the weblog data has been a challenging task in the area of web usage mining. In this paper we propose an effective and enhanced data preprocessing methodology which produces an efficient usage patterns and reduces the size of weblog down to 75-80% of its initial size. The experimental results are also shown in the following chapters.

  20. An Improvement on STEM Method in Multi-Criteria Analysis

    Directory of Open Access Journals (Sweden)

    M. Izadikhah

    2012-06-01

    Full Text Available Multi-criteria decision making (MCDM refers to making decision in the presence of multiple and conflicting criteria. Multiobjective programming method such as multiple objective linear programming (MOLP are techniques used to solve such multiple criteria decision making (MCDM problems. One of the first interactive procedures to solve MOLP is step method (STEM. In this paper we try to improve STEM method by introducing the weight vector of objectives which emphasize that more important objectives be more closer to ideal one. Therefore the presented method try to increase the rate of satisfactoriness of the obtained solution. Finally, a numerical example for illustration of the new method is given to clarify the main results developed in this pape

  1. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  2. Numerical analysis of modeling based on improved Elman neural network.

    Science.gov (United States)

    Jie, Shao; Li, Wang; WeiSong, Zhao; YaQin, Zhong; Malekian, Reza

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance. PMID:25054172

  3. Preintervention Analysis and Improvement of Customer Greeting in A Restaurant

    OpenAIRE

    Therrien, Kelly; Wilder, David A; Rodriguez, Manuel; Wine, Byron

    2005-01-01

    We examined customer greeting by employees at one location of a sandwich restaurant chain. First, a preintervention analysis was conducted to determine the conditions under which greeting a customer within 3 s of his or her entry into the restaurant did and did not occur. Results suggested that an appropriate customer greeting was most likely to occur when a door chime was used to indicate that a customer had entered the store and when the store manager was present behind the service counter....

  4. Techniques for Improving Filters in Power Grid Contingency Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Adolf, Robert D.; Haglin, David J.; Halappanavar, Mahantesh; Chen, Yousu; Huang, Zhenyu

    2011-12-31

    In large-scale power transmission systems, predicting faults and preemptively taking corrective action to avoid them is essential to preventing rolling blackouts. The computational study of the constantly-shifting state of the power grid and its weaknesses is called contingency analysis. Multiple-contingency planning in the electrical grid is one example of a complex monitoring system where a full computational solution is operationally infeasible. We present a general framework for building and evaluating resource-aware models of filtering techniques for this type of monitoring.

  5. Analysis of oocyte physiology to improve cryopreservation procedures.

    Science.gov (United States)

    Gardner, David K; Sheehan, Courtney B; Rienzi, Laura; Katz-Jaffe, Mandy; Larman, Mark G

    2007-01-01

    In contrast to the preimplantation mammalian embryo, it has been notoriously difficult to cryopreserve the metaphase II oocyte. The ability to store oocytes successfully at -196 degrees C has numerous practical and financial advantages, together with ethical considerations, and will positively impact animal breeding programs and assisted conception in the human. Differences in membrane permeability and in physiology are two main reasons why successful oocyte cryopreservation has remained elusive. It is proposed, therefore, that rather than relying on technologies already established for the preimplantation embryo, the development of cryopreservation techniques suitable for the mammalian oocyte needs to take into account the idiosyncratic physiology of this cell. Analysis of intracellular calcium, for example, has revealed that exposure to conventional permeating cryoprotectants, such as propanediol, ethylene glycol and DMSO, all independently result in an increase in calcium, which in turn has the potential to initiate oocyte activation, culminating in zona hardening. Quantification of the metabolome and proteome of the oocyte has revealed that whereas slow freezing has a dramatic effect on cell physiology, vitrification appears to have limited effect. This is plausibly achieved by the limited exposure to cryoprotectants. Analysis of meiotic spindle dynamics and embryo development following IVF, also indicate that vitrification is less traumatic than slow freezing, and therefore has the greatest potential for successful oocyte cryopreservation. PMID:17049589

  6. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  7. Improved evaporative light scattering detection for carbohydrate analysis.

    Science.gov (United States)

    Condezo-Hoyos, Luis; Pérez-López, Elena; Rupérez, Pilar

    2015-08-01

    Optimization and validation of evaporative light scattering detector (ELSD), aided by response surface methodology (RSM), has been developed for the liquid chromatography analysis of a wide molecular weight (MW) range of carbohydrates, including polysaccharides and oligosaccharides. Optimal experimental parameters for the ELSD detection were: 88.8°C evaporator temperature, 77.9°C nebulizer temperature and 1.1 standard litres per minute nitrogen flow rate. Optimal ELSD detection, used together with high performance size exclusion chromatography (HPSEC) of carbohydrates, gave a linear range from 250 to 1000 mg L(-1) (R(2)>0.998), with limits of detection and quantitation of 4.83-11.67 and 16.11-38.91 mg L(-1), respectively. Relative standard deviation was lower than 1.8% for intra-day and inter-day repeatability for apple pectin, inulin, verbascose, stachyose and raffinose. Recovery ranged from 103.7% to 118.3% for fructo-oligosaccharides, α-galacto-oligosaccharides and disaccharides. Optimized and validated ELSD detection is proposed for the analysis of high- to low-MW carbohydrates with high sensitivity, precision and accuracy. PMID:25766827

  8. Recent improvements in plutonium gamma-ray analysis using MGA

    International Nuclear Information System (INIS)

    MGA is a gamma-ray spectrum analysis program for determining relative plutonium isotopic abundances. It can determine plutonium isotopic abundances better than 1% using a high-resolution, low-energy, planar germanium detector and measurement times ten minutes or less. We have modified MGA to allow determination of absolute plutonium isotopic abundances in solutions. With calibration of a detector using a known solution concentration in a well-defined sample geometry, plutonium solution concentrations can be determined. MGA can include analysis of a second spectrum of the high-energy spectrum to include determination of fission product abundances relative to total plutonium. For the high-energy gamma-ray measurements we have devised a new hardware configuration, so that both the low- and high-energy gamma-ray detectors are mounted in a single cryostat thereby reducing weight and volume of the detector systems. We describe the detector configuration, and the performance of the MGA program for determining plutonium concentrations in solutions and fission product abundances

  9. Functional Virtual Prototyping in Vehicle Chassis Reform Analysis and Improvement Design

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The contribution of functional virtual prototyping to vehicle chassis development is presented. The different topics that we took into consideration were reform analysis and improvement design during the vehicle chassis development. A frame of coordinates based on the digital-model was established, the main CAE analysis methods, multi-body system dynamics and finite element analysis were applied to the digital-model build by CAD/CAM software. The method was applied in the vehicle chassis reform analysis and improvement design, all the analysis and design projects were implemented in the uniform digital-model, and the development was carried through effectively.

  10. An improved rank assessment method for weibull analysis of reliability data

    International Nuclear Information System (INIS)

    Weibull analysis has been applied widely in reliability data analysis. Rank assessment is one of the key steps in weibull analysis, which also induces the original errors. An improved median rank function obtained by genetic algorithms is presented to reduce the errors of rank assessment. (authors)

  11. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  12. Improved data analysis for verifying quantum nonlocality and entanglement

    Science.gov (United States)

    Zhang, Yanbao; Glancy, Scott; Knill, Emanuel

    2012-06-01

    Given a finite number of experimental results originating from local measurements on two separated quantum systems in an unknown state, are these systems nonlocally correlated or entangled with each other? These properties can be verified by violating a Bell inequality or satisfying an entanglement witness. However, violation or satisfaction could be due to statistical fluctuations in finite measurements. Rigorous upper bounds, on the maximum probability (i.e., the p-value) according to local realistic or separable states of a violation or satisfaction as high as the observed, are required. Here, we propose a rigorous upper bound that improves the known bound from large deviation theory [R. Gill, arXiv:quant-ph/0110137]. The proposed bound is robust against experimental instability and the memory loophole [J. Barrett et al., Phys. Rev. A 66, 042111 (2002)]. Compared with our previous method [Phys. Rev. A 84, 062118 (2011)], the proposed method takes advantage of the particular Bell inequality or entanglement witness tested in an experiment, so the computation complexity is reduced. Also, this method can be easily extended to test a set of independent Bell inequalities or entanglement witnesses simultaneously.

  13. Improved pressurized water reactor radial reflector modeling in nodal analysis

    International Nuclear Information System (INIS)

    A one-dimensional method based on a combination of the nodal equivalence theory and response matrix homogenization methods was previously described for determining environment-insensitive equivalent few-group diffusion theory parameters for homogenized radial reflector nodes of a pressurized water reactor. This reflector model, called the NGET-RM model, yields equivalent nodal parameters that do not account for the two-dimensional structure of the baffle at core corners; this can lead to significant errors in computed two-dimensional core power distributions. A semi-empirical correction procedure is proposed for reducing the two-dimensional effects associated with this particular one-dimensional reflector model. Numerical two-group experiments are performed for a given reflector configuration (and soluble boron concentration) to determine optimal values of the two empirical factors defined by this model. In this paper it is shown that the resultant factors are rather insensitive to core configuration or core conditions and that their application yields improved two-group NGET-RM reflector parameters with which accurate nodal power distributions can be obtained. The results are also compared with those obtained with another one-dimensional environment-insensitive model that has an extra degree of freedom utilized here to reduce two-dimensional effects. Some practical aspects related to the application of the proposed correction procedure are briefly discussed

  14. Energy response improvement for photon dosimetry using pulse analysis

    Science.gov (United States)

    Zaki, Dizaji H.

    2016-02-01

    During the last few years, active personal dosimeters have been developed and have replaced passive personal dosimeters in some external monitoring systems, frequently using silicon diode detectors. Incident photons interact with the constituents of the diode detector and produce electrons. These photon-induced electrons deposit energy in the detector's sensitive region and contribute to the response of diode detectors. To achieve an appropriate photon dosimetry response, the detectors are usually covered by a metallic layer with an optimum thickness. The metallic cover acts as an energy compensating shield. In this paper, a software process is performed for energy compensation. Selective data sampling based on pulse height is used to determine the photon dose equivalent. This method is applied to improve the energy response in photon dosimetry. The detector design is optimized for the response function and determination of the photon dose equivalent. Photon personal dose equivalent is determined in the energy range of 0.3-6 MeV. The error values of the calculated data for this wide energy range and measured data for 133Ba, 137Cs, 60Co and 241Am-Be sources respectively are up to 20% and 15%. Fairly good agreement is seen between simulation and dose values obtained from our process and specifications from several photon sources.

  15. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada

    2016-08-01

    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  16. Improving the channeler ant model for lung CT analysis

    Science.gov (United States)

    Cerello, Piergiorgio; Lopez Torres, Ernesto; Fiorina, Elisa; Oppedisano, Chiara; Peroni, Cristiana; Arteche Diaz, Raul; Bellotti, Roberto; Bosco, Paolo; Camarlinghi, Niccolo; Massafra, Andrea

    2011-03-01

    The Channeler Ant Model (CAM) is an algorithm based on virtual ant colonies, conceived for the segmentation of complex structures with different shapes and intensity in a 3D environment. It exploits the natural capabilities of virtual ant colonies to modify the environment and communicate with each other by pheromone deposition. When applied to lung CTs, the CAM can be turned into a Computer Aided Detection (CAD) method for the identification of pulmonary nodules and the support to radiologists in the identification of early-stage pathological objects. The CAM has been validated with the segmentation of 3D artificial objects and it has already been successfully applied to the lung nodules detection in Computed Tomography images within the ANODE09 challenge. The model improvements for the segmentation of nodules attached to the pleura and to the vessel tree are discussed, as well as a method to enhance the detection of low-intensity nodules. The results on five datasets annotated with different criteria show that the analytical modules (i.e. up to the filtering stage) provide a sensitivity in the 80 - 90% range with a number of FP/scan of the order of 20. The classification module, although not yet optimised, keeps the sensitivity in the 70 - 85% range at about 10 FP/scan, in spite of the fact that the annotation criteria for the training and the validation samples are different.

  17. Performance analysis of PV plants: Optimization for improving profitability

    International Nuclear Information System (INIS)

    Highlights: ► Real PV production from two 100 kWp grid-connected installations is conducted. ► Data sets on production were collected over an entire year. ► Economic results highlight the importance of properly selecting the system components. ► Performance of PV plants is directly related to improvements of all components. - Abstract: A study is conducted of real PV production from two 100 kWp grid-connected installations located in the same area, both of which experience the same fluctuations in temperature and radiation. Data sets on production were collected over an entire year and both installations were compared under various levels of radiation. The installations were assembled with mono-Si panels, mounted on the same support system, and the power supply was equal for the inverter and the measurement system; the same parameters were also employed for the wiring, and electrical losses were calculated in both cases. The results, in economic terms, highlight the importance of properly selecting the system components and the design parameters for maximum profitability.

  18. Performance Analysis of an Improved MUSIC DoA Estimator

    Science.gov (United States)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe

    2015-12-01

    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  19. Analysis and improvement of cyclotron thallium target room shield

    International Nuclear Information System (INIS)

    Because of high neutron and gamma ray intensities during thallium-203 target bombardment, thallium target room shield and its improvement have been investigated. Leakage neutron and gamma-ray dose rates in various points behind the shield are calculated by simulating the transport of neutrons and photons using Monte Carlo MCNP4C computer code. By considering target room geometry, its associated shield, neutron and gamma rays source strengths and spectra, three designs for enhancing shield performance have been analyzed; A door as a shield in maze entrance, covering maze walls with layers of some effective materials and adding a shadow shield in target room in front of the radiation source, have been considered as the parallel to the maze. Dose calculations carried out for each kind of suggested shields separately for different materials and dimensions, then the shield with better than has been constructed and It has been found that the deviation between calculated and measured dose values after upgrading is less than 20%

  20. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta;

    2006-01-01

    indicators directly quantifying choice of coxibs, indicators measuring expenditure per Defined Daily Dose, and indicators taking risk aspects into account, (2) "Frequent NSAID prescribing", comprising indicators quantifying prevalence or amount of NSAID prescribing, and (3) "Diverse NSAID choice", comprising...... indicators focusing on the width of GPs' formularies. The number of indicators for measuring the important aspects of quality in prescribing of NSAIDs could be reduced substantially by selecting the indicator in each dimension with the highest factor loading. A high preference for coxibs indicated both...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  1. Improvements in high purity radioxenon sample preparation and analysis

    International Nuclear Information System (INIS)

    This paper describes the production and analysis of high purity radioxenon samples. The University of Texas' 1.1 MW TRIGA research reactor is used for radioactive sample production via neutron activation. The reactor's facilities include a pneumatic system for precise irradiation of samples. In order to use the pneumatic facilities, gaseous samples have been encapsulated in quartz to fit into the polyethylene vials designed for the system. Enriched, stable, isotopically pure xenon gas is irradiated with neutrons in order to activate it to radioxenon isotopes, which are then measured with a β-γ coincidence system. The system enhancement and procedures to produce the radioxenon samples are described and examples of first of their kind measurements are shown for 125Xe, 127Xe, 129m Xe, and 137Xe. (author)

  2. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian;

    2008-01-01

    Health technology assessment (HTA) is the multidisciplinary study of the implications of the development, diffusion and use of health technologies. It supports health-policy decisions by providing a joint knowledge base for decision-makers. To increase its policy relevance, HTA tries to extend...... only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs, and their...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology that is...

  3. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  4. Stiffness Analysis and Improvement of Bolt-Plate Contact Assemblies

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard; Pedersen, Pauli

    2008-01-01

    In a previous study it was shown that, a simplified expression for the stiffness of the plate member in a bolt-plate assembly can be found. The stiffnesses of the bolt and the connected plates are the primary quantities that control the lifetime of a dynamically loaded connection. The present study...... of stiffnesses is extended to include different material parameters by including the influence of Poisson's ratio. Two simple practical formulas are suggested and their accuracies are documented for different bolts and different material (Poisson's ratio). Secondly, the contact analysis between the...... bolt head and the plate is extended by the possibility of designing a gap, that is, a nonuniform distance between the bolt and plate before prestressing. Designing the gap function generates the possibility for a better stress field by which the stiffness of the bolt is lowered, and at the same time...

  5. Some improvements on air particulate matter analysis by INAA

    International Nuclear Information System (INIS)

    At ITN analysis of air particulate matter is being made since 1994. Use is being made of PM10 Gent samplers with separation in two fractions: E.A.D. (equivalent aerodynamic diameter) < 2.5 μm and 2.5 μm < E.A.D. <10 μm. Costar-Nuclepore polycarbonate filters are used. Filters are routinely analysed by neutron activation analysis (INAA) and proton induced X-ray emission (PIXE). Heavy metals and other elements are determined. The procedure used consists in cutting the filter in three parts: one half for INAA, one quarter for PIXE and one quarter left for other eventual uses. For INAA, the half filter was rolled up, irradiated in pure polyethylene container and gamma measurement made including the irradiated polyethylene container. Blanks consisting of polyethylene container + half filter (clean) were also irradiated for impurity content correction. For some elements correction was quite relevant; therefore decision was taken is irradiating the rolled filter within a tin foil which after irradiation was removed and the half filter put into a polyethylene container not-irradiated. In this work comparison is made between the two situations, showing advantages and disadvantages of both procedures. For INAA, Cd-115 was used for Cd determination and very seldom the 527.9 keV gamma line was visible. Now we also look for the 336.2 keV gamma line of In-115m. In this work Cd results obtained by both gamma lines are shown and compared and detection limits for Cd are presented. Taking into account the EU directive 96/62/CE, which will demand very soon determination of Cd, As, Ni, and Hg, some results on these elements in air particulate matter collected in the neighbourhood of Lisbon are shown. (author)

  6. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  7. Delamination Modeling of Composites for Improved Crash Analysis

    Science.gov (United States)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  8. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    Science.gov (United States)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  9. Methodology Improvement of Reactor Physics Codes for CANDU Channels Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun; Choi, Geun Suk; Win, Naing; Aung, Tharndaing; Baek, Min Ho; Lim, Jae Yong [Kyunghee University, Seoul (Korea, Republic of)

    2010-04-15

    As the operational time increase, pressure tubes and calandria tubes in CANDU core encounter inevitably a geometrical deformation along the tube length. A pressure tube may be sagged downward within a calandria tube by creep from irradiation. This event can bring about a problem that is serious in integrity of pressure tube. A measurement of deflection state of in-service pressure tube is, therefore, very important for the safety of CANDU reactor. In this paper, evaluation of impacts on nuclear characteristic due to fuel channel deformation were aimed in order to improve nuclear design tools for concerning the local effects from abnormal deformations. It was known that sagged pressure tube can cause the eccentric configuration of fuel bundles in pressure tube by O.6cm maximum. In this case, adverse pin power distribution and reactivity balance can affect reactor safety under normal and accidental condition. Thermal and radiation-induced creep in pressure tube would expand a tube size. It was known that maximum expansion may be 5% in volume. In this case, more coolant make more moderation in the deformed channel resulting in the increase of reactivity. Sagging of pressure tube did not cause considerable change in K-inf values. However, expansion of the pressure tube made relatively large change in K-inf. Modeling of eccentric and enlarged configuration is not easy in preparation of input geometry at both HELlOS and MCNP. On the other hand, there is no way to consider this deformation in one-dimensional homogenization tool such as WIMS code. The way of handling this deformation was suggested as the correction method of expansion effect by adjusting the number density of coolant. The number density of heavy water coolant was set to be increased as the rate of expansion increase. This correction was done in the intact channel without changing geometry. It was found that this correction was very effective in the prediction of K-inf values. In this study, further

  10. Using robust statistics to improve neutron activation analysis results

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G., E-mail: gzahn@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2011-07-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  11. Using robust statistics to improve neutron activation analysis results

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  12. A potential target gene for the host-directed therapy of mycobacterial infection in murine macrophages.

    Science.gov (United States)

    Bao, Zhang; Chen, Ran; Zhang, Pei; Lu, Shan; Chen, Xing; Yao, Yake; Jin, Xiaozheng; Sun, Yilan; Zhou, Jianying

    2016-09-01

    Mycobacterium tuberculosis (MTB), one of the major bacterial pathogens for lethal infectious diseases, is capable of surviving within the phagosomes of host alveolar macrophages; therefore, host genetic variations may alter the susceptibility to MTB. In this study, to identify host genes exploited by MTB during infection, genes were non-selectively inactivated using lentivirus-based antisense RNA methods in Raw264.7 macrophages, and the cells that survived virulent MTB infection were then screened. Following DNA sequencing of the surviving cell clones, 26 host genes affecting susceptibility to MTB were identified and their pathways were analyzed by bioinformatics analysis. In total, 9 of these genes were confirmed as positive regulators of collagen α-5(IV) chain (Col4a5) expression, a gene encoding a type IV collagen subunit present on the cell surface. The knockdown of Col4a5 consistently suppressed intracellular mycobacterial viability, promoting the survival of Raw264.7 macrophages following mycobacterial infection. Furthermore, Col4a5 deficiency lowered the pH levels of intracellular vesicles, including endosomes, lysosomes and phagosomes in the Raw264.7 cells. Finally, the knockdown of Col4a5 post-translationally increased microsomal vacuolar-type H+-ATPase activity in macrophages, leading to the acidification of intracellular vesicles. Our findings reveal a novel role for Col4a5 in the regulation of macrophage responses to mycobacterial infection and identify Col4a5 as a potential target for the host-directed anti-mycobacterial therapy. PMID:27432120

  13. The improvement gap in energy intensity: Analysis of China's thirty provincial regions using the improved DEA (data envelopment analysis) model

    International Nuclear Information System (INIS)

    Enacting a reduction target for energy intensity in provinces has become an important issue for the central and local governments in China. But the energy intensity index has provided little information about energy efficiency improvement potential. This study re-estimates the TFEE (total-factor energy efficiency) using an improved DEA (data envelopment analysis) model, which combines the super-efficiency and sequential DEA models to avoid “discriminating power problem” and “technical regress”, and then used it to calculated the TEI (target for energy intensity). The REI (improvement potential in energy intensity) is calculated by the difference between TEI and the actual level of energy intensity. In application, we calculate the REIs for different provinces under the metafrontier and group-frontier respectively, and their ratios are the technology gaps for energy use. The main result shows that China's REIs fluctuate around 21%, 7.5% and 12% for Eastern, Central and Western China respectively; and Eastern China has the highest level of energy technology. These findings reveal that energy intensities of China's provinces do not converge to the optimal level. Therefore, the target of energy-saving policy for regions should be enhancing the energy efficiency of the inefficient ones, and thereby reduce the gap for improvement in energy intensity across regions. - Highlights: • We present an improved DEA model to calculate the TFEE (total-factor energy efficiency). • The improved TFEE combines with a meta-frontier analysis. • We estabilish a new indicator for improvement gap in energy intensity. • Improvement in energy intensity of regions in China is analysed

  14. An improved algorithm for model-based analysis of evoked skin conductance responses ☆

    OpenAIRE

    Bach, D R; Friston, K.J.; Dolan, R. J.

    2013-01-01

    Model-based analysis of psychophysiological signals is more robust to noise - compared to standard approaches - and may furnish better predictors of psychological state, given a physiological signal. We have previously established the improved predictive validity of model-based analysis of evoked skin conductance responses to brief stimuli, relative to standard approaches. Here, we consider some technical aspects of the underlying generative model and demonstrate further improvements. Most im...

  15. Vulnerability of assessing water resources by the improved set pair analysis

    Directory of Open Access Journals (Sweden)

    Yang Xiao-Hua

    2014-01-01

    Full Text Available Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources vulnerability assessment in terms of water cycle, socio-economy, and ecological environment are established based on the analysis of sensibility and adaptability. Improved set pair analysis model is used to assess water resource vulnerability in Ningxia with twelve indexes under four kinds of future climate scenarios. Certain and uncertain information quantity of water resource vulnerability is calculated by connection numbers in the improved set pair analysis model. Results show that Ningxia is higher vulnerability under climate change scenarios. Improved set pair analysis model can fully take advantage of certain and uncertain knowledge, subjective and objective information compared with fuzzy assessment model and artificial neural network model. The improved set pair analysis is an extension to the vulnerability assessment model of water resources system.

  16. An improvement of window factor analysis for resolution of noisy HPLC-DAD data

    Institute of Scientific and Technical Information of China (English)

    邵学广; 邵利民; 李梅青; 林祥钦

    2002-01-01

    Window factor analysis (WFA) is a powerful tool in analyzing evolutionary process. However, it was found that window factor analysis is much sensitive to the noise involved in original data matrix. An error analysis was done with the fact that the concentration profiles resolved by the conventional window factor analysis are easily distorted by the noise reserved by the abstract factor analysis (AFA), and a modified algorithm for window factor analysis was proposed. Both simulated and experimental HPLC-DAD data were investigated by the conventional and the improved methods. Results show that the improved method can yield less noise-distorted concentration profiles than the conventional method, and the ability for resolution of noisy data sets can be greatly enhanced.

  17. Analysis and improvements of fringe jump corrections by electronics on the JET tokamak far infrared interferometer

    International Nuclear Information System (INIS)

    For the Tore Supra interferometer phase measurements, an electronics had been developed electronics using field programmable gate array processors. The embedded algorithm can correct the fringe jumps. For comparison, the electronics ran at JET during the 2009 campaign. The first analysis concluded that the electronics was not correcting all the fringe jumps. An analysis of the failures led to improvements in the algorithm, which was tested during the rest of the campaign. In this article, we evaluate the increases in the performance. From the analysis of the remaining faults, further improvements are discussed for designing future boards that are foreseen for JET using the second wavelength and the Cotton-Mouton effect information.

  18. Heterogeneous Multi core processors for improving the efficiency of Market basket analysis algorithm in data mining

    OpenAIRE

    L, Aashiha Priyadarshni.

    2014-01-01

    Heterogeneous multi core processors can offer diverse computing capabilities. The efficiency of Market Basket Analysis Algorithm can be improved with heterogeneous multi core processors. Market basket analysis algorithm utilises apriori algorithm and is one of the popular data mining algorithms which can utilise Map/Reduce framework to perform analysis. The algorithm generates association rules based on transactional data and Map/Reduce motivates to redesign and convert the existing sequentia...

  19. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  20. Improvements in the gaseous hydrogen-water equilibration technique for hydrogen isotope ratio analysis

    Science.gov (United States)

    Coplen, T.B.; Wildman, J.D.; Chen, J.

    1991-01-01

    Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.

  1. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  2. Preliminary analysis of the J-52 aircraft engine Component Improvement Program

    OpenAIRE

    Butler, Randall Scott

    1992-01-01

    Approved for public release; distribution is unlimited Increasing budgetary constraints have required program managers within the Naval Air Systems Command to justify their programs as never before. This thesis presents a preliminary analysis of the J-52 aircraft engine Component Improvement Program (CIP). The objectives of the research were to scrutinize the association of the CIP with promised improvements and benefits pertaining to the J-52 engine and to determine the obstacles that e...

  3. Daya Bay Nuclear Power Station outdoors electrical equipment pollution status analysis and improving

    International Nuclear Information System (INIS)

    According to the practice operation experience of the outdoors electrical equipment in Guangdong Daya Bay Nuclear Power Station, following to the engineering technical standard applied in China, by analysis and assessment of pollution classes, it is considered that the class four is reasonable. And indicated the voltage distance should be more than 3.5 cm/kV. Some improvements had been executed and effects are good. And further suggest some improving comments

  4. An improved multiple linear regression and data analysis computer program package

    Science.gov (United States)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  5. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    International Nuclear Information System (INIS)

    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results

  6. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  7. Improvement on reaction model for sodium-water reaction jet code and application analysis

    Energy Technology Data Exchange (ETDEWEB)

    Itooka, Satoshi; Saito, Yoshinori [Hitachi Ltd., Nuclear Systems Division, Hitachi, Ibaraki (Japan); Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi [Hitachi Engineering Co., Ltd., Nuclear Power Plant Engineering No.2 Dept., Hitachi, Ibaraki (Japan)

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3{center_dot}Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  8. Improving the problem analysis in cost-benefit analysis for transport projects : an explorative study

    NARCIS (Netherlands)

    Annema, J.A.; Mouter, N.

    2013-01-01

    Key actors (consultants, scientists and policy makers) in the Netherlands transport policy cost-benefit analysis (CBA) practice consider ‘problem analysis’ to be one of the important CBA substantive problems. Their idea is that a good-quality problem analysis can help to identify proper solutions, a

  9. Computer analysis of weight evolution – a method to improve the children population health

    OpenAIRE

    Corina Muşuroi; Doina Mot

    2006-01-01

    The improvement of the assessment of the children’s growth development has been explained by the introduction of new methods for monitoring the growth parameters and the computer analysis of this data. The necessity of this new technologies appeared because the clinicians noticed that there are new types of data, in large amounts, which are difficult to be analyzed using the traditional methods. So, in order to receive the maxim information, the implementation of the computer analysis and of ...

  10. Improved Principal Component Analysis and its Application in the Evaluation of the Industrial Structure

    OpenAIRE

    LI Fen-Hong

    2013-01-01

    In this study, improved principal component analysis method is put forward to avoid the shortage of comprehensive evaluation dealt with principal component analysis method. When the contribution rate of the first principal components is short of asks, we can choose to rotate factor loading matrix and select multiple main components and synthesize and weigh the variation coefficient and variance contribution as weight coefficient, to set up a comprehensive evaluation model. As an example of th...

  11. Improved Analysis of Co-Channel Interference in Cellular Communications Systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zu-fan; DU Hui-ping; ZHU Wei-le

    2005-01-01

    In terms of the carrier-to-interference-ratio, the performance of co-channel interference in cellular communications systems is studied. The approach is based on an improved analysis, which allows to take into account some area in the desired sector may not be interfered by some co-channel sectors with exact geometrical analysis, instead of the entire sector interfered by some co-channel sectors. Other features, such as power control and the number of interferences are also included.

  12. APPLICATION OF DYNAMIC SIMULATIONS IN THE ANALYSIS OF MEASURES FOR IMPROVING ENERGY EFFICIENCY OF BUILDINGS

    OpenAIRE

    DRAGICEVIC SNEZANA M.

    2016-01-01

    One of the most commonly used methods for improving energy performances of buildings is reducing heating energy consumption. This paper shows a comparative analysis of building energy demand for space heating based on case studies in which building modifications were made with insulating materials of building envelopes and with different window types. For the analysis, a public building with 6 floors, located in Belgrade, was selected. For a dynamical simulation and evaluation of the applied ...

  13. Handbook of Soccer Match Analysis: A Systematic Approach to Improving Performance

    OpenAIRE

    Christopher Carling; Mark Williams, A; Thomas Reilly

    2006-01-01

    DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training ro...

  14. Vulnerability of assessing water resources by the improved set pair analysis

    OpenAIRE

    Yang Xiao-Hua; He Jun; Di Cong-Li; Li Jian-Qiang

    2014-01-01

    Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources ...

  15. Financial Statement Analysis in a Company and Proposals for Improvements of Financial Health

    OpenAIRE

    Mešťan, Marek

    2014-01-01

    The aim of this thesis is to evaluate the financial situation of company LUX, s. r. o. in the years 2009–2013 based on selected methods of financial analysis and formulate proposals to solve problem areas. In thesis is used financial and strategic analysis. The results are compared with the recommended results and averages in the same area. In the final part of thesis are suggestions and recommendations for possible improvements of financial health in the company.

  16. Unifying Geometric Features and Facial Action Units for Improved Performance of Facial Expression Analysis

    OpenAIRE

    Ghayoumi, Mehdi; Bansal, Arvind K.

    2016-01-01

    Previous approaches to model and analyze facial expression analysis use three different techniques: facial action units, geometric features and graph based modelling. However, previous approaches have treated these technique separately. There is an interrelationship between these techniques. The facial expression analysis is significantly improved by utilizing these mappings between major geometric features involved in facial expressions and the subset of facial action units whose presence or...

  17. Teaching Classroom Videorecording Analysis to Graduate Students: Strategies for Observation and Improvement

    Science.gov (United States)

    Cahalan, James M.

    2013-01-01

    Videorecording analysis can help improve the teaching of college literature and other subjects. Here, I concentrate on specific analytical strategies that I have been teaching my graduate students since 1994, and I cite my students (including their graphical charts) to illustrate what important lessons they have learned through careful study of…

  18. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  19. Commentaries to "The Vital Role of Operations Analysis in Improving Healthcare Delivery"

    OpenAIRE

    n/a

    2012-01-01

    This series of discussions presents commentaries on where the field of healthcare operations management is now and possible future research directions, expanding upon the key points raised by Green [Green LV (2012) The vital role of operations analysis in improving healthcare delivery. Manufacturing Service Oper. Management 14(4):488-494].

  20. Improved Detection of Time Windows of Brain Responses in Fmri Using Modified Temporal Clustering Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    @@ Temporal clustering analysis (TCA) has been proposed recently as a method to detect time windows of brain responses in functional MRI (fMRI) studies when the timing and location of the activation are completely unknown. Modifications to the TCA technique are introduced in this report to further improve the sensitivity in detecting brain activation.

  1. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    Science.gov (United States)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  2. Model extension and improvement for simulator-based software safety analysis

    International Nuclear Information System (INIS)

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study

  3. A study on the system improvement and systematization for the field analysis of ground water

    Energy Technology Data Exchange (ETDEWEB)

    Song, Duk Young; Park, Jin Tae; Kim, Sang Yeon; Choi, Byung In [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    Systematization and improvement of the field water analysis method has been performed. The pretreatment method and accurate analytical methods has been established for volatile and unstable compounds in ground water, hot spring water and mineral water. The accuracy of the analytical results were investigated by the comparison among titrimetry, gravimetry, spectrometry and other analytical tools used in the field water analysis. For S{sup 2-}, Fe{sup +2} and NH{sub 3}-N, which are easily gasified and oxidized, the error and accuracy were examined as a function of time span from the moment of the sampling. Standard deviations of analytical results in spectrophotometer were estimated by use of standard solutions, which resulted in the improvement of the confidence level of the analytical results. Thus, developed system is proven to be very reliable and useful for the systematic field analysis of waters. (author). 18 refs., 9 figs., 13 tabs.

  4. Powerplant productivity improvement study: policy analysis and incentive assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    Policy options that the Illinois Commerce Commission might adopt in order to promote improved power plant productivity for existing units in Illinois are identified and analyzed. These policy options would generally involve either removing existing disincentives and/or adding direct incentives through the regulatory process. The following activities are reported: in-depth review of existing theoretical and empirical literature in the areas of power plant reliability, regulatory utility efficiency and performance incentives, and impacts of various regulatory mechanisms such as the Fuel Adjustment Clauses on productivity; contacts with other state public utility commissions known to be investigating or implementing productivity improvement incentive mechanisms; documentation and analysis of incentive mechanisms adopted or under consideration in other states; analysis of current regulatory practice in Illinois as it relates to power plant productivity incentives and disincentives; identification of candidate incentive mechanisms for consideration by the Illinois Commerce Commission; and analysis and evaluation of these candidates. 72 references, 8 figures.

  5. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. PMID:26851473

  6. Menu Analysis for Improved Customer Demand and Profitability in Hospital Cafeterias.

    Science.gov (United States)

    Mann, Linda L.; MacInnis, Donna; Gardiner, Nicole

    1999-01-01

    Several sophisticated menu analysis methods have been compared in studies using theoretical restaurant menus. Institutional and especially hospital cafeterias differ from commercial restaurants in ways that may influence the effectiveness of these menu analysis methods. In this study, we compared three different menu analysis methods - menu engineering, goal value analysis, and marginal analysis in an institutional setting, to evaluate their relative effectiveness for menu management decision-making. The three methods were used to analyze menu cost and sales data for a representative cafeteria in a large metropolitan hospital. The results were compared with informal analyses by the manager and an employee to determine accuracy and value of information for decision-making. Results suggested that all three methods would improve menu planning and pricing, which in turn would enhance customer demand (revenue) and profitability. However, menu engineering was ranked the easiest of the three methods to interpret. PMID:11844400

  7. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  8. Reliability improvement of robotics systems: Analysis, design and real time supervision

    International Nuclear Information System (INIS)

    Reliability improvement of Robotics Systems is a key issue in automation and autonomy in maintenance and intervention tasks in Hostile Environment. Constraints in hostile environment require different way of using and programming of robots when compared with industrial application. To take maximum benefit of robot technology, the level of Confidence in the robotics tool must be much higher than in classical production world. To increase this level of confidence, application of Reliability Engineering in combination with strong knowledge of robot technology leads to such an objective. In this paper, three different aspects are considered and developed as tools to be used in different stage of this improvement. The first one is the Analysis of reliability of robotics and in remote handling systems in general to identify failure modes, effects on the system, sensitive components and needs of redundancy. Tools as the Failure Modes, Effects and Criticality Analysis are presented as well as the Fault Tree Analysis. The second one deals with design criteria for new robot systems or improvement of existing one using reliability and safety driven design concepts. Such concepts are applicable on mechanical design, electrical design and electronic design including the computer controller of the robot. The last aspect is the control in real time of availability of functions, safety level as well as failure detection in the various subsystems composing a robot device. Techniques of supervision by use of safety check subroutines are considered. Experiences of such improvement process of robotics for maintenance of Fusion machines is discussed. (author). Figs

  9. Improved methodology for integral analysis of advanced reactors employing passive safety

    Science.gov (United States)

    Muftuoglu, A. Kursad

    After four decades of experience with pressurized water reactors, a new generation of nuclear plants are emerging. These advanced designs employ passive safety which relies on natural forces, such as gravity and natural circulation. The new concept of passive safety also necessitates improvement in computational tools available for best-estimate analyses. The system codes originally designed for high pressure conditions in the presence of strong momentum sources such as pumps are challenged in many ways. Increased interaction of the primary system with the containment necessitates a tool for integral analysis. This study addresses some of these concerns. An improved tool for integral analysis coupling primary system with containment calculation is also presented. The code package is based on RELAP5 and CONTAIN programs, best-estimate thermal-hydraulics code for primary system analysis and containment code for containment analysis, respectively. The suitability is demonstrated with a postulated small break loss of coolant accident analysis of Westinghouse AP600 plant. The thesis explains the details of the analysis including the coupling model.

  10. An improved path flux analysis with multi generations method for mechanism reduction

    Science.gov (United States)

    Wang, Wei; Gou, Xiaolong

    2016-03-01

    An improved path flux analysis with a multi generations (IMPFA) method is proposed to eliminate unimportant species and reactions, and to generate skeletal mechanisms. The production and consumption path fluxes of each species at multiple reaction paths are calculated and analysed to identify the importance of the species and of the elementary reactions. On the basis of the indexes of each reaction path of the first, second, and third generations, the improved path flux analysis with two generations (IMPFA2) and improved path flux analysis with three generations (IMPFA3) are used to generate skeletal mechanisms that contain different numbers of species. The skeletal mechanisms are validated in the case of homogeneous autoignition and perfectly stirred reactor of methane and n-decane/air mixtures. Simulation results of the skeletal mechanisms generated by IMPFA2 and IMPFA3 are compared with those obtained by path flux analysis (PFA) with two and three generations, respectively. The comparisons of ignition delay times, final temperatures, and temperature dependence on flow residence time show that the skeletal mechanisms generated by the present IMPFA method are more accurate than those obtained by the PFA method, with almost the same number of species under a range of initial conditions. By considering the accuracy and computational efficiency, when using the IMPFA (or PFA) method, three generations may be the best choice for the reduction of large-scale detailed chemistry.

  11. Sensitivity analysis to improve the gap conductance uncertainty for KINS-REM

    International Nuclear Information System (INIS)

    KINS has been using the Best Estimate Plus Uncertainty(BEPU) methodology to analyze the LBLOCA that is the design basis accident of emergency core cooling system(ECCS). KINS-REM(Realistic Evaluation Methodology) is the currently used for LBLOCA analysis methodology and has been improved continuously. One of the important issue of the improvements is the consideration about the uncertainty parameters related to fuel rod behaviors during LBLOCA.. Effect of Thermal Conductivity Degradation(TCD) of fuel rod has been studied to be considered in KINS-REM. For this purpose, the sensitivity analysis has been performed to improve the gap conductance uncertainty parameter in this study. The OPR1000 plant, Hanul unit 3 and 4, was selected as the reference plant. LBLOCA transient calculations have been performed by MARSKS. As the method of uncertainty quantification for gap conductance, the controls of the cladding roughness parameter(B) is changed to the controls of the global variable, effective gap conductance(hg) that is physically reasonable manner. The sensitivity analysis has been performed on the uncertainty multiplication coefficient of hg corresponding to the previous uncertainty range of cladding roughness parameter B in PCT calculations of LBLOCA. Through the comparison and analysis of the PCT values and behavior trends for reflood and blowdown, the range of uncertainty of the multiplication coefficient of the global variable hg, 2.34 - 0.66 and mean value 1.5 are reasonable to replace the local variable called the cladding roughness

  12. An Improved Biclustering Algorithm and Its Application to Gene Expression Spectrum Analysis

    Institute of Scientific and Technical Information of China (English)

    Hua Qu; Liu-Pu Wang; Yan-Chun Liang; Chun-Guo Wu

    2005-01-01

    Cheng and Church algorithm is an important approach in biclustering algorithms.In this paper, the process of the extended space in the second stage of Cheng and Church algorithm is improved and the selections of two important parameters are discussed. The results of the improved algorithm used in the gene expression spectrum analysis show that, compared with Cheng and Church algorithm, the quality of clustering results is enhanced obviously, the mining expression models are better, and the data possess a strong consistency with fluctuation on the condition while the computational time does not increase significantly.

  13. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Claudia R.; Hofer, Christoph; Wiek, Arnim; Scholz, Roland W. [Environmental Sciences, Natural and Social Science Interface, Swiss Federal Institute of Technology, ETH Zentrum, HAD, Haldenbachstr. 44, CH-8092 Zurich (Switzerland)

    2004-05-10

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management.

  14. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    International Nuclear Information System (INIS)

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management

  15. Construction Delay Analysis Techniques—A Review of Application Issues and Improvement Needs

    Directory of Open Access Journals (Sweden)

    Nuhu Braimah

    2013-07-01

    Full Text Available The time for performance of a project is usually of the essence to the employer and the contractor. This has made it quite imperative for contracting parties to analyse project delays for purposes of making right decisions on potential time and/or cost compensation claims. Over the years, existing delay analysis techniques (DATs for aiding this decision-making have been helpful but have not succeeded in curbing the high incidence of disputes associated with delay claims resolutions. A major source of the disputes lies with the limitations and capabilities of the techniques in their practical use. Developing a good knowledge of these aspects of the techniques is of paramount importance in understanding the real problematic issues involved and their improvement needs. This paper seeks to develop such knowledge and understanding (as part of a wider research work via: an evaluation of the most common DATs based on a case study, a review of the key relevant issues often not addressed by the techniques, and the necessary improvements needs. The evaluation confirmed that the various techniques yield different analysis results for the same delay claims scenario, mainly due to their unique application procedures. The issues that are often ignored in the analysis but would also affect delay analysis results are: functionality of the programming software employed for the analysis, resource loading and levelling requirements, resolving concurrent delays, and delay-pacing strategy. Improvement needs by way of incorporating these issues in the analysis and focusing on them in future research work are the key recommendations of the study.

  16. Decaying dark matter: a stacking analysis of galaxy clusters to improve on current limits

    OpenAIRE

    Combet, C.; Maurin, D.; Nezri, E.; Pointecouteau, E.; Hinton, J. A.; R White

    2012-01-01

    We show that a stacking approach to galaxy clusters can improve current limits on decaying dark matter by a factor $\\gtrsim 5-100$, with respect to a single source analysis, for all-sky instruments such as Fermi-LAT. Based on the largest sample of X-ray-selected galaxy clusters available to date (the MCXC meta-catalogue), we provide all the astrophysical information, in particular the astrophysical term for decaying dark matter, required to perform an analysis with current instruments.

  17. Next generation sequencing: Improved resolution for paternal/maternal duos analysis.

    Science.gov (United States)

    Ma, Yan; Kuang, Jin-Zhi; Nie, Tong-Gang; Zhu, Wei; Yang, Zhi

    2016-09-01

    In the case of two mismatches observed in alleged parent-offspring pairs, there is doubt as to whether there is an exclusion of the putative parent or the existence of two mutations. Here, we report on four cases with two mismatches in paternal/maternal duos based on capillary electrophoresis (CE) results. The analyzed next generation sequencing (NGS) results were compared with 20 autosomal STRs derived from previous CE-based analysis. In summary, the NGS samples used offered comprehensive information of different types of markers that can improve resolutions for paternal/maternal duos analysis. PMID:27347656

  18. Improvement and analysis of ID3 algorithm in decision-making tree

    Science.gov (United States)

    Xie, Xiao-Lan; Long, Zhen; Liao, Wen-Qi

    2015-12-01

    For the cooperative system under development, it needs to use the spatial analysis and relative technology concerning data mining in order to carry out the detection of the subject conflict and redundancy, while the ID3 algorithm is an important data mining. Due to the traditional ID3 algorithm in the decision-making tree towards the log part is rather complicated, this paper obtained a new computational formula of information gain through the optimization of algorithm of the log part. During the experiment contrast and theoretical analysis, it is found that IID3 (Improved ID3 Algorithm) algorithm owns higher calculation efficiency and accuracy and thus worth popularizing.

  19. Methodological aspects of development of new instrumental methods of analysis and improvement of known ones

    International Nuclear Information System (INIS)

    Consideration is given to possibilities of instrumental methods of analysis, such as method of precision registration of natural isotope rates of light elements from gaseous phase; method of piezoquartz microweighting; probe methods of analysis in spark mass-spectroscopy; extraction-atomic-emission spectroscopy with inductively coupled plasma. Prediction of further development of these methods, improvement of their analytic characteristics is given: increase of sensitivity, accuracy and rapidity. Extension of fields of their application is forecasted as well. 20 refs.; 7 figs.; 2 tabs

  20. Computed tomography as a core analysis tool: Applications, instrument evaluation, and image improvement techniques

    International Nuclear Information System (INIS)

    In recent years, the use of computerized tomography (CT) to characterize two-phase fluid flow through porous media has become increasingly popular. This paper describes a different application of CT: it use as a core analysis tool. The advantages and disadvantages of the different technological generations of commercial medical CT scanners available as core analysis instruments are also discussed. Additionally, methods are presented for improving images and reducing CT-number errors inherent in the scanning of high-density rock samples on instruments whose software was designed for the scanning of the low-density human patients

  1. Improved GPS travelling wave fault locator for power cables by using wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, W.; Song, Y.H.; Chen, W.R. [Brunel Univ., Dept. of Electronics and Computer Engineering, Uxbridge (United Kingdom)

    2001-06-01

    The paper propose an improved approach to cable-fault location, which is essentially based on synchronised sampling technique, wavelet analysis and travelling wave principle. After an outline of the new scheme and brief introduction to the three major techniques, wavelet analysis of faulty transient waveforms is conducted in details to determine the best wavelet levels for this particular application. Then a 400 kV underground cable system simulated by the Alternative Transient Program (ATP) under various system and fault conditions is used to fully evaluate the approach. Numerical results show that this scheme is reliable and accurate with errors of less than 2% of the length of the cable line. (Author)

  2. Thermal Analysis in Gas Insulated Transmission Lines Using an Improved Finite-Element Model

    Directory of Open Access Journals (Sweden)

    Ling LI

    2013-01-01

    Full Text Available  In this paper, an improved finite element (FE model is proposed to investigate the temperature distribution in gas insulated transmission lines (GILs. The solution of joule losses in eddy current field analysis is indirectly coupled into fluid and thermal fields. As is different from the traditional methods, the surrounding air of the GIL is involved in the model to avoid constant convective heat transfer coefficient, thus multiple species transport technique is employed to deal with the problem of two fluid types in a single model. In addition, the temperature dependent electrical and thermal properties of the materials are considered. The steady-state and transient thermal analysis of the GIL are performed separately with the improved model. The corresponding temperature distributions are compared with experimental results reported in the literature.

  3. Improvements in in-bay irradiated fuel inspection planning and analysis at Bruce Power

    International Nuclear Information System (INIS)

    This paper describes improvements in irradiated fuel inspection planning and analysis implemented at Bruce Power since 2012. A review of inspection plans and fuel performance reports since 2001 identified significant variations in how irradiated fuel bundles were selected for inspection from year-to-year. A series of inspection tasks was established in an inspection logic and technical basis document. Inspection objectives and bundle selection criteria were defined for each task. These requirements, along with resource availability are now used to prepare a fuel inspection plan each year. The inspection results are then considered in the context of the analysis objectives for each task. The inspection results are presented in brief monthly updates and in-depth semi-annual reports in addition to the Annual Fuel Performance Reports. These changes have improved the effectiveness, consistency and efficiency of Bruce Power’s fuel performance monitoring. (author)

  4. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Park, Dae Woong [Korea Testing and Research Institute, Kwachun (Korea, Republic of)

    2015-03-15

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  5. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    International Nuclear Information System (INIS)

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  6. Improved analysis of differential rotation parameters of active longitudes of solar x-ray flares

    International Nuclear Information System (INIS)

    Complete text of publication follows. There is increasing evidence that various manifestations of solar activity are non-axisymmetric and mainly occur in two preferred longitude ranges, so called active longitudes. We have earlier analyzed the longitudinal occurrence of solar X-ray flares observed by GOES satellites using a specially developed dynamic, differentially rotating coordinate system. In this frame, the longitude distribution shows two persistent preferred longitudes separated by about 180 degrees whose strength alternates in time, similarly to the so called flip-flop phenomenon. Here we make an improved statistical analysis to find the globally best fitting values for the parameters describing the differential rotation of active longitudes. We find that the revised analysis gives a more consistent set of parameters, e.g., for the different classes of X-ray flares. Also, the improved parameters yield a higher level of non-axisymmetry for the longitudinal distribution, thus increasing evidence for the existence of active longitudes.

  7. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  8. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  9. Combined Analysis of Cortical (EEG) and Nerve Stump Signals Improves Robotic Hand Control

    OpenAIRE

    Tombini, Mario; Rigosa, Jacopo; Zappasodi, Filippo; Porcaro, Camillo; Citi, Luca; Carpaneto, Jacopo; Rossini, Paolo Maria; Micera, Silvestro

    2012-01-01

    Background. Interfacing an amputee's upper-extremity stump nerves to control a robotic hand requires training of the individual and algorithms to process interactions between cortical and peripheral signals. Objective. To evaluate for the first time whether EEG-driven analysis of peripheral neural signals as an amputee practices could improve the classification of motor commands. Methods. Four thin-film longitudinal intrafascicular electrodes (tf-LIFEs-4) were implanted in the median and ulna...

  10. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    OpenAIRE

    Bahram A. Ghanbari-Hashemabadi; Raheleh Maddah-Shoorcheh; ZahraVafaei-Jahan; Mostafa Bolghanabadi

    2012-01-01

    Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen base...

  11. Analysis of Improvement on Human Resource Management within Chinese Enterprises in Economic Globalization

    OpenAIRE

    Lihui Xie; Dasong Deng; Xifa Liu

    2013-01-01

    In this study, we analysis of improvement on human resource management within Chinese enterprises in economic globalization. China’s entry into WTO has accelerated the economic globalization pace of Chinese enterprises and Chinese economy is further integrated with the global economy in a global scope. Human resource is what economic globalization of Chinese enterprises relies on, the first resource for China to participate in the international competition and is also the key to make effectiv...

  12. Conference Report: Improving Skills: Evidence from Secondary Analysis of International Surveys

    OpenAIRE

    WEBER ANKE; MOUTHAAN MELISSA

    2013-01-01

    The Improving Skills conference, which took place November 15-16, 2012 in Cyprus, was organised by the European Commission, DG Education and Culture (DG EAC), in close cooperation with the Cypriot Presidency and with the input from CRELL. The aim of the conference was to generate and disseminate knowledge derived from recent secondary analysis of large scale international surveys and assessments such as PISA, TIMSS, ICCS, ESLC and PIRLS. Participants of the conference discussed policy...

  13. A social work study on the effect of transactional analysis on the improvement of intimacy attitude

    OpenAIRE

    Parvin Gol; Kiiumars Farahbakhsh; Fatemeh Rezaei

    2013-01-01

    The purpose of this paper is to investigate the impact of group counseling using transactional analysis on the improvement of intimacy attitude in some depressed patients in city of Esfahan, Iran. In this paper, semi-experimental design with pretest posttest control groups was conducted among 30 patients. The sample was selected through available sampling method among the depressed patients referred to psychiatric centers. They were randomly assigned into experimental and control groups. The ...

  14. Ion beam analysis with external beams: Recent set-up improvements

    International Nuclear Information System (INIS)

    Accelerator-based analytical techniques using external beams are ideally fitted to the study of works of art because of their fully non-destructive character. However, accurate quantitative analysis is not straightforward, due in particular to difficult beam monitoring. Significant improvements have been progressively made on the external beam line of the IBA facility of the Louvre museum in order to increase the accuracy and to conduct combined analyses with different IBA techniques

  15. Economic Analysis of Cost-Effectiveness of Community Engagement to Improve Health

    OpenAIRE

    Andrew Street; Roy Carr-Hill

    2008-01-01

    Liberty of association is one of the building blocks of a democratic society, and presumes that community engagement in a democratic society is universally a good thing. This presumption is not subject to economic analysis, but the issue considered here is whether community engagement is a better vehicle for improving the community’s health than another approach. The problems of applying the standard framework of economic evaluation to consider this issue include: multiple perspectives and ti...

  16. Further development of the ultrasonic testing method for improving the detection and analysis of corrosion cracks

    International Nuclear Information System (INIS)

    Defect detection and analysis can be improved applying the ultrasonic multifrequency technique basing on the principle that each defect type reveals two characteristic domains during ultrasonic testing, i.e. scattering and reflection. The frequency range ft, which is the frequency range where transition from scattering to reflection occurs, is a major characteristic value identifying the defect size, which in the case of corrosion cracks is directly proportional to the crack depth. (orig.)

  17. Protein cleavage strategies for an improved analysis of the membrane proteome

    OpenAIRE

    Poetsch Ansgar; Fischer Frank

    2006-01-01

    Abstract Background Membrane proteins still remain elusive in proteomic studies. This is in part due to the distribution of the amino acids lysine and arginine, which are less frequent in integral membrane proteins and almost absent in transmembrane helices. As these amino acids are cleavage targets for the commonly used protease trypsin, alternative cleavage conditions, which should improve membrane protein analysis, were tested by in silico digestion for the three organisms Saccharomyces ce...

  18. Improved breath alcohol analysis with use of carbon dioxide as the tracer gas

    OpenAIRE

    Kaisdotter Andersson, Annika

    2010-01-01

    State-of-the-art breath analysers require a prolonged expiration into a mouthpiece to obtain the accuracy required for evidential testing and screening of the alcohol concentration. This requirement is unsuitable for breath analysers used as alcolock owing to their frequent use and the fact that the majority of users are sober drivers; as well as for breath testing in uncooperative persons. This thesis presents a method by which breath alcohol analysis can be improved, using carbon dioxide (C...

  19. Improvement of safety by analysis of costs and benefits of the system

    OpenAIRE

    Karkoszka, T.; M. Andraczke

    2011-01-01

    Purpose: of the paper has been the assessment of the dependence between improvement of the implemented occupational health and safety management system and both minimization of costs connected with occupational health and safety assurance and optimization of real work conditions.Design/methodology/approach: used for the analysis has included definition of the occupational health and safety system with regard to the rules and tool allowing for occupational safety assurance in the organisationa...

  20. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson

    2008-06-01

    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  1. Improved enteral tolerance following step procedure: systematic literature review and meta-analysis.

    Science.gov (United States)

    Fernandes, Melissa A; Usatin, Danielle; Allen, Isabel E; Rhee, Sue; Vu, Lan

    2016-10-01

    Surgical management of children with short bowel syndrome (SBS) changed with the introduction of the serial transverse enteroplasty procedure (STEP). We conducted a systematic review and meta-analysis using MEDLINE and SCOPUS to determine if children with SBS had improved enteral tolerance following STEP. Studies were included if information about a child's pre- and post-STEP enteral tolerance was provided. A random effects meta-analysis provided a summary estimate of the proportion of children with enteral tolerance increase following STEP. From 766 abstracts, seven case series involving 86 children were included. Mean percent tolerance of enteral nutrition improved from 35.1 to 69.5. Sixteen children had no enteral improvement following STEP. A summary estimate showed that 87 % (95 % CI 77-95 %) of children who underwent STEP had an increase in enteral tolerance. Compilation of the literature supports the belief that SBS subjects' enteral tolerance improves following STEP. Enteral nutritional tolerance is a measure of efficacy of STEP and should be presented as a primary or secondary outcome. By standardizing data collection on children undergoing STEP procedure, better determination of nutritional benefit from STEP can be ascertained. PMID:27461428

  2. Improvement of safety by analysis of costs and benefits of the system

    Directory of Open Access Journals (Sweden)

    T. Karkoszka

    2011-11-01

    Full Text Available Purpose: of the paper has been the assessment of the dependence between improvement of the implemented occupational health and safety management system and both minimization of costs connected with occupational health and safety assurance and optimization of real work conditions.Design/methodology/approach: used for the analysis has included definition of the occupational health and safety system with regard to the rules and tool allowing for occupational safety assurance in the organisational and technical way, analyses of costs and benefits of the system maintenance as well as study on the tools for potential improvement of processes.Findings: of analysis are as follows: continuously improving occupational safety management system guarantees the advancement of work conditions, the decrease of the rate of occupational illnesses as well as the lowering of the amount of occupational accidents.Research limitations/implications: can apply in case of any organisation, which uses both organizational and technical rules, methods and tools to assure the optimal level of occupational health and safety conditions.Originality/value: of the presented paper has been constituted by the specification of the continuous improvement tools and methods in the system implemented on the basis on quality criterion.

  3. Simulations study of neutrino oscillation parameters with the Iron Calorimeter Detector (ICAL): an improved analysis

    CERN Document Server

    Mohan, Lakshmi S

    2016-01-01

    We present an updated and improved simulations analysis of precision measurements of neutrino oscillation parameters from the study of charged-current interactions of atmospheric neutrinos in the Iron Calorimeter (ICAL) detector at the proposed India-based Neutrino Observatory (INO). The present analysis is done in the extended muon energy range of 0.5--25 GeV, as compared to the previous analyses which were limited to the range 1--11 GeV of muon energy. A substantial improvement in the precision measurement of the oscillation parameters in the 2--3 sector, including the magnitude and sign of the 2--3 mass-squared difference $\\Delta{m^2_{32}}$ and especially $\\theta_{23}$ is observed. The sensitivities are further improved by the inclusion of additional systematics which constrains the ratio of neutrino to anti-neutrino fluxes. The best $1\\sigma$ precision on $\\sin^2 \\theta_{23}$ and $|\\Delta{m^2_{32}}|$ achievable with the new analysis for 500 kTon yr exposure of ICAL are $\\sim9\\%$ and $\\sim2.5\\%$ respective...

  4. Improving resolution of gravity data with wavelet analysis and spectral method

    Institute of Scientific and Technical Information of China (English)

    QIU Ning; HE Zhanxiang; CHANG Yanjun

    2007-01-01

    Gravity data are the results of gravity force field interaction from all the underground sources. The objects of detection are always submerged in the background field, and thus one of the crucial problems for gravity data interpretation is how to improve the resolution of observed information.The wavelet transform operator has recently been introduced into the domain fields both as a filter and as a powerful source analysis tool. This paper studied the effects of improving resolution of gravity data with wavelet analysis and spectral method, and revealed the geometric characteristics of density heterogeneities described by simple shaped sources. First, the basic theory of the multiscale wavelet analysis and its lifting scheme and spectral method were introduced. With the exper-imental study on forward simulation of anomalies given by the superposition of six objects and measured data in Songliao plain, Northeast China, the shape, size and depth of the buried objects were estimated in the study. Also, the results were compared with those obtained by conventional techniques,which demonstrated that this method greatly improves the resolution of gravity anomalies.

  5. Improved Proteomic Analysis Following Trichloroacetic Acid Extraction of Bacillus anthracis Spore Proteins

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.; Warner, Marvin G.; Wahl, Karen L.; Hutchison, Janine R.

    2015-08-07

    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extraction improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.

  6. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-11-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  7. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-01-01

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  8. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.

    2016-07-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  9. Model-free functional MRI analysis using improved fuzzy cluster analysis techniques

    Science.gov (United States)

    Lange, Oliver; Meyer-Baese, Anke; Wismueller, Axel; Hurdal, Monica; Sumners, DeWitt; Auer, Dorothee

    2004-04-01

    Conventional model-based or statistical analysis methods for functional MRI (fMRI) are easy to implement, and are effective in analyzing data with simple paradigms. However, they are not applicable in situations in which patterns of neural response are complicated and when fMRI response is unknown. In this paper the Gath-Geva algorithm is adapted and rigorously studied for analyzing fMRI data. The algorithm supports spatial connectivity aiding in the identification of activation sites in functional brain imaging. A comparison of this new method with the fuzzy n-means algorithm, Kohonen's self-organizing map, fuzzy n-means algorithm with unsupervised initialization, minimal free energy vector quantizer and the "neural gas" network is done in a systematic fMRI study showing comparative quantitative evaluations. The most important findings in the paper are: (1) the Gath-Geva algorithms outperforms for a large number of codebook vectors all other clustering methods in terms of detecting small activation areas, and (2) for a smaller number of codebook vectors the fuzzy n-means with unsupervised initialization outperforms all other techniques. The applicability of the new algorithm is demonstrated on experimental data.

  10. A NOVEL SPEECH ENHANCEMENT APPROACH BASED ON MODIFIED DCT AND IMPROVED PITCH SYNCHRONOUS ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. R. Balaji

    2014-01-01

    Full Text Available Speech enhancement has become an essential issue within the field of speech and signal processing, because of the necessity to enhance the performance of voice communication systems in noisy environment. There has been a number of research works being carried out in speech processing but still there is always room for improvement. The main aim is to enhance the apparent quality of the speech and to improve the intelligibility. Signal representation and enhancement in cosine transformation is observed to provide significant results. Discrete Cosine Transformation has been widely used for speech enhancement. In this research work, instead of DCT, Advanced DCT (ADCT which simultaneous offers energy compaction along with critical sampling and flexible window switching. In order to deal with the issue of frame to frame deviations of the Cosine Transformations, ADCT is integrated with Pitch Synchronous Analysis (PSA. Moreover, in order to improve the noise minimization performance of the system, Improved Iterative Wiener Filtering approach called Constrained Iterative Wiener Filtering (CIWF is used in this approach. Thus, a novel ADCT based speech enhancement using improved iterative filtering algorithm integrated with PSA is used in this approach.

  11. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    Directory of Open Access Journals (Sweden)

    Zhenying Zhang

    2014-08-01

    Full Text Available The impulse turbo expander (ITE is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reduction of the friction losses of the rotor. The performance of the novel cycle is investigated based on energy and exergy analysis. A correlation of the optimum intermediate pressure in terms of ITE efficiency is developed. The improved ITE cycle increases the exergy efficiency by 1.4%–6.1% over the conventional ITE cycle, 4.6%–8.3% over the economizer cycle and 7.2%–21.6% over the base cycle. Furthermore, the improved ITE cycle is also preferred due to its lower exergy loss.

  12. Improved wavelet analysis in enhancing Electromagnetic Campatibility of underground monitoring system in coal mine

    Institute of Scientific and Technical Information of China (English)

    SUN Ji-ping; MA Feng-ying; WU Dong-xu; LIU Xiao-yang

    2008-01-01

    Underground Electro Magnetic Interference (EMI) has become so serious that there were false alarms in monitoring system, which induced troubles of coal mine safety in production. In order to overcome difficulties caused by the explosion-proof enclosure of the equipments and the limitation of multiple startup and stop in transient process during EMI measurement, a novel technique was proposed to measure underground EMI distribution indirectly and enhance Electromagnetic Campatibility(EMC) of the monitoring system. The wavelet time-frequency analysis was introduced to underground monitoring system. Therefore, the sources, the startup time, duration and waveform of EMI could be ascertained correctly based on running records of underground electric equipments. The electrical fast transient/burst (EFT/B) was studied to verify the validity of wavelet analysis.EMI filter was improved in accordance of the EMI distribution gotten from wavelet analysis.Power port immunity was developed obviously. In addition, the method of setting wavelet thresholds was amended based upon conventional thresholds in the wavelet filter design.Therefore the EFT/B of data port was restrained markedly with the wavelet filtering. Coordinative effect of EMI power and wavelet filter makes false alarms of monitoring system reduce evidently. It is concluded that wavelet analysis and the improved EMI filter have enhanced the EMC of monitoring system obviously.

  13. Improved disparity map analysis through the fusion of monocular image segmentations

    Science.gov (United States)

    Perlant, Frederic P.; Mckeown, David M.

    1991-01-01

    The focus is to examine how estimates of three dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. The utilization of surface illumination information is provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Stereo analysis results are presented on a complex urban scene containing various man-made and natural features. This scene contains a variety of problems including low building height with respect to the stereo baseline, buildings and roads in complex terrain, and highly textured buildings and terrain. The improvements are demonstrated due to monocular fusion with a set of different region-based image segmentations. The generality of this approach to stereo analysis and its utility in the development of general three dimensional scene interpretation systems are also discussed.

  14. Multiple breath washout analysis in infants: quality assessment and recommendations for improvement.

    Science.gov (United States)

    Anagnostopoulou, Pinelopi; Egger, Barbara; Lurà, Marco; Usemann, Jakob; Schmidt, Anne; Gorlanova, Olga; Korten, Insa; Roos, Markus; Frey, Urs; Latzin, Philipp

    2016-03-01

    Infant multiple breath washout (MBW) testing serves as a primary outcome in clinical studies. However, it is still unknown whether current software algorithms allow between-centre comparisons. In this study of healthy infants, we quantified MBW measurement errors and tried to improve data quality by simply changing software settings. We analyzed best quality MBW measurements performed with an ultrasonic flowmeter in 24 infants from two centres in Switzerland with the current software settings. To challenge the robustness of these settings, we also used alternative analysis approaches. Using the current analysis software, the coefficient of variation (CV) for functional residual capacity (FRC) differed significantly between centres (mean  ±  SD (%): 9.8  ±  5.6 and 5.8  ±  2.9, respectively, p  =  0.039). In addition, FRC values calculated during the washout differed between  -25 and  +30% from those of the washin of the same tracing. Results were mainly influenced by analysis settings and temperature recordings. Changing few algorithms resulted in significantly more robust analysis. Non-systematic inter-centre differences can be reduced by using correctly recorded environmental data and simple changes in the software algorithms. We provide implications that greatly improve infant MBW outcomes' quality and can be applied when multicentre trials are conducted. PMID:26849570

  15. Multi-factor Analysis Model for Improving Profit Management Using Excel in Shellfish Farming Projects

    Institute of Scientific and Technical Information of China (English)

    Zhuming; ZHAO; Changlin; LIU; Xiujuan; SHAN; Jin; YU

    2013-01-01

    By using a farm’s data in Yantai City and the theory of Cost-Volume-Profit analysis and the financial management methods,this paper construct a multi-factor analysis model for improving profit management using Excel 2007 in Shellfish farming projects and describes the procedures to construct a multi-factor analysis model.The model can quickly calculate the profit,improve the level of profit management,find out the breakeven point and enhance the decision-making efficiency of businesses etc.It is also a thought of the application to offer suggestions for government decisions and economic decisions for corporations as a simple analysis tool.While effort has been exerted to construct a four-variable model,some equally important variables may not be discussed sufficiently due to limitation of the paper’s space and the authors’knowledge.All variables can be listed in EXCEL 2007 and can be associated in a logical way to manage the profit of shellfish farming projects more efficiently and more practically.

  16. Application of exergy analysis for improving energy efficiency of natural gas liquids recovery processes

    International Nuclear Information System (INIS)

    Thermodynamic analysis and optimization method is applied to provide design guidelines for improving energy efficiency and cost-effectiveness of natural gas liquids recovery processes. Exergy analysis is adopted in this study as a thermodynamic tool to evaluate the loss of exergy associated with irreversibility in natural gas liquids recovery processes, with which conceptual understanding on inefficient design feature or equipment can be obtained. Natural gas liquids processes are modeled and simulated within UniSim® simulator, with which detailed thermodynamic information are obtained for calculating exergy loss. The optimization framework is developed by minimizing overall exergy loss, as an objective function, subject to product specifications and engineering constraints. The optimization is carried out within MATLAB® with the aid of a stochastic solver based on genetic algorithms. The process simulator is linked and interacted with the optimization solver, in which optimal operating conditions can be determined. A case study is presented to illustrate the benefit of using exergy analysis for the design and optimization of natural gas liquids processes and to demonstrate the applicability of design method proposed in this paper. - Highlights: • Application of exergy analysis for natural gas liquids (NGL) recovery processes. • Minimization of exergy loss for improving energy efficiency. • A systematic optimization framework for the design of NGL recovery processes

  17. Improving Markov Chain Monte Carlo algorithms in LISA Pathfinder Data Analysis

    International Nuclear Information System (INIS)

    The LISA Pathfinder mission (LPF) aims to test key technologies for the future LISA mission. The LISA Technology Package (LTP) on-board LPF will consist of an exhaustive suite of experiments and its outcome will be crucial for the future detection of gravitational waves. In order to achieve maximum sensitivity, we need to have an understanding of every instrument on-board and parametrize the properties of the underlying noise models. The Data Analysis team has developed algorithms for parameter estimation of the system. A very promising one implemented for LISA Pathfinder data analysis is the Markov Chain Monte Carlo. A series of experiments are going to take place during flight operations and each experiment is going to provide us with essential information for the next in the sequence. Therefore, it is a priority to optimize and improve our tools available for data analysis during the mission. Using a Bayesian framework analysis allows us to apply prior knowledge for each experiment, which means that we can efficiently use our prior estimates for the parameters, making the method more accurate and significantly faster. This, together with other algorithm improvements, will lead us to our main goal, which is no other than creating a robust and reliable tool for parameter estimation during the LPF mission.

  18. Economic analysis of interventions to improve village chicken production in Myanmar.

    Science.gov (United States)

    Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J

    2013-07-01

    A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for

  19. Quantitative Transcript Analysis in Plants: Improved First-strand cDNA Synthesis

    Institute of Scientific and Technical Information of China (English)

    Nai-Zhong XIAO; Lei BA; Preben Bach HOLM; Xing-Zhi WANG; Steve BOWRA

    2005-01-01

    The quantity and quality of first-strand cDNA directly influence the accuracy of transcriptional analysis and quantification. Using a plant-derived α-tubulin as a model system, the effect of oligo sequence and DTT on the quality and quantity of first-strand cDNA synthesis was assessed via a combination of semi-quantitative PCR and real-time PCR. The results indicated that anchored oligo dT significantly improved the quantity and quality of α-tubulin cDNA compared to the conventional oligo dT. Similarly, omitting DTT from the first-strand cDNA synthesis also enhanced the levels of transcript. This is the first time that a comparative analysis has been undertaken for a plant system and it shows conclusively that small changes to current protocols can have very significant impact on transcript analysis.

  20. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis

    Science.gov (United States)

    Oshima, Mitsutaka

    2016-04-01

    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration

  1. Computerized lung sound analysis following clinical improvement of pulmonary edema due to congestive heart failure exacerbations

    Institute of Scientific and Technical Information of China (English)

    WANG Zhen; XIONG Ying-xia

    2010-01-01

    Background Although acute congestive heart failure (CHF) patients typically present with abnormal auscultatory findings on lung examination, lung sounds are not normally subjected to rigorous analysis. The goals of this study were to use a computerized analytic acoustic tool to evaluate lung sound patterns in CHF patients during acute exacerbation and after clinical improvement and to compare CHF profiles with those of normal individuals.Methods Lung sounds throughout the respiratory cycle was captured using a computerized acoustic-based imaging technique. Thirty-two consecutive CHF patients were imaged at the time of presentation to the emergency department and after clinical improvement. Digital images were created, geographical area of the images and lung sound patterns were quantitatively analyzed.Results The geographical areas of the vibration energy image of acute CHF patients without and with radiographically evident pulmonary edema were (67.9±4.7) and (60.3±3.5) kilo-pixels, respectively (P <0.05). In CHF patients without and with radiographically evident pulmonary edema (REPE), after clinical improvement the geographical area of vibration energy image of lung sound increased to (74.5±4.4) and (73.9±3.9) kilo-pixels (P <0.05), respectively. Vibration energy decreased in CHF patients with REPE following clinical improvement by an average of (85±19)% (P <0.01). Conclusions With clinical improvement of acute CHF exacerbations, there was more homogenous distribution of lung vibration energy, as demonstrated by the increased geographical area of the vibration energy image. Lung sound analysis may be useful to track in acute CHF exacerbations.

  2. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  3. Recent Improvements of Actinides Trace Analysis in Environmental Samples for Nuclear Activities Detection

    International Nuclear Information System (INIS)

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  4. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    Energy Technology Data Exchange (ETDEWEB)

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  5. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  6. A model for improving energy efficiency in industrial motor system using multicriteria analysis

    Energy Technology Data Exchange (ETDEWEB)

    Herrero Sola, Antonio Vanderley, E-mail: sola@utfpr.edu.br [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil); Mota, Caroline Maria de Miranda, E-mail: carolmm@ufpe.br [Federal University of Pernambuco, Cx. Postal 7462, CEP 50630-970, Recife (Brazil); Kovaleski, Joao Luiz [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil)

    2011-06-15

    In the last years, several policies have been proposed by governments and global institutions in order to improve the efficient use of energy in industries worldwide. However, projects in industrial motor systems require new approach, mainly in decision making area, considering the organizational barriers for energy efficiency. Despite the wide application, multicriteria methods remain unexplored in industrial motor systems until now. This paper proposes a multicriteria model using the PROMETHEE II method, with the aim of ranking alternatives for induction motors replacement. A comparative analysis of the model, applied to a Brazilian industry, has shown that multicriteria analysis presents better performance on energy saving as well as return on investments than single criterion. The paper strongly recommends the dissemination of multicriteria decision aiding as a policy to support the decision makers in industries and to improve energy efficiency in electric motor systems. - Highlights: > Lack of decision model in industrial motor system is the main motivation of the research. > A multicriteria model based on PROMETHEE method is proposed with the aim of supporting the decision makers in industries. > The model can contribute to transpose some barriers within the industries, improving the energy efficiency in industrial motor system.

  7. A model for improving energy efficiency in industrial motor system using multicriteria analysis

    International Nuclear Information System (INIS)

    In the last years, several policies have been proposed by governments and global institutions in order to improve the efficient use of energy in industries worldwide. However, projects in industrial motor systems require new approach, mainly in decision making area, considering the organizational barriers for energy efficiency. Despite the wide application, multicriteria methods remain unexplored in industrial motor systems until now. This paper proposes a multicriteria model using the PROMETHEE II method, with the aim of ranking alternatives for induction motors replacement. A comparative analysis of the model, applied to a Brazilian industry, has shown that multicriteria analysis presents better performance on energy saving as well as return on investments than single criterion. The paper strongly recommends the dissemination of multicriteria decision aiding as a policy to support the decision makers in industries and to improve energy efficiency in electric motor systems. - Highlights: → Lack of decision model in industrial motor system is the main motivation of the research. → A multicriteria model based on PROMETHEE method is proposed with the aim of supporting the decision makers in industries. → The model can contribute to transpose some barriers within the industries, improving the energy efficiency in industrial motor system.

  8. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user the ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.

  9. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  10. Protein cleavage strategies for an improved analysis of the membrane proteome

    Directory of Open Access Journals (Sweden)

    Poetsch Ansgar

    2006-03-01

    Full Text Available Abstract Background Membrane proteins still remain elusive in proteomic studies. This is in part due to the distribution of the amino acids lysine and arginine, which are less frequent in integral membrane proteins and almost absent in transmembrane helices. As these amino acids are cleavage targets for the commonly used protease trypsin, alternative cleavage conditions, which should improve membrane protein analysis, were tested by in silico digestion for the three organisms Saccharomyces cerevisiae, Halobacterium sp. NRC-1, and Corynebacterium glutamicum as hallmarks for eukaryotes, archea and eubacteria. Results For the membrane proteomes from all three analyzed organisms, we identified cleavage conditions that achieve better sequence and proteome coverage than trypsin. Greater improvement was obtained for bacteria than for yeast, which was attributed to differences in protein size and GRAVY. It was demonstrated for bacteriorhodopsin that the in silico predictions agree well with the experimental observations. Conclusion For all three examined organisms, it was found that a combination of chymotrypsin and staphylococcal peptidase I gave significantly better results than trypsin. As some of the improved cleavage conditions are not more elaborate than trypsin digestion and have been proven useful in practice, we suppose that the cleavage at both hydrophilic and hydrophobic amino acids should facilitate in general the analysis of membrane proteins for all organisms.

  11. Improving land cover classification using input variables derived from a geographically weighted principal components analysis

    Science.gov (United States)

    Comber, Alexis J.; Harris, Paul; Tsutsumida, Narumasa

    2016-09-01

    This study demonstrates the use of a geographically weighted principal components analysis (GWPCA) of remote sensing imagery to improve land cover classification accuracy. A principal components analysis (PCA) is commonly applied in remote sensing but generates global, spatially-invariant results. GWPCA is a local adaptation of PCA that locally transforms the image data, and in doing so, can describe spatial change in the structure of the multi-band imagery, thus directly reflecting that many landscape processes are spatially heterogenic. In this research the GWPCA localised loadings of MODIS data are used as textural inputs, along with GWPCA localised ranked scores and the image bands themselves to three supervised classification algorithms. Using a reference data set for land cover to the west of Jakarta, Indonesia the classification procedure was assessed via training and validation data splits of 80/20, repeated 100 times. For each classification algorithm, the inclusion of the GWPCA loadings data was found to significantly improve classification accuracy. Further, but more moderate improvements in accuracy were found by additionally including GWPCA ranked scores as textural inputs, data that provide information on spatial anomalies in the imagery. The critical importance of considering both spatial structure and spatial anomalies of the imagery in the classification is discussed, together with the transferability of the new method to other studies. Research topics for method refinement are also suggested.

  12. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2005-11-01

    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  13. Improved Regression Analysis of Temperature-Dependent Strain-Gage Balance Calibration Data

    Science.gov (United States)

    Ulbrich, N.

    2015-01-01

    An improved approach is discussed that may be used to directly include first and second order temperature effects in the load prediction algorithm of a wind tunnel strain-gage balance. The improved approach was designed for the Iterative Method that fits strain-gage outputs as a function of calibration loads and uses a load iteration scheme during the wind tunnel test to predict loads from measured gage outputs. The improved approach assumes that the strain-gage balance is at a constant uniform temperature when it is calibrated and used. First, the method introduces a new independent variable for the regression analysis of the balance calibration data. The new variable is designed as the difference between the uniform temperature of the balance and a global reference temperature. This reference temperature should be the primary calibration temperature of the balance so that, if needed, a tare load iteration can be performed. Then, two temperature{dependent terms are included in the regression models of the gage outputs. They are the temperature difference itself and the square of the temperature difference. Simulated temperature{dependent data obtained from Triumph Aerospace's 2013 calibration of NASA's ARC-30K five component semi{span balance is used to illustrate the application of the improved approach.

  14. Using Mobile Phones to Improve Educational Outcomes: An Analysis of Evidence from Asia

    Directory of Open Access Journals (Sweden)

    John-Harmen Valk

    2010-03-01

    Full Text Available Despite improvements in educational indicators, such as enrolment, significant challenges remain with regard to the delivery of quality education in developing countries, particularly in rural and remote regions. In the attempt to find viable solutions to these challenges, much hope has been placed in new information and communication technologies (ICTs, mobile phones being one example. This article reviews the evidence of the role of mobile phone-facilitated mLearning in contributing to improved educational outcomes in the developing countries of Asia by exploring the results of six mLearning pilot projects that took place in the Philippines, Mongolia, Thailand, India, and Bangladesh. In particular, this article examines the extent to which the use of mobile phones helped to improve educational outcomes in two specific ways: 1 in improving access to education, and 2 in promoting new learning. Analysis of the projects indicates that while there is important evidence of mobile phones facilitating increased access, much less evidence exists as to how mobiles promote new learning.

  15. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital. PMID:20703560

  16. Heterogeneous Multi Core Processors for Improving the Efficiency of Market Basket Analysis Algorithm in Data Mining

    Directory of Open Access Journals (Sweden)

    Aashiha Priyadarshni .L

    2014-09-01

    Full Text Available Heterogeneous multi core processors can offer diverse computing capabilities. The efficiency of Market Basket Analysis Algorithm can be improved with heterogeneous multi core processors. Market basket analysis algorithm utilises apriori algorithm and is one of the popular data mining algorithms which can utilise Map/Reduce framework to perform analysis. The algorithm generates association rules based on transactional data and Map/Reduce motivates to redesign and convert the existing sequential algorithms for efficiency. Hadoop is the parallel programming platform built on Hadoop Distributed File Systems(HDFS for Map/Reduce computation that process data as (key, value pairs. In Hadoop map/reduce, the sequential jobs are parallelised and the Job Tracker assigns parallel tasks to the Task Tracker. Based on single threaded or multithreaded parallel tasks in the task tracker, execution is carried out in the appropriate cores. For this, a new scheduler called MB Scheduler can be developed. Switching between the cores can be made static or dynamic. The use of heterogeneous multi core processors optimizes processing capabilities and power requirements for a processor and improves the performance of the system.

  17. Stress analysis and evaluation of improved second-generation nuclear power plant piping systems

    International Nuclear Information System (INIS)

    Background: Piping is an important part in nuclear power plants. Purpose: In order to make piping meet the specification requirements, it is necessary to analyze and evaluate piping stress. Methods: This paper deals with the stress analysis and evaluation of the class l and class 2 piping of improved second-generation nuclear power plants, and discusses the calculation methods of piping stress due to various loads. Also, the effects of changing the RCC-M code edition on the calculation of piping stress are summarized. Results: Taking the LingAo Nuclear Power Plant Phase Ⅱ engineering safety injection system as an example, the piping stress analysis and evaluation are performed, and the results meet the RCC-M code requirements. Besides, the interface parameters including the support loads, nozzle loads, displacements of the piping and so on are obtained. Conclusions: This paper will provide technical support for stress analysis and evaluation of improved second-generation nuclear power plant piping systems. (authors)

  18. An Improved Distance and Mass Estimate for Sgr A* from a Multistar Orbit Analysis

    CERN Document Server

    Boehle, A; Schödel, R; Meyer, L; Yelda, S; Albers, S; Martinez, G D; Becklin, E E; Do, T; Lu, J R; Matthews, K; Morris, M R; Sitarski, B; Witzel, G

    2016-01-01

    We present new, more precise measurements of the mass and distance of our Galaxy's central supermassive black hole, Sgr A*. These results stem from a new analysis that more than doubles the time baseline for astrometry of faint stars orbiting Sgr A*, combining two decades of speckle imaging and adaptive optics data. Specifically, we improve our analysis of the speckle images by using information about a star's orbit from the deep adaptive optics data (2005 - 2013) to inform the search for the star in the speckle years (1995 - 2005). When this new analysis technique is combined with the first complete re-reduction of Keck Galactic Center speckle images using speckle holography, we are able to track the short-period star S0-38 (K-band magnitude = 17, orbital period = 19 years) through the speckle years. We use the kinematic measurements from speckle holography and adaptive optics to estimate the orbits of S0-38 and S0-2 and thereby improve our constraints of the mass ($M_{bh}$) and distance ($R_o$) of Sgr A*: $...

  19. Development of an improved commercial sector energy model for national policy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B.

    1992-12-01

    Pacific Northwest Laboratory provided support to the Office of Conservation and Renewable Energy (CE), under the Office of Planning and Assessment, to develop improved energy and environmental analysis tools. Commercial building sector energy models from the past decade were analyzed in order to provoke comment and stimulate discussion between potential model users and developers as to the appropriate structure and capability of a commercial sector energy model supported by CE. Three specific areas were examined during this review. These areas provide (1) a look at recent suggestions and guidance as to what constitutes a minimal set of requirements and capabilities for a commercial buildings energy model for CE, (2) a review of several existing models in terms of their general structure and how they match up with the requirements listed previously, and (3) an overview of a proposed improved commercial sector energy model.

  20. Improvements in data analysis obtained by large-area silicon ΔE - E detector telescopes

    International Nuclear Information System (INIS)

    The paper describes a few practical methods for the analysis of data obtained by standard thin-thick silicon detector telescopes used in nuclear reaction measurements. The addressed issues are: (1) improvement in double-sided silicon strip detector (DSSSD) calibration based on the fact that each event is registered twice, both in horizontal and vertical strips, (2) improvements in particle identification and (3) simplified mapping of the non-uniformity of the thin detector, without a dedicated measurement of the thickness. The proposed procedures are applied on experimental data obtained for 30MeV 7Li beam induced reactions on LiF and C targets, studied with a detection setup consisting of four telescopes placed at different angles and distances. The proposed methods aim at quicker and more reliable calibration and particle identification. (orig.)

  1. Silica Fume and Fly Ash Admixed Can Help to Improve the PRC Durability Combine Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Xiao Li-guang

    2016-01-01

    Full Text Available Silica fume/Fly ash RPC can greatly improve durability. When Silica fume to replace the same amount of 8% of the proportion of cement, re-mixed 15min of mechanically activated Fly ash content of 10%, by chloride ion flux detector measuring, complex doped than the reference RPC impermeability improved significantly; In addition, by using static nitrogen adsorption method showed, RPC internal pore structure determination, the hole integral volume was lower than the reference admixed RPC integral pore volume significantly; And combined SEM microscopic experimental methods, mixed of RPC internal structure and the formation mechanism analysis showed that, SF/FA complex fully embodies the synergy doped composites “Synergistic” principle.

  2. State of the art review of sodium fire analysis and current notions for improvements

    International Nuclear Information System (INIS)

    Sodium releases from postulated pipe ruptures, as well as failures of sodium handling equipment in liquid metal fast breeder reactors, may lead to substantial pressure-temperature transients in the sodium system cells, as well as in the reactor containment building. Sodium fire analyses are currently performed with analytical tools, such as the SPRAY, SOMIX, SPOOL-FIRE and SOFIRE-II codes. A review and evaluation of the state-of-the-art in sodium fire analysis is presented, and suggestions for further improvements are made. This work is based, in part, on studies made at Brookhaven National Laboratory during the past several years in the areas of model development and improvement associated with the accident analyses of LMFBRs. (author)

  3. Improved modelling of power transformer winding using bacterial swarming algorithm and frequency response analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shintemirov, A.; Tang, W.J.; Tang, W.H.; Wu, Q.H. [Department of Electrical Engineering and Electronics The University of Liverpool, Brownlow Hill, Liverpool L69 3GJ (United Kingdom)

    2010-09-15

    The paper discusses an improved modelling of transformer windings based on bacterial swarming algorithm (BSA) and frequency response analysis (FRA). With the purpose to accurately identify transformer windings parameters a model-based identification approach is introduced using a well-known lumped parameter model. It includes search space estimation using analytical calculations, which is used for the subsequent model parameters identification with a novel BSA. The newly introduced BSA, being developed upon a bacterial foraging behavior, is described in detail. Simulations and discussions are presented to explore the potential of the proposed approach using simulated and experimentally measured FRA responses taken from two transformers. The BSA identification results are compared with those using genetic algorithm. It is shown that the proposed BSA delivers satisfactory parameter identification and improved modelling can be used for FRA results interpretation. (author)

  4. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  5. State-of-the-art review of sodium fire analysis and current notions for improvements

    International Nuclear Information System (INIS)

    Sodium releases from postulated pipe ruptures, as well as failures of sodium handling equipment in liquid metal fast breeder reactors, may lead to substantial pressure-temperature transients in the sodium system cells, as well as in the reactor containment building. Sodium fire analyses are currently performed with analytical tools, such as the SPRAY, SOMIX, SPOOL-FIRE and SOFIRE-II codes. A review and evaluation of the state-of-the-art in sodium fire analysis is presented, and suggestions for further improvements are made. This work is based, in part, on studies made at Brookhaven National Laboratory during the past several years in the areas of model development and improvement associated with the accident analyses of LMFBRs

  6. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole

    2002-01-01

    recovery. The annual energy cost and first cost of the HVAC system were calculat4ed using DOE 2.1E for different levels of air quality (10-50% dissatisfied). This was achieved by changing the outdoor air supply rate and the pollution loads. Previous studies have documented a 1.1% increase in office...... productivity for every 10% reduction in the proportion of occupants entering a space who are dissatisfied with the air quality. With this assumption, the annual benefit due to improved air quality was always at least 10 times higher than the increase in annual energy and maintenance costs. The payback time of......A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...

  7. Improvements in data analysis obtained by large-area silicon ΔE - E detector telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Uroic, M.; Miljanic, D.; Prepolec, L.; Soic, N. [Ruder Boskovic Institute, Zagreb (Croatia); Milin, M. [University of Zagreb, Department of Physics, Faculty of Science, Zagreb (Croatia); Di Pietro, A.; Figuera, P.; Fisichella, M.; Pellegriti, M.G.; Scuderi, V. [Laboratori Nazionali del Sud, INFN, Catania (Italy); Lattuada, M. [Laboratori Nazionali del Sud, INFN, Catania (Italy); Universtita' di Catania, Dipartimento di Fisica e Astronomia, Catania (Italy); Martel, I. [Universidad de Huelva, Departamento de Fisica Aplicada, Huelva (Spain); Sanchez Benitez, A.M. [Universidad de Huelva, Departamento de Fisica Aplicada, Huelva (Spain); Universidade de Lisboa, Centro de Fisica Nuclear da, Lisboa (Portugal); Strano, E.; Torresi, D. [Laboratori Nazionali del Sud, INFN, Catania (Italy); Laboratori Nazionali di Legnaro, INFN, Legnaro (Italy)

    2015-08-15

    The paper describes a few practical methods for the analysis of data obtained by standard thin-thick silicon detector telescopes used in nuclear reaction measurements. The addressed issues are: (1) improvement in double-sided silicon strip detector (DSSSD) calibration based on the fact that each event is registered twice, both in horizontal and vertical strips, (2) improvements in particle identification and (3) simplified mapping of the non-uniformity of the thin detector, without a dedicated measurement of the thickness. The proposed procedures are applied on experimental data obtained for 30MeV {sup 7}Li beam induced reactions on LiF and C targets, studied with a detection setup consisting of four telescopes placed at different angles and distances. The proposed methods aim at quicker and more reliable calibration and particle identification. (orig.)

  8. Improving analytic hierarchy process applied to fire risk analysis of public building

    Institute of Scientific and Technical Information of China (English)

    SHI Long; ZHANG RuiFang; XIE QiYuan; FU LiHua

    2009-01-01

    The structure importance in Fault Tree Analysis (FTA) reflects how important Basic Events are to Top Event.Attribute at alternative level in Analytic Hierarchy Process (AHP) also reflect its importance to general goal.Based on the coherence of these two methods,an improved AHP is put forward.Using this improved method,how important the attribute is to the fire safety of public building can be ana-lyzed more credibly because of the reduction of subjective judgment.Olympic venues are very impor-tant public buildings in China.The fire safety evaluation of them will be a big issue to engineers.Im-proved AHP is a useful tool to the safety evaluation to these Olympic venues,and it will guide the evaluation in other areas.

  9. Analysis of Improvement on Human Resource Management within Chinese Enterprises in Economic Globalization

    Directory of Open Access Journals (Sweden)

    Lihui Xie

    2013-04-01

    Full Text Available In this study, we analysis of improvement on human resource management within Chinese enterprises in economic globalization. China’s entry into WTO has accelerated the economic globalization pace of Chinese enterprises and Chinese economy is further integrated with the global economy in a global scope. Human resource is what economic globalization of Chinese enterprises relies on, the first resource for China to participate in the international competition and is also the key to make effective use of other resources. Nevertheless, under the background of economic globalization, human resource management in Chinese enterprises is still faced up with quite a lot of challenges and problems. In order to establish a human resource management concept of globalization and set up a human resource management mechanism to respond to the economic globalization, this study makes a discussion and proposes management method and improvement measures for reference.

  10. Business analysis for Wal-Mart, a grocery retail chain, and improvement proposals

    OpenAIRE

    BARBERÁ MARCILLA, LAURA

    2014-01-01

    This study consists on the analysis of a very big grocery retail chain and the proposal of a serial of improvements I consider that can help the company to grow in the future. Wal-Mart Stores, Inc. is a multinational retail corporation that runs large discount superstores and warehouses. It was founded less than fifty years ago by Sam Walton and his brother Bud in Bentonville, Arkansas (USA). With sales over $300 billion a year, Wal-Mart is considered one of world´s most valuable companies...

  11. Analysis and Improvement of TCP Congestion Control Mechanism Based on Global Optimization Model

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Network flow control is formulated as a global optimization problem of user profit. A general global optimization flow control model is established. This model combined with the stochastic model of TCP is used to study the global rate allocation characteristic of TCP. Analysis shows when active queue manage ment is used in network TCP rates tend to be allocated to maximize the aggregate of a user utility function Us (called Us fairness). The TCP throughput formula is derived. An improved TCP congestion control mecha nism is proposed. Simulations show its throughput is TCP friendly when competing with existing TCP and its rate change is smoother. Therefore, it is suitable to carry multimedia applications.

  12. Improved Methodology Application for 12-Rad Analysis in a Shielded Facility at SRS

    International Nuclear Information System (INIS)

    The DOE Order 420.1 requires establishing 12-rad evacuation zone boundaries and installing Criticality Accident Alarm System (CAAS) per ANS-8.3 standard for facilities having a probability of criticality greater than 10-6 per year. The H-Canyon at the Savannah River Site (SRS) is one of the reprocessing facilities where SRS reactor fuels, research reactor fuels, and other fissile materials are processed and purified using a modified Purex process called H-Modified or HM Process. This paper discusses an improved methodology for 12-rad zone analysis and its implementation within this large shielded facility that has a large variety of criticality sources and scenarios

  13. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  14. Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle

    Science.gov (United States)

    Stambolian, Damon B.; Dippolito, Gregory M.; Nyugen, Bao; Dischinger, Charles; Tran, Donald; Henderson, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human Factors analysis in improving the ground processing procedures for the Ares-1 launch vehicle. The light vehicle engineering designers for Ares-l launch vehicle had to design the flight vehicle for effective, efficient and safe ground operations in the cramped dimensions in a rocket design. The use of a mockup of the area where the technician would be required to work proved to be a very effective method to promote the collaboration between the Ares-1 designers and the ground operations personnel.

  15. Analysis and improvement of a chaos-based image encryption algorithm

    International Nuclear Information System (INIS)

    The security of digital image attracts much attention recently. In Guan et al. [Guan Z, Huang F, Guan W. Chaos-based image encryption algorithm. Phys Lett A 2005; 346: 153-7.], a chaos-based image encryption algorithm has been proposed. In this paper, the cause of potential flaws in the original algorithm is analyzed in detail, and then the corresponding enhancement measures are proposed. Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.

  16. Improved Method for the Flow Injection Analysis of Chemical Oxygen Demand Using Silver Nitrate

    OpenAIRE

    Korenaga, Takashi; Ikatsu, Hisayoshi; Moriwake, Tosio; Takahashi, Teruo

    1980-01-01

    On the flow injection analysis (FIA) of chemical oxygendemand (COD), silver salt was added as an oxidation catalyst for COD substances and a masking agent for halide to improve operating conditions of the FIA apparatus. Both of a proper concentration of potassium permanganate solution and 6.0 % sulfuric acid solution containing 0.1 % silver nitrate are individually pumped up with respective flow rates of 0.51 ml min(-l) and merged into a carrier stream. A 20 μ1 of sample solution is injected ...

  17.  The Assembly of Lean Production : An Analysis of Doing Production Improvements

    OpenAIRE

    Andersson, Gunnar

    2011-01-01

    This thesis is an analysis of the assembly of the zero-defects project at Glomma Papp AS, a company on manufacture of paper, corrugated board, solid board and display, in Sarpsborg Norway. The zero-defects project was a local production improvement project based on approaches, tools and methods known as Lean. The project is seen as an actor-network, which means that its reality, and the understandings and practices of it, are effects of the web of people, structures, technologies and others w...

  18. Improvements in the vapor-time profile analysis of explosive odorants using solid-phase microextraction.

    Science.gov (United States)

    Young, Mimy; Schantz, Michele; MacCrehan, William

    2016-07-15

    A modified approach for characterization of the vapor-time profile of the headspace odors of explosives was developed using solid-phase microextraction (SPME) incorporating introduction of an externally-sampled internal standard (ESIS) followed by gas chromatography/mass spectrometry (GC/MS) analysis. With this new method, reproducibility of the measurements of 2-ethyl-1-hexanol and cyclohexanone were improved compared to previous work (Hoffman et al., 2009; Arthur and Pawliszyn, 1990) through the use of stable-isotope-labeled internal standards. Exposing the SPME fiber to the ESIS after sampling the target analyte proved to be advantageous, while still correcting for fiber variability and detector drift. For the analysis of high volatility compounds, incorporation of the ESIS using the SPME fiber in the retracted position minimized the subsequent competitive loss of the target analyte, allowing for much longer sampling times. PMID:27286650

  19. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  20. Structure of CPV17 polyhedrin determined by the improved analysis of serial femtosecond crystallographic data

    International Nuclear Information System (INIS)

    The X-ray free-electron laser (XFEL) allows the analysis of small weakly diffracting protein crystals, but has required very many crystals to obtain good data. Here we use an XFEL to determine the room temperature atomic structure for the smallest cytoplasmic polyhedrosis virus polyhedra yet characterized, which we failed to solve at a synchrotron. These protein microcrystals, roughly a micron across, accrue within infected cells. We use a new physical model for XFEL diffraction, which better estimates the experimental signal, delivering a high-resolution XFEL structure (1.75 Å), using fewer crystals than previously required for this resolution. The crystal lattice and protein core are conserved compared with a polyhedrin with less than 10% sequence identity. We explain how the conserved biological phenotype, the crystal lattice, is maintained in the face of extreme environmental challenge and massive evolutionary divergence. Our improved methods should open up more challenging biological samples to XFEL analysis

  1. Improved soil particle-size analysis by gamma-ray attenuation

    International Nuclear Information System (INIS)

    The size distribution of particles is useful for physical characterization of soil. This study was conducted to determine whether a new method of soil particle-size analysis by gamma-ray attenuation could be further improved by changing the depth and time of measurement of the suspended particle concentration during sedimentation. In addition to the advantage of nondestructive, undisturbed measurement by gamma-ray attenuation, as compared with conventional pipette or hydrometer methods, the modifications here suggested and employed do substantially decrease the total time for analysis, and will also facilitate total automation and generalize the method for other sedimentation studies. Experimental results are presented for three different Brazilian soil materials, and illustrate the nature of the fine detail provided in the cumulative particle-size distribution as given by measurements obtained during the relatively short time period of 28 min

  2. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  3. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification

  4. Improving student critical thinking skills through a root cause analysis pilot project.

    Science.gov (United States)

    Tschannen, Dana; Aebersold, Michelle

    2010-08-01

    The Essentials of Baccalaureate Education for Professional Nursing Practice provides a framework for building the baccalaureate education for the twenty-first century. One of the exemplars included in the essentials toolkit includes student participation in an actual root cause analysis (RCA) or failure mode effects analysis. To align with this exemplar, faculty at the University of Michigan School of Nursing developed a pilot RCA project for the senior-level Leadership and Management course. While working collaboratively with faculty and unit liaisons at the University Health System, students completed an RCA on a nursing sensitive indicator (pain assessment or plan of care compliance). An overview of the pilot project, including the implementation process, is described. Each team of students identified root causes and recommendations for improvement on clinical and documentation practice within the context of the unit. Feedback from both the unit liaisons and the students confirmed the pilot's success. PMID:20509590

  5. Improving Resolution and Depth of Astronomical Observations via Modern Mathematical Methods for Image Analysis

    Science.gov (United States)

    Castellano, M.; Ottaviani, D.; Fontana, A.; Merlin, E.; Pilo, S.; Falcone, M.

    2015-09-01

    In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.

  6. Improving resolution and depth of astronomical observations via modern mathematical methods for image analysis

    CERN Document Server

    Castellano, Marco; Fontana, Adriano; Merlin, Emiliano; Pilo, Stefano; Falcone, Maurizio

    2015-01-01

    In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.

  7. Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft

    Science.gov (United States)

    Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.

  8. Linear analysis of the vertical shear instability: outstanding issues and improved solutions (Research Note)

    CERN Document Server

    Umurhan, O M; Gressel, O

    2015-01-01

    The Vertical Shear Instability is one of two known mechanisms potentially active in the so-called dead zones of protoplanetary accretion disks. A recent analysis indicates that a subset of unstable modes shows unbounded growth - both as resolution is increased and when the nominal lid of the atmosphere is extended, possibly indicating ill-posedness in previous attempts of linear analysis. The reduced equations governing the instability are revisited and the generated solutions are examined using both the previously assumed separable forms and an improved non-separable solution form that is herewith introduced. Analyzing the reduced equations using the separable form shows that, while the low-order body modes have converged eigenvalues and eigenfunctions (as both the vertical boundaries of the atmosphere are extended and with increased radial resolution), it is also confirmed that the corresponding high-order body modes and the surface modes do indeed show unbounded growth rates. However, the energy contained ...

  9. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    Science.gov (United States)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  10. A social work study on the effect of transactional analysis on the improvement of intimacy attitude

    Directory of Open Access Journals (Sweden)

    Parvin Gol

    2013-04-01

    Full Text Available The purpose of this paper is to investigate the impact of group counseling using transactional analysis on the improvement of intimacy attitude in some depressed patients in city of Esfahan, Iran. In this paper, semi-experimental design with pretest posttest control groups was conducted among 30 patients. The sample was selected through available sampling method among the depressed patients referred to psychiatric centers. They were randomly assigned into experimental and control groups. The measurement instrument is intimacy attitude scale (IAS questionnaire by Amidon et al. (1983 [Amidon, E., Kumar, V. K., & Treadwell, T. (1983. Measurement of intimacy attitudes: The intimacy attitude scale-revisited. Journal of personality assessment, 47(6, 635-639.] and the Beck depression inventory (BDI. The pretest and posttest scores of the intimacy attitude scale questionnaire were analyzed in both experimental and control groups. For statistical analysis of data, repeated measures analysis of variance was carried out. The research findings indicated that group counseling using transactional analysis increases the level of intimacy attitude in depressed individuals. It also increases the emotional intimacy, but it does not increase the mental intimacy.

  11. The 3rd ATLAS Domestic Standard Problem for Improvement of Safety Analysis Technology

    International Nuclear Information System (INIS)

    The third ATLAS DSP (domestic standard problem exercise) was launched at the end of 2012 in response to the strong need for continuation of the ATLAS DSP. A guillotine break of a main steam line without LOOP at a zero power condition was selected as a target scenario, and it was successfully completed in the beginning of 2014. In the 3rd ATLAS DSP, comprehensive utilization of the integral effect test data was made by dividing analysis with three topics; 1. scale-up where extrapolation of ATLAS IET data was investigated 2. 3D analysis where how much improvement can be obtained by 3D modeling was studied 3. 1D sensitivity analysis where the key phenomena affecting the SLB simulation were identified and the best modeling guideline was achieved. Through such DSP exercises, it has been possible to effectively utilize high-quality ATLAS experimental data of to enhance thermal-hydraulic understanding and to validate the safety analysis codes. A strong human network and technical expertise sharing among the various nuclear experts are also important outcomes from this program

  12. Security Analysis and Improvements of Authentication and Access Control in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Bruce Ndibanje

    2014-08-01

    Full Text Available Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. According to our analysis, Jing et al.’s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  13. Security analysis and improvements of authentication and access control in the Internet of Things.

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-01-01

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost. PMID:25123464

  14. Analysis of the dynamic response improvement of a turbocharged diesel engine driven alternating current generating set

    International Nuclear Information System (INIS)

    Reliability of electric supply systems is among the most required necessities of modern society. Turbocharged diesel engine driven alternating current generating sets are often used to prevent electric black outs and/or as prime electric energy suppliers. It is well known that turbocharged diesel engines suffer from an inadequate response to a sudden load increase, this being a consequence of the nature of the energy exchange between the engine and the turbocharger. The dynamic response of turbocharged diesel engines could be improved by electric assisting systems, either by direct energy supply with an integrated starter-generator-booster (ISG) mounted on the engine flywheel, or by an indirect energy supply with an electrically assisted turbocharger. An experimentally verified zero dimensional computer simulation method was used for the analysis of both types of electrical assistance. The paper offers an analysis of the interaction between a turbocharged diesel engine and different electric assisting systems, as well as the requirements for the supporting electric motors that could improve the dynamic response of a diesel engine while driving an AC generating set. When performance class compliance is a concern, it is evident that an integrated starter-generator-booster outperforms an electrically assisted turbocharger for the investigated generating set. However, the electric energy consumption and frequency recovery times are smaller when an electrically assisted turbocharger is applied

  15. Efficacy of e-technologies in improving breastfeeding outcomes among perinatal women: a meta-analysis.

    Science.gov (United States)

    Lau, Ying; Htun, Tha P; Tam, Wai S W; Klainin-Yobas, Piyanee

    2016-07-01

    A growing line of research has highlighted that e-technologies may play a promising role in improving breastfeeding outcomes. The objective of this review was to synthesise the best of available evidence by conducting a meta-analysis to evaluate whether e-technologies have had any effect in improving breastfeeding outcomes among perinatal women. The review was conducted using nine electronic databases to search for English-language research studies from 2007 to 2014. A 'risk of bias' table was used to assess methodological quality. Meta-analysis was performed with the RevMan software. The Q test and I(2) test was used to assess the heterogeneity. The test of overall effect was assessed using z-statistics at P attitude (z = 3.01, P = 0.003) and breastfeeding knowledge (z = 4.54, P = < 0.00001) in subgroup analyses. This review provides support for the development of web-based, texting messaging, compact disc read-only memory, electronic prompts and interactive computer agent interventions for promoting and supporting breastfeeding. PMID:26194599

  16. Improving the sensory quality of flavored liquid milk by engaging sensory analysis and consumer preference.

    Science.gov (United States)

    Zhi, Ruicong; Zhao, Lei; Shi, Jingye

    2016-07-01

    Developing innovative products that satisfy various groups of consumers helps a company maintain a leading market share. The hedonic scale and just-about-right (JAR) scale are 2 popular methods for hedonic assessment and product diagnostics. In this paper, we chose to study flavored liquid milk because it is one of the most necessary nutrient sources in China. The hedonic scale and JAR scale methods were combined to provide directional information for flavored liquid milk optimization. Two methods of analysis (penalty analysis and partial least squares regression on dummy variables) were used and the results were compared. This paper had 2 aims: (1) to investigate consumer preferences of basic flavor attributes of milk from various cities in China; and (2) to determine the improvement direction for specific products and the ideal overall liking for consumers in various cities. The results showed that consumers in China have local-specific requirements for characteristics of flavored liquid milk. Furthermore, we provide a consumer-oriented product design method to improve sensory quality according to the preference of particular consumers. PMID:27108179

  17. An Improved Plasticity-Based Distortion Analysis Method for Large Welded Structures

    Science.gov (United States)

    Yang, Yu-Ping; Athreya, Badrinarayan P.

    2013-05-01

    The plasticity-based distortion prediction method was improved to address the computationally intensive nature of welding simulations. Plastic strains, which are typically first computed using either two-dimensional (2D) or three-dimensional (3D) thermo-elastic-plastic analysis (EPA) on finite element models of simple weld geometry, are mapped to the full structure finite element model to predict distortion by conducting a linear elastic analysis. To optimize welding sequence to control distortion, a new theory was developed to consider the effect of weld interactions on plastic strains. This improved method was validated with experimental work on a Tee joint and tested on two large-scale welded structures—a light fabrication and a heavy fabrication—by comparing against full-blown distortion predictions using thermo-EPA. 3D solid and shell models were used for the heavy and light fabrications, respectively, to compute plastic strains due to each weld. Quantitative comparisons between this method and thermo-EPA indicate that this method can predict distortions fairly accurately—even for different welding sequences—and is roughly 1-2 orders of magnitude faster. It was concluded from these findings that, with further technical development, this method can be an ideal solver for optimizing welding sequences.

  18. Design improvement and dynamic finite element analysis of novel ITI dental implant under dynamic chewing loads.

    Science.gov (United States)

    Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei; Lee, Shyh-Yuan

    2015-01-01

    The main aim of this article was to introduce the application of a uniform design for experimental methods to drop the micromotion of a novel ITI dental implant model under the dynamic loads. Combining the characteristics of the traditional ITI and Nano-Tite implants, a new implant with concave holes has been constructed. Compared to the traditional ITI dental implant model, the micromotion of the new dental implant model was significantly reduced by explicit dynamic finite element analysis. From uniform design of experiments, the dynamic finite element analysis method was applied to caluculated the maximum micromotion of the full model. Finally, the chief design in all the experiment simulations which cause the minimum micromotion is picked as the advanced model of the design. Related to the original design, which was associated with a micromotion of 45.11 μm, the micromotion of the improved version was 31.37 μm, for an improvement rate of 30.5%. PMID:26406049

  19. Vulnerability Identification and Design-Improvement-Feedback using Failure Analysis of Digital Control System Designs

    International Nuclear Information System (INIS)

    Fault tree analyses let analysts establish the failure sequences of components as a logical model and confirm the result at the plant level. These two analyses provide insights regarding what improvements are needed to increase availability because it expresses the quantified design attribute of the system as minimal cut sets and availability value interfaced with component reliability data in the fault trees. This combined failure analysis method helps system users understand system characteristics including its weakness and strength in relation to faults in the design stage before system operation. This study explained why a digital system could have weaknesses in methods to transfer control signals or data and how those vulnerabilities could cause unexpected outputs. In particular, the result of the analysis confirmed that complex optical communication was not recommended for digital data transmission in the critical systems of nuclear power plants. Regarding loop controllers in Design A, a logic configuration should be changed to prevent spurious actuation due to a single failure, using hardware or software improvements such as cross checking between redundant modules, or diagnosis of the output signal integrity. Unavailability calculations support these insights from the failure analyses of the systems. In the near future, KHNP will perform failure mode and effect analyses in the design stage before purchasing non-safety-related digital system packages. In addition, the design requirements of the system will be confirmed based on evaluation of overall system availability or unavailability

  20. Second-Law Analysis to Improve the Energy Efficiency of Screw Liquid Chillers

    Directory of Open Access Journals (Sweden)

    Tzong-Shing Lee

    2010-03-01

    Full Text Available This work applies the second-law analysis of thermodynamics to quantify the exergy destruction of the components of screw liquid chiller, and to identify the potential for each component to contribute to improve the overall energy efficiency of the system. Three screw liquid chiller units were built to demonstrate the feasibility of the model presented herein. Unit A was a 100 RT water-cooled screw liquid chiller. Unit B was modified from Unit A by switching the old condenser for a new one with a greater heat transfer, and Unit C was modified from Unit B by exchanging the compressor for a more efficient one. The results indicate that the compressor has the largest potential to improve energy efficiency, followed in order by the condenser, and then the evaporator. The second law analysis may help engineers to focus on the components with higher exergy destruction and quantify the extent to which modifying such components can influence, favorably or unfavorably, the performance of other components of the screw liquid chiller.

  1. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP

    International Nuclear Information System (INIS)

    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction. (paper)

  2. Structured hydrological analysis for targeting fallow evaporation to improve water productivity at the irrigation system level

    Directory of Open Access Journals (Sweden)

    S. Khan

    2007-02-01

    Full Text Available This paper provides results of an application of a holistic systematic approach of water accounting using remote sensing and GIS coupled with ground water modeling to evaluate water saving options by tracking non-beneficial evaporation in the Liuyuankou Irrigation System (LIS of China. Groundwater rise is a major issue in the LIS, where groundwater levels have risen alarmingly close to the ground surface (within 1 m near the Yellow River. The lumped water balance analysis showed high fallow evaporation losses and which need to be reduced for improving water productivity.

    The seasonal actual evapotranspiration (ETs was estimated by applying the SEBAL algorithm for eighteen NOAA AVHRR-12 images over the year of 1990–1991. This analysis was aided by the unsupervised land use classification applied to two Landsat 5 TM images of the study area. SEBAL results confirmed that a significant amount (116.7 MCM of water can be saved by reducing ETs from fallow land which will result in improved water productivity at the irrigation system. The water accounting indicator (for the analysis period shows that the process fraction per unit of depleted water (PFdepleted is 0.52 for LIS, meaning that 52% of the depleted water is consumed by agricultural crops and 48% is lost through non-process depletion.

    Finally, the groundwater modeling was applied to simulate three land use and water management interventions to assess their effectiveness for both water savings and impact on the groundwater in LIS. MODFLOW's Zone Budget code calculates the groundwater budget of user-specified subregions, the exchange of flows between subregions and also calculates a volumetric water budget for the entire model at the end of each time step. The simulation results showed that fallow evaporation could be reduced between 14.2% (25.51 MCM and 45.3% (81.36 MCM by interventions such as canal lining and ground

  3. Improving fluid intelligence with training on working memory: a meta-analysis.

    Science.gov (United States)

    Au, Jacky; Sheehan, Ellen; Tsai, Nancy; Duncan, Greg J; Buschkuehl, Martin; Jaeggi, Susanne M

    2015-04-01

    Working memory (WM), the ability to store and manipulate information for short periods of time, is an important predictor of scholastic aptitude and a critical bottleneck underlying higher-order cognitive processes, including controlled attention and reasoning. Recent interventions targeting WM have suggested plasticity of the WM system by demonstrating improvements in both trained and untrained WM tasks. However, evidence on transfer of improved WM into more general cognitive domains such as fluid intelligence (Gf) has been more equivocal. Therefore, we conducted a meta-analysis focusing on one specific training program, n-back. We searched PubMed and Google Scholar for all n-back training studies with Gf outcome measures, a control group, and healthy participants between 18 and 50 years of age. In total, we included 20 studies in our analyses that met our criteria and found a small but significant positive effect of n-back training on improving Gf. Several factors that moderate this transfer are identified and discussed. We conclude that short-term cognitive training on the order of weeks can result in beneficial effects in important cognitive functions as measured by laboratory tests. PMID:25102926

  4. Rice Improvement Through Genome-Based Functional Analysis and Molecular Breeding in India.

    Science.gov (United States)

    Agarwal, Pinky; Parida, Swarup K; Raghuvanshi, Saurabh; Kapoor, Sanjay; Khurana, Paramjit; Khurana, Jitendra P; Tyagi, Akhilesh K

    2016-12-01

    Rice is one of the main pillars of food security in India. Its improvement for higher yield in sustainable agriculture system is also vital to provide energy and nutritional needs of growing world population, expected to reach more than 9 billion by 2050. The high quality genome sequence of rice has provided a rich resource to mine information about diversity of genes and alleles which can contribute to improvement of useful agronomic traits. Defining the function of each gene and regulatory element of rice remains a challenge for the rice community in the coming years. Subsequent to participation in IRGSP, India has continued to contribute in the areas of diversity analysis, transcriptomics, functional genomics, marker development, QTL mapping and molecular breeding, through national and multi-national research programs. These efforts have helped generate resources for rice improvement, some of which have already been deployed to mitigate loss due to environmental stress and pathogens. With renewed efforts, Indian researchers are making new strides, along with the international scientific community, in both basic research and realization of its translational impact. PMID:26743769

  5. Topological-based bottleneck analysis and improvement strategies for traffic networks

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    A method is proposed to find key components of traffic networks with homogenous and heterogeneous topologies, in which heavier traffic flow is transported. One component, called the skeleton, is the minimum spanning tree (MST) based on the zero flow cost (ZCMST). The other component is the infinite incipient percolation cluster (IIC) which represents the spine of the traffic network. Then, a new method to analysis the property of the bottleneck in a large scale traffic network is given from a macroscopic and statistical viewpoint. Moreover, three effective strategies are proposed to alleviate traffic congestion. The significance of the findings is that one can significantly improve the global transport by enhancing the capacity in the ZCMST with a few links, while for improving the local traffic property, improving a tiny fraction of the traffic network in the IIC is effective. The result can be used to help traffic managers prevent and alleviate traffic congestion in time, guard against the formation of congestion bottleneck, and make appropriate policies for traffic demand management. Meanwhile, the method has very important theoretical significance and practical worthiness in optimizing traffic organization, traffic control, and disposal of emergency.

  6. Analysis of technological innovation and environmental performance improvement in aviation sector.

    Science.gov (United States)

    Lee, Joosung; Mo, Jeonghoon

    2011-09-01

    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector-aircraft manufacturers and airlines-has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation's lifecycle environmental impact if they can achieve sufficient economies of scale. PMID:22016716

  7. Repetitive transcranial magnetic stimulation improves consciousness disturbance in stroke patients A quantitative electroencephalography spectral power analysis

    Institute of Scientific and Technical Information of China (English)

    Ying Xie; Tong Zhang

    2012-01-01

    Repetitive transcranial magnetic stimulation is a noninvasive treatment technique that can directly alter cortical excitability and improve cerebral functional activity in unconscious patients. To investigate the effects and the electrophysiological changes of repetitive transcranial magnetic stimulation cortical treatment, 10 stroke patients with non-severe brainstem lesions and with disturbance of consciousness were treated with repetitive transcranial magnetic stimulation. A quantitative electroencephalography spectral power analysis was also performed. The absolute power in the alpha band was increased immediately after the first repetitive transcranial magnetic stimulation treatment, and the energy was reduced in the delta band. The alpha band relative power values slightly decreased at 1 day post-treatment, then increased and reached a stable level at 2 weeks post-treatment. Glasgow Coma Score and JFK Coma Recovery Scale-Revised score were improved. Relative power value in the alpha band was positively related to Glasgow Coma Score and JFK Coma Recovery Scale-Revised score. These data suggest that repetitive transcranial magnetic stimulation is a noninvasive, safe, and effective treatment technology for improving brain functional activity and promoting awakening in unconscious stroke patients.

  8. Lack of efficacy of music to improve sleep: a polysomnographic and quantitative EEG analysis.

    Science.gov (United States)

    Lazic, Stanley E; Ogilvie, Robert D

    2007-03-01

    An increasing number of studies have been examining non-pharmacological methods to improve the quality of sleep, including the use of music and other types of auditory stimulation. While many of these studies have found significant results, they suffer from a combination of subjective self-report measures as the primary outcome, a lack of proper controls, often combine music with some type of relaxation therapy, or do not randomise subjects to control and treatment conditions. It is therefore difficult to assess the efficacy of music to induce or improve sleep. The present study therefore examined the effects of music using standard polysomnographic measures and quantitative analysis of the electroencephalogram, along with subjective ratings of sleep quality. In addition, a tones condition was used to compare any effects of music with the effects of general auditory stimulation. Using a counter-balanced within-subjects design, the music was not significantly better than the tones or control conditions in improving sleep onset latency, sleep efficiency, wake time after sleep onset, or percent slow wave sleep, as determined by objective physiological criteria. PMID:17123654

  9. Analysis of Technological Innovation and Environmental Performance Improvement in Aviation Sector

    Directory of Open Access Journals (Sweden)

    Jeonghoon Mo

    2011-09-01

    Full Text Available The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector—aircraft manufacturers and airlines—has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation’s lifecycle environmental impact if they can achieve sufficient economies of scale.

  10. Increasing the number of thyroid lesions classes in microarray analysis improves the relevance of diagnostic markers.

    Directory of Open Access Journals (Sweden)

    Jean-Fred Fontaine

    Full Text Available BACKGROUND: Genetic markers for thyroid cancers identified by microarray analysis have offered limited predictive accuracy so far because of the few classes of thyroid lesions usually taken into account. To improve diagnostic relevance, we have simultaneously analyzed microarray data from six public datasets covering a total of 347 thyroid tissue samples representing 12 histological classes of follicular lesions and normal thyroid tissue. Our own dataset, containing about half the thyroid tissue samples, included all categories of thyroid lesions. METHODOLOGY/PRINCIPAL FINDINGS: Classifier predictions were strongly affected by similarities between classes and by the number of classes in the training sets. In each dataset, sample prediction was improved by separating the samples into three groups according to class similarities. The cross-validation of differential genes revealed four clusters with functional enrichments. The analysis of six of these genes (APOD, APOE, CLGN, CRABP1, SDHA and TIMP1 in 49 new samples showed consistent gene and protein profiles with the class similarities observed. Focusing on four subclasses of follicular tumor, we explored the diagnostic potential of 12 selected markers (CASP10, CDH16, CLGN, CRABP1, HMGB2, ALPL2, ADAMTS2, CABIN1, ALDH1A3, USP13, NR2F2, KRTHB5 by real-time quantitative RT-PCR on 32 other new samples. The gene expression profiles of follicular tumors were examined with reference to the mutational status of the Pax8-PPARgamma, TSHR, GNAS and NRAS genes. CONCLUSION/SIGNIFICANCE: We show that diagnostic tools defined on the basis of microarray data are more relevant when a large number of samples and tissue classes are used. Taking into account the relationships between the thyroid tumor pathologies, together with the main biological functions and pathways involved, improved the diagnostic accuracy of the samples. Our approach was particularly relevant for the classification of microfollicular adenomas.

  11. Effects of improved modeling on best estimate BWR severe accident analysis

    International Nuclear Information System (INIS)

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H2O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B4C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table

  12. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes. PMID:24179734

  13. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  14. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    International Nuclear Information System (INIS)

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H2/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author)

  15. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H{sub 2}/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author).

  16. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  17. An Improved, Automated Whole-Air Sampler and VOC Analysis System: Results from SONGNEX 2015

    Science.gov (United States)

    Lerner, B. M.; Gilman, J.; Tokarek, T. W.; Peischl, J.; Koss, A.; Yuan, B.; Warneke, C.; Isaacman-VanWertz, G. A.; Sueper, D.; De Gouw, J. A.; Aikin, K. C.

    2015-12-01

    Accurate measurement of volatile organic compounds (VOCs) in the troposphere is critical for the understanding of emissions and physical and chemical processes that can impact both air quality and climate. Airborne VOC measurements have proven challenging due to the requirements of short sample collection times (=10 s) to maximize spatial resolution and sampling frequency and high sensitivity (pptv) to chemically diverse hydrocarbons, halocarbons, oxygen- and nitrogen-containing VOCs. NOAA ESRL CSD has built an improved whole air sampler (iWAS) which collects compressed ambient air samples in electropolished stainless steel canisters, based on the NCAR HAIS Advanced Whole Air Sampler [Atlas and Blake]. Post-flight chemical analysis is performed with a custom-built gas chromatograph-mass spectrometer system that pre-concentrates analyte cryostatically via a Stirling cooler, an electromechanical chiller which precludes the need for liquid nitrogen to reach trapping temperatures. For the 2015 Shale Oil and Natural Gas Nexus Study (SONGNEX), CSD conducted iWAS measurements on 19 flights aboard the NOAA WP-3D aircraft between March 19th and April 27th. Nine oil and natural gas production regions were surveyed during SONGNEX and more than 1500 air samples were collected and analyzed. For the first time, we employed real-time mapping of sample collection combined with live data from fast time-response measurements (e.g. ethane) for more uniform surveying and improved target plume sampling. Automated sample handling allowed for more than 90% of iWAS canisters to be analyzed within 96 hours of collection - for the second half of the campaign improved efficiencies reduced the median sample age at analysis to 36 hours. A new chromatography peak-fitting software package was developed to minimize data reduction time by an order of magnitude without a loss of precision or accuracy. Here we report mixing ratios for aliphatic and aromatic hydrocarbons (C2-C8) along with select

  18. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    OpenAIRE

    Feofanova Iryna V.; Feofanov Lev K.

    2013-01-01

    The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis) and logical (for identification of directions of improvement of accounting) methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both ...

  19. Improvements of 3D finite element method for eddy current analysis and its application to fusion technology

    International Nuclear Information System (INIS)

    The 3D finite element method is improved so that both the computer storage and the CPU time can be reduced by examining the boundary conditions. The improved method is applied to the analysis of the Fusion Electromagnetic Induction Experiment (FELIX) facilities, and the characteristics of 3-D eddy current distributions are investigated. (orig.)

  20. Improving air pollution control policy in China--A perspective based on cost-benefit analysis.

    Science.gov (United States)

    Gao, Jinglei; Yuan, Zengwei; Liu, Xuewei; Xia, Xiaoming; Huang, Xianjin; Dong, Zhanfeng

    2016-02-01

    To mitigate serious air pollution, the State Council of China promulgated the Air Pollution Prevention and Control Action Plan in 2013. To verify the feasibility and validity of industrial energy-saving and emission-reduction policies in the action plan, we conducted a cost-benefit analysis of implementing these policies in 31 provinces for the period of 2013 to 2017. We also completed a scenario analysis in this study to assess the cost-effectiveness of different measures within the energy-saving and the emission-reduction policies individually. The data were derived from field surveys, statistical yearbooks, government documents, and published literatures. The results show that total cost and total benefit are 118.39 and 748.15 billion Yuan, respectively, and the estimated benefit-cost ratio is 6.32 in the S3 scenario. For all the scenarios, these policies are cost-effective and the eastern region has higher satisfactory values. Furthermore, the end-of-pipe scenario has greater emission reduction potential than energy-saving scenario. We also found that gross domestic product and population are significantly correlated with the benefit-cost ratio value through the regression analysis of selected possible influencing factors. The sensitivity analysis demonstrates that benefit-cost ratio value is more sensitive to unit emission-reduction cost, unit subsidy, growth rate of gross domestic product, and discount rate among all the parameters. Compared with other provinces, the benefit-cost ratios of Beijing and Tianjin are more sensitive to changes of unit subsidy than unit emission-reduction cost. These findings may have significant implications for improving China's air pollution prevention policy. PMID:26595398

  1. Improved protocol for rapid identification of certain spa types using high resolution melting curve analysis.

    Directory of Open Access Journals (Sweden)

    Benjamin Mayerhofer

    Full Text Available Methicillin-resistant Staphylococcus aureus is one of the most significant pathogens associated with health care. For efficient surveillance, control and outbreak investigation, S. aureus typing is essential. A high resolution melting curve analysis was developed and evaluated for rapid identification of the most frequent spa types found in an Austrian hospital consortium covering 2,435 beds. Among 557 methicillin-resistant Staphylococcus aureus isolates 38 different spa types were identified by sequence analysis of the hypervariable region X of the protein A gene (spa. Identification of spa types through their characteristic high resolution melting curve profiles was considerably improved by double spiking with genomic DNA from spa type t030 and spa type t003 and allowed unambiguous and fast identification of the ten most frequent spa types t001 (58%, t003 (12%, t190 (9%, t041 (5%, t022 (2%, t032 (2%, t008 (2%, t002 (1%, t5712 (1% and t2203 (1%, representing 93% of all isolates within this hospital consortium. The performance of the assay was evaluated by testing samples with unknown spa types from the daily routine and by testing three different high resolution melting curve analysis real-time PCR instruments. The ten most frequent spa types were identified from all samples and on all instruments with 100% specificity and 100% sensitivity. Compared to classical spa typing by sequence analysis, this gene scanning assay is faster, cheaper and can be performed in a single closed tube assay format. Therefore it is an optimal screening tool to detect the most frequent endemic spa types and to exclude non-endemic spa types within a hospital.

  2. Improved protocol for rapid identification of certain spa types using high resolution melting curve analysis.

    Science.gov (United States)

    Mayerhofer, Benjamin; Stöger, Anna; Pietzka, Ariane T; Fernandez, Haizpea Lasa; Prewein, Bernhard; Sorschag, Sieglinde; Kunert, Renate; Allerberger, Franz; Ruppitsch, Werner

    2015-01-01

    Methicillin-resistant Staphylococcus aureus is one of the most significant pathogens associated with health care. For efficient surveillance, control and outbreak investigation, S. aureus typing is essential. A high resolution melting curve analysis was developed and evaluated for rapid identification of the most frequent spa types found in an Austrian hospital consortium covering 2,435 beds. Among 557 methicillin-resistant Staphylococcus aureus isolates 38 different spa types were identified by sequence analysis of the hypervariable region X of the protein A gene (spa). Identification of spa types through their characteristic high resolution melting curve profiles was considerably improved by double spiking with genomic DNA from spa type t030 and spa type t003 and allowed unambiguous and fast identification of the ten most frequent spa types t001 (58%), t003 (12%), t190 (9%), t041 (5%), t022 (2%), t032 (2%), t008 (2%), t002 (1%), t5712 (1%) and t2203 (1%), representing 93% of all isolates within this hospital consortium. The performance of the assay was evaluated by testing samples with unknown spa types from the daily routine and by testing three different high resolution melting curve analysis real-time PCR instruments. The ten most frequent spa types were identified from all samples and on all instruments with 100% specificity and 100% sensitivity. Compared to classical spa typing by sequence analysis, this gene scanning assay is faster, cheaper and can be performed in a single closed tube assay format. Therefore it is an optimal screening tool to detect the most frequent endemic spa types and to exclude non-endemic spa types within a hospital. PMID:25768007

  3. Term AnalysisImproving the Quality of Learning and Application Documents in Engineering Design

    Directory of Open Access Journals (Sweden)

    S. Weiss

    2006-01-01

    Full Text Available Conceptual homogeneity is one determinant of the quality of text documents. A concept remains the same if the words used (termini change [1, 2]. In other words, termini can vary while the concept retains the same meaning. Human beings are able to handle concepts and termini because of their semantic network, which is able to connect termini to the actual context and thus identify the adequate meaning of the termini. Problems could arise when humans have to learn new content and correspondingly new concepts. Since the content is basically imparted by text via particular termini, it is a challenge to establish the right concept from the text with the termini. A term might be known, but have a different meaning [3, 4]. Therefore, it is very important to build up the correct understanding of concepts within a text. This is only possible when concepts are explained by the right termini, within an adequate context, and above all, homogeneously. So, when setting up or using text documents for teaching or application, it is essential to provide concept homogeneity.Understandably, the quality of documents is, ceteris paribus, reciprocally proportional to variations of termini. Therefore, an analysis of variations of termini could form a basis for specific improvement of conceptual homogeneity.Consequently, an exposition of variations of termini as control and improvement parameters is carried out in this investigation. This paper describes the functionality and the profit of a tool called TermAnalysis.It also outlines the margins, typeface and other vital specifications necessary for authors preparing camera-ready papers for submission to the 5th International Conference on Advanced Engineering Design. The aim of this paper is to ensure that all readers are clear as to the uniformity required by the organizing committee and to ensure that readers’ papers will be accepted as camera-ready for the conference.TermAnalysis is a software tool developed

  4. Metabolites production improvement by identifying minimal genomes and essential genes using flux balance analysis.

    Science.gov (United States)

    Salleh, Abdul Hakim Mohamed; Mohamad, Mohd Saberi; Deris, Safaai; Illias, Rosli Md

    2015-01-01

    With the advancement in metabolic engineering technologies, reconstruction of the genome of host organisms to achieve desired phenotypes can be made. However, due to the complexity and size of the genome scale metabolic network, significant components tend to be invisible. We proposed an approach to improve metabolite production that consists of two steps. First, we find the essential genes and identify the minimal genome by a single gene deletion process using Flux Balance Analysis (FBA) and second by identifying the significant pathway for the metabolite production using gene expression data. A genome scale model of Saccharomyces cerevisiae for production of vanillin and acetate is used to test this approach. The result has shown the reliability of this approach to find essential genes, reduce genome size and identify production pathway that can further optimise the production yield. The identified genes and pathways can be extendable to other applications especially in strain optimisation. PMID:26489144

  5. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. PMID:26936844

  6. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    Science.gov (United States)

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models. PMID:10699681

  7. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    Science.gov (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  8. State Space Analysis and Improvement for Stereo Matching Based on Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Yang Zhizhong

    2014-01-01

    Full Text Available Stereo matching is a key technology of machine vision and stereo vision. There are 3 factors that can influence the performance of stereo matching including disparity search range, the size of clustering window, the number of control points. In this study, a state space model had been built to analyze the effect of these factors. Based on analysis results, 2 improvements were proposed. First, control points are computed at low resolution level image of Gauss tower. Second, the shape of the window is adjusted in according to the position of control points at 2 sides. Experimental results show that stereo matching could been fast carried out based on dynamic programming and its matching accuracy could been kept.

  9. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters.

    Science.gov (United States)

    Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G

    2016-05-01

    With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice. PMID:26337122

  10. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    Science.gov (United States)

    Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly

    2016-01-01

    This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.

  11. An improved X-ray diffraction analysis method to characterize dislocation density in lath martensitic structures

    International Nuclear Information System (INIS)

    An improved X-ray diffraction line profile analysis method is developed to determine dislocation density of lath martensitic steels. This method combines the modified Warren–Averbach (MWA) and the modified Williamson–Hall (MWH) methods. The developed method is robust and leads to unique values for the dislocation density, the effective outer cut-off radius of the dislocations (Re) and the dislocations distribution parameter (M). Dislocation structures of lath martensite in a steel, in the as-quenched condition as well as in tempered conditions, are characterized by using the proposed method. The calculated dislocation density is compared with the values obtained from the MWH method by considering a constant value for M. It was found that both methods provide dislocation densities in the range of the values calculated from the dislocation strengthening component of the yield strength

  12. Improving our understanding of papillary renal cell carcinoma with integrative genomic analysis.

    Science.gov (United States)

    Modi, Parth K; Singer, Eric A

    2016-04-01

    Papillary renal cell carcinoma (pRCC) is a heterogeneous and incompletely understood histologic subtype of kidney cancer. Recently, authors from The Cancer Genome Atlas Research Network performed a comprehensive molecular characterization of pRCC. Using multiple analytic methods, they identified 4 subgroups of pRCC with varied genotypic anomalies and probabilities of overall survival. This analysis elucidated the differences between type 1 and type 2 pRCC. Furthermore, type 2 pRCC was found to be heterogeneous itself, with at least 3 subtypes with distinct molecular features. This improved characterization and insight about potential driver mutations and altered pathways may lead to the development of more targeted agents and better patient stratification in clinical trials for pRCC. PMID:27162793

  13. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  14. Analysis of production data to improve characterization of in-situ megascopic reservoir permeability

    Energy Technology Data Exchange (ETDEWEB)

    Chugh, S.; Herweijer, J.; Kuppe, F.

    2000-07-01

    The process described in this paper demonstrates how permeabilities derived from production data complement core data and relatively short duration pressure transient analysis. Specifically, the study demonstrates how to interpret core and well test kh (permeability value) relative to production kh; generate effective kh values for a megascopic scale in fluvial environments, and to take into account hydraulic fractures and/or naturally occurring high permeability streaks. It also facilitates upscaling the core plug permeability to the megascopic simulation scale, allowing the reservoir simulation model to be preconditioned, thereby reducing the time required to achieve a history match. The process also demonstrates how a comparison of effective/average permeability values at various scales reveals information about reservoir quality and connectivity, and how interwell megascopic representation of permeability can be improved by using inverted decline curve (IDC) and reciprocal productivity index (RPI) techniques. 11 refs., 12 figs.

  15. Improved Facial-Feature Detection for AVSP via Unsupervised Clustering and Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Lucey Simon

    2003-01-01

    Full Text Available An integral part of any audio-visual speech processing (AVSP system is the front-end visual system that detects facial-features (e.g., eyes and mouth pertinent to the task of visual speech processing. The ability of this front-end system to not only locate, but also give a confidence measure that the facial-feature is present in the image, directly affects the ability of any subsequent post-processing task such as speech or speaker recognition. With these issues in mind, this paper presents a framework for a facial-feature detection system suitable for use in an AVSP system, but whose basic framework is useful for any application requiring frontal facial-feature detection. A novel approach for facial-feature detection is presented, based on an appearance paradigm. This approach, based on intraclass unsupervised clustering and discriminant analysis, displays improved detection performance over conventional techniques.

  16. Improved Facial-Feature Detection for AVSP via Unsupervised Clustering and Discriminant Analysis

    Science.gov (United States)

    Lucey, Simon; Sridharan, Sridha; Chandran, Vinod

    2003-12-01

    An integral part of any audio-visual speech processing (AVSP) system is the front-end visual system that detects facial-features (e.g., eyes and mouth) pertinent to the task of visual speech processing. The ability of this front-end system to not only locate, but also give a confidence measure that the facial-feature is present in the image, directly affects the ability of any subsequent post-processing task such as speech or speaker recognition. With these issues in mind, this paper presents a framework for a facial-feature detection system suitable for use in an AVSP system, but whose basic framework is useful for any application requiring frontal facial-feature detection. A novel approach for facial-feature detection is presented, based on an appearance paradigm. This approach, based on intraclass unsupervised clustering and discriminant analysis, displays improved detection performance over conventional techniques.

  17. Analysis of Entropy Generation for the Performance Improvement of a Tubular Solid Oxide Fuel Cell Stack

    Directory of Open Access Journals (Sweden)

    Vittorio Verda

    2009-03-01

    Full Text Available The aim of the paper is to investigate possible improvements in the design and operation of a tubular solid oxide fuel cell. To achieve this purpose, a CFD model of the cell is introduced. The model includes thermo-fluid dynamics, chemical reactions and electrochemistry. The fluid composition and mass flow rates at the inlet sections are obtained through a finite difference model of the whole stack. This model also provides boundary conditions for the radiation heat transfer. All of these conditions account for the position of each cell within the stack. The analysis of the cell performances is conducted on the basis of the entropy generation. The use of this technique makes it possible to identify the phenomena provoking the main irreversibilities, understand their causes and propose changes in the system design and operation.

  18. Structural Analysis and Improved Design of the Gearbox Casing of a Certain Type of Tracked Vehicle

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xue-sheng; JIA Xiao-ping; CHEN Ya-ning; YU Kui-long

    2011-01-01

    Loads on a gearbox casing of a certain type of tracked vehicle were calculated according to the engine's full load characteristic curve and the worst load condition where the gearbox operated while the tracked vehicle was running, and then stiffness and strength of the casing were analyzed by means of Patran/Nastran software. After a- nalysis, it was found that the casing satisfied the Mises ' yield condition; however, the stress distribution was hetero- geneous, and stresses near the bearing saddle bores of the casing were higher while those in other regions were much less than the allowable stress. For this reason, thicknesses of the casing wall on bearing assembling holes needed in- creasing, while those in other places can decrease. After much structural improving and re-analysis, the optimal casing design was found, and its weight decreased by 5% ; the casing still satisfied the Mises yield criterion and the stress distribution was more homogeneous.

  19. Quality assurance testing of an explosives trace analysis laboratory--further improvements to include peroxide explosives.

    Science.gov (United States)

    Crowson, Andrew; Cawthorne, Richard

    2012-12-01

    The Forensic Explosives Laboratory (FEL) operates within the Defence Science and Technology Laboratory (DSTL) which is part of the UK Government Ministry of Defence (MOD). The FEL provides support and advice to the Home Office and UK police forces on matters relating to the criminal misuse of explosives. During 1989 the FEL established a weekly quality assurance testing regime in its explosives trace analysis laboratory. The purpose of the regime is to prevent the accumulation of explosives traces within the laboratory at levels that could, if other precautions failed, result in the contamination of samples and controls. Designated areas within the laboratory are swabbed using cotton wool swabs moistened with ethanol:water mixture, in equal amounts. The swabs are then extracted, cleaned up and analysed using Gas Chromatography with Thermal Energy Analyser detectors or Liquid Chromatography with triple quadrupole Mass Spectrometry. This paper follows on from two previous published papers which described the regime and summarised results from approximately 14years of tests. This paper presents results from the subsequent 7years setting them within the context of previous results. It also discusses further improvements made to the systems and procedures and the inclusion of quality assurance sampling for the peroxide explosives TATP and HMTD. Monitoring samples taken from surfaces within the trace laboratories and trace vehicle examination bay have, with few exceptions, revealed only low levels of contamination, predominantly of RDX. Analysis of the control swabs, processed alongside the monitoring swabs, has demonstrated that in this environment the risk of forensic sample contamination, assuming all the relevant anti-contamination procedures have been followed, is so small that it is considered to be negligible. The monitoring regime has also been valuable in assessing the process of continuous improvement, allowing sources of contamination transfer into the trace

  20. Development of an improved optical transmission technique for black carbon (BC) analysis

    Science.gov (United States)

    Ballach, J.; Hitzenberger, R.; Schultz, E.; Jaeschke, W.

    A new optical transmission technique for black carbon (BC) analysis was developed to minimize interferences due to scattering effects in filter samples. A standard thermal analysis method (VDI, 1999) is used to link light attenuation by the filter samples to elemental carbon (EC) concentration. Scattering effects are minimized by immersion of the filters in oil of a similar refractive index, as is often done for microscopy purposes. Light attenuation was measured using both a white light source and a red LED of 650 nm. The usual increase in overestimation of BC concentrations with decreasing BC amount in filter samples was found considerably reduced. Some effects of BC properties (e.g. fractal dimension, microstructure and size distribution) on the specific attenuation coefficient BATN, however, are still present for the treated samples. BATN was found close to 1 m 2 g -1 for dry-dispersed industrial BC and 7 m 2 g -1 for nebulized BC. Good agreement was found between the oil immersion, integrating sphere and a polar photometer technique and Mie calculations. The average specific attenuation coefficient of ambient samples in oil varied between 7 and 11 m 2 g -1 for white light and 6 and 9 m 2 g -1 for red light (LED). BATN was found to have much less site variation for the treated than for the untreated samples. The oil immersion technique improved also the correlation with thermally analyzed EC. This new immersion technique therefore presents a considerable improvement over conventional optical transmission techniques and may therefore serve as a simple, fast and cost-effective alternative to thermal methods.

  1. Using uterine activity to improve fetal heart rate variability analysis for detection of asphyxia during labor.

    Science.gov (United States)

    Warmerdam, G J J; Vullings, R; Van Laar, J O E H; Van der Hout-Van der Jagt, M B; Bergmans, J W M; Schmitt, L; Oei, S G

    2016-03-01

    During labor, uterine contractions can cause temporary oxygen deficiency for the fetus. In case of severe and prolonged oxygen deficiency this can lead to asphyxia. The currently used technique for detection of asphyxia, cardiotocography (CTG), suffers from a low specificity. Recent studies suggest that analysis of fetal heart rate variability (HRV) in addition to CTG can provide information on fetal distress. However, interpretation of fetal HRV during labor is difficult due to the influence of uterine contractions on fetal HRV. The aim of this study is therefore to investigate whether HRV features differ during contraction and rest periods, and whether these differences can improve the detection of asphyxia. To this end, a case-control study was performed, using 14 cases with asphyxia that were matched with 14 healthy fetuses. We did not find significant differences for individual HRV features when calculated over the fetal heart rate without separating contractions and rest periods (p  >  0.30 for all HRV features). Separating contractions from rest periods did result in a significant difference. In particular the ratio between HRV features calculated during and outside contractions can improve discrimination between fetuses with and without asphyxia (p  <  0.04 for three out of four ratio HRV features that were studied in this paper). PMID:26862891

  2. Use of Selection Indices Based on Multivariate Analysis for Improving Grain Yield in Rice

    Institute of Scientific and Technical Information of China (English)

    Hossein SABOURI; Babak RABIEI; Maryam FAZLALIPOUR

    2008-01-01

    In order to study selection indices for improving rice grain yield, a cross was made between an Iranian traditional rice (Oryza sativa L.) variety, Tarommahalli and an improved indica rice variety, Khazar in 2006. The traits of the parents (30 plants), F1 (30 plants) and F2 generations (492 individuals) were evaluated at the Rice Research institute of Iran (RRII) during 2007. Heritabilities of the number of panicles per plant, plant height, days to heading and panicle exsertion were greater than that of grain yield. The selection indices were developed using the results of multivariate analysis. To evaluate selection strategies to maximize grain yield, 14 selection indices were calculated based on two methods (optimum and base) and combinations of 12 traits with various economic weights. Results of selection indices showed that selection for grain weight, number of panicles per plant and panicle length by using their phenotypic and/or genotypic direct effects (path coefficient) as economic weights should serve as an effective selection criterion for using either the optimum or base index.

  3. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging

    Science.gov (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.

    2016-08-01

    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  4. Inverse transient radiation analysis in one-dimensional participating slab using improved Ant Colony Optimization algorithms

    Science.gov (United States)

    Zhang, B.; Qi, H.; Ren, Y. T.; Sun, S. C.; Ruan, L. M.

    2014-01-01

    As a heuristic intelligent optimization algorithm, the Ant Colony Optimization (ACO) algorithm was applied to the inverse problem of a one-dimensional (1-D) transient radiative transfer in present study. To illustrate the performance of this algorithm, the optical thickness and scattering albedo of the 1-D participating slab medium were retrieved simultaneously. The radiative reflectance simulated by Monte-Carlo Method (MCM) and Finite Volume Method (FVM) were used as measured and estimated value for the inverse analysis, respectively. To improve the accuracy and efficiency of the Basic Ant Colony Optimization (BACO) algorithm, three improved ACO algorithms, i.e., the Region Ant Colony Optimization algorithm (RACO), Stochastic Ant Colony Optimization algorithm (SACO) and Homogeneous Ant Colony Optimization algorithm (HACO), were developed. By the HACO algorithm presented, the radiative parameters could be estimated accurately, even with noisy data. In conclusion, the HACO algorithm is demonstrated to be effective and robust, which had the potential to be implemented in various fields of inverse radiation problems.

  5. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency.

    Science.gov (United States)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-12-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. PMID:26295438

  6. Prone positioning improves survival in severe ARDS: a pathophysiologic review and individual patient meta-analysis.

    Science.gov (United States)

    Gattinoni, L; Carlesso, E; Taccone, P; Polli, F; Guérin, C; Mancebo, J

    2010-06-01

    Prone positioning has been used for over 30 years in the management of patients with acute respiratory distress syndrome (ARDS). This maneuver has consistently proven capable of improving oxygenation in patients with acute respiratory failure. Several mechanisms can explain this observation, including possible intervening net recruitment and more homogeneously distributed alveolar inflation. It is also progressively becoming clear that prone positioning may reduce the nonphysiological stress and strain associated with mechanical ventilation, thus decreasing the risk of ventilator-induced lung injury, which is known to adversely impact patient survival. The available randomized clinical trials, however, have failed to demonstrate that prone positioning improves the outcomes of patients with ARDS overall. In contrast, the individual patient meta-analysis of the four major clinical trials available clearly shows that with prone positioning, the absolute mortality of severely hypoxemic ARDS patients may be reduced by approximately 10%. On the other hand, all data suggest that long-term prone positioning may expose patients with less severe ARDS to unnecessary complications. PMID:20473258

  7. An improved genetic system for detection and analysis of protein nuclear import signals

    Directory of Open Access Journals (Sweden)

    Derbyshire Stephanie

    2007-01-01

    Full Text Available Abstract Background Nuclear import of proteins is typically mediated by their physical interaction with soluble cytosolic receptor proteins via a nuclear localization signal (NLS. A simple genetic assay to detect active NLSs based on their function in the yeast Saccharomyces cerevisiae has been previously described. In that system, a chimera consisting of a modified bacterial LexA DNA binding domain and the transcriptional activation domain of the yeast Gal4 protein is fused to a candidate NLS. A functional NLS will redirect the chimeric fusion to the yeast cell nucleus and activate transcription of a reporter gene. Results We have reengineered this nuclear import system to expand its utility and tested it using known NLS sequences from adenovirus E1A. Firstly, the vector has been reconstructed to reduce the level of chimera expression. Secondly, an irrelevant "stuffer" sequence from the E. coli maltose binding protein was used to increase the size of the chimera above the passive diffusion limit of the nuclear pore complex. The improved vector also contains an expanded multiple cloning site and a hemagglutinin epitope tag to allow confirmation of expression. Conclusion The alterations in expression level and composition of the fusions used in this nuclear import system greatly reduce background activity in β-galactosidase assays, improving sensitivity and allowing more quantitative analysis of NLS bearing sequences.

  8. Inverse transient radiation analysis in one-dimensional participating slab using improved Ant Colony Optimization algorithms

    International Nuclear Information System (INIS)

    As a heuristic intelligent optimization algorithm, the Ant Colony Optimization (ACO) algorithm was applied to the inverse problem of a one-dimensional (1-D) transient radiative transfer in present study. To illustrate the performance of this algorithm, the optical thickness and scattering albedo of the 1-D participating slab medium were retrieved simultaneously. The radiative reflectance simulated by Monte-Carlo Method (MCM) and Finite Volume Method (FVM) were used as measured and estimated value for the inverse analysis, respectively. To improve the accuracy and efficiency of the Basic Ant Colony Optimization (BACO) algorithm, three improved ACO algorithms, i.e., the Region Ant Colony Optimization algorithm (RACO), Stochastic Ant Colony Optimization algorithm (SACO) and Homogeneous Ant Colony Optimization algorithm (HACO), were developed. By the HACO algorithm presented, the radiative parameters could be estimated accurately, even with noisy data. In conclusion, the HACO algorithm is demonstrated to be effective and robust, which had the potential to be implemented in various fields of inverse radiation problems. -- Highlights: • The ACO-based algorithms were firstly applied to the inverse transient radiation problem. • Three ACO-based algorithms were developed based on the BACO algorithm for continuous domain problem. • HACO shows a robust performance for simultaneous estimation of the radiative properties

  9. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-01-01

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005

  10. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    International Nuclear Information System (INIS)

    The objective of the present research is to perform the separate effect tests and to assess the RELAP5/MOD3.2 code for the analysis of thermal-hydraulic behavior in the reactor coolant system and the improvement of the auditing technology of safety analysis. Three Separate Effect Tests (SETs) are the reflux condensation in the U-tube, the direct contact condensation in the hot-leg and the mixture level buildup in the pressurizer. The experimental data and the empirical correlations are obtained through SETs. On the ases of the three SET works, models in RELAP5 are modified and improved, which are compared with the data. The Korea Standard Nuclear Power Plant (KSNP) are assessed using the modified RELAP5. In the reflux condensation test, the data of heat transfer coefficients and flooding are obtained and the condensation models are modified using the non-iterative model, as results, modified code better predicts the data. In the direct contact condensation test, the data of heat transfer coefficients are obtained for the cocurrent and countercurrent flow between the mixture gas and the water in condition of horizontal stratified flow. Several condensation and friction models are modified, which well predict the present data. In the mixture level test, the data for the mixture level and the onset of water draining into the surge line are obtained. The standard RELAP5 over-predicts the mixture level and the void fraction in the pressurizer. Simple modification of model related to the pool void fraction is suggested. The KSNP is assessed using the standard and the modified RELAP5 resulting from the experimental and code works for the SETs. In case of the pressurizer manway opening with available secondary side of the steam generators, the modified code predicts that the collapsed level in the pressurizer is little accumulated. The presence and location of the opening and the secondary condition of the steam generators have an effect on the coolant inventory. The

  11. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Moon, Young Min; Lee, Dong Won; Lee, Sang Ik; Kim, Eung Soo; Yeom, Keum Soo [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2002-03-15

    The objective of the present research is to perform the separate effect tests and to assess the RELAP5/MOD3.2 code for the analysis of thermal-hydraulic behavior in the reactor coolant system and the improvement of the auditing technology of safety analysis. Three Separate Effect Tests (SETs) are the reflux condensation in the U-tube, the direct contact condensation in the hot-leg and the mixture level buildup in the pressurizer. The experimental data and the empirical correlations are obtained through SETs. On the ases of the three SET works, models in RELAP5 are modified and improved, which are compared with the data. The Korea Standard Nuclear Power Plant (KSNP) are assessed using the modified RELAP5. In the reflux condensation test, the data of heat transfer coefficients and flooding are obtained and the condensation models are modified using the non-iterative model, as results, modified code better predicts the data. In the direct contact condensation test, the data of heat transfer coefficients are obtained for the cocurrent and countercurrent flow between the mixture gas and the water in condition of horizontal stratified flow. Several condensation and friction models are modified, which well predict the present data. In the mixture level test, the data for the mixture level and the onset of water draining into the surge line are obtained. The standard RELAP5 over-predicts the mixture level and the void fraction in the pressurizer. Simple modification of model related to the pool void fraction is suggested. The KSNP is assessed using the standard and the modified RELAP5 resulting from the experimental and code works for the SETs. In case of the pressurizer manway opening with available secondary side of the steam generators, the modified code predicts that the collapsed level in the pressurizer is little accumulated. The presence and location of the opening and the secondary condition of the steam generators have an effect on the coolant inventory. The

  12. Measurement of keff with an improved neutron source multiplication method based on numerical analysis

    International Nuclear Information System (INIS)

    In this work, we developed a numerical analysis-associated experiment method to determine the effective multiplication factor keff, which is difficult to obtain directly from conventional neutron source multiplication (NSM) method. The method is based on the relationship between keff, subcritical multiplication factor ks and external neutron source efficiency Φ* in the subcritical system. On basis of the theoretical analysis, the dependence of ks and Φ* on subcriticality and source position was investigated at the Chinese Fast Burst Reactor-II (CFBR-II). A series of ks were measured by NSM experiments at four subcritical states (keff=0.996, 0.994, 0.991 and 0.986) with the 252Cf neutron source located at different positions (from the system center to outside) at each subcritical states. The Φ* was obtained by Monte-Carlo simulation for each condition. With the measured ks and calculated Φ*, keff of the subcritical system was evaluated with a relative difference of <1% between values obtained by the improved method and by positive period method. Especially, the relative difference of <0.18% with the source located at the system center. (authors)

  13. Scanner Uniformity improvements for radiochromic film analysis with matt reflectance backing

    International Nuclear Information System (INIS)

    Full text: A simple and reproducible method for increasing desktop scanner uniformity for the analysis of radiochromic films is presented. Scanner uniformity, especially in the non-scan direction, for transmission scanning is well known to be problematic for radiochromic film analysis and normally corrections need to be applied. These corrections are dependant on scanner coordinates and dose level applied which complicates dosimetry procedures. This study has highlighted that using reflectance scanning in combination with a matt, white backing material instead of the conventional gloss scanner finish, substantial increases in the scanner uniformity can be achieved within 90% of the scanning area. Uniformity within ±I% over the scanning area for our epsonV700 scanner tested was found. This is compared to within ±3% for reflection scanning with the gloss backing material and within ±4% for transmission scanning. The matt backing material used was simply 5 layers of standard quality white printing paper (80 g/m It was found that 5 layers was the optimal result for backing material however most of the improvements were seen with a minimum of 3 layers. Above 5 layers, no extra benefit was seen. This may eliminate the need to perform scanner corrections for position on the desktop scanners for radiochromic film dosimetry. (author)

  14. Improvement of technological processes by the use of technological efficiency analysis

    Directory of Open Access Journals (Sweden)

    L.A. Dobrzański

    2007-12-01

    Full Text Available Purpose: Technological process is a basic determinant of correctness of industrial company’s functioning on the market. In this connection they should treat with the priority all activities connected with technology, technology management and controlling, that is with their continuous improvement.Design/methodology/approach: The basis for preparing the process analysis model are the indicators of fragmentary and technological efficiency, as well as standardized parameters of the technological process depending on the applied treatment.Findings: Thanks to the appropriate indicators it is possible to identify operations which need to be verified. Although interdisciplinary process control is very complex, it offers objective assessment. The assessment should include the influence of individual parameters on the process and enable good choice of the optimisation type.Practical implications: The process analysis with the use of immaterial parameters based on different types of processing and the design of the technological process involved assessment of technological process efficiency with the use of indicators of operational efficiency.Originality/value: Creating computer applications for calculating individual indicators, as well as final efficiency assessment used for planning optimisation of individual operations

  15. Improvement of the LOCA PSA model using a beat-estimate thermal-hydraulic analysis

    International Nuclear Information System (INIS)

    Probabilistic Safety Assessment (PSA) has been widely used to estimate the overall safety of nuclear power plants (NPP) and it provides base information for risk informed application (RIA) and risk informed regulation (RIR). For the effective and correct use of PSA in RIA/RIR related decision making, the risk estimated by a PSA model should be as realistic as possible. In this work, a best-estimate thermal-hydraulic analysis of loss-of-coolant accidents (LOCAs) for the Hanul Nuclear Units 3 and 4 is first carried out in a systematic way. That is, the behaviors of peak cladding temperature (PCT) were analyzed with various combinations of break sizes, the operating conditions of safety systems, and the operator's action time for aggressive secondary cooling. Thereafter, the results of the thermal-hydraulic analysis have been reflected in the improvement of the PSA model by changing both accident sequences and success criteria of the event trees for the LOCA scenarios.

  16. An improved global analysis of nuclear parton distribution functions including RHIC data

    CERN Document Server

    Eskola, K J; Salgado, C A

    2008-01-01

    We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus collisions, with inclusive high-$p_T$ hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the $\\chi^2$ function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated $\\chi^2$ minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-$p_T$ forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions -- a sector in the nPDFs which has traditionally been very b...

  17. Ultrasound guidance improves the success rate of axillary plexus block: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Qin Qin

    2016-04-01

    Full Text Available ABSTRACT OBJECTIVE: To evaluate the value of real-time ultrasound (US guidance for axillary brachial plexus block (AXB through the success rate and the onset time. METHODS: The meta-analysis was carried out in the Anesthesiology Department of the Second Affiliated Hospital of Soochow University, Suzhou, Jiangsu Province, China. A literature search of Medline, EMBASE, Cochrane database from the years 2004 to 2014 was performed. The literature searches were carried out using medical subject headings and free-text word: "axilla", "axillary", "brachial plexus", "ultrasonography", "ultrasound", "ultrasonics". Two different reviewers carried out the search and evaluated studies independently. RESULTS: Seven randomized controlled trials, one cohort study and three retrospective studies were included. A total of 2042 patients were identified. 1157 patients underwent AXB using US guidance (US group and the controlled group included 885 patients (246 patients using traditional approach (TRAD and 639 patients using nerve stimulation (NS. Our analysis showed that the success rate was higher in the US group compared to the controlled group (90.64% vs. 82.21%, p < 0.00001. The average time to perform the block and the onset of sensory time were shorter in the US group than the controlled group. CONCLUSION: The present study demonstrated that the real-time ultrasound guidance for axillary brachial plexus block improves the success rate and reduce the mean time to onset of anesthesia and the time of block performance.

  18. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  19. Improved micro x-ray fluorescence spectrometer for light element analysis

    International Nuclear Information System (INIS)

    Since most available micro x-ray fluorescence (micro-XRF) spectrometers operate in air, which does not allow the analysis of low-Z elements (Z≤14), a special micro-XRF spectrometer has been designed to extend the analytical range down to light elements (Z≥6). It offers improved excitation and detection conditions necessary for light element analysis. To eliminate absorption of the exciting and fluorescent radiation, the system operates under vacuum condition. Sample mapping is automated and controlled by specialized computer software developed for this spectrometer. Several different samples were measured to test and characterize the spectrometer. The spot size has been determined by scans across a 10 μm Cu wire which resulted in a full width at half maximum of 31 μm for Mo Kα line (17.44 keV) and 44 μm effective beam size for the Cu K edge and 71 μm effective beam size for the Cu L edge. Lower limits of detection in the picogram range for each spot (or μg/cm2) were obtained by measuring various thin metal foils under different conditions. Furthermore, detection limits in the parts per million range were found measuring NIST621 standard reference material. Area scans of a microscopic laser print and NaF droplet were performed to show mapping capabilities.

  20. Methylphenidate on Cognitive Improvement in Patients with Traumatic Brain Injury: A Meta-Analysis.

    Science.gov (United States)

    Huang, Chi-Hsien; Huang, Chia-Chen; Sun, Cheuk-Kwan; Lin, Gong-Hong; Hou, Wen-Hsuan

    2016-01-01

    Although methylphenidate has been used as a neurostimulant to treat patients with attention deficit hyperactivity disorder, its therapeutic role in the psychomotor or cognitive recovery of patients with traumatic brain injuries (TBIs) in both intensive care and rehabilitation settings has not been adequately explored. To address this issue, this meta-analysis searched the available electronic databases using the key words "methylphenidate", "brain injuries", "head injuries", and "traumatic brain injury". Analysis of the ten double-blind RCTs demonstrated significant benefit in using methylphenidate for enhancing vigilance-associated attention (i.e., selective, sustained, and divided attention) in patients with TBIs (standardized mean difference: 0.45, 95% CI: 0.10 to 0.79), especially in sustained attention (standardized mean difference: 0.66, 95% CI: 0.22 to 1.10). However, no significant positive impact was noted on the facilitation of memory or processing speed. More studies on the efficacy and safety of methylphenidate for the cognitive improvement of patients with TBIs are warranted. PMID:26951094

  1. Error analysis and system improvements in phase-stepping methods for photoelasticity

    International Nuclear Information System (INIS)

    In the past automated photoelasticity has been demonstrated to be one of the most efficient technique for determining the complete state of stress in a 3-D component. However, the measurement accuracy, which depends on many aspects of both the theoretical foundations and experimental procedures, has not been studied properly. The objective of this thesis is to reveal the intrinsic properties of the errors, provide methods for reducing them and finally improve the system accuracy. A general formulation for a polariscope with all the optical elements in an arbitrary orientation was deduced using the method of Mueller Matrices. The deduction of this formulation indicates an inherent connectivity among the optical elements and gives a knowledge of the errors. In addition, this formulation also shows a common foundation among the photoelastic techniques, consequently, these techniques share many common error sources. The phase-stepping system proposed by Patterson and Wang was used as an exemplar to analyse the errors and provide the proposed improvements. This system can be divided into four parts according to their function, namely the optical system, light source, image acquisition equipment and image analysis software. All the possible error sources were investigated separately and the methods for reducing the influence of the errors and improving the system accuracy are presented. To identify the contribution of each possible error to the final system output, a model was used to simulate the errors and analyse their consequences. Therefore the contribution to the results from different error sources can be estimated quantitatively and finally the accuracy of the systems can be improved. For a conventional polariscope, the system accuracy can be as high as 99.23% for the fringe order and the error less than 5 degrees for the isoclinic angle. The PSIOS system is limited to the low fringe orders. For a fringe order of less than 1.5, the accuracy is 94.60% for fringe

  2. Study on improvement of reactor physics analysis method for FBRs with various core concept (2)

    International Nuclear Information System (INIS)

    Investigation was made on the following three themes as a part of the improvement of reactor physics analysis method for FBR with various core concepts. Part 1: Investigations on Improvement of Neutron Spectrum Evaluation by the Use of Co-variance Matrices and Bias Corrections. In order to improve the neutron spectrum unfolding method used in the experimental fast reactor JOYO, investigation was made on the bias corrections to the initial neutron spectrum and error evaluation of nuclear data with the co-variance matrices. The error estimation was done by accumulating each bias correction factor and the co-variance matrix. It was concluded that the accumulated error for the initial neutron spectrum is relatively small, and a considerable improvement was achieved by the use of bias corrections. Part 2: Evaluation of Neutron Streaming in Gas Cooled Fast Reactors by the Use of Monte Carlo Method. As a part of investigations on the evaluation of the anisotropic diffusion coefficients for gas cooled fast reactors, a new tally function was added to a Monte Carlo code so that the neutron streaming can be calculated with heterogeneous core configurations. It was found that the neutron streaming becomes larger when the heterogeneous model was used. The tendency was more distinct in lower energy range. The same types of comparison was also done for the difference of core calculation models and the transport/diffusion theory. The final result shows that the transport/diffusion error has positive values in higher energy range, and the heterogeneous/homogeneous error has negative values in lower energy range. Part 3: Investigation on the Calculation Method for Nuclear Converters with Neutron Moderators. A new calculation system which can deals with the target assemblies with neutron moderators was proposed. This concept has been investigated as a device to achieve high conversion rate for long life fission products. It was concluded that the characteristics method is ideal

  3. AB032. A systematic review (meta-analysis): low energy shock wave improves erectile function

    Science.gov (United States)

    Lue, Tom F.

    2016-01-01

    Background As a novel therapeutic method for erectile dysfunction (ED), the Low Energy Shock Wave (LESW) has been applied in the clinical setting recently. We feel that a summary of the current literature and a systematic review to evaluate the therapeutic efficacy of LESW for ED would be helpful for physicians who are interested in using this modality to treat patients. Methods A comprehensive search of the PubMed and EMBASE databases to November 2015 was performed. Studies reporting on patients with ED treated by LESW were included. The international index of erectile function (IIEF) score and Erection Hardness Score (EHS) were the most commonly used tools to evaluate the therapeutic efficacy of LESW. Results There were 14 studies including 806 patients from 2005 to 2015. Seven of the 14 studies were randomized controlled trials (RCTs). In these studies, the setup parameters of LESW and the protocols of treatment were in variation. The meta-analysis revealed that LESW could improve IIEF score (MD =2.00; 95% CI, 0.99–3.00; P<0.0001) and EHS score (RD =0.16; 95% CI, 0.03–0.28; P=0.01) significantly. The therapeutic efficacy could last at least three months. The patients with mild-moderate ED or without comorbidities had better therapeutic efficacy after treatment than patients with more severe ED or comorbidities. The energy density of LESW treatment was closely related to its clinical outcome, especially on IIEF improvement. More frequent treatment or longer treatment course did not improve IIEF score. Conclusions Studies of LESW for ED patients increased sharply in recent years. Most of these studies presented encouraging results, regardless of variation in LESW setup parameters or treatment protocols. These studies suggest that LESW could significantly improve the IIEF and EHS score of ED patients. The publication of robust evidence from additional randomized controlled trials and longer-term follow-up would provide us more confidence regarding utilization

  4. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  5. Improvement of core effective thermal conductivity model of GAMMA+ code based on CFD analysis

    International Nuclear Information System (INIS)

    Highlights: • We assessed the core effective thermal conductivity (ETC) model of GAMMA+ code. • The analytical model of GAMMA+ code was compared with the result of CFD analysis. • Effects of material property of composite and geometric configuration were studied. • The GAMMA+ model agreed with the CFD result when the fuel gap is ignored. • The GAMMA+ model was improved by the ETC model of fuel compact including fuel gap. - Abstract: The GAMMA+ code has been developed for the thermo-fluid and safety analyses of a high temperature gas-cooled reactor (HTGR). In order to calculate the core effective thermal conductivity, this code adopts a heterogeneous model derived from the Maxwell’s theory that accounts for three distinct materials in a fuel block of the reactor core. In this model, the fuel gap is neglected since the gap thickness is quite small. In addition, the configuration of the fuel block is assumed to be homogeneous, and the volume fraction and material properties of each component are taken into account. In the accident condition, the conduction and radiation are major heat transfer mechanism. Therefore, the core effective thermal conductivity model should be validated in order to estimate the heat transfer in the core appropriately. In this regard, the objective of this study is to validate the core effective thermal conductivity model of the GAMMA+ code by a computational fluid dynamics (CFD) analysis using a commercial CFD code, CFX-13. The effects of the temperature condition, material property and geometric modeling on the core effective thermal conductivity were investigated. When the fuel gap is not modeled in the CFD analysis, the result of the GAMMA+ code shows a good agreement with the CFD result. However, when the fuel gap is modeled, the GAMMA+ model overestimates the core effective thermal conductivity considerably for all cases. This is because of the increased thermal resistance by the fuel gap which is not taken into account in

  6. Improved analysis of picomole quantities of lithium, sodium, and potassium in biological fluids.

    Science.gov (United States)

    Shalmi, M; Kibble, J D; Day, J P; Christensen, P; Atherton, J C

    1994-10-01

    The analysis of picomolar lithium, sodium, and potassium by electrothermal atomic absorption spectrophotometry was studied using a Perkin-Elmer Zeeman 3030 spectrophotometer. With ordinary pyrolytically coated graphite tubes, a number of interference effects associated with the sample matrix were observed. In particular, the lithium and potassium absorbance signal was depressed by chloride, an effect shown to be dependent on the preatomization heating. When an in situ tantalum-coated atomization surface was used, matrix interferences observed in lithium and potassium analyses were abolished, and the linear range for the potassium assay was extended. Technical difficulties encountered during sodium analysis at the primary wavelength were effectively circumvented by analysis at a less-sensitive wavelength (303.3 nm), at which tantalum coating also prevented significant chloride interference. The improved microanalyses were employed to reevaluate the handling of lithium, sodium, and potassium along the proximal convoluted tubule (PCT) of the anesthetized rat. The average tubular fluid-to-plasma concentration ratios for lithium [(TF/P)Li] and sodium [(TF/P)Na] were 1.13 +/- 0.08, n = 26, and 0.99 +/- 0.07 (n = 26), respectively. The tubular fluid-to-plasma ultrafiltrate concentration ratio for potassium [(TF/UF)K] was 1.09 +/- 0.05 (n = 13). Ratios did not change significantly with puncture site along the PCT for any of the ions. (TF/P)Li and (TF/UF)K were significantly greater than (TF/P)Na, indicating that lithium and potassium reabsorption do not directly parallel sodium reabsorption in the PCT. PMID:7943365

  7. Improving configuration management of thermalhydraulic analysis by automating the linkage between pipe geometry and plant idealization

    International Nuclear Information System (INIS)

    All safety analysis codes require some representation of actual plant data as a part of their input. Such representations, referred to at Point Lepreau Generating Station (PLGS) as plant idealizations, may include piping layout, orifice, pump or valve opening characteristics, boundary conditions of various sorts, reactor physics parameters, etc. As computing power increases, the numerical capabilities of thermalhydraulic analysis tools become more sophisticated, requiring more detailed assessments, and consequently more complex and complicated idealizations of the system models. Thus, a need has emerged to create a precise plant model layout in electronic form which ensures a realistic representation of the plant systems, and form which analytical approximations of any chosen degree of accuracy may be created. The benefits of this process are twofold. Firstly, the job of developing a plant idealization is made simpler, and therefore is cheaper for the utility. More important however, are the improvements in documentation and reproducibility that this process imparts to the resultant idealization. Just as the software that performs the numerical operations on the input data must be subject to verification/validation, equally robust measures must be taken to ensure that these software operations are being applied to valid idealizations, that are formally documented. Since the CATHENA Code is one of the most important thermalhydraulic code used for safety analysis at PLGS the main effort was directed towards the systems plant models for this code. This paper reports the results of the work carried on at PLGS and ANSL to link the existing piping data base to the actual CATHENA plant idealization. An introduction to the concept is given first, followed by a description of the databases, and the supervisory tool which manages the data, and associated software. An intermediate code, which applied some thermalhydraulic rules to the data, and translated the resultant data

  8. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  9. COLD-PCR enhanced melting curve analysis improves diagnostic accuracy for KRAS mutations in colorectal carcinoma

    Directory of Open Access Journals (Sweden)

    Joseph Loren

    2010-11-01

    -based melting curve analysis. This assay significantly improved diagnostic accuracy compared to traditional PCR and direct sequencing.

  10. Analysis and improvement of data-set level file distribution in Disk Pool Manager

    International Nuclear Information System (INIS)

    Of the three most widely used implementations of the WLCG Storage Element specification, Disk Pool Manager[1, 2] (DPM) has the simplest implementation of file placement balancing (StoRM doesn't attempt this, leaving it up to the underlying filesystem, which can be very sophisticated in itself). DPM uses a round-robin algorithm (with optional filesystem weighting), for placing files across filesystems and servers. This does a reasonable job of evenly distributing files across the storage array provided to it. However, it does not offer any guarantees of the evenness of distribution of that subset of files associated with a given 'dataset' (which often maps onto a 'directory' in the DPM namespace (DPNS)). It is useful to consider a concept of 'balance', where an optimally balanced set of files indicates that the files are distributed evenly across all of the pool nodes. The best case performance of the round robin algorithm is to maintain balance, it has no mechanism to improve balance. In the past year or more, larger DPM sites have noticed load spikes on individual disk servers, and suspected that these were exacerbated by excesses of files from popular datasets on those servers. We present here a software tool which analyses file distribution for all datasets in a DPM SE, providing a measure of the poorness of file location in this context. Further, the tool provides a list of file movement actions which will improve dataset-level file distribution, and can action those file movements itself. We present results of such an analysis on the UKI-SCOTGRID-GLASGOW Production DPM.

  11. Improving the Efficiency and Ease of Healthcare Analysis Through Use of Data Visualization Dashboards.

    Science.gov (United States)

    Stadler, Jennifer G; Donlon, Kipp; Siewert, Jordan D; Franken, Tessa; Lewis, Nathaniel E

    2016-06-01

    The digitization of a patient's health record has profoundly impacted medicine and healthcare. The compilation and accessibility of medical history has provided clinicians an unprecedented, holistic account of a patient's conditions, procedures, medications, family history, and social situation. In addition to the bedside benefits, this level of information has opened the door for population-level monitoring and research, the results of which can be used to guide initiatives that are aimed at improving quality of care. Cerner Corporation partners with health systems to help guide population management and quality improvement projects. With such an enormous and diverse client base-varying in geography, size, organizational structure, and analytic needs-discerning meaning in the data and how they fit with that particular hospital's goals is a slow, difficult task that requires clinical, statistical, and technical literacy. This article describes the development of dashboards for efficient data visualization at the healthcare facility level. Focusing on two areas with broad clinical importance, sepsis patient outcomes and 30-day hospital readmissions, dashboards were developed with the goal of aggregating data and providing meaningful summary statistics, highlighting critical performance metrics, and providing easily digestible visuals that can be understood by a wide range of personnel with varying levels of skill and areas of expertise. These internal-use dashboards have allowed associates in multiple roles to perform a quick and thorough assessment on a hospital of interest by providing the data to answer necessary questions and to identify important trends or opportunities. This automation of a previously manual process has greatly increased efficiency, saving hours of work time per hospital analyzed. Additionally, the dashboards have standardized the analysis process, ensuring use of the same metrics and processes so that overall themes can be compared across

  12. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    Energy Technology Data Exchange (ETDEWEB)

    Miwa, Kenta, E-mail: kenta5710@gmail.com [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Inubushi, Masayuki, E-mail: inubushi@med.kawasaki-m.ac.jp [Department of Nuclear Medicine, Kawasaki Medical School, 577 Matsushima Kurashiki, Okayama 701-0192 (Japan); Wagatsuma, Kei, E-mail: kei1192@hotmail.co.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Nagao, Michinobu, E-mail: minagao@radiol.med.kyushu-u.ac.jp [Department of Molecular Imaging and Diagnosis, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Murata, Taisuke, E-mail: taisuke113@gmail.com [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koyama, Masamichi, E-mail: masamichi.koyama@jfcr.or.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koizumi, Mitsuru, E-mail: mitsuru@jfcr.or.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Sasaki, Masayuki, E-mail: msasaki@hs.med.kyushu-u.ac.jp [Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan)

    2014-04-15

    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUV{sub max}) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUV{sub max} and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUV{sub max}, the difference did not reach statistical significance. Tumor size correlated significantly with SUV{sub max} (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUV{sub max} or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUV{sub max} and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy.

  13. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    International Nuclear Information System (INIS)

    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUVmax) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUVmax and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUVmax, the difference did not reach statistical significance. Tumor size correlated significantly with SUVmax (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUVmax or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUVmax and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy

  14. Analysis of Stakeholder's Behaviours for an Improved Management of an Agricultural Coastal Region in Oman

    Science.gov (United States)

    Khatri, Ayisha Al; Jens, Grundmann; der Weth Rüdiger, van; Niels, Schütze

    2015-04-01

    differences exist between groups on how to achieve this improvement, since farmers prefer management interventions operating more on the water resources side while decision makers support measures for a better management on the water demand side. Furthermore, the opinions within single groups are sometimes contradicting for several management interventions. The use of more advanced statistical methods like discriminant analysis or Bayesian network allow for identifying factors and drivers to explain these differences. Both approaches, will help to understand stakeholder's behaviours and to evaluate the implementation potential of several management interventions. Keywords IWRM, Stakeholder participation, field survey, statistical analysis, Oman

  15. An improved state-parameter analysis of ecosystem models using data assimilation

    Science.gov (United States)

    Chen, M.; Liu, S.; Tieszen, L.L.; Hollinger, D.Y.

    2008-01-01

    Much of the effort spent in developing data assimilation methods for carbon dynamics analysis has focused on estimating optimal values for either model parameters or state variables. The main weakness of estimating parameter values alone (i.e., without considering state variables) is that all errors from input, output, and model structure are attributed to model parameter uncertainties. On the other hand, the accuracy of estimating state variables may be lowered if the temporal evolution of parameter values is not incorporated. This research develops a smoothed ensemble Kalman filter (SEnKF) by combining ensemble Kalman filter with kernel smoothing technique. SEnKF has following characteristics: (1) to estimate simultaneously the model states and parameters through concatenating unknown parameters and state variables into a joint state vector; (2) to mitigate dramatic, sudden changes of parameter values in parameter sampling and parameter evolution process, and control narrowing of parameter variance which results in filter divergence through adjusting smoothing factor in kernel smoothing algorithm; (3) to assimilate recursively data into the model and thus detect possible time variation of parameters; and (4) to address properly various sources of uncertainties stemming from input, output and parameter uncertainties. The SEnKF is tested by assimilating observed fluxes of carbon dioxide and environmental driving factor data from an AmeriFlux forest station located near Howland, Maine, USA, into a partition eddy flux model. Our analysis demonstrates that model parameters, such as light use efficiency, respiration coefficients, minimum and optimum temperatures for photosynthetic activity, and others, are highly constrained by eddy flux data at daily-to-seasonal time scales. The SEnKF stabilizes parameter values quickly regardless of the initial values of the parameters. Potential ecosystem light use efficiency demonstrates a strong seasonality. Results show that the

  16. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    Science.gov (United States)

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  17. Experimental study and mechanism analysis of modified limestone by red mud for improving desulfurization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongtao; Han, Kuihua; Niu, Shengli; Lu, Chunmei; Liu, Mengqi; Li, Hui [Shandong Univ., Jinan (China). School of Energy and Power Engineering

    2013-07-01

    Red mud is a type of solid waste generated during alumina production from bauxite, and how to dispose and utilize red mud in a large scale is yet a question with no satisfied answer. This paper attempts to use red mud as a kind of additive to modify the limestone. The enhancement of the sulfation reaction of limestone by red mud (two kinds of Bayer process red mud and one kind of sintering process red mud) are studied by a tube furnace reactor. The calcination and sulfation process and kinetics are investigated in a thermogravimetric (TG) analyzer. The results show that red mud can effectively improve the desulfurization performance of limestone in the whole temperature range (1,073-1,373K). Sulfur capacity of limestone (means quality of SO{sub 2} which can be retained by 100mg of limestone) can be increased by 25.73, 7.17 and 15.31% while the utilization of calcium can be increased from 39.68 to 64.13%, 60.61 and 61.16% after modified by three kinds of red mud under calcium/metallic element (metallic element described here means all metallic elements which can play a catalytic effect on the sulfation process, including the Na, K, Fe, Ti) ratio being 15, at the temperature of 1,173K. The structure of limestone modified by red mud is interlaced and tridimensional which is conducive to the sulfation reaction. The phase composition analysis measured by XRD of modified limestone sulfated at high temperature shows that there are correspondingly more sulphates for silicate and aluminate complexes of calcium existing in the products. Temperature, calcium/metallic element ratio and particle diameter are important factors as for the sulfation reaction. The optimum results can be obtained as calcium/metallic element ratio being 15. Calcination characteristic of limestone modified by red mud shows a migration to lower temperature direction. The enhancement of sulfation by doping red mud is more pronounced once the product layer has been formed and consequently the promoting

  18. Rehabilitation Interventions for Improving Social Participation After Stroke: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Obembe, Adebimpe O; Eng, Janice J

    2016-05-01

    Background Despite the fact that social participation is considered a pivotal outcome of a successful recovery after stroke, there has been little attention on the impact of activities and services on this important domain.Objective To present a systematic review and meta-analysis from randomized controlled trials (RCTs) on the effects of rehabilitation interventions on social participation after stroke.Methods A total of 8 electronic databases were searched for relevant RCTs that evaluated the effects of an intervention on the outcome of social participation after stroke. Reference lists of selected articles were hand searched to identify further relevant studies. The methodological quality of the studies was assessed using the Physiotherapy Evidence Database Scale. Standardized mean differences (SMDs) and confidence intervals (CIs) were estimated using fixed- and random-effect models.Results In all, 24 RCTs involving 2042 stroke survivors were identified and reviewed, and 21 were included in the meta-analysis. There was a small beneficial effect of interventions that utilized exercise on social participation (10 studies; SMD = 0.43; 95% CI = 0.09, 0.78;P= .01) immediately after the program ended. Exercise in combination with other interventions (13 studies; SMD = 0.34; 95% CI = 0.10, 0.58;P= .006) also resulted in beneficial effects. No significant effect was observed for interventions that involved support services over 9 studies (SMD = 0.09 [95% CI = -0.04, 0.21];I(2)= 0%;P= .16).Conclusions The included studies provide evidence that rehabilitation interventions may be effective in improving social participation after stroke, especially if exercise is one of the components. PMID:26223681

  19. Improving practices in nanomedicine through near real-time pharmacokinetic analysis

    Science.gov (United States)

    Magafia, Isidro B.

    More than a decade into the development of gold nanoparticles, with multiple clinical trials underway, ongoing pre-clinical research continues towards better understanding in vivo interactions. The goal is treatment optimization through improved best practices. In an effort to collect information for healthcare providers enabling informed decisions in a relevant time frame, instrumentation for real-time plasma concentration (multi-wavelength photoplethysmography) and protocols for rapid elemental analysis (energy dispersive X-Ray fluorescence) of biopsied tumor tissue have been developed in a murine model. An initial analysis, designed to demonstrate the robust nature and utility of the techniques, revealed that area under the bioavailability curve (AUC) alone does not currently inform tumor accumulation with a high degree of accuracy (R2=0.56), marginally better than injected dose (R2=0.46). This finding suggests that the control of additional experimental and physiological variables (chosen through modeling efforts) may yield more predictable tumor accumulation. Subject core temperature, blood pressure, and tumor perfusion are evaluated relative to particle uptake in a murine tumor model. New research efforts are also focused on adjuvant therapies that are employed to modify circulation parameters, including the AUC, of nanorods and gold nanoshells. Preliminary studies demonstrated a greater than 300% increase in average AUC using a reticuloendothelial blockade agent versus control groups. Given a better understanding of the relative importance of the physiological factors that influence rates of tumor accumulation, a set of experimental best practices is presented. This dissertation outlines the experimental protocols conducted, and discusses the real-world needs discovered and how these needs became specifications of developed protocols.

  20. Cofactor modification analysis: a computational framework to identify cofactor specificity engineering targets for strain improvement.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Chung, Bevan Kai-Sheng; Liu, Chengcheng; Kim, Seon-Won; Lee, Dong-Yup

    2013-12-01

    Cofactors, such as NAD(H) and NADP(H), play important roles in energy transfer within the cells by providing the necessary redox carriers for a myriad of metabolic reactions, both anabolic and catabolic. Thus, it is crucial to establish the overall cellular redox balance for achieving the desired cellular physiology. Of several methods to manipulate the intracellular cofactor regeneration rates, altering the cofactor specificity of a particular enzyme is a promising one. However, the identification of relevant enzyme targets for such cofactor specificity engineering (CSE) is often very difficult and labor intensive. Therefore, it is necessary to develop more systematic approaches to find the cofactor engineering targets for strain improvement. Presented herein is a novel mathematical framework, cofactor modification analysis (CMA), developed based on the well-established constraints-based flux analysis, for the systematic identification of suitable CSE targets while exploring the global metabolic effects. The CMA algorithm was applied to E. coli using its genome-scale metabolic model, iJO1366, thereby identifying the growth-coupled cofactor engineering targets for overproducing four of its native products: acetate, formate, ethanol, and lactate, and three non-native products: 1-butanol, 1,4-butanediol, and 1,3-propanediol. Notably, among several target candidates for cofactor engineering, glyceraldehyde-3-phosphate dehydrogenase (GAPD) is the most promising enzyme; its cofactor modification enhanced both the desired product and biomass yields significantly. Finally, given the identified target, we further discussed potential mutational strategies for modifying cofactor specificity of GAPD in E. coli as suggested by in silico protein docking experiments. PMID:24372035

  1. Gaining improved chemical composition by exploitation of Compton-to-Rayleigh intensity ratio in XRF analysis.

    Science.gov (United States)

    Hodoroaba, Vasile-Dan; Rackwitz, Vanessa

    2014-07-15

    The high specificity of the coherent (Rayleigh), as well as incoherent (Compton) X-ray scattering to the mean atomic number of a specimen to be analyzed by X-ray fluorescence (XRF), is exploited to gain more information on the chemical composition. Concretely, the evaluation of the Compton-to-Rayleigh intensity ratio from XRF spectra and its relation to the average atomic number of reference materials via a calibration curve can reveal valuable information on the elemental composition complementary to that obtained from the reference-free XRF analysis. Particularly for matrices of lower mean atomic numbers, the sensitivity of the approach is so high that it can be easily distinguished between specimens of mean atomic numbers differing from each other by 0.1. Hence, the content of light elements which are "invisible" for XRF, particularly hydrogen, or of heavier impurities/additives in light materials can be calculated "by difference" from the scattering calibration curve. The excellent agreement between such an experimental, empirical calibration curve and a synthetically generated one, on the basis of a reliable physical model for the X-ray scattering, is also demonstrated. Thus, the feasibility of the approach for given experimental conditions and particular analytical questions can be tested prior to experiments with reference materials. For the present work a microfocus X-ray source attached on an SEM/EDX (scanning electron microscopy/energy dispersive X-ray spectroscopy) system was used so that the Compton-to-Rayleigh intensity ratio could be acquired with EDX spectral data for improved analysis of the elemental composition. PMID:24950635

  2. Improved machine learning method for analysis of gas phase chemistry of peptides

    Directory of Open Access Journals (Sweden)

    Ahn Natalie

    2008-12-01

    Full Text Available Abstract Background Accurate peptide identification is important to high-throughput proteomics analyses that use mass spectrometry. Search programs compare fragmentation spectra (MS/MS of peptides from complex digests with theoretically derived spectra from a database of protein sequences. Improved discrimination is achieved with theoretical spectra that are based on simulating gas phase chemistry of the peptides, but the limited understanding of those processes affects the accuracy of predictions from theoretical spectra. Results We employed a robust data mining strategy using new feature annotation functions of MAE software, which revealed under-prediction of the frequency of occurrence in fragmentation of the second peptide bond. We applied methods of exploratory data analysis to pre-process the information in the MS/MS spectra, including data normalization and attribute selection, to reduce the attributes to a smaller, less correlated set for machine learning studies. We then compared our rule building machine learning program, DataSqueezer, with commonly used association rules and decision tree algorithms. All used machine learning algorithms produced similar results that were consistent with expected properties for a second gas phase mechanism at the second peptide bond. Conclusion The results provide compelling evidence that we have identified underlying chemical properties in the data that suggest the existence of an additional gas phase mechanism for the second peptide bond. Thus, the methods described in this study provide a valuable approach for analyses of this kind in the future.

  3. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Analysis of Factors Affecting Its Performance

    Science.gov (United States)

    Perry, Bruce A.; Anderson, Molly S.

    2015-01-01

    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station Water Processor Assembly to form a complete water recovery system for future missions. A preliminary chemical process simulation was previously developed using Aspen Custom Modeler® (ACM), but it could not simulate thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. This paper describes modifications to the ACM simulation of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version can be used to model thermal startup and predicts the total energy consumption of the CDS. The simulation has been validated for both NaC1 solution and pretreated urine feeds and no longer requires retuning when operating parameters change. The simulation was also used to predict how internal processes and operating conditions of the CDS affect its performance. In particular, it is shown that the coefficient of performance of the thermoelectric heat pump used to provide heating and cooling for the CDS is the largest factor in determining CDS efficiency. Intrastage heat transfer affects CDS performance indirectly through effects on the coefficient of performance.

  4. Network Analysis of Force Concept Inventory Responses to Improve Diagnostic Utility

    Science.gov (United States)

    Brewe, Eric; Bruun, Jesper

    2015-04-01

    The Force Concept Inventory (FCI) is a diagnostic instrument designed to investigate students' understanding of Newtonian Mechanics and is widely used in Physics Education Research. One of the strengths of the FCI is that the distractors are drawn from student conceptions based in their experiences. The distractors chosen are often more informative about student's understanding as they identify the particular nature of students' alternative conceptions. We propose a network based analysis of the FCI which will enhance the utility of the FCI as a diagnostic tool for identifying student conceptions. In this approach, student responses are treated as a bipartite network which is then projected into two networks - students and responses. The response network includes all responses that are shared among students. We use the LANS backbone extraction algorithm to identify patterns in student responses. We use community detection algorithms on the backbone networks to identify clusters of common responses which map to models held by students, for example, ``force is needed for movement'' and ``the active agent uses the most force.'' This method has utility across a variety of instruments and could be used to improve instruction by providing in-depth knowledge of student conceptions. Supported in part by NSF Grant #PHY 134424.

  5. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  6. Structure analysis of interstellar clouds: I. Improving the Delta-variance method

    CERN Document Server

    Ossenkopf, V; Stutzki, J

    2008-01-01

    The Delta-variance analysis, has proven to be an efficient and accurate method of characterising the power spectrum of interstellar turbulence. The implementation presently in use, however, has several shortcomings. We propose and test an improved Delta-variance algorithm for two-dimensional data sets, which is applicable to maps with variable error bars and which can be quickly computed in Fourier space. We calibrate the spatial resolution of the Delta-variance spectra. The new Delta-variance algorithm is based on an appropriate filtering of the data in Fourier space. It allows us to distinguish the influence of variable noise from the actual small-scale structure in the maps and it helps for dealing with the boundary problem in non-periodic and/or irregularly bounded maps. We try several wavelets and test their spatial sensitivity using artificial maps with well known structure sizes. It turns out that different wavelets show different strengths with respect to detecting characteristic structures and spectr...

  7. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  8. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    Science.gov (United States)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  9. Improved target detection and bearing estimation utilizing fast orthogonal search for real-time spectral analysis

    International Nuclear Information System (INIS)

    The problem of target detection and tracking in the ocean environment has attracted considerable attention due to its importance in military and civilian applications. Sonobuoys are one of the capable passive sonar systems used in underwater target detection. Target detection and bearing estimation are mainly obtained through spectral analysis of received signals. The frequency resolution introduced by current techniques is limited which affects the accuracy of target detection and bearing estimation at a relatively low signal-to-noise ratio (SNR). This research investigates the development of a bearing estimation method using fast orthogonal search (FOS) for enhanced spectral estimation. FOS is employed in this research in order to improve both target detection and bearing estimation in the case of low SNR inputs. The proposed methods were tested using simulated data developed for two different scenarios under different underwater environmental conditions. The results show that the proposed method is capable of enhancing the accuracy for target detection as well as bearing estimation especially in cases of a very low SNR

  10. An improved system for atmospheric analysis of volatile organic compounds including monoterpenes

    Science.gov (United States)

    Hopkins, J. R.; Jones, C. E.; Lewis, A. C.

    2009-04-01

    A dual channel gas chromatograph system with flame ionisation detectors has been used extensively for the analysis of volatile organic compounds (VOCs) in the atmosphere. The instrument was first used during the North Atlantic Marine Boundary Layer Experiment (NAMBLEX) at Mace Head, Ireland in 2002 and has since been involved in many field campaigns including the ACCENT OVOC intercomparison at the SAPHIR atmospheric simulation chamber in Juelich, Germany in 2006. The system has continued to be adapted and improved to include measurements of selected monoterpenes (a potentially important class of biogenic VOCs which are emitted from vegetation) without any significant loss of resolution of the other VOCs measured. Here we present the first ambient air monoterpene measurements from this instrument which were made during the Oxidant and Particle Photochemical Processes above a South-East Asian tropical rainforest (OP3) campaign in Danum Valley, Borneo in 2008. The monoterpenes measured were alpha-pinene, camphene, 3-carene, gamma-terpinene and limonene. We compare the relative concentrations and diurnal profiles of the different monoterpene species and other biogenic VOCs including isoprene, in order to gain insight into factors which affect their emission rates and their potential impact on photochemical processes within the boundary layer.

  11. Analysis and improvement for a linearized seafloor elastic parameter inversion method

    Science.gov (United States)

    Liu, Yangting; Liu, Xuewei; Ning, Hongxiao

    2016-05-01

    AVO inversion is an effective seismic exploration method to predict elastic parameters. In this paper, we review and analyze the linearized AVO inversion method previously published for seafloor elastic parameters, and present a modification strategy. Before the linearized inversion is performed, a proper near-angle range in which the relationship between the reflection coefficient and sine-squared incidence angle is linear needs to be provided. However, the near-angle range is determined by the elastic parameters which are to be estimated by inversion. Therefore, only an approximated value of the near-angle range can be provided for the linearized inversion. Model tests show that a too large near-angle range may cause inversion fault, and a too small near-angle range may cause unreliable estimation. Further analysis shows that the estimation stability can be further improved even though the linearized inversion is performed under an exact near-angle range. To mitigate the strong dependence on the near-angle range, we use the seafloor elastic parameters estimated from the linearized method as the initial model for an unconstrained optimization method. Compared with the previously published method, the modified method is more robust to noisy data and shows less dependence on the near-angle range.

  12. Fluid Analysis and Improved Structure of an ATEG Heat Exchanger Based on Computational Fluid Dynamics

    Science.gov (United States)

    Tang, Z. B.; Deng, Y. D.; Su, C. Q.; Yuan, X. H.

    2015-06-01

    In this study, a numerical model has been employed to analyze the internal flow field distribution in a heat exchanger applied for an automotive thermoelectric generator based on computational fluid dynamics. The model simulates the influence of factors relevant to the heat exchanger, including the automotive waste heat mass flow velocity, temperature, internal fins, and back pressure. The result is in good agreement with experimental test data. Sensitivity analysis of the inlet parameters shows that increase of the exhaust velocity, compared with the inlet temperature, makes little contribution (0.1 versus 0.19) to the heat transfer but results in a detrimental back pressure increase (0.69 versus 0.21). A configuration equipped with internal fins is proved to offer better thermal performance compared with that without fins. Finally, based on an attempt to improve the internal flow field, a more rational structure is obtained, offering a more homogeneous temperature distribution, higher average heat transfer coefficient, and lower back pressure.

  13. An Improved Adaptive Multi-way Principal Component Analysis for Monitoring Streptomycin Fermentation Process

    Institute of Scientific and Technical Information of China (English)

    何宁; 王树青; 谢磊

    2004-01-01

    Multi-way principal component analysis (MPCA) had been successfully applied to monitoring the batch and semi-batch process in most chemical industry. An improved MPCA approach, step-by-step adaptive MPCA (SAMPCA), using the process variable trajectories to monitoring the batch process is presented in this paper. It does not need to estimate or fill in the unknown part of the process variable trajectory deviation from the current time until the end. The approach is based on a MPCA method that processes the data in a sequential and adaptive manner. The adaptive rate is easily controlled through a forgetting factor that controls the weight of past data in a summation. This algorithm is used to evaluate the industrial streptomycin fermentation process data and is compared with the traditional MPCA. The results show that the method is more advantageous than MPCA, especially when monitoring multi-stage batch process where the latent vector structure can change at several points during the batch.

  14. Improved target detection and bearing estimation utilizing fast orthogonal search for real-time spectral analysis

    Science.gov (United States)

    Osman, Abdalla; Nourledin, Aboelamgd; El-Sheimy, Naser; Theriault, Jim; Campbell, Scott

    2009-06-01

    The problem of target detection and tracking in the ocean environment has attracted considerable attention due to its importance in military and civilian applications. Sonobuoys are one of the capable passive sonar systems used in underwater target detection. Target detection and bearing estimation are mainly obtained through spectral analysis of received signals. The frequency resolution introduced by current techniques is limited which affects the accuracy of target detection and bearing estimation at a relatively low signal-to-noise ratio (SNR). This research investigates the development of a bearing estimation method using fast orthogonal search (FOS) for enhanced spectral estimation. FOS is employed in this research in order to improve both target detection and bearing estimation in the case of low SNR inputs. The proposed methods were tested using simulated data developed for two different scenarios under different underwater environmental conditions. The results show that the proposed method is capable of enhancing the accuracy for target detection as well as bearing estimation especially in cases of a very low SNR.

  15. CUSTOMER VALUE NETWORK ANALYSIS FOR IMPROVEMENT OF CUSTOMER LIFE-TIME VALUE COMPUTATION

    Directory of Open Access Journals (Sweden)

    Monireh Hosseini

    2010-06-01

    Full Text Available The constant changes in the world have exposed companies to a situation of tough competition. This situation, especially in e-commerce, complicates the decision-making process about target customers and the recommendation of products to them. On the one hand, understanding and measuring the customer lifetime value (CLV is a critical factor for long-term success. On the other hand, the value network is a new concept that considers both tangible and intangible complex dynamic value exchanges between two or more enterprises, customers, suppliers, etc. In this paper we introduce a new definition of value networks that has focused on customer relationship management (CRM concepts called business customers' value network. Then, we suggest the value network analysis (VNA approach as a powerful tool for modeling and analyzing tangible and intangible relationships between a company and its business customers, and propose VNA to improve networking potential of CLV. This study provides a conceptual framework for mapping a newly proposed value network consisting of three schemas (star, community and compound schemas with an illustrated example. Development of a new networked measure of CLV called network customer lifetime value (NCLV is our future aim.

  16. IMPROVEMENT OF COMPANY MARKETING STRATEGY BASED ON ANALYSIS OF GOOGLE SEARCH RESULTS

    Directory of Open Access Journals (Sweden)

    Marek Ďurica

    2015-09-01

    Full Text Available Nowadays, Internet plays a major role in people's lives. It is usually used for entertainment, as a source of information, and also for electronic commerce. Electronic commerce (e-commerce is gradually replacing traditional shopping, especially in the past years. It is a quick and easy form of marketing, which provides convenience for the customers, and, therefore, more and more users are using this form of shopping on the Internet. E-commerce also provides new opportunities for companies, which force them to begin dealing with the Internet. Many customers who are shopping on the Internet look for the best product or service close to their home. Most of the space in the search results in Google is occupied by local results. If a company offers some goods or services and they do not show up on the local search results, the company may be losing a lot of profits from these potential customers. That is why companies have to focus on best ranking in the local search results. In this article, we try to experimentally determine which factors affect ranking in Google search. Of course, it is necessary to quantify the impact of these factors. To select these factors and to determine their impact, we use exact methods of mathematical statistics, hypothesis testing, correlation, and regression analysis. Confirmation and quantification of the impact of some qualitative and quantitative characteristics of the company can be used to formulate recommendations for improving corporate strategy in acquiring new customers.

  17. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyougn Tae; Moon, Young Min; Choi, Sung Won; Heo, Sun [Korea Advanced Institute Science and Technology, Taejon (Korea, Republic of)

    1999-04-15

    The loss-of-RHR accident during midloop operation has been important as results of the probabilistic safety analysis. The condensation models In RELAP5/MOD3 are not proper to analyze the midloop operation. To audit and improve the model in RELAP5/MOD3.2, several items of separate effect tests have been performed. The 29 sets of reflux condensation data is obtained and the correlation is developed with these heat transfer coefficient's data. In the experiment of the direct contact condensation in hot leg, the apparatus setting is finished and a few experimental data is obtained. Non-iterative model is used to predict the model in RELAP5/MOD3.2 with the results of reflux condensation and evaluates better than the present model. The results of the direct contact condensation in a hot leg represent to be similar with the present model. The study of the CCF and liquid entrainment in a surge line and pressurizer is selected as the third separate experiment and is on performance.

  18. Improving Australia's renewable energy project policy and planning: A multiple stakeholder analysis

    International Nuclear Information System (INIS)

    Renewable Energy (RE) is part of Australia's and the world's energy supply matrix with over A$100 billion spent annually on RE projects since 2007. Businesses seeking to invest in RE projects, particularly in the wind and solar energy sectors, may face an onerous collection of planning approvals and permitting processes that impede investment and implementation. In this study, we draw on international and domestic stakeholder inputs to a governmental inquiry in Australia to show how RE projects might be approved in shortened timeframes with reduced associated costs. The process mapping and stakeholder analysis demonstrates that RE supply projects can benefit from standardized approval processes and documentation, a 360° deep engagement with stakeholders, and expanded electricity grid access in resource areas, augmented through supportive public policy and planning frameworks. In addition, stakeholder objections to project approval and implementation streamlining were used to contrast the efficacy of the proposed changes in policy. -- Highlights: •Highlights the over A$200 billion spent annually on global RE projects. •Describes a typical two stage, multi-layered governance RE project approval process. •Exposes long 3 year and multi-million dollar cost approvals for RE projects. •Identifies multi-million dollar remote grid connections as an RE project impediment. •Outlines RE project policy and guidelines shortcomings and proposed improvements

  19. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-06-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  20. Analysis of microbiota on abalone (Haliotis discus hannai) in South Korea for improved product management.

    Science.gov (United States)

    Lee, Min-Jung; Lee, Jin-Jae; Chung, Han Young; Choi, Sang Ho; Kim, Bong-Soo

    2016-10-01

    Abalone is a popular seafood in South Korea; however, because it contains various microorganisms, its ingestion can cause food poisoning. Therefore, analysis of the microbiota on abalone can improve understanding of outbreaks and causes of food poisoning and help to better manage seafood products. In this study, we collected a total of 40 abalones from four different regions in March and July, which are known as the maximum abalone production areas in Korea. The microbiota were analyzed using high-throughput sequencing, and bacterial loads on abalone were quantified by real-time PCR. Over 2700 species were detected in the samples, and Alpha- and Gammaproteobacteria were the predominant classes. The differences in microbiota among regions and at each sampling time were also investigated. Although Psychrobacter was the dominant genus detected on abalone in both March and July, the species compositions were different between the two sampling times. Five potential pathogens (Lactococcus garvieae, Yersinia kristensenii, Staphylococcus saprophyticus, Staphylococcus warneri, and Staphylococcus epidermidis) were detected among the abalone microbiota. In addition, we analyzed the influence of Vibrio parahaemolyticus infection on shifts in abalone microbiota during storage at different temperatures. Although the proportion of Vibrio increased over time in infected and non-infected abalone, the shifts of microbiota were more dynamic in infected abalone. These results can be used to better understand the potential of food poisoning caused by abalone consumption and manage abalone products according to the microbiota composition. PMID:27371902

  1. Improving breast cancer classification with mammography, supported on an appropriate variable selection analysis

    Science.gov (United States)

    Pérez, Noel; Guevara, Miguel A.; Silva, Augusto

    2013-02-01

    This work addresses the issue of variable selection within the context of breast cancer classification with mammography. A comprehensive repository of feature vectors was used including a hybrid subset gathering image-based and clinical features. It aimed to gather experimental evidence of variable selection in terms of cardinality, type and find a classification scheme that provides the best performance over the Area Under Receiver Operating Characteristics Curve (AUC) scores using the ranked features subset. We evaluated and classified a total of 300 subsets of features formed by the application of Chi-Square Discretization, Information-Gain, One-Rule and RELIEF methods in association with Feed-Forward Backpropagation Neural Network (FFBP), Support Vector Machine (SVM) and Decision Tree J48 (DTJ48) Machine Learning Algorithms (MLA) for a comparative performance evaluation based on AUC scores. A variable selection analysis was performed for Single-View Ranking and Multi-View Ranking groups of features. Features subsets representing Microcalcifications (MCs), Masses and both MCs and Masses lesions achieved AUC scores of 0.91, 0.954 and 0.934 respectively. Experimental evidence demonstrated that classification performance was improved by combining image-based and clinical features. The most important clinical and image-based features were StromaDistortion and Circularity respectively. Other less important but worth to use due to its consistency were Contrast, Perimeter, Microcalcification, Correlation and Elongation.

  2. An Improved Variable Structure Adaptive Filter Design and Analysis for Acoustic Echo Cancellation

    Directory of Open Access Journals (Sweden)

    A. Kar

    2015-04-01

    Full Text Available In this research an advance variable structure adaptive Multiple Sub-Filters (MSF based algorithm for single channel Acoustic Echo Cancellation (AEC is proposed and analyzed. This work suggests a new and improved direction to find the optimum tap-length of adaptive filter employed for AEC. The structure adaptation, supported by a tap-length based weight update approach helps the designed echo canceller to maintain a trade-off between the Mean Square Error (MSE and time taken to attain the steady state MSE. The work done in this paper focuses on replacing the fixed length sub-filters in existing MSF based AEC algorithms which brings refinements in terms of convergence, steady state error and tracking over the single long filter, different error and common error algorithms. A dynamic structure selective coefficient update approach to reduce the structural and computational cost of adaptive design is discussed in context with the proposed algorithm. Simulated results reveal a comparative performance analysis over proposed variable structure multiple sub-filters designs and existing fixed tap-length sub-filters based acoustic echo cancellers.

  3. Analysis of Scattering Components from Fully Polarimetric SAR Images for Improving Accuracies of Urban Density Estimation

    Science.gov (United States)

    Susaki, J.

    2016-06-01

    In this paper, we analyze probability density functions (PDFs) of scatterings derived from fully polarimetric synthetic aperture radar (SAR) images for improving the accuracies of estimated urban density. We have reported a method for estimating urban density that uses an index Tv+c obtained by normalizing the sum of volume and helix scatterings Pv+c. Validation results showed that estimated urban densities have a high correlation with building-to-land ratios (Kajimoto and Susaki, 2013b; Susaki et al., 2014). While the method is found to be effective for estimating urban density, it is not clear why Tv+c is more effective than indices derived from other scatterings, such as surface or double-bounce scatterings, observed in urban areas. In this research, we focus on PDFs of scatterings derived from fully polarimetric SAR images in terms of scattering normalization. First, we introduce a theoretical PDF that assumes that image pixels have scatterers showing random backscattering. We then generate PDFs of scatterings derived from observations of concrete blocks with different orientation angles, and from a satellite-based fully polarimetric SAR image. The analysis of the PDFs and the derived statistics reveals that the curves of the PDFs of Pv+c are the most similar to the normal distribution among all the scatterings derived from fully polarimetric SAR images. It was found that Tv+c works most effectively because of its similarity to the normal distribution.

  4. Improved survival following surgery and radiation therapy for olfactory neuroblastoma: analysis of the SEER database

    International Nuclear Information System (INIS)

    Olfactory Neuroblastoma is a rare malignant tumor of the olfactory tract. Reports in the literature comparing treatment modalities for this tumor are limited. The SEER database (1973-2006) was queried by diagnosis code to identify patients with Olfactory Neuroblastoma. Kaplan-Meier was used to estimate survival distributions based on treatment modality. Differences in survival distributions were determined by the log-rank test. A Cox multiple regression analysis was then performed using treatment, race, SEER historic stage, sex, age at diagnosis, year at diagnosis and SEER geographic registry. A total of 511 Olfactory Neuroblastoma cases were reported. Five year overall survival, stratified by treatment modality was: 73% for surgery with radiotherapy, 68% for surgery only, 35% for radiotherapy only, and 26% for neither surgery nor radiotherapy. There was a significant difference in overall survival between the four treatment groups (p < 0.01). At ten years, overall survival stratified by treatment modality and stage, there was no significant improvement in survival with the addition of radiation to surgery. Best survival results were obtained for surgery with radiotherapy

  5. Analysis of Improved Cyclostationary Detector with Multiple Antennas over Fading Channels

    Directory of Open Access Journals (Sweden)

    Ying Zhu

    2013-11-01

    Full Text Available A comprehensive performance analysis of the multi-cycle cyclostationary (MC detection-based spectrum sensing over fading channels with multiple independent and correlated antennas is developed. We first proposed an improved MC detector, aiming to reduce the computational complexity of the conventional one. Compared with conventional MC detector, the proposed method is low-computational complexity and high-accuracy on sensing performance. Based on the proposed MC detector, for the multiple independent antennas case, the average detection probability by employing square-law combining (SLC is derived for several fading channels such as Nakagami, Rayleigh and Rician by using the moment generation function (MGF approach. For multiple correlated antenna case, with Nakagami fading and SLC scheme, expressions of detection probability are derived by the same approach as it in the independent antennas case. Special cases of a linear array of 2 and 4 arbitrarily correlated antennas are treated. Finally, illustrative and analytical results show that the reliability of our proposed MC detector and the degradation of sensing performance over correlation and fading.

  6. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Teoh, Siew Li; Sudfangsai, Suthinee; Lumbiganon, Pisake; Laopaiboon, Malinee; Lai, Nai Ming; Chaiyakunapruk, Nathorn

    2016-01-01

    Chicken essence (CE) is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane's Risk of Bias (ROB) tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD) of -0.55 (-1.04, -0.06)) and another with no significant difference (pooled SMD of 0.70 (-0.001, 1.40)). For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (-0.16, 1.42)). Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE. PMID:26805876

  7. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Siew Li Teoh

    2016-01-01

    Full Text Available Chicken essence (CE is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane’s Risk of Bias (ROB tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD of −0.55 (−1.04, −0.06 and another with no significant difference (pooled SMD of 0.70 (−0.001, 1.40. For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (−0.16, 1.42. Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE.

  8. Linear analysis of the vertical shear instability: outstanding issues and improved solutions

    Science.gov (United States)

    Umurhan, O. M.; Nelson, R. P.; Gressel, O.

    2016-02-01

    Context. The vertical shear instability is one of several known mechanisms that are potentially active in the so-called dead zones of protoplanetary accretion disks. A recent analysis of the instability mechanism indicates that a subset of unstable modes shows unbounded growth - both as resolution is increased and when the nominal lid of the atmosphere is extended. This trend suggests that, possibly, the model system is ill-posed. Aims: This research note both examines the energy content of these modes and questions the legitimacy of assuming separable solutions for a problem whose linear operator is fundamentally inseparable. Methods: The reduced equations governing the instability are revisited and the generated solutions are examined using both the previously assumed separable forms and an improved non-separable solution form that is introduced in this paper. Results: Reconsidering the solutions of the reduced equations by using the separable form shows that, while the low-order body modes have converged eigenvalues and eigenfunctions (for both variations in the model atmosphere's vertical boundaries and radial numerical resolution). It is also confirmed that the corresponding high-order body modes and the surface modes indeed show unbounded growth rates. The energy contained in both the higher order body modes and surface modes diminishes precipitously due to the disk's Gaussian density profile. Most of the energy of the instability is contained in the low-order modes. An inseparable solution form is introduced to filter out the inconsequential surface modes, leaving only body modes (both low- and high-order ones). The analysis predicts a fastest growing mode with a specific radial length scale. The growth rates associated with the fundamental corrugation and breathing modes match the growth and length scales observed in previous nonlinear studies of the instability. Conclusions: Linear stability analysis of the vertical shear instability should be done

  9. Genre Analysis and Writing Skill: Improving Iranian EFL Learners Writing Performance through the Tenets of Genre Analysis

    Directory of Open Access Journals (Sweden)

    Nazanin Naderi Kalali

    2015-12-01

    Full Text Available The main thrust of this study was to determine whether a genre-based instruction improve the writing proficiency of Iranian EFL learners. To this end, 30 homogenous Iranian BA learners studying English at Islamic Azad University, Bandar Abbas Branch were selected as the participants of the study through a version of TOEFL test as the proficiency test. The selected participants were 15 females and 15 males who were randomly divided into two groups of experimental and control. The both experimental and control groups were asked to write on a topic determined by the researcher which were considered as the pre-test. The writing of the students were scored using holistic scoring procedure. The subjects received sixteen hours instruction—the experimental group using a genre-based pedagogy and the control group through the traditional methodology which was followed by a post-test—the subjects were, this time, asked to write on the same topic which they were asked to write before instruction. Their post-writings were also scored through the holistic scoring procedures. In analyzing the data, t-test statistic was utilized for comparing the performances of the two groups. It was found that there is statistically significant difference between the writing ability of the participants who go under a genre-based instruction and who don’t. The study, however, didn’t find any significant role for gender.Keywords: genre analysis, writing skill, holistic scoring procedure, pre-test, post-test, t-test

  10. Linkage analysis and physical mapping near the gene for x-linked agammaglobulinemia at Xq22

    Energy Technology Data Exchange (ETDEWEB)

    Parolini, O.; Lassiter, G.L.; Henry, M.J.; Conley, M.E. (Univ. of Tennessee College of Medicine, Memphis (United States) St. Jude Children' s Research Hospital, Memphis, TN (United States)); Hejtmancik, J.F. (National Inst. of Health, Bethesda, MD (United States)); Allen, R.C.; Belmont, J.W. (Baylor College of Medicine, Houston, TX (United States)); Barker, D.F. (Univ. of Utah, Salt Lake City (United States))

    1993-02-01

    The gene for x-linked agammaglobulinemia (XLA) has been mapped to Xq22. No recombinations have been reported between the gene and the prob p212 at DXS178; however, this probe is informative in only 30-40% of women and the reported flanking markers, DXS3 and DXS94, and 10-15 cM apart. To identify additional probes that might be useful in genetic counseling, we examined 11 polymorphisms that have been mapped to the Xq21.3-q22 region in 13 families with XLA. In addition, pulsed-field gel electrophoresis and yeast artificial chromosomes (YACs) were used to further characterize the segman of DNA within which the gene for SLA must lie. The results demonstrated that DXS366 and DXS442, which share a 430-kb pulsed-field fragment, could replace DXS3 as proximal flanking markers. Probes at DXS178 and DXS265 identified the same 145-kb pulsed-field fragment, and both loci were contained within a 200-kb YAC identified with the probe p212. A highly polymorphic CA repeat (DCS178CA) was isolated from one end of this YAC and used in linkage analysis. Probes at DXS101 and DXS328 shared several pulsed-field fragments, the smallest of which was 250 kb. No recombinations were seen between XLA and the DXS178-DXS265-DXS178CA complex, DXS101, DXS328, DXS87, or the gene for proteolipid protein (PLP). Key crossovers, when combined with the linkage data from families with Alport syndrome, suggested the following order of loci: cen-DXS3-DXS366-DXS442-(PLP, DXS101, DXS328, DXS178-DXS265-DXS178CA complex, XL)-(DXS87, DXS94)-DXS327-(DXS350, DXS362)-tel. Our studies also limit the segment of DNA within which the XLA gene must lie to the 3- to 4-cM distance between DCS442 and DXS94 and they identify and orient polymorphisms that can be used in genetic counseling not only for XLA but also for Pelizaeus-Merzbacher disease (PLP deficiency), Alport syndrome (COL4A5 deficiency), and Fabry disease ([alpha]-galactosidase A difficiency). 31 refs., 5 figs., 2 tabs.

  11. Numerical Analysis of the Unsteady Propeller Performance in the Ship Wake Modified By Different Wake Improvement Devices

    Directory of Open Access Journals (Sweden)

    Bugalski Tomasz

    2014-10-01

    Full Text Available The paper presents the summary of results of the numerical analysis of the unsteady propeller performance in the non-uniform ship wake modified by the different wake improvement devices. This analysis is performed using the lifting surface program DUNCAN for unsteady propeller analysis. Te object of the analysis is a 7000 ton chemical tanker, for which four different types of the wake improvement devices have been designed: two vortex generators, a pre-swirl stator, and a boundary layer alignment device. These produced five different cases of the ship wake structure: the original hull and hull equipped alternatively with four wake improvement devices. Two different propellers were analyzed in these five wake fields, one being the original reference propeller P0 and the other - a specially designed, optimized propeller P3. Te analyzed parameters were the pictures of unsteady cavitation on propeller blades, harmonics of pressure pulses generated by the cavitating propellers in the selected points and the fluctuating bearing forces on the propeller shaft. Some of the calculated cavitation phenomena were confronted with the experimental. Te objective of the calculations was to demonstrate the differences in the calculated unsteady propeller performance resulting from the application of different wake improvement devices. Te analysis and discussion of the results, together with the appropriate conclusions, are included in the paper.

  12. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    International Nuclear Information System (INIS)

    The direct-contact condensation hear transfer coefficients are experimentally obtained in the following conditions : pure steam/steam in the presence of noncondensible gas, horizontal/slightly inclined pipe, cocurrent/countercurrent stratified flow with water. The empirical correlation for liquid Nusselt number is developed in conditions of the slightly inclined pipe and the cocurrent stratified flow. The several models - the wall friction coefficient, the interfacial friction coefficient, the correlation of direct-contact condensation with noncondensible gases, and the correlation of wall film condensation - in the RELAP5/MOD3.2 code are modified, As results, RELAP5/MOD3.2 is improved. The present experimental data is used for evaluating the improved code. The standard RELAP5/MOD3.2 code is modified using the non-iterative modeling, which is a mechanistic model and does not require any interfacial information such as the interfacial temperature, The modified RELAP5/MOD3.2 code os used to simulate the horizontally stratified in-tube condensation experiment which represents the direct-contact condensation phenomena in a hot leg of a nuclear reactor. The modeling capabilities of the modified code as well as the standard code are assessed using several hot-leg condensation experiments. The modified code gives better prediction over local experimental data of liquid void fraction and interfacial heat transfer coefficient than the standard code. For the separate effect test of the thermal-hydraulic phenomena in the pressurizer, the scaling analysis is performed to obtain a similarity of the phenomena between the Korea Standard Nuclear Power Plant(KSNPP) and the present experimental facility. The diameters and lengths of the hot-leg, the surge line and the pressurizer are scaled down with the similitude of CCFL and velocity. The ratio of gas flow rate is 1/25. The experimental facility is composed of the air-water supply tank, the horizontal pipe, the surge line and the

  13. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyoung Tae; Moon, Young Min; Choi, Sung Won; Hwang, Do Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2000-03-15

    The direct-contact condensation hear transfer coefficients are experimentally obtained in the following conditions : pure steam/steam in the presence of noncondensible gas, horizontal/slightly inclined pipe, cocurrent/countercurrent stratified flow with water. The empirical correlation for liquid Nusselt number is developed in conditions of the slightly inclined pipe and the cocurrent stratified flow. The several models - the wall friction coefficient, the interfacial friction coefficient, the correlation of direct-contact condensation with noncondensible gases, and the correlation of wall film condensation - in the RELAP5/MOD3.2 code are modified, As results, RELAP5/MOD3.2 is improved. The present experimental data is used for evaluating the improved code. The standard RELAP5/MOD3.2 code is modified using the non-iterative modeling, which is a mechanistic model and does not require any interfacial information such as the interfacial temperature, The modified RELAP5/MOD3.2 code os used to simulate the horizontally stratified in-tube condensation experiment which represents the direct-contact condensation phenomena in a hot leg of a nuclear reactor. The modeling capabilities of the modified code as well as the standard code are assessed using several hot-leg condensation experiments. The modified code gives better prediction over local experimental data of liquid void fraction and interfacial heat transfer coefficient than the standard code. For the separate effect test of the thermal-hydraulic phenomena in the pressurizer, the scaling analysis is performed to obtain a similarity of the phenomena between the Korea Standard Nuclear Power Plant(KSNPP) and the present experimental facility. The diameters and lengths of the hot-leg, the surge line and the pressurizer are scaled down with the similitude of CCFL and velocity. The ratio of gas flow rate is 1/25. The experimental facility is composed of the air-water supply tank, the horizontal pipe, the surge line and the

  14. Pursuit of improvement in uranium bulk analysis at the clear facility for safeguards environmental samples

    International Nuclear Information System (INIS)

    with flexible tube (FIG. 1). By scanning the surface of Texwipe-304 with the top of the tip, the particles are sucked up through the tip and collected on to the filter. Figure 2 shows micrographs of Texwipe-304 surface: (a) new one, (b) particle-swiped one, and (c) particle-recovered one by the Vacuum Suction Method. As seen in the figure, most of the particles were removed from Texwipe-304. Preliminary examination suggested that the particle recovery yield is in acceptable level and that the process blank is low. The method will provide a useful means to improve the reliability in bulk analysis of ultra-trace amount of uranium, even in the case that Texwipe-304 is not replaceable. At the presentation, further results and other activities related to minimize the blank values at the CLEAR facility will be reported. (author)

  15. Combination of an Improved FRF-Based Substructure Synthesis and Power Flow Method with Application to Vehicle Axle Noise Analysis

    OpenAIRE

    Liu, C Q

    2008-01-01

    In this paper, an improved FRF-based substructure synthesis method combined with power flow analysis is presented and is used for performing a vehicle axle noise analysis. The major transfer paths of axle noise transmitted from chassis to vehicle body are identified and ranked based on power flows transmitted through bushings between the chassis and body. To calculate the power flows, it is necessary to know the reaction forces and the vibrations at the bushing locations on the body side. To ...

  16. Metal Foam Analysis: Improving Sandwich Structure Technology for Engine Fan and Propeller Blades

    Science.gov (United States)

    Fedor, Jessica L.

    2004-01-01

    The Life Prediction Branch of the NASA Glenn Research Center is searching for ways to construct aircraft and rotorcraft engine fan and propeller blades that are lighter and less costly. One possible design is to create a sandwich structure composed of two metal faces sheets and a metal foam core. The face sheets would carry the bending loads and the foam core would have to resist the transverse shear loads. Metal foam is ideal because of its low density and energy absorption capabilities, making the structure lighter, yet still stiff. The material chosen for the face sheets and core was 17-4PH stainless steel, which is easy to make and has appealing mechanical properties. This material can be made inexpensively compared to titanium and polymer matrix composites, the two current fan blade alternatives. Initial tests were performed on design models, including vibration and stress analysis. These tests revealed that the design is competitive with existing designs; however, some problems were apparent that must be addressed before it can be implemented in new technology. The foam did not hold up as well as expected under stress. This could be due to a number of issues, but was most likely a result of a large number of pores within the steel that weakened the structure. The brazing between the face sheets and the foam was also identified as a concern. The braze did not hold up well under shear stress causing the foam to break away from the face sheets. My role in this project was to analyze different options for improving the design. I primarily spent my time examining various foam samples, created with different sintering conditions, to see which exhibited the most favorable characteristics for our purpose. Methods of analysis that I employed included examining strut integrity under a microscope, counting the number of cells per inch, measuring the density, testing the microhardness, and testing the strength under compression. Shear testing will also be done to examine

  17. Improvement in intraperitoneal intraoperative cisplatin exposure based on pharmacokinetic analysis in patients with ovarian cancer.

    Science.gov (United States)

    Royer, Bernard; Delroeux, Delphine; Guardiola, Emmanuel; Combe, Marielle; Hoizey, Guillaume; Montange, Damien; Kantelip, Jean-Pierre; Chauffert, Bruno; Heyd, Bruno; Pivot, Xavier

    2008-03-01

    Ovarian cancer is the leading cause of gynecological cancer-related death in Western countries. The present treatment standards for ovarian cancer are based on the association of debulking surgery with platinum-based chemotherapy. Another strategy that could be further investigated is intraperitoneal chemotherapy (IP). We previously described that the 2-h administration of intraoperative IP cisplatin did not reach satisfactory concentrations. In the present study, we present the results of a pharmacokinetic analysis performed after two consecutive 1-h IP 30 mg/l cisplatin administrations. Twenty-seven patients with advanced epithelial cancer classified FIGO stage IIIC were included in the study. Blood and IP samples were taken over a 24-h period, during and after IP treatment. Both total and ultrafiltered (Uf) platinum (Pt) concentration levels were analyzed. Biological and clinical toxicities were also recorded. With this strategy, IP Pt concentrations stayed above the target concentration (10 mg/l) for a satisfactory length of time. The serum Pt concentrations were higher than those observed with the "one-bath" protocol and they induced the occurrence of recoverable renal toxicities (3 grade 1, 7 grade 2 and 4 grade 3). The best predictive parameter for renal failure was the total Pt 24-h Area Under the Curve (AUC) with a threshold value of 25 mg h/l RR = 0.31 (95% CI 0.13 - 0.49, P amount of cisplatin is feasible and a satisfactory level of IP Pt concentrations is obtained. However, this improvement is associated with an increase in serum Pt levels and resulting renal toxicities. An attractive solution would be to decrease Pt transfer from peritoneum to bloodstream. A phase 1 study using intraoperative IP epinephrine in order to decrease this transfer is presently being carried out. PMID:17503047

  18. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  19. Structural Analysis of Char by Raman Spectroscopy: Improving Band Assignments through Computational Calculations from First Principles

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Matthew W.; Dallmeyer, Ian; Johnson, Timothy J.; Brauer, Carolyn S.; McEwen, Jean-Sabin; Espinal, Juan F.; Garcia-Perez, Manuel

    2016-04-01

    Raman spectroscopy is a powerful tool for the characterization of many carbon 27 species. The complex heterogeneous nature of chars and activated carbons has confounded 28 complete analysis due to the additional shoulders observed on the D-band and high intensity 29 valley between the D and G-bands. In this paper the effects of various vacancy and substitution 30 defects have been systematically analyzed via molecular modeling using density functional 31 theory (DFT) and how this is manifested in the calculated gas-phase Raman spectra. The 32 accuracy of these calculations was validated by comparison with (solid-phase) experimental 33 spectra, with a small correction factor being applied to improve the accuracy of frequency 34 predictions. The spectroscopic effects on the char species are best understood in terms of a 35 reduced symmetry as compared to a “parent” coronene molecule. Based upon the simulation 36 results, the shoulder observed in chars near 1200 cm-1 has been assigned to the totally symmetric 37 A1g vibrations of various small polyaromatic hydrocarbons (PAH) as well as those containing 38 rings of seven or more carbons. Intensity between 1400 cm-1 and 1450 cm-1 is assigned to A1g 39 type vibrations present in small PAHs and especially those containing cyclopentane rings. 40 Finally, band intensity between 1500 cm-1 and 1550 cm-1 is ascribed to predominately E2g 41 vibrational modes in strained PAH systems. A total of ten potential bands have been assigned 42 between 1000 cm-1 and 1800 cm-1. These fitting parameters have been used to deconvolute a 43 thermoseries of cellulose chars produced by pyrolysis at 300-700 °C. The results of the 44 deconvolution show consistent growth of PAH clusters with temperature, development of non-45 benzyl rings as temperature increases and loss of oxygenated features between 400 °C and 46 600 °C

  20. Improving food safety with accurate analysis by laboratories participating in proficiency testing programmes

    International Nuclear Information System (INIS)

    Full text: The certification of food products, either for export or internal consumption, requires an analysis as accurate as possible and proofs assuring that the results have a solid base. International food trade is a very sensible area of commerce and the sanitary barriers, including those for potentially toxic metals, are extremely strict. Countries with mutual recognition agreements (MRA) accept the certification of the exporter. Where MRA does not exist, the recipient country analyses the goods using their own sampling and analytical procedures. In some cases the results agree but do not in others. In the last situation the products are rejected and not allowed into the buying country with the consequent losses. Chile has a large international market for its seafood products. It has to comply, however, with regulations established by each importing country. One such requirement refers to the maximum admissible level of cadmium in molluscs, set by many countries at 1 mg/kg of Cd. Discrepancies between the Chilean and laboratories abroad arose in the past, resulting in many rejections of the products. Under these circumstances, the Chilean National Fisheries (SERNAPESCA) and the Chilean Nuclear Energy Commission (CCHEN) set up a proficiency test programme mandatory for all authorized laboratories for the certification of export seafood. CCHEN has the responsibility of all technical aspects of the programme, including the preparation and distribution of the materials and evaluation of the results submitted by the participants. After the first proficiency test, several laboratories had their authorization rescinded and, as an additional consequence, all the laboratories had to review and re-validate their analytical procedures. So far, three proficiency tests have been carried out and the response of the laboratories has noticeably improved with direct consequence in the decrease of rejections of the goods by the importers. This paper presents the details of the

  1. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    OpenAIRE

    Zhenying Zhang; Lili Tian

    2014-01-01

    The impulse turbo expander (ITE) is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reductio...

  2. Improving performance of high risk organizations Spanish nuclear sector from the analysis of organizational culture factors

    International Nuclear Information System (INIS)

    This paper presents the research project funded by UNESA and conducted by the CISOT-CIEMAT that aims to contribute to improving the operating performance of the Spanish nuclear power plants. This paper aims to identify the factors and key organizational processes to improve efficiency, in order to advance knowledge about the influence of organizational culture on the safety of high reliability organizations.

  3. School Board Improvement Plans in Relation to the AIP Model of Educational Accountability: A Content Analysis

    Science.gov (United States)

    van Barneveld, Christina; Stienstra, Wendy; Stewart, Sandra

    2006-01-01

    For this study we analyzed the content of school board improvement plans in relation to the Achievement-Indicators-Policy (AIP) model of educational accountability (Nagy, Demeris, & van Barneveld, 2000). We identified areas of congruence and incongruence between the plans and the model. Results suggested that the content of the improvement plans,…

  4. Improving Alpine Flood Prediction through Hydrological Process Characterization and Uncertainty Analysis

    OpenAIRE

    Tobin, Cara Christine

    2012-01-01

    Among the many challenges of Alpine flood prediction is describing complex, meteo-hydrological processes in a simplified, robust manner that can be easily integrated into operational forecasting. In this dissertation, improved methods to characterize these processes are developed and integrated into the hydrological modeling component of an operational flood forecasting system used in the Swiss Alps. Detailed studies are conducted to improve hydrologi...

  5. Quality Improvement and Infrastructure Activity Costs in Software Development: A Longitudinal Analysis

    OpenAIRE

    Donald E. Harter; Slaughter, Sandra A.

    2003-01-01

    This study draws upon theories of task interdependence and organizational inertia to analyze the effect of quality improvement on infrastructure activity costs in software development. Although increasing evidence indicates that quality improvement reduces software development costs, the impact on infrastructure activities is not known. Infrastructure activities include services like computer operations, data integration, and configuration management that support software development. Because...

  6. An improved model for predicting coolant activity behaviour for fuel-failure monitoring analysis

    Energy Technology Data Exchange (ETDEWEB)

    El-Jaby, A.; Lewis, B.J.; Thompson, W.T. [Department of Chemistry and Chemical Engineering, Royal Military College of Canada, Kingston, Ontario, K7K 7B4 (Canada); Iglesias, F.C. [Candesco Corporation, 230 Richmond Street West, 10th Floor, Toronto, Ontario, M5V 1V6 (Canada); Ip, Monique [Bruce Power, 123 Front Street West, 4th Floor Toronto, Ontario, M5J 2M2 (Canada)

    2009-06-15

    A Candu fuel element becomes defective when the Zircaloy-4 sheath is breached, allowing high pressure D{sub 2}O coolant to enter the fuel-to-sheath gap, thereby creating a direct path for fission products (mainly volatile species of iodine and noble gases) and fuel debris to escape into the primary heat transport system (PHTS). In addition, the entry of high-pressure D{sub 2}O coolant into the fuel-to-sheath gap may cause the UO{sub 2} fuel to oxidize, which in turn can augment the rate of fission product release into the PHTS. The release of fission products and fuel debris into the PHTS will elevate circuit contamination levels, consequently increasing radiation exposure to station personnel during maintenance tasks. Moreover, the continued operation of a defective fuel element may result in a diminished thermal performance if the thermal conductivity and the incipient melting temperature of the UO{sub 2} fuel are reduced due to fuel oxidation effects. It is therefore desirable to discharge defective fuel as soon as possible. Hence, a better understanding of defective fuel behaviour is required in order to develop an improved methodology for fuel-failure monitoring and PHTS coolant activity prediction. Several codes have been previously developed for fuel-failure monitoring in Candu, LWR (PWR and BWR), and WWER reactors. Most tools use a steady-state coolant activity analysis, where a Booth diffusion-type model is used to describe the fission product release from the UO{sub 2} fuel matrix into the fuel-to-sheath gap, and a first order kinetic model to consider the transport, hold-up, and release of volatile fission products from the fuel-to-sheath gap into the PHTS coolant. It is therefore necessary to use an empirical diffusion coefficient D' to account for the fission product diffusion in the UO{sub 2} fuel matrix and an escape rate coefficient {nu} for the release from the fuel-to-sheath gap into the PHTS coolant. However, these parameters are not

  7. Improvement of the PSA model using a best-estimate thermal-hydraulic analysis of LOCA scenarios

    International Nuclear Information System (INIS)

    This study was performed to propose both new success criterion and heading of the event tree by using best-estimate analysis of each LOCA scenario, aiming at the improvement of the PSA models. The MARS code was used for the thermal-hydraulic analysis of LOCA and the Ulchin units 3 and 4 were selected as a reference plant in this study. This study was performed to improve the PSA model of three LOCA scenarios by using best-estimate thermal-hydraulic analysis. The LOCA calculations with various configurations of the safety systems and break sizes were performed. Using the results, we proposed both new success criterion and heading of the small- and middle-break LOCA scenario. The small-break LOCA will be analyzed later in terms of operator actions to depressurize the RCS. The results of this analysis may contribute to improve the PSA model of LOCA. In the probabilistic safety analysis (PSA) of Korean Standard Nuclear Power Plant (KSNP), loss-of-coolant accidents (LOCA) are classified into three scenarios by the break size, such as large-, middle-, and small-break LOCA. The specific break sizes were adopted to identify the boundaries of the three groups in the previous PSA model and the success criteria has been conservatively applied to each state of safety system in the event tree

  8. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    Science.gov (United States)

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  9. Feed-forward active contour analysis for improved brachial artery reactivity testing.

    Science.gov (United States)

    Pugliese, Daniel N; Sehgal, Chandra M; Sultan, Laith R; Reamer, Courtney B; Mohler, Emile R

    2016-08-01

    The object of this study was to utilize a novel feed-forward active contour (FFAC) algorithm to find a reproducible technique for analysis of brachial artery reactivity. Flow-mediated dilation (FMD) is an important marker of vascular endothelial function but has not been adopted for widespread clinical use given its technical limitations, including inter-observer variability and differences in technique across clinical sites. We developed a novel FFAC algorithm with the goal of validating a more reliable standard. Forty-six healthy volunteers underwent FMD measurement according to the standard technique. Ultrasound videos lasting 5-10 seconds each were obtained pre-cuff inflation and at minutes 1 through 5 post-cuff deflation in longitudinal and transverse views. Automated segmentation using the FFAC algorithm with initial boundary definition from three different observers was used to analyze the images to measure diameter/cross-sectional area over the cardiac cycle. The %FMD was calculated for average, minimum, and maximum diameters/areas. Using the FFAC algorithm, the population-specific coefficient of variation (CV) at end-diastole was 3.24% for transverse compared to 9.96% for longitudinal measurements; the subject-specific CV was 15.03% compared to 57.41%, respectively. For longitudinal measurements made via the conventional method, the population-specific CV was 4.77% and subject-specific CV was 117.79%. The intraclass correlation coefficient (ICC) for transverse measurements was 0.97 (95% CI: 0.95-0.98) compared to 0.90 (95% CI: 0.84-0.94) for longitudinal measurements with FFAC and 0.72 (95% CI: 0.51-0.84) for conventional measurements. In conclusion, transverse views using the novel FFAC method provide less inter-observer variability than traditional longitudinal views. Improved reproducibility may allow adoption of FMD testing in a clinical setting. The FFAC algorithm is a robust technique that should be evaluated further for its ability to replace the

  10. Cost savings associated with improving appropriate and reducing inappropriate preventive care: cost-consequences analysis

    Directory of Open Access Journals (Sweden)

    Baskerville Neill

    2005-03-01

    Full Text Available Abstract Background Outreach facilitation has been proven successful in improving the adoption of clinical preventive care guidelines in primary care practice. The net costs and savings of delivering such an intensive intervention need to be understood. We wanted to estimate the proportion of a facilitation intervention cost that is offset and the potential for savings by reducing inappropriate screening tests and increasing appropriate screening tests in 22 intervention primary care practices affecting a population of 90,283 patients. Methods A cost-consequences analysis of one successful outreach facilitation intervention was done, taking into account the estimated cost savings to the health system of reducing five inappropriate tests and increasing seven appropriate tests. Multiple data sources were used to calculate costs and cost savings to the government. The cost of the intervention and costs of performing appropriate testing were calculated. Costs averted were calculated by multiplying the number of tests not performed as a result of the intervention. Further downstream cost savings were determined by calculating the direct costs associated with the number of false positive test follow-ups avoided. Treatment costs averted as a result of increasing appropriate testing were similarly calculated. Results The total cost of the intervention over 12 months was $238,388 and the cost of increasing the delivery of appropriate care was $192,912 for a total cost of $431,300. The savings from reduction in inappropriate testing were $148,568 and from avoiding treatment costs as a result of appropriate testing were $455,464 for a total savings of $604,032. On a yearly basis the net cost saving to the government is $191,733 per year (2003 $Can equating to $3,687 per physician or $63,911 per facilitator, an estimated return on intervention investment and delivery of appropriate preventive care of 40%. Conclusion Outreach facilitation is more expensive

  11. Turbulence Analysis Upstream of a Wind Turbine: a LES Approach to Improve Wind LIDAR Technology

    Science.gov (United States)

    Calaf, M.

    2015-12-01

    upstream, much can be learned about the incoming turbulence, hence allowing improved wind turbine readjustments. Time correlations with the upstream incoming turbulence have been computed through an entire diurnal cycle, and a non-dimensional analysis shows the existence of different behaviors throughout the day.

  12. Geoelectrical time-lapse analysis for improved interpretation of data in a contaminated area

    Science.gov (United States)

    Chitea, Florina; Serban, Adrian; Ioane, Dumitru; Georgescu, Paul

    2014-05-01

    Non invasive geoelectrical studies are useful in the preliminary assessment of areas suspected to be contaminated but also in the investigation stage. Correctly adapted to the site specific situation, they are used to detect and investigate buried sources of pollution, to characterize the geology of the area, to detect the contaminated plume or to study the attenuation of pollution in case the appliance of an site-specific remediation techniques. Despite the improved technological acquisition techniques and the optimized inversion data algorithms, interpretation of geoelectrical data in still a challenging task, especially in a contaminated hydrogeological context. Beside the soil physical properties (composition, porosity, texture, etc.), moisture content and chemical composition of the pollutant are also influencing the measured parameter. Apparent electrical resistivity method was use in an area located near an Oil Refinery. Electrical measurements performed on profiles (transverse and along the direction of water flow -according to hydrological data) revealed the presence of contaminants by means of high resistivity anomalies. Using the same acquisition technique (Schlumberger array, same VES points, injection - AB - and voltage - MN - lines extension), measurements were repeated during time, along the same profiles. On the resulted electrical sections from 2006 to 2013, a dynamic situation regarding the pollution plume was observed. Time - lapse analysis, based on the calculation of resistivity differences between sets of data acquired along the same profile was applied, and data interpretation was made using the resulted sections. Significant variation between data sets (> 17% of apparent resistivity normalized differences) observed along the main profile were mainly ranging from the near surface (1.5 m) to an approximated depth (AB/2) of 10m. Using the time-lapse method, changes in the lateral and in depth extension of polluted areas could be observed and

  13. Analysis of advanced sodium-cooled fast reactor core designs with improved safety characteristics

    International Nuclear Information System (INIS)

    improvements address both neutronics and thermal-hydraulics aspects. Furthermore, emphasis has been placed on not only the beginning-of-life (BOL) state of the core, but also on the beginning of closed equilibrium fuel cycle (BEC) state. An important context for the current thesis is the 7th European Framework Program's Collaborative Project for a European Sodium Fast Reactor (CP-ESFR), the reference 3600 MWth ESFR core being the starting point for the conducted research. The principally employed computational tools belong to the so-called FAST code system, viz. the fast-reactor neutronics code ERANOS, the fuel cycle simulating procedure EQL3D, the spatial kinetics code PARCS and the system thermal-hydraulics code TRACE. The research has been carried out in essentially three successive phases. The first phase has involved achieving a clearer understanding of the principal phenomena contributing to the SFR void effect. Decomposition and analysis of sodium void reactivity have been carried out, while considering different fuel cycle states for the core. Furthermore, the spatial distribution of void reactivity importance, in both axial and radial directions, is investigated. For the reactivity decomposition, two methods, based respectively on neutron balance considerations and on perturbation theory, have been applied. The sodium void reactivity of the reference ESFR core has been, accordingly, decomposed reaction-wise, cross-section-wise, isotope-wise and energy-group-wise. Effectively, the neutron balance based method allows an in-depth understanding of the ‘consequences’ of sodium voidage, while the perturbation theory based method provides a complementary understanding of the ‘causes’. The second phase of the research has addressed optimization of the reference ESFR core design from the neutronics viewpoint. Four options oriented towards either the leakage component or the spectral effect have been considered in detail, viz. introducing an upper sodium plenum and

  14. Quantitative Analysis of Impact of Education on Improving Farmers' Net Income and Yield Per Capita

    Institute of Scientific and Technical Information of China (English)

    DING Jing-zhi

    2002-01-01

    In this paper, we analyze the relation between farmers' schooling and their net income and yield per capita by systemic and scientific method, concluding that improving farmers' educational level may increase their net income.

  15. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  16. Analysis of Participatory Research Projects in the International Maize and Wheat Improvement Center

    OpenAIRE

    Lilja, Nina K.; Bellon, Mauricio R.

    2006-01-01

    Through a survey of scientists from the International Maize and Wheat Improvement Center (CIMMYT) in 2004, this study assessed the extent to which participatory methods had been used by the center, how they were perceived by the scientists, and how participatory research could be applied more effectively by CIMMYT and partners. Results for 19 CIMMYT projects suggest among other things that participatory approaches at the center were largely “functional”—that is, aimed at improving the efficie...

  17. A meta-analysis of clinical improvements of general well-being by a standardized Lycium barbarum.

    Science.gov (United States)

    Paul Hsu, Chiu-Hsieh; Nance, Dwight M; Amagase, Harunobu

    2012-11-01

    Four randomized, blind, placebo-controlled clinical trials were pooled to study the general effects of oral consumption of Lycium barbarum at 120 mL/day, as a standardized juice, GoChi(®) (FreeLife International, Phoenix, AZ, USA). A questionnaire consisting of symptoms graded 0-5 was given to the participants. For each question, the score changes in the questionnaire between pre- and postintervention were summarized by the standardized mean difference and associated SE to perform the meta-analysis. The change was also characterized into a binary outcome, improved or not, to derive odds ratio (OR) and associated SE derived by a binary outcome using the Mantel-Haenszel method. The meta-analysis and heterogeneity were evaluated with the R program using the rmeta package. Statistical significance was set at 5%. In total, 161 participants (18-72 years old) were included in the meta-analysis. Compared with the placebo group (n=80), the active group (n=81) showed significant improvements in weakness, stress, mental acuity, ease of awakening, shortness of breath, focus on activity, sleep quality, daydreaming, and overall feelings of health and well-being under a random effects model. A fixed effects model showed additional improvements in fatigue, depression, circulation, and calmness. The OR indicated significantly higher chance to improve fatigue, dizziness, and sleep quality. Three studies had statistically significant heterogeneity in procrastination, shoulder stiffness, energy, and calmness. The present meta-analysis confirmed the various health effects of L. barbarum polysaccharides-standardized L. barbarum intake found in the previous randomized, double-blind, placebo-controlled human clinical trials and revealed it resulted in statistically significant improvements in neurological/psychological performance and overall feelings of health and well-being compared with the placebo group under both the fixed and the random effects models of the R program. PMID

  18. Financial analysis of selected company by application of chosen solvency models and suggestions for improvement

    OpenAIRE

    König, Robert

    2014-01-01

    The subject of this bachelor thesis is analysis and evaluation of the financial situation of VAKABRNO s.r.o. during investigated period from 2008 to 2012. Thesis is divided into three parts. The first, theoretical research explains basics of financial analysis, users, sources and methods. Practical part provides analysis of the company by the methods from theoretical research and explains financial situation based on financial statements. The last part of this thesis includes summary of the f...

  19. An improved model for predicting coolant activity behaviour for fuel-failure monitoring analysis

    International Nuclear Information System (INIS)

    A Candu fuel element becomes defective when the Zircaloy-4 sheath is breached, allowing high pressure D2O coolant to enter the fuel-to-sheath gap, thereby creating a direct path for fission products (mainly volatile species of iodine and noble gases) and fuel debris to escape into the primary heat transport system (PHTS). In addition, the entry of high-pressure D2O coolant into the fuel-to-sheath gap may cause the UO2 fuel to oxidize, which in turn can augment the rate of fission product release into the PHTS. The release of fission products and fuel debris into the PHTS will elevate circuit contamination levels, consequently increasing radiation exposure to station personnel during maintenance tasks. Moreover, the continued operation of a defective fuel element may result in a diminished thermal performance if the thermal conductivity and the incipient melting temperature of the UO2 fuel are reduced due to fuel oxidation effects. It is therefore desirable to discharge defective fuel as soon as possible. Hence, a better understanding of defective fuel behaviour is required in order to develop an improved methodology for fuel-failure monitoring and PHTS coolant activity prediction. Several codes have been previously developed for fuel-failure monitoring in Candu, LWR (PWR and BWR), and WWER reactors. Most tools use a steady-state coolant activity analysis, where a Booth diffusion-type model is used to describe the fission product release from the UO2 fuel matrix into the fuel-to-sheath gap, and a first order kinetic model to consider the transport, hold-up, and release of volatile fission products from the fuel-to-sheath gap into the PHTS coolant. It is therefore necessary to use an empirical diffusion coefficient D' to account for the fission product diffusion in the UO2 fuel matrix and an escape rate coefficient ν for the release from the fuel-to-sheath gap into the PHTS coolant. However, these parameters are not constant in time as they are influenced by the

  20. Comparative analysis of maize (Zea mays) crop performance: natural variation, incremental improvements and economic impacts.

    Science.gov (United States)

    Leibman, Mark; Shryock, Jereme J; Clements, Michael J; Hall, Michael A; Loida, Paul J; McClerren, Amanda L; McKiness, Zoe P; Phillips, Jonathan R; Rice, Elena A; Stark, Steven B

    2014-09-01

    Grain yield from maize hybrids continues to improve through advances in breeding and biotechnology. Despite genetic improvements to hybrid maize, grain yield from distinct maize hybrids is expected to vary across growing locations due to numerous environmental factors. In this study, we examine across-location variation in grain yield among maize hybrids in three case studies. The three case studies examine hybrid improvement through breeding, introduction of an insect protection trait or introduction of a transcription factor trait associated with increased yield. In all cases, grain yield from each hybrid population had a Gaussian distribution. Across-location distributions of grain yield from each hybrid partially overlapped. The hybrid with a higher mean grain yield typically outperformed its comparator at most, but not all, of the growing locations (a 'win rate'). These results suggest that a broad set of environmental factors similarly impacts grain yields from both conventional- and biotechnology-derived maize hybrids and that grain yields among two or more hybrids should be compared with consideration given to both mean yield performance and the frequency of locations at which each hybrid 'wins' against its comparators. From an economic standpoint, growers recognize the value of genetically improved maize hybrids that outperform comparators in the majority of locations. Grower adoption of improved maize hybrids drives increases in average U.S. maize grain yields and contributes significant value to the economy. PMID:24851925

  1. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  2. Improvement in the Plutonium Parameter Files of the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    The isotopic analysis code Fixed-energy Response-function Analysis with Multiple efficiency (FRAM) employs user-editable parameter sets to analyze a broad range of sample types. This report presents new parameter files, based upon a new set of plutonium branding ratios, which give more accurate isotope results than the current parameter files that use FRAM

  3. Improvement in the Plutonium Parameter Files of the FRAM Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    D. T. Vo; T. E. Sampson

    2000-09-01

    The isotopic analysis code Fixed-energy Response-function Analysis with Multiple efficiency (FRAM) employs user-editable parameter sets to analyze a broad range of sample types. This report presents new parameter files, based upon a new set of plutonium branding ratios, which give more accurate isotope results than the current parameter files that use FRAM.

  4. An Improved Flame Test for Qualitative Analysis Using a Multichannel UV-Visible Spectrophotometer

    Science.gov (United States)

    Blitz, Jonathan P.; Sheeran, Daniel J.; Becker, Thomas L.

    2006-01-01

    Qualitative analysis schemes are used in undergraduate laboratory settings as a way to introduce equilibrium concepts and logical thinking. The main component of all qualitative analysis schemes is a flame test, as the color of light emitted from certain elements is distinctive and a flame photometer or spectrophotometer in each laboratory is…

  5. Genre Analysis and Writing Skill: Improving Iranian EFL Learners Writing Performance through the Tenets of Genre Analysis

    OpenAIRE

    Nazanin Naderi Kalali; Kian Pishkar

    2015-01-01

    The main thrust of this study was to determine whether a genre-based instruction improve the writing proficiency of Iranian EFL learners. To this end, 30 homogenous Iranian BA learners studying English at Islamic Azad University, Bandar Abbas Branch were selected as the participants of the study through a version of TOEFL test as the proficiency test. The selected participants were 15 females and 15 males who were randomly divided into two groups of experimental and control. The both experime...

  6. The job analysis of Korean nurses as a strategy to improve the Korean Nursing Licensing Examination

    Directory of Open Access Journals (Sweden)

    In Sook Park

    2016-06-01

    Full Text Available Purpose: This study aimed at characterizing Korean nurses’ occupational responsibilities to apply the results for improvement of the Korean Nursing Licensing Examination. Methods: First, the contents of nursing job were defined based on a focus group interview of 15 nurses. Developing a Curriculum (DACOM method was used to examine those results and produce the questionnaire by 13 experts. After that, the questionnaire survey to 5,065 hospital nurses was done. Results: The occupational responsibilities of nurses were characterized as involving 8 duties, 49 tasks, and 303 task elements. Those 8 duties are nursing management and professional development, safety and infection control, the management of potential risk factors, basic nursing and caring, the maintenance of physiological integrity, medication and parenteral treatments, socio-psychological integrity, and the maintenance and improvement of health. Conclusion: The content of Korean Nursing Licensing Examination should be improved based on 8 duties and 49 tasks of the occupational responsibilities of Korean nurses.

  7. Design and Analysis of a Differential Waveguide Structure to Improve Magnetostrictive Linear Position Sensors

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2011-05-01

    Full Text Available Magnetostrictive linear position sensors (MLPS are high-precision sensors used in the industrial field for measuring the propagation time of ultrasonic signals in a waveguide. To date, MLPS have attracted widespread attention for their accuracy, reliability, and cost-efficiency in performing non-contact, multiple measurements. However, the sensor, with its traditional structure, is susceptible to electromagnetic interference, which affects accuracy. In the present study, we propose a novel structure of MLPS that relies on two differential waveguides to improve the signal-to-noise ratio, common-mode rejection ratio, and accuracy of MLPS. The proposed sensor model can depict sensor performance and the relationship of sensor parameters. Experimental results with the new sensor indicate that the new structure can improve accuracy to ±0.1 mm higher than ±0.2 mm with a traditional structure. In addition, the proposed sensor shows a considerable improvement in temperature characteristics.

  8. Improvement Analysis in a Municipal Pumping System Aiming the Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Rafael Fernando Dutra

    2014-08-01

    Full Text Available With the rapid and disorderly growth that occurred in the city of Caxias do Sul - RS in the last two decades, many problems of water supply are observed in specific areas, especially in peak times and days of consumption. In order to solve the mentioned problem and focusing on energy efficiency, this study proposed two improvements in the pumping system of Santa Fe, which is responsible for supplying the northern part of Caxias do Sul. The improvements mentioned dealt with the exchange of the pumping system and the use of a frequency converter to control its speed. From the measurements made and simulations in spreadsheets and software Epanet, it was found that the two improvements are technically and economically viable, providing an estimated monthly savings of 37.7%.

  9. Simulation System of Car Crash Test in C-NCAP Analysis Based on an Improved Apriori Algorithm*

    Science.gov (United States)

    Xiang, LI

    In order to analysis car crash test in C-NCAP, an improved algorithm is given based on Apriori algorithm in this paper. The new algorithm is implemented with vertical data layout, breadth first searching, and intersecting. It takes advantage of the efficiency of vertical data layout and intersecting, and prunes candidate frequent item sets like Apriori. Finally, the new algorithm is applied in simulation of car crash test analysis system. The result shows that the relations will affect the C-NCAP test results, and it can provide a reference for the automotive design.

  10. The contribution of ergonomic analysis to the improvement of working conditions, patient reception, and quality in radiotherapy

    International Nuclear Information System (INIS)

    The authors report an ergonomic analysis of the activity in a radiotherapy department of a hospital. This study aimed at understanding physiological fatigue and pathologies among technical personnel, at analyzing factors which impact reception quality, but also the professional satisfaction associated with this profession. The authors have developed a guide for a radiotherapy technician workstation study. They report and comment the activity analysis (time organisation, working time in different positions, movements and handling, handled weights), also outline the anxiety associated with the risk of mistake. They identify various improvement possibilities

  11. Microstructure analysis of sintered Nd-Fe-B magnets improved by Tb-vapor sorption

    International Nuclear Information System (INIS)

    The surface treatment with Tb-vapor sorption improves the magnetic properties of small-sized magnets. Microstructure of the Tb-treated magnets was investigated by transmission electron microscopy (TEM). By applying the Tb-treatment, the diffusion of Tb through grain boundaries occurred even inside the magnets, and then a thin and continuous wetting-layer phase was formed at the boundaries between the Nb2Fe14B grains. The results suggest that the formation of the thin and continuous wetting-layer phase leads to the improvement in magnetic properties. (author)

  12. Improved diagnosis of MV paper-insulated cables using signal analysis

    DEFF Research Database (Denmark)

    Villefrance, Rasmus; Holbøll, Joachim T.; Sørensen, John Aasted;

    1999-01-01

    With the purpose of improving the PD estimation accuracy and the degree of automation of the measurements, the following study is carried out. Initially, a library of different discharge pulses and actual background noise from a selection of cables is established. The library is then used for the...... estimation of PD-signals from a parametric model leading to reduction of the noise superimposed on the PD-signals and thus to improved PD-detection. The applicability of these methods is discussed in relation to mobile systems for the assessment of cable insulation condition....

  13. THE ACCOUNT AND ANALYSIS IMPROVE FOR USING MAIN ITEMS IN DIVISION, SIGNALIZATION AND CONNECTION

    Directory of Open Access Journals (Sweden)

    A. M. Kozuberda

    2011-05-01

    Full Text Available The article is dealt with proposals for improvement of accounting in signaling and communication and the reasons for such decisions. It is offered to use new methods of calculating depreciation, change the criteria for enrollment in low-value items and simplify the procedure for writing off fixed assets. These changes should reduce the costs at the enterprise, simplify the accounting work and improve the overall performance of the maintenance section, provide better to its main function – the traffic safety of trains.

  14. Investigating data envelopment analysis model with potential improvement for integer output values

    Science.gov (United States)

    Hussain, Mushtaq Taleb; Ramli, Razamin; Khalid, Ruzelan

    2015-12-01

    The decrement of input proportions in DEA model is associated with its input reduction. This reduction is apparently good for economy since it could reduce unnecessary cost resources. However, in some situations the reduction of relevant inputs such as labour could create social problems. Such inputs should thus be maintained or increased. This paper develops an advanced radial DEA model dealing with mixed integer linear programming to improve integer output values through the combination of inputs. The model can deal with real input values and integer output values. This model is valuable for situations dealing with input combination to improve integer output values as faced by most organizations.

  15. Testing a four-dimensional variational data assimilation method using an improved intermediate coupled model for ENSO analysis and prediction

    Science.gov (United States)

    Gao, Chuan; Wu, Xinrong; Zhang, Rong-Hua

    2016-07-01

    A four-dimensional variational (4D-Var) data assimilation method is implemented in an improved intermediate coupled model (ICM) of the tropical Pacific. A twin experiment is designed to evaluate the impact of the 4D-Var data assimilation algorithm on ENSO analysis and prediction based on the ICM. The model error is assumed to arise only from the parameter uncertainty. The "observation" of the SST anomaly, which is sampled from a "truth" model simulation that takes default parameter values and has Gaussian noise added, is directly assimilated into the assimilation model with its parameters set erroneously. Results show that 4D-Var effectively reduces the error of ENSO analysis and therefore improves the prediction skill of ENSO events compared with the non-assimilation case. These results provide a promising way for the ICM to achieve better real-time ENSO prediction.

  16. Time-frequency analysis of non-stationary fusion plasma signals using an improved Hilbert-Huang transform

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yangqing, E-mail: liuyq05@gmail.com; Tan, Yi; Xie, Huiqiao; Wang, Wenhao; Gao, Zhe [Department of Engineering Physics, Tsinghua University, Beijing 100084 (China)

    2014-07-15

    An improved Hilbert-Huang transform method is developed to the time-frequency analysis of non-stationary signals in tokamak plasmas. Maximal overlap discrete wavelet packet transform rather than wavelet packet transform is proposed as a preprocessor to decompose a signal into various narrow-band components. Then, a correlation coefficient based selection method is utilized to eliminate the irrelevant intrinsic mode functions obtained from empirical mode decomposition of those narrow-band components. Subsequently, a time varying vector autoregressive moving average model instead of Hilbert spectral analysis is performed to compute the Hilbert spectrum, i.e., a three-dimensional time-frequency distribution of the signal. The feasibility and effectiveness of the improved Hilbert-Huang transform method is demonstrated by analyzing a non-stationary simulated signal and actual experimental signals in fusion plasmas.

  17. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  18. An integrated data analysis tool for improving measurements on the MST RFP

    International Nuclear Information System (INIS)

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method

  19. MS1 Peptide Ion Intensity Chromatograms in MS2 (SWATH) Data Independent Acquisitions. Improving Post Acquisition Analysis of Proteomic Experiments*

    OpenAIRE

    Rardin, Matthew J.; Schilling, Birgit; Cheng, Lin-Yang; MacLean, Brendan X.; Sorensen, Dylan J.; Sahu, Alexandria K.; MacCoss, Michael J; Vitek, Olga; Gibson, Bradford W.

    2015-01-01

    Quantitative analysis of discovery-based proteomic workflows now relies on high-throughput large-scale methods for identification and quantitation of proteins and post-translational modifications. Advancements in label-free quantitative techniques, using either data-dependent or data-independent mass spectrometric acquisitions, have coincided with improved instrumentation featuring greater precision, increased mass accuracy, and faster scan speeds. We recently reported on a new quantitative m...

  20. The analysis and improvement for the problem of 2KRT007MA in Daya Bay nuclear power plant

    International Nuclear Information System (INIS)

    In Daya Bay nuclear Power plant, KRT007MA provide continuous monitoring of β activity level for condenser system to detect the leakage of generator. Due to the unnormal working status of condenser, the sampling gas of KRT007MA contains water drips, which makes the channel alarm frequently. By the investigation and analysis, the root cause was found and improvement was adopt, which solved the problem. (authors)

  1. Analysis of Farmers’ Willingness to Adopt Improved Peanut Varieties in Northern Ghana with the use of Baseline Survey Data.

    OpenAIRE

    Ibrahim, Mohammed; Florkowski, Wojciech

    2015-01-01

    This study employed a probit model to identify the factors that influence the willingness of farmers in northern Ghana to adopt improved peanut varieties. A cross-sectional data of 206 peanut farmers from the Tamale Metropolitan, Tolon-Kumbungu and Savelugu-Nanton districts in the northern region of Ghana were used in the analysis. The estimated results indicate that Tolon-Kumbungu district (location), early maturity, farm size, ownership of a radio and membership in a farm organization signi...

  2. Vibrational Analysis of Brucite Surfaces and the Development of an Improved Force Field for Molecular Simulation of Interfaces

    OpenAIRE

    Zeitler, Todd R.; Greathouse, Jeffery A.; Gale, Julian D.; Cygan, Randall T.

    2014-01-01

    We introduce a nonbonded three-body harmonic potential energy term for Mg–O–H interactions for improved edge surface stability in molecular simulations. The new potential term is compatible with the Clayff force field and is applied here to brucite, a layered magnesium hydroxide mineral. Comparisons of normal mode frequencies from classical and density functional theory calculations are used to verify a suitable spring constant (k parameter) for the Mg–O–H bending motion. Vibrational analysis...

  3. Detection of ULF electromagnetic emissions as a precursor to an earthquake in China with an improved polarization analysis

    OpenAIRE

    Y. Ida; D. Yang; Li, Q.; Sun, H; Hayakawa, M.

    2008-01-01

    An improved analysis of polarization (as the ratio of vertical magnetic field component to the horizontal one) has been developed, and applied to the approximately four years data (from 1 March 2003 to 31 December 2006) observed at Kashi station in China. It is concluded that the polarization ratio has exhibited an apparent increase only just before the earthquake on 1 September 2003 (magnitude = 6.1 and epicentral distance of 116 km).

  4. Using Process Definition and Analysis Techniques to Reduce Errors and Improve Efficiency in the Delivery of Healthcare

    OpenAIRE

    Clarke, Lori; Osterweil, Leon

    2013-01-01

    As has been widely reported in the news lately, heathcare errors are a major cause of death and suffering, and healthcare inefficiencies result in escalating costs. In the University of Massachusetts Medical Safety Project, we are investigating if process definition and analysis technologies can be used to help reduce heathcare errors and improve heathcare efficiency. Specifically, we are modeling healthcare processes using a process definition language and then analyzing these processes usin...

  5. Knowledge management for improving business processes: an analysis of the transport management process for indivisible exceptional cargo

    OpenAIRE

    André Cristiano Silva Melo; Maria Aparecida Cavalcanti Netto; Virgílio José Martins Ferreira Filho; Elton Fernandes

    2010-01-01

    This paper presents an organizational analysis methodology aimed at knowledge capitalization with a view to improving business processes. Based on a real problem in a large electric sector firm, this methodology is applied to managing the transport of indivisible exceptional cargo (IEC). In the firm in question, intellectual capital is a critical asset for service performance and is fundamental for achieving business excellence. Applied to the firm's transport management process, the approach...

  6. Detection of ULF electromagnetic emissions as a precursor to an earthquake in China with an improved polarization analysis

    Directory of Open Access Journals (Sweden)

    Y. Ida

    2008-07-01

    Full Text Available An improved analysis of polarization (as the ratio of vertical magnetic field component to the horizontal one has been developed, and applied to the approximately four years data (from 1 March 2003 to 31 December 2006 observed at Kashi station in China. It is concluded that the polarization ratio has exhibited an apparent increase only just before the earthquake on 1 September 2003 (magnitude = 6.1 and epicentral distance of 116 km.

  7. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    OpenAIRE

    Bang Yeon Lee; Su-Tae Kang; Hae-Bum Yun; Yun Yong Kim

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed tech...

  8. Fuzzy approach to analysis of flood risk based on variable fuzzy sets and improved information diffusion methods

    OpenAIRE

    Li, Q.

    2013-01-01

    The predictive analysis of natural disasters and their consequences is challenging because of uncertainties and incomplete data. The present article studies the use of variable fuzzy sets (VFS) and improved information diffusion method (IIDM) to construct a composite method. The proposed method aims to integrate multiple factors and quantification of uncertainties within a consistent system for catastrophic risk assessment. The fuzzy methodology is proposed in the area of flood disaster risk ...

  9. Improvement of X-Ray Fluorescence Sensitivity by Day Ashing Method for Elemental Analysis of Bee Honey

    International Nuclear Information System (INIS)

    Elements, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Rb, and Sr in bee honey samples were determined using an improved dry ashing (DA) method for XRF with Mo-secondary target (Mo-XRF). The sensitivity of the DA method was significantly improved in comparison to the wet ashing (WA) and the direct (D) methods. The limits of detection (LODs) obtained by DA (3.4-0.007 μg/g) method were better by an order of magnitude than those obtained by WA (34.0-0.120 μg/g) and D (61.2-0.270 μg/g) methods. Further improvements in the sensitivity of the DA-XRF were achieved by using a Cu-secondary target for the excitation of the elements of K, Ca, Ti, Cr, and Mn. In this instance, the LODs were in the range of 0.220-0.024 μg/g. The results of DA-XRF analysis revealed a very good accuracy with errors less than 7.1% and a precision with a relative standard deviation (RSD) better than ± 8.8%. The improved DA-XRF analysis was applied for the determination of the above mentioned elements in several Syrian bee honey samples. The results were comparable to those obtained by the atomic spectrometry method with correlation coefficients better than 0.9927. (author)

  10. A Meta-Analysis of Educational Data Mining on Improvements in Learning Outcomes

    Science.gov (United States)

    AlShammari, Iqbal A.; Aldhafiri, Mohammed D.; Al-Shammari, Zaid

    2013-01-01

    A meta-synthesis study was conducted of 60 research studies on educational data mining (EDM) and their impacts on and outcomes for improving learning outcomes. After an overview, an examination of these outcomes is provided (Romero, Ventura, Espejo, & Hervas, 2008; Romero, "et al.", 2011). Then, a review of other EDM-related research…

  11. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jing; GUO ZengYuan

    2008-01-01

    The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.

  12. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.

  13. How Can Bulgaria Improve Its Education System? An Analysis of PISA 2012 and Past Results

    OpenAIRE

    World Bank

    2012-01-01

    Bulgaria's performance on all three disciplines of the program for international student assessment (PISA) 2012 was slightly better than its PISA 2000 performance, after having dropped between 2000 and 2006. The improvements in performance between 2006 and 2012 promoted shared prosperity, but equality of opportunities is still a major challenge. In fact, disaggregating students' PISA score...

  14. Reporting Data with "Over-the-Counter" Data Analysis Supports Improves Educators' Data Analyses

    Science.gov (United States)

    Rankin, Jenny Grant

    2014-01-01

    The benefits of making data-informed decisions to improve learning rely on educators correctly interpreting given data. Many educators routinely misinterpret data, even at districts with proactive support for data use. The tool most educators use for data analyses, which is an information technology data system or its reports, typically reports…

  15. Improving the design and analysis of superconducting magnets for particle accelerators

    International Nuclear Information System (INIS)

    The field quality in superconducting magnets has been improved to a level that it does not appear to be a limiting factor on the performance of RHIC. The many methods developed, improved and adopted during the course of this work have contributed significantly to that performance. One can not only design and construct magnets with better field quality than in one made before but can also improve on that quality after construction. The relative field error (ΔB/B) can now be made as low as a few parts in 10-5 at 2/3 of the coil radius. This is about an order of magnitude better than what is generally expected for superconducting magnets. This extra high field quality is crucial to the luminosity performance of RHIC. The research work described here covers a number of areas which all must be addressed to build the production magnets with a high field quality. The work has been limited to the magnetic design of the cross section which in most cases essentially determines the field quality performance of the whole magnet since these magnets are generally long. Though the conclusions to be presented in this chapter have been discussed at the end of each chapter, a summary of them might be useful to present a complete picture. The lessons learned from these experiences may be useful in the design of new magnets. The possibilities of future improvements will also be presented

  16. Transient Voltage Stability Analysis and Improvement of A Network with different HVDC Systems

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe

    2011-01-01

    the two links and the size of loads. In order to improve the transient voltage stability, a voltage adjusting method is proposed in this paper. A voltage increment component has been introduced into the outer voltage control loop under emergency situation caused by severe grid faults. In order to verify...

  17. Improvements in the Reliability of X-Ray Photoelectron Spectroscopy for Surface Analysis

    Science.gov (United States)

    Powell, Cedric J.

    2004-01-01

    A survey of progress made over the past ~30 years (1970 - 2004) to improve the reliability of surface analyses by XPS is presented. The basic principles of XPS are very simple yet there is considerable complexity in designing experiments, interpreting data, and analyzing the results for the many different modes of operation and many different…

  18. Analysis and Improvement of the Lightweight Mutual Authentication Protocol under EPC C-1 G-2 Standard

    Directory of Open Access Journals (Sweden)

    Masoud Mohammadi

    Full Text Available Radio Frequency Identification (RFID technology is a promising technology. It uses radio waves to identify objects. Through automatic and real-time data acquisition, this technology can give a great benefit to various industries by improving the efficien ...

  19. Does Agency Competition Improve the Quality of Policy Analysis? Evidence from OMB and CBO Fiscal Projections

    Science.gov (United States)

    Krause, George A.; Douglas, James W.

    2006-01-01

    Public management scholars often claim that agency competition provides an effective institutional check on monopoly authority, and hence, leads to improvement of administrative performance in public sector agencies. This logic was central for creating the Congressional Budget Office (CBO) in 1975 to challenge the policy information provided by…

  20. AN IMPROVED ERROR ANALYSIS FOR FINITE ELEMENT APPROXIMATION OF BIOLUMINESCENCE TOMOGRAPHY

    Institute of Scientific and Technical Information of China (English)

    Wei Gong; Ruo Li; Ningning Yan; Weibo Zhao

    2008-01-01

    This paper is concerned with an ill-posed problem which results from the area of molecular imaging and is known as BLT problem. Using Tikhonov regularization technique, a quadratic optimization problem can be formulated. We provide an improved error estimate for the finite element approximation of the regularized optimization problem. Some numerical examples are presented to demonstrate our theoretical results.

  1. Improving Financial Literacy of College Students: A Cross-Sectional Analysis

    Science.gov (United States)

    Seyedian, Mojtaba; Yi, Taihyeup David

    2011-01-01

    Financial literacy has become more important than ever as an increasing number of college students are relying on credit cards to finance their education. We examine whether college students are knowledgeable about finance, whether they improve upon that knowledge, and whether their demographic profile, financial backgrounds, and…

  2. A Novel Approach to Improve the Detectability of CO2 by GC Analysis

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A novel stochastic resonance algorithm was employed to enhance the signal-to-noise ratio (SNR) of signals of analytical chemistry. By using a gas chromatographic data set, it was proven that the SNR was greatly improved and the quantitative relationship between concentrations and chromatographic responses remained simultaneously. The linear range was extended beyond the instrumental detection limit.

  3. Using Mobile Phones to Improve Educational Outcomes: An Analysis of Evidence from Asia

    Science.gov (United States)

    Valk, John-Harmen; Rashid, Ahmed T.; Elder, Laurent

    2010-01-01

    Despite improvements in educational indicators, such as enrolment, significant challenges remain with regard to the delivery of quality education in developing countries, particularly in rural and remote regions. In the attempt to find viable solutions to these challenges, much hope has been placed in new information and communication technologies…

  4. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.

    2013-11-01

    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  5. Transportation routing analysis geographic information system - tragis, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental United States. This paper outlines some of the features available in this model. (authors)

  6. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  7. Improving the accuracy of likelihood-based inference in meta-analysis and meta-regression

    OpenAIRE

    Kosmidis, Ioannis; Guolo, Annamaria; Varin, Cristiano

    2015-01-01

    Random-effects models are frequently used to synthesise information from different studies in meta-analysis. While likelihood-based inference is attractive both in terms of limiting properties and in terms of implementation, its application in random-effects meta-analysis may result in misleading conclusions, especially when the number of studies is small to moderate. The current paper shows how methodology that reduces the asymptotic bias of the maximum likelihood estimator of the variance c...

  8. Improving Real Analysis in Coq: a User-Friendly Approach to Integrals and Derivatives

    OpenAIRE

    Boldo, Sylvie; Lelay, Catherine; Melquiond, Guillaume

    2012-01-01

    International audience Verification of numerical analysis programs requires dealing with derivatives and integrals. High confidence in this process can be achieved using a formal proof checker, such as Coq. Its standard library provides an axiomatization of real numbers and various lemmas about real analysis, which may be used for this purpose. Unfortunately, its definitions of derivative and integral are unpractical as they are partial functions that demand a proof term. This proof term m...

  9. Analysis of marketing communications in the selected services firm and proposal of possible improvements

    OpenAIRE

    Černá, Jana

    2009-01-01

    This graduation theses were concerned with analysis of marketing communication in services {--} in the Wellness hotel Rezidence Nové Hrady. This theses was incurred during temporary opening of the hotel till finishing and opening new spaces. There is recapped a history of the hotel and executed detailed situation analysis in the analytical part. There were detection advantages and disadvantages, which should the hotel use for persuasion the costumers. The synthetic part consequent to the anal...

  10. Principles and methods of neutron activation analysis (NAA) in improved water resources development

    International Nuclear Information System (INIS)

    The methods of neutron activation analysis (NAA) as it applies to water resources exploration, exploitation and management has been reviewed and its capabilities demonstrated. NAA has been found to be superior and offer higher sensitivity to many other analytical techniques in analysis of water. The implications of chemical and element concentrations (water pollution and quality) determined in water on environmental impact assessment to aquatic life and human health are briefly highlighted

  11. Analysis of the evaluated data discrepancies for minor actinides and development of improved evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ignatyuk, A. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    1997-03-01

    The work is directed on a compilation of experimental and evaluated data available for neutron induced reaction cross sections on {sup 237}Np, {sup 241}Am, {sup 242m}Am and {sup 243}Am isotopes, on the analysis of the old data and renormalizations connected with changes of standards and on the comparison of experimental data with theoretical calculation. Main results of the analysis performed by now are presented in this report. (J.P.N.)

  12. Improving online source analysis in history education: Trialling the Ethos model

    OpenAIRE

    James Goulding

    2015-01-01

    This paper reports on the findings of a study that compared models of online source analysis. It is argued that there is disconnect between print-based (classical) approaches to teaching online source analysis in history, and how students are informally analysing online information. It will be argued that this disconnect makes it difficult for students to effectively analyse online sources containing false and misleading information. In order to address this issue, formal web-based approaches...

  13. An improved method for statistical analysis of raw accelerator mass spectrometry data

    International Nuclear Information System (INIS)

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs

  14. Do improvements in outreach, clinical, and family and community-based services predict improvements in child survival? An analysis of serial cross-sectional national surveys

    Directory of Open Access Journals (Sweden)

    Simen-Kapeu Aline

    2011-06-01

    Full Text Available Abstract Background There are three main service delivery channels: clinical services, outreach, and family and community. To determine which delivery channels are associated with the greatest reductions in under-5 mortality rates (U5MR, we used data from sequential population-based surveys to examine the correlation between changes in coverage of clinical, outreach, and family and community services and in U5MR for 27 high-burden countries. Methods Household survey data were abstracted from serial surveys in 27 countries. Average annual changes (AAC between the most recent and penultimate survey were calculated for under-five mortality rates and for 22 variables in the domains of clinical, outreach, and family- and community-based services. For all 27 countries and a subset of 19 African countries, we conducted principal component analysis to reduce the variables into a few components in each domain and applied linear regression to assess the correlation between changes in the principal components and changes in under-five mortality rates after controlling for multiple potential confounding factors. Results AAC in under 5-mortality varied from 6.6% in Nepal to -0.9% in Kenya, with six of the 19 African countries all experiencing less than a 1% decline in mortality. The strongest correlation with reductions in U5MR was observed for access to clinical services (all countries: p = 0.02, r2 = 0.58; 19 African countries p 2 = 0.67. For outreach activities, AAC U5MR was significantly correlated with antenatal care and family planning services, while AAC in immunization services showed no association. In the family- and community services domain, improvements in breastfeeding were associated with significant changes in mortality in the 30 countries but not in the African subset; while in the African countries, nutritional status improvements were associated with a significant decline in mortality. Conclusions Our findings support the importance of

  15. Research To Improve The Ability Of Neutron Activation Analysis And X- Ray Fluorescence Analysis Methods For Crude Oil Samples

    International Nuclear Information System (INIS)

    Some improvements for INAA procedure on nuclear reactor to determine trace elements concentration in crude oil samples have been carried out, including (i) the procedure of packing sample for irradiation and investigation of neutron attenuation in container for long irradiation by MCNP; (ii) Study and correction of the sample evaporation after irradiation; (iii) Treatment of sample before irradiation by low temperature vaporization and solvent; and (iv) Investigation of gamma self absorption in the sample. The XRFA procedure for analyzing of V, Ni, Cu, Zn, Sr and Pb in crude oil sample has also been established. In the INAA procedure for crude oil, it is necessary to let the sample naturally vaporize before irradiation (except for some easily vaporized elements such as Br, Cl, S, Se, As and Hg). The sample is then packed by improved procedure. Neutron shielding (during irradiation) as well as gamma self absorption for energy < 400 keV (during measurement) and sample vaporization (after irradiation) are corrected. XRFA is available in analyzing of some elements such as K, Ca, Ga, Ge, Y, Zr and Pb in crude oil sample, which are difficultly characterized by INAA. Through this study, determination of trace elements concentration in crude oil sample has been done by combination of INAA and XRFA at NRI. (author)

  16. Novel approaches to the analysis of nuclear and other radioactive materials - Improving detection capability through alpha-gamma coincidence, alpha-induced optical fluorescence and advanced spectrum analysis

    OpenAIRE

    Ihantola, Sakari

    2013-01-01

    Nuclear and other radioactive materials pose a special concern in the proliferation of nuclear weapons, reactor accidents or through criminal acts. To prevent the adverse effects of the use of these materials, novel approaches for their detection and analysis are required. The objective of the research in this thesis was to improve the detection and characterisation of nuclear and other radioactive materials with radiometric methods. Radioactive sources can be detected and identified base...

  17. Mechanism analysis and evaluation methodology of regenerative braking contribution to energy efficiency improvement of electrified vehicles

    International Nuclear Information System (INIS)

    Highlights: • The energy flow of an electric vehicle with regenerative brake is analyzed. • Methodology for measuring the regen brake contribution is discussed. • Evaluation parameters of regen brake contribution are proposed. • Vehicle tests are carried out on chassis dynamometer. • Test results verify the evaluation method and parameters proposed. - Abstract: This article discusses the mechanism and evaluation methods of contribution brought by regenerative braking to electric vehicle’s energy efficiency improvement. The energy flow of an electric vehicle considering the braking energy regeneration was analyzed. Then, methodologies for measuring the contribution made by regenerative brake to vehicle energy efficiency improvement were introduced. Based on the energy flow analyzed, two different evaluation parameters were proposed. Vehicle tests were carried out on chassis dynamometer under typical driving cycles with three different control strategies. The experimental results the difference between the proposed two evaluation parameters, and demonstrated the feasibility and effectiveness of the evaluation methodologies proposed

  18. Genetic Diversity Analysis of Iranian Improved Rice Cultivars through RAPD Markers

    Directory of Open Access Journals (Sweden)

    Ghaffar KIANI

    2011-08-01

    Full Text Available The aim of this study was to evaluate the genetic diversity of Iranian improved rice varieties. Sixteen rice varieties of particular interest to breeding programs were evaluated by means of random amplified polymorphic DNA (RAPD technique. The number of amplification products generated by each primer varied from 4 (OPB-04 to 11 (OPD-11 with an average of 8.2 bands per primer. Out of 49 bands, 33 (67.35% were found to be polymorphic for one or more cultivars ranging from 4 to 9 fragments per primer. The size of amplified fragments ranged between 350 to 1800 bp. Pair-wise Nei and Li�s (1979 similarity estimated the range of 0.59 to 0.98 between rice cultivars. Results illustrate the potential of RAPD markers to distinguish improved cultivars at DNA level. The information will facilitate selection of genotypes to serve as parents for effective rice breeding programs in Iran.

  19. Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference

    CERN Document Server

    Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter

    2016-01-01

    A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...

  20. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.