WorldWideScience

Sample records for analysis improves col4a5

  1. A Novel COL4A5 Mutation Identified in a Chinese Han Family Using Exome Sequencing

    Directory of Open Access Journals (Sweden)

    Xiaofei Xiu

    2014-01-01

    Full Text Available Alport syndrome (AS is a monogenic disease of the basement membrane (BM, resulting in progressive renal failure due to glomerulonephropathy, variable sensorineural hearing loss, and ocular anomalies. It is caused by mutations in the collagen type IV alpha-3 gene (COL4A3, the collagen type IV alpha-4 gene (COL4A4, and the collagen type IV alpha-5 gene (COL4A5, which encodes type IV collagen α3, α4, and α5 chains, respectively. To explore the disease-related gene in a four-generation Chinese Han pedigree of AS, exome sequencing was conducted on the proband, and a novel deletion mutation c.499delC (p.Pro167Glnfs*36 in the COL4A5 gene was identified. This mutation, absent in 1,000 genomes project, HapMap, dbSNP132, YH1 databases, and 100 normal controls, cosegregated with patients in the family. Neither sensorineural hearing loss nor typical COL4A5-related ocular abnormalities (dot-and-fleck retinopathy, anterior lenticonus, and the rare posterior polymorphous corneal dystrophy were present in patients of this family. The phenotypes of patients in this AS family were characterized by early onset-age and rapidly developing into end-stage renal disease (ESRD. Our discovery broadens the mutation spectrum in the COL4A5 gene associated with AS, which may also shed new light on genetic counseling for AS.

  2. A COL4A5 mutation with glomerular disease and signs of chronic thrombotic microangiopathy.

    Science.gov (United States)

    Wuttke, Matthias; Seidl, Maximilian; Malinoc, Angelica; Prischl, Friedrich C; Kuehn, E Wolfgang; Walz, Gerd; Köttgen, Anna

    2015-12-01

    COL4A5 mutations are a known cause of Alport syndrome, which typically manifests with haematuria, hearing loss and ocular symptoms. Here we report on a 16-year-old male patient with a negative family history who presented with proteinuria, progressive renal failure and haemolysis, but without overt haematuria or hearing loss. A renal biopsy revealed features of atypical IgA nephropathy, while a second biopsy a year later showed features of focal segmental glomerulosclerosis, but was finally diagnosed as chronic thrombotic microangiopathy. Targeted sequencing of candidate genes for steroid-resistant nephrotic syndrome and congenital thrombotic microangiopathy was negative. Despite all therapeutic efforts, including angiotensin-converting enzyme inhibition, immunosuppressive therapy, plasma exchanges and rituximab, the patient progressed to end-stage renal disease. When a male cousin presented with nephrotic syndrome years later, whole-exome sequencing identified a shared disruptive COL4A5 mutation (p.F222C) that showed X-linked segregation. Thus, mutations in COL4A5 give rise to a broader spectrum of clinical presentation than commonly suspected, highlighting the benefits of comprehensive rather than candidate genetic testing in young patients with otherwise unexplained glomerular disease. Our results are in line with an increasing number of atypical presentations of single-gene disorders identified through genome-wide sequencing.

  3. High mutation detection rate in the COL4A5 collagen gene in suspected Alport syndrome using PCR and direct DNA sequencing

    DEFF Research Database (Denmark)

    Martin, P; Heiskari, N; Zhou, J;

    1998-01-01

    Approximately 85% of patients with Alport syndrome (hereditary nephritis) have been estimated to have mutations in the X chromosomal COL4A5 collagen gene; the remaining cases are autosomal with mutations in the COL4A3 or COL4A4 genes located on chromosome 2. In the present work, the promoter sequ...

  4. A nonsense mutation in the COL4A5 collagen gene in a family with X-linked juvenile Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Heiskari, N; Zhou, J;

    1995-01-01

    The X-linked form of Alport syndrome is associated with mutations in the COL4A5 gene encoding the alpha 5-chain of type IV collagen. By using PCR-amplification and direct sequencing we identified a novel mutation involving a deletion of the last two bases in the codon GGA for Glycine-1479 in exon...... 47 of the COL4A5 gene in a patient with a juvenile form of X-linked Alport syndrome with deafness. This two base deletion caused a shift in the reading frame and introduced a premature stop codon which resulted in an alpha 5(IV)-chain shortened by 202 residues and lacking almost the entire NC1 domain...

  5. Mutations in the codon for a conserved arginine-1563 in the COL4A5 collagen gene in Alport syndrome

    DEFF Research Database (Denmark)

    Zhou, J; Gregory, M C; Hertz, Jens Michael;

    1993-01-01

    for arginine to the translation stop codon TGA. In Utah kindred 2123 and in the Danish kindred A13, there was a C-->T mutation in the noncoding strand changing the same codon to CAA for glutamine. Both mutations were confirmed by allele-specific hybridization on PCR-amplified DNA from other family members....... kindreds. All three kindreds have classical Alport syndrome of the juvenile type. DNA-sequencing analyses demonstrated two different single base changes in the codon for arginine-1563 located in exon 48. In Utah kindred 2103, there was a substitution of C by T resulting in the change of the CGA codon...

  6. Detection of mutations in the COL4A5 gene by SSCP in X-linked Alport syndrome

    DEFF Research Database (Denmark)

    Hertz, Jens Michael; Juncker, I; Persson, U;

    2001-01-01

    -linked inheritance, and 15 isolated cases. We found a mutation detection rate of 52% (42/81) (58% in males and 21% in females), and 69% (20/29) in families who clearly demonstrated X-linked inheritance. Thirty-six different mutations were found in 42 patients comprising 16 missense mutations, seven frameshifts...

  7. Failure Analysis for Improved Reliability

    Science.gov (United States)

    Sood, Bhanu

    2016-01-01

    Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure

  8. Improved Intermittency Analysis of Single Event Data

    OpenAIRE

    Janik, R. A.; Ziaja, B.

    1998-01-01

    The intermittency analysis of single event data (particle moments) in multiparticle production is improved, taking into account corrections due to the reconstruction of history of a particle cascade. This approach is tested within the framework of the $\\alpha$-model.

  9. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor;

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  10. Improved Intermittency Analysis of Individual Events

    OpenAIRE

    Janik, R. A.; Ziaja, B.

    1998-01-01

    Recent progress on the event-by-event analysis of intermittent data by R. A. Janik and myself is reported. The intermittency analysis of single event data (particle moments) in multiparticle production is improved, taking into account corrections due to the reconstruction of history of a particle cascade. This approach is tested within the framework of the $\\alpha$-model.}

  11. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  12. AN IMPROVED ALGORITHM FOR DPIV CORRELATION ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    WU Long-hua

    2007-01-01

    In a Digital Particle Image Velocimetry (DPIV) system, the correlation of digital images is normally used to acquire the displacement information of particles and give estimates of the flow field. The accuracy and robustness of the correlation algorithm directly affect the validity of the analysis result. In this article, an improved algorithm for the correlation analysis was proposed which could be used to optimize the selection/determination of the correlation window, analysis area and search path. This algorithm not only reduces largely the amount of calculation, but also improves effectively the accuracy and reliability of the correlation analysis. The algorithm was demonstrated to be accurate and efficient in the measurement of the velocity field in a flocculation pool.

  13. Analysis of Questionnaire using Multivariate Analysis for Improving Lectures

    Science.gov (United States)

    Abe, Takehiko; Tajima, Takuya; Kimura, Haruhiko

    Recently, universities send out questionnaire to students. Result of this questionnaire is used for improving lectures. However, subjective views of students control result of questionnaire. Therefore, there is a necessity for analyzing result of questionnaire. This paper uses regression analysis and quantification theory type 3. In conclusion, this paper explains relation between student's satisfaction and grade by analysis of result of questionnaire.

  14. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  15. Improved Tiled Bitmap Forensic Analysis Algorithm

    Directory of Open Access Journals (Sweden)

    C. D. Badgujar, G. N. Dhanokar

    2012-12-01

    Full Text Available In Computer network world, the needs for securityand proper systems of control are obvious and findout the intruders who do the modification andmodified data. Nowadays Frauds that occurs incompanies are not only by outsiders but also byinsiders. Insider may perform illegal activity & tryto hide illegal activity. Companies would like to beassured that such illegal activity i.e. tampering hasnot occurred, or if it does, it should be quicklydiscovered. Mechanisms now exist that detecttampering of a database, through the use ofcryptographically-strong hash functions. This papercontains a survey which explores the various beliefsupon database forensics through differentmethodologies using forensic algorithms and toolsfor investigations. Forensic analysis algorithms areused to determine who, when, and what data hadbeen tampered. Tiled Bitmap Algorithm introducesthe notion of a candidate set (all possible locationsof detected tampering(s and provides a completecharacterization of the candidate set and itscardinality. Improved tiled bitmap algorithm willcover come the drawbacks of existing tiled bitmapalgorithm.

  16. Analysis and Improvement of a User Authentication Improved Protocol

    Directory of Open Access Journals (Sweden)

    Zuowen Tan

    2010-05-01

    Full Text Available Remote user authentication always adopts the method of password to login the server within insecure network environments. Recently, Peyravin and Jeffries proposed a practical authentication scheme based on one-way collision-resistant hash functions. However, Shim and Munilla independently showed that the scheme is vulnerable to off-line guessing attacks. In order to remove the weakness, Hölbl, Welzer and Brumenn presented an improved secure password-based protocols for remote user authentication, password change and session key establishment.  Unfortunately, the remedies of their improved scheme cannot work. The improved scheme still suffers from the off-line attacks. And the password change protocol is insecure against Denial-of-Service attack. A proposed scheme is presented which overcomes these weaknesses. Detailed cryanalysis show that the proposed password-based protocols for remote user authentication, password change and session key establishment are immune against man-in-the-middle attacks, replay attacks, password guessing attacks, outsider attacks, denial-of-Service attacks and impersonation attacks.

  17. Productivity improvement through cycle time analysis

    Science.gov (United States)

    Bonal, Javier; Rios, Luis; Ortega, Carlos; Aparicio, Santiago; Fernandez, Manuel; Rosendo, Maria; Sanchez, Alejandro; Malvar, Sergio

    1996-09-01

    A cycle time (CT) reduction methodology has been developed in the Lucent Technology facility (former AT&T) in Madrid, Spain. It is based on a comparison of the contribution of each process step in each technology with a target generated by a cycle time model. These targeted cycle times are obtained using capacity data of the machines processing those steps, queuing theory and theory of constrains (TOC) principles (buffers to protect bottleneck and low cycle time/inventory everywhere else). Overall efficiency equipment (OEE) like analysis is done in the machine groups with major differences between their target cycle time and real values. Comparisons between the current value of the parameters that command their capacity (process times, availability, idles, reworks, etc.) and the engineering standards are done to detect the cause of exceeding their contribution to the cycle time. Several friendly and graphical tools have been developed to track and analyze those capacity parameters. Specially important have showed to be two tools: ASAP (analysis of scheduling, arrivals and performance) and performer which analyzes interrelation problems among machines procedures and direct labor. The performer is designed for a detailed and daily analysis of an isolate machine. The extensive use of this tool by the whole labor force has demonstrated impressive results in the elimination of multiple small inefficiencies with a direct positive implications on OEE. As for ASAP, it shows the lot in process/queue for different machines at the same time. ASAP is a powerful tool to analyze the product flow management and the assigned capacity for interdependent operations like the cleaning and the oxidation/diffusion. Additional tools have been developed to track, analyze and improve the process times and the availability.

  18. SPORTS ORGANIZATIONS MANAGEMENT IMPROVEMENT: A SURVEY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alin Molcut

    2015-07-01

    Full Text Available Sport organizations exist to perform tasks that can only be executed through cooperative effort, and sport management is responsible for the performance and success of these organizations. The main of the paper is to analyze several issues of management sports organizations in order to asses their quality management. In this respect a questionnaire has been desingned for performing a survey analysis through a statistical approach. Investigation was conducted over a period of 3 months, and have been questioned a number of managers and coaches of football, all while pursuing an activity in football clubs in the counties of Timis and Arad, the level of training for children and juniors. The results suggest that there is a significant interest for the improvement of management across teams of children and under 21 clubs, emphasis on players' participation and rewarding performance. Furthermore, we can state that in the sports clubs there is established a vision and a mission as well as the objectives of the club's general refers to both sporting performance, and financial performance.

  19. Improved security analysis of Fugue-256

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Bagheri, Nasour; Knudsen, Lars Ramkilde;

    2011-01-01

    in the G transform. Next we improve the designers’ meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Next we study the security of Fugue-256 against free-start distinguishers and free-start collisions. In this direction, we use an improved variant of the differential...

  20. SINFAC - SYSTEMS IMPROVED NUMERICAL FLUIDS ANALYSIS CODE

    Science.gov (United States)

    Costello, F. A.

    1994-01-01

    The Systems Improved Numerical Fluids Analysis Code, SINFAC, consists of additional routines added to the April 1983 revision of SINDA, a general thermal analyzer program. The purpose of the additional routines is to allow for the modeling of active heat transfer loops. The modeler can simulate the steady-state and pseudo-transient operations of 16 different heat transfer loop components including radiators, evaporators, condensers, mechanical pumps, reservoirs and many types of valves and fittings. In addition, the program contains a property analysis routine that can be used to compute the thermodynamic properties of 20 different refrigerants. SINFAC can simulate the response to transient boundary conditions. SINFAC was first developed as a method for computing the steady-state performance of two phase systems. It was then modified using CNFRWD, SINDA's explicit time-integration scheme, to accommodate transient thermal models. However, SINFAC cannot simulate pressure drops due to time-dependent fluid acceleration, transient boil-out, or transient fill-up, except in the accumulator. SINFAC also requires the user to be familiar with SINDA. The solution procedure used by SINFAC is similar to that which an engineer would use to solve a system manually. The solution to a system requires the determination of all of the outlet conditions of each component such as the flow rate, pressure, and enthalpy. To obtain these values, the user first estimates the inlet conditions to the first component of the system, then computes the outlet conditions from the data supplied by the manufacturer of the first component. The user then estimates the temperature at the outlet of the third component and computes the corresponding flow resistance of the second component. With the flow resistance of the second component, the user computes the conditions down stream, namely the inlet conditions of the third. The computations follow for the rest of the system, back to the first component

  1. Novel analysis and improvement of Yahalom protocol

    Institute of Scientific and Technical Information of China (English)

    CHEN Chun-ling; YU Han; L(U) Heng-shan; WANG Ru-chuan

    2009-01-01

    The modified version of Yahalom protocol improved by Burrows, Abradi, and Needham (BAN) still has security drawbacks. This study analyzed such flaws in a detailed way from the point of strand spaces, which is a novel method of analyzing protocol's security. First, a mathematical model of BAN-Yahalom protocol is constructed. Second, penetrators' abilities are restricted with a rigorous and formalized definition. Moreover, to increase the security of this protocol against potential attackers in practice, a further improvement is made to the protocol. Future application of this re-improved protocol is also discussed.

  2. Improving Public Perception of Behavior Analysis.

    Science.gov (United States)

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis. PMID:27606184

  3. Liquefaction mathematical analysis for improvement structures stability

    Directory of Open Access Journals (Sweden)

    Azam Khodashenas Pelko

    2010-10-01

    Full Text Available The stability of any structure is possible if foundation is appropriately designed. The Bandar abbas is the largest and most important port of Iran, with high seismicity and occurring strong earthquakes in this territory, the soil mechanical properties of different parts of city have been selected as the subject of current research. The data relating to the design of foundation for improvement of structure at different layer of subsoil have been collected and, accordingly, soil mechanical properties have been evaluated. The results of laboratory experiments can be used for evaluation of geotechnical characteristics of urban area for development a region with high level of structural stability. Ultimately, a new method for calculation of liquefaction force is suggested. It is applicable for improving geotechnical and structure codes and also for reanalysis of structure stability of previously constructed buildings.

  4. An Economic Analysis of Improved Water Quality

    OpenAIRE

    Alam, Khorshed; Rolfe, John; Donaghy, Peter

    2006-01-01

    The research reported in this paper is focused on the cost-effectiveness of intervention strategies to reduce pollution loads and improve water quality in South-east Queensland. Strategies considered include point and non-point source interventions. Predicted reductions in pollution levels were calculated for each action based on the expected population growth. The costs of the interventions included the full investment and annual running costs as well as planned public investment by the stat...

  5. Video analysis applied to volleyball didactics to improve sport skills

    OpenAIRE

    Raiola, Gaetano; Parisi, Fabio; Giugno, Ylenia; Di Tore, Pio Alfredo

    2013-01-01

    The feedback method is increasingly used in learning new skills and improving performance. "Recent research, however, showed that the most objective and quantitative feedback is, theº greater its effect on performance". The video analysis, which is the analysis of sports performance by watching the video, is used primarily for use in the quantitative performance of athletes through the notational analysis. It may be useful to combine the quantitative and qualitative analysis of the single ges...

  6. An improved evaluation method for fault tree kinetic analysis

    International Nuclear Information System (INIS)

    By means of the exclusive sum of products of a fault tree, the improved method uses the basic event parameters direct in the synthetic evaluation and makes the fault tree kinetic analysis more simple. This paper provides a reasonable evaluation method for the kinetic analysis of basic events which has parameters of the synthetic distribution, too

  7. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2013-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits...

  8. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2015-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...... this is a major improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore...

  9. Improving the Individual Work Performance Questionnaire using Rasch analysis.

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Buuren, S. van; Beek, A.J. van der; Vet, H.C.W. de

    2014-01-01

    Recently, the Individual Work Performance Questionnaire (IWPQ) version 0.2 was developed using Rasch analysis. The goal of the current study was to improve targeting of the IWPQ scales by including additional items. The IWPQ 0.2 (original) and 0.3 (including additional items) were examined using Ras

  10. Using Operational Analysis to Improve Access to Pulmonary Function Testing.

    Science.gov (United States)

    Ip, Ada; Asamoah-Barnieh, Raymond; Bischak, Diane P; Davidson, Warren J; Flemons, W Ward; Pendharkar, Sachin R

    2016-01-01

    Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15), with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  11. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    Directory of Open Access Journals (Sweden)

    Ada Ip

    2016-01-01

    Full Text Available Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15, with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  12. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    OpenAIRE

    Jia-Shing Sheu; Kai-Chung Teng

    2013-01-01

    The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the conte...

  13. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. Development and improvement of safety analysis code for geological disposal

    International Nuclear Information System (INIS)

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  15. Improvements in antenna coupling path algorithms for aircraft EMC analysis

    Science.gov (United States)

    Bogusz, Michael; Kibina, Stanley J.

    The algorithms to calculate and display the path of maximum electromagnetic interference coupling along the perfectly conducting surface of a frustrum cone model of an aircraft nose are developed and revised for the Aircraft Inter-Antenna Propagation with Graphics (AAPG) electromagnetic compatibility analysis code. Analysis of the coupling problem geometry on the frustrum cone model and representative numerical test cases reveal how the revised algorithms are more accurate than their predecessors. These improvements in accuracy and their impact on realistic aircraft electromagnetic compatibility problems are outlined.

  16. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    OpenAIRE

    Ada Ip; Raymond Asamoah-Barnieh; Diane P. Bischak; Warren J Davidson; W. Ward Flemons; Pendharkar, Sachin R.

    2016-01-01

    Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilizati...

  17. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  18. Improved Methods for the Enrichment and Analysis of Glycated Peptides

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Qibin; Schepmoes, Athena A; Brock, Jonathan W; Wu, Si; Moore, Ronald J; Purvine, Samuel O; Baynes, John; Smith, Richard D; Metz, Thomas O

    2008-12-15

    Non-enzymatic glycation of tissue proteins has important implications in the development of complications of diabetes mellitus. Herein we report improved methods for the enrichment and analysis of glycated peptides using boronate affinity chromatography and electron transfer dissociation mass spectrometry, respectively. The enrichment of glycated peptides was improved by replacing an off-line desalting step with an on-line wash of column-bound glycated peptides using 50 mM ammonium acetate. The analysis of glycated peptides by MS/MS was improved by considering only higher charged (≥3) precursor-ions during data-dependent acquisition, which increased the number of glycated peptide identifications. Similarly, the use of supplemental collisional activation after electron transfer (ETcaD) resulted in more glycated peptide identifications when the MS survey scan was acquired with enhanced resolution. In general, acquiring ETD-MS/MS data at a normal MS survey scan rate, in conjunction with the rejection of both 1+ and 2+ precursor-ions, increased the number of identified glycated peptides relative to ETcaD or the enhanced MS survey scan rate. Finally, an evaluation of trypsin, Arg-C, and Lys-C showed that tryptic digestion of glycated proteins was comparable to digestion with Lys-C and that both were better than Arg-C in terms of the number glycated peptides identified by LC-MS/MS.

  19. Improved spectrum simulation for validating SEM-EDS analysis

    Science.gov (United States)

    Statham, P.; Penman, C.; Duncumb, P.

    2016-02-01

    X-ray microanalysis by SEM-EDS requires corrections for the many physical processes that affect emitted intensity for elements present in the material. These corrections will only be accurate provided a number of conditions are satisfied and it is essential that the correct elements are identified. As analysis is pushed to achieve results on smaller features and more challenging samples it becomes increasingly difficult to determine if all conditions are upheld and whether the analysis results are valid. If a theoretical simulated spectrum based on the measured analysis result is compared with the measured spectrum, any marked differences will indicate problems with the analysis and can prevent serious mistakes in interpretation. To achieve the necessary accuracy a previous theoretical model has been enhanced to incorporate new line intensity measurements, differential absorption and excitation of emission lines, including the effect of Coster-Kronig transitions and an improved treatment of bremsstrahlung for compounds. The efficiency characteristic has been measured for a large area SDD detector and data acquired from an extensive set of standard materials at both 5 kV and 20 kV. The parameterized model has been adjusted to fit measured characteristic intensities and both background shape and intensity at the same beam current. Examples are given to demonstrate how an overlay of an accurate theoretical simulation can expose some non-obvious mistakes and provide some expert guidance towards a valid analysis result. A new formula for calculating the effective mean atomic number for compounds has also been derived that is appropriate and should help improve accuracy in techniques that calculate the bremsstrahlung or use a bremsstrahlung measurement for calibration.

  20. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  1. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    Energy Technology Data Exchange (ETDEWEB)

    Sun Zengzhi [School of Materials Science and Engineering, Chang' an University, Xi' an 710064 (China)], E-mail: zz-sun@126.com; Xu Qinwu [Pavement research, Transtec Group Inc., Austin 78731 (United States)], E-mail: qinwu_xu@yahoo.com

    2008-08-25

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca{sup 2+} and Al{sup 3+} cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH){sub 2}, acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent.

  2. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    Science.gov (United States)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  3. Improved generalized cell mapping for global analysis of dynamical systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Three main parts of generalized cell mapping are improved for global analysis. A simple method, which is not based on the theory of digraphs, is presented to locate complete self-cycling sets that corre- spond to attractors and unstable invariant sets involving saddle, unstable periodic orbit and chaotic saddle. Refinement for complete self-cycling sets is developed to locate attractors and unstable in- variant sets with high degree of accuracy, which can start with a coarse cell structure. A nonuniformly interior-and-boundary sampling technique is used to make the refinement robust. For homeomorphic dissipative dynamical systems, a controlled boundary sampling technique is presented to make gen- eralized cell mapping method with refinement extremely accurate to obtain invariant sets. Recursive laws of group absorption probability and expected absorption time are introduced into generalized cell mapping, and then an optimal order for quantitative analysis of transient cells is established, which leads to the minimal computational work. The improved method is applied to four examples to show its effectiveness in global analysis of dynamical systems.

  4. New Framework for Improving Big Data Analysis Using Mobile Agent

    Directory of Open Access Journals (Sweden)

    Youssef M. ESSA

    2014-01-01

    Full Text Available the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS. Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop designed to support parallel and distributed data processing. Hadoop provides a distributed file processing system that stores and processes a large scale of data. It enables a fault tolerant by replicating data on three or more machines to avoid data loss.Hadoop is based on client server model and used single master machine called NameNode. However, Hadoop has several drawbacks affecting on its performance and reliability against big data analysis. In this paper, a new framework is proposed to improve big data analysis and overcome specified drawbacks of Hadoop. These drawbacks are replication tasks, Centralized node and nodes failure. The proposed framework is called MapReduce Agent Mobility (MRAM. MRAM is developed by using mobile agent and MapReduce paradigm under Java Agent Development Framework (JADE.

  5. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  6. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  7. Skill Gap Analysis for Improved Skills and Quality Deliverables

    Directory of Open Access Journals (Sweden)

    Mallikarjun Koripadu

    2014-10-01

    Full Text Available With a growing pressure in identifying the skilled resources in Clinical Data Management (CDM world of clinical research organizations, to provide the quality deliverables most of the CDM organizations are planning to improve the skills within the organization. In changing CDM landscape the ability to build, manage and leverage the skills of clinical data managers is very critical and important. Within CDM to proactively identify, analyze and address skill gaps for all the roles involved. In addition to domain skills, the evolving role of a clinical data manager demands diverse skill sets such as project management, six sigma, analytical, decision making, communication etc. This article proposes a methodology of skill gap analysis (SGA management as one of the potential solutions to the big skill challenge that CDM is gearing up for bridging the gap of skills. This would in turn strength the CDM capability, scalability, consistency across geographies along with improved productivity and quality of deliverables

  8. Crystal quality analysis and improvement using x-ray topography.

    Energy Technology Data Exchange (ETDEWEB)

    Maj, J.; Goetze, K.; Macrander, A.; Zhong, Y.; Huang, X.; Maj, L.; Univ. of Chicago

    2008-01-01

    The Topography X-ray Laboratory of the Advanced Photon Source (APS) at Argonne National Laboratory operates as a collaborative effort with APS users to produce high performance crystals for APS X-ray beamline experiments. For many years the topography laboratory has worked closely with an on-site optics shop to help ensure the production of crystals with the highest quality, most stress-free surface finish possible. It has been instrumental in evaluating and refining methods used to produce high quality crystals. Topographical analysis has shown to be an effective method to quantify and determine the distribution of stresses, to help identify methods that would mitigate the stresses and improve the Rocking curve, and to create CCD images of the crystal. This paper describes the topography process and offers methods for reducing crystal stresses in order to substantially improve the crystal optics.

  9. Analysis and implementation of an improved recycling folded cascode amplifier

    Institute of Scientific and Technical Information of China (English)

    李一雷; 韩科峰; 闫娜; 谈熙; 闵昊

    2012-01-01

    A generally improved recycling folded cascode (IRFC) is analyzed and implemented.Analysis and comparisons among the IRFC,the original recycling folded cascode (RFC) and the conventional folded cascode (FC) are made,and it is shown that with the flexible structure of IRFC,significant enhancement in transconductance,slew rate and noise can be achieved.Prototype amplifiers were fabricated in 0.13 μm technology.Measurement shows that IRFC has 3 × enhancement in gain-bandwidth and slew rate over conventional FC,and the enhancement is 1.5× when compared with the RFC.

  10. An improved convergence analysis of smoothed aggregation algebraic multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Brezina, Marian [Univ. of Colorado, Boulder, CO (United States). Dept. of Applied Mathematics; Vaněk, Petr [University of West Bohemia (Czech Republic). Dept. of Mathematics; Vassilevski, Panayot S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing

    2011-03-02

    We present an improved analysis of the smoothed aggregation (SA) alge- braic multigrid method (AMG) extending the original proof in [SA] and its modification in [Va08]. The new result imposes fewer restrictions on the aggregates that makes it eas- ier to verify in practice. Also, we extend a result in [Van] that allows us to use aggressive coarsening at all levels due to the special properties of the polynomial smoother, that we use and analyze, and thus provide a multilevel convergence estimate with bounds independent of the coarsening ratio.

  11. An Efficient and Configurable Preprocessing Algorithm to Improve Stability Analysis.

    Science.gov (United States)

    Sesia, Ilaria; Cantoni, Elena; Cernigliaro, Alice; Signorile, Giovanna; Fantino, Gianluca; Tavella, Patrizia

    2016-04-01

    The Allan variance (AVAR) is widely used to measure the stability of experimental time series. Specifically, AVAR is commonly used in space applications such as monitoring the clocks of the global navigation satellite systems (GNSSs). In these applications, the experimental data present some peculiar aspects which are not generally encountered when the measurements are carried out in a laboratory. Space clocks' data can in fact present outliers, jumps, and missing values, which corrupt the clock characterization. Therefore, an efficient preprocessing is fundamental to ensure a proper data analysis and improve the stability estimation performed with the AVAR or other similar variances. In this work, we propose a preprocessing algorithm and its implementation in a robust software code (in MATLAB language) able to deal with time series of experimental data affected by nonstationarities and missing data; our method is properly detecting and removing anomalous behaviors, hence making the subsequent stability analysis more reliable.

  12. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ

    2010-09-01

    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  13. ECONOMIC AND ENERGETICAL ANALYSIS OF IMPROVED WASTE UTILIZATION PLASMA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Serghei VAMBOL

    2015-07-01

    Full Text Available Purpose. Energy and economic evaluation of the improved plasma waste utilization technological process, as well as an expediency substantiation of the use of improved plasma technology by comparing its energy consumption with other thermal methods of utilization. Methodology. Analysis of existing modern and advanced methods of waste management and its impact on environmental safety. Considering of energy and monetary costs to implement two different waste management technologies. Results. Studies have shown regular gasification ensure greater heating value due to differences, a significant amount of nitrogen than for plasma gasification. From the point of view of minimizing energy and monetary costs and environmental safety more promising is to offer advanced technology for plasma waste. To carry out the energy assessment of the appropriateness of the considered technologies-comparative calculation was carried out at the standard conditions. This is because in the processing of waste produced useful products, such as liquefied methane, synthetic gas (94% methane and a fuel gas for heating, suitable for sale that provides cost-effectiveness of this technology. Originality. Shown and evaluated ecological and economic efficiency of proposed improved plasma waste utilization technology compared with other thermal techniques. Practical value. Considered and grounded of energy and monetary costs to implement two different waste management technologies, namely ordinary gasification and using plasma generators. Proposed plasma waste utilization technology allows to obtain useful products, such as liquefied methane, synthetic gas and a fuel gas for heating, which are suitable for sale. Plant for improved plasma waste utilization technological process allows to compensate the daily and seasonal electricity and heat consumption fluctuations by allowing the storage of obtained fuel products.

  14. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  15. TENDENCY OF IMPROVEMENT ANALYSIS OF VENTURE ACTIVITY FOR MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс

    2015-03-01

    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  16. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)

    2012-03-15

    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  17. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  18. An Improved Analysis of Forest Carbon Dynamics using Data Assimilation

    Science.gov (United States)

    Williams, Mathew; Schwarz, Paul A.; Law, Beverly E.; Kurpius, Meredith R.

    2005-01-01

    There are two broad approaches to quantifying landscape C dynamics - by measuring changes in C stocks over time, or by measuring fluxes of C directly. However, these data may be patchy, and have gaps or biases. An alternative approach to generating C budgets has been to use process-based models, constructed to simulate the key processes involved in C exchange. However, the process of model building is arguably subjective, and parameters may be poorly defined. This paper demonstrates why data assimilation (DA) techniques - which combine stock and flux observations with a dynamic model - improve estimates of, and provide insights into, ecosystem carbon (C) exchanges. We use an ensemble Kalman filter (EnKF) to link a series of measurements with a simple box model of C transformations. Measurements were collected at a young ponderosa pine stand in central Oregon over a 3-year period, and include eddy flux and soil C02 efflux data, litterfall collections, stem surveys, root and soil cores, and leaf area index data. The simple C model is a mass balance model with nine unknown parameters, tracking changes in C storage among five pools; foliar, wood and fine root pools in vegetation, and also fresh litter and soil organic matter (SOM) plus coarse woody debris pools. We nested the EnKF within an optimization routine to generate estimates from the data of the unknown parameters and the five initial conditions for the pools. The efficacy of the DA process can be judged by comparing the probability distributions of estimates produced with the EnKF analysis vs. those produced with reduced data or model alone. Using the model alone, estimated net ecosystem exchange of C (NEE)= -251 f 197g Cm-2 over the 3 years, compared with an estimate of -419 f 29gCm-2 when all observations were assimilated into the model. The uncertainty on daily measurements of NEE via eddy fluxes was estimated at 0.5gCm-2 day-1, but the uncertainty on assimilated estimates averaged 0.47 g Cm-2 day-1, and

  19. A improved method for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    In this work we describe a methodology, developed in the last years, for the analysis of alpha emitters spectra, obtained with implanted ion detectors, that tend to solve some of the problems that shows this type of spectra. This is an improved methodology respect to that described in a previous publication. The method is based on the application of a mathematical function that allows to model the tail of an alpha peak, to evaluate the part of the peak that is not seen in the cases of partial superposition with another peak. Also, a calculation program that works in a semiautomatic way, with the possibility of interactive intervention of the analyst, has been developed simultaneously and is described in detail. (author)

  20. Improved Analysis for Graphic TSP Approximation via Matchings

    CERN Document Server

    Mucha, Marcin

    2011-01-01

    The Travelling Salesman Problem is one the most fundamental and most studied problems in approximation algorithms. For more than 30 years, the best algorithm known for general metrics has been Christofides's algorithm with approximation factor of 3/2, even though the so-called Held-Karp LP relaxation of the problem is conjectured to have the integrality gap of only 4/3. Very recently, significant progress has been made for the important special case of graphic metrics, first by Oveis Gharan et al., and then by Momke and Svensson. In this paper, we provide an improved analysis for the approach introduced by Momke and Svensson yielding a bound of 35/24 on the approximation factor, as well as a bound of 19/12+epsilon for any epsilon>0 for a more general Travelling Salesman Path Problem in graphic metrics.

  1. Improving Semantic Search in Digital Libraries Using Multimedia Analysis

    Directory of Open Access Journals (Sweden)

    Ilianna Kollia

    2012-04-01

    Full Text Available Semantic search of cultural content is of major importance in current digital libraries, such as in Europeana. Content metadata constitute the main features of cultural items that are analysed, mapped and used to interpret users' queries, so that the most appropriate content is selected and presented to the users. Multimedia, especially visual, analysis, has not been a main component in these developments. This paper presents a new semantic search methodology, including a query answering mechanism which meets the semantics of users' queries and enriches the answers by exploiting appropriate visual features, both local and MPEG-7, through an interweaved knowledge and machine learning based approach. An experimental study is presented, using content from the Europeana digital library, and involving both thematic knowledge and extracted visual features from Europeana images, illustrating the improved performance of the proposed semantic search approach.

  2. Improving knowledge management systems with latent semantic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sebok, A.; Plott, C. [Alion Science and Technology, MA and D Operation, 4949 Pearl East Circle, Boulder, CO 80301 (United States); LaVoie, N. [Pearson Knowledge Technologies, 4940 Pearl East Circle, Boulder, CO 80301 (United States)

    2006-07-01

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  3. Benchmarking Of Improved DPAC Transient Deflagration Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Laurinat, James E.; Hensel, Steve J.

    2013-03-21

    The transient deflagration code DPAC (Deflagration Pressure Analysis Code) has been upgraded for use in modeling hydrogen deflagration transients. The upgraded code is benchmarked using data from vented hydrogen deflagration tests conducted at the HYDRO-SC Test Facility at the University of Pisa. DPAC originally was written to calculate peak deflagration pressures for deflagrations in radioactive waste storage tanks and process facilities at the Savannah River Site. Upgrades include the addition of a laminar flame speed correlation for hydrogen deflagrations and a mechanistic model for turbulent flame propagation, incorporation of inertial effects during venting, and inclusion of the effect of water vapor condensation on vessel walls. In addition, DPAC has been coupled with CEA, a NASA combustion chemistry code. The deflagration tests are modeled as end-to-end deflagrations. The improved DPAC code successfully predicts both the peak pressures during the deflagration tests and the times at which the pressure peaks.

  4. Improving knowledge management systems with latent semantic analysis

    International Nuclear Information System (INIS)

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  5. Improved iterative error analysis for endmember extraction from hyperspectral imagery

    Science.gov (United States)

    Sun, Lixin; Zhang, Ying; Guindon, Bert

    2008-08-01

    Automated image endmember extraction from hyperspectral imagery is a challenge and a critical step in spectral mixture analysis (SMA). Over the past years, great efforts were made and a large number of algorithms have been proposed to address this issue. Iterative error analysis (IEA) is one of the well-known existing endmember extraction methods. IEA identifies pixel spectra as a number of image endmembers by an iterative process. In each of the iterations, a fully constrained (abundance nonnegativity and abundance sum-to-one constraints) spectral unmixing based on previously identified endmembers is performed to model all image pixels. The pixel spectrum with the largest residual error is then selected as a new image endmember. This paper proposes an updated version of IEA by making improvements on three aspects of the method. First, fully constrained spectral unmixing is replaced by a weakly constrained (abundance nonnegativity and abundance sum-less-or-equal-to-one constraints) alternative. This is necessary due to the fact that only a subset of endmembers exhibit in a hyperspectral image have been extracted up to an intermediate iteration and the abundance sum-to-one constraint is invalid at the moment. Second, the search strategy for achieving an optimal set of image endmembers is changed from sequential forward selection (SFS) to sequential forward floating selection (SFFS) to reduce the so-called "nesting effect" in resultant set of endmembers. Third, a pixel spectrum is identified as a new image endmember depending on both its spectral extremity in the feature hyperspace of a dataset and its capacity to characterize other mixed pixels. This is achieved by evaluating a set of extracted endmembers using a criterion function, which is consisted of the mean and standard deviation of residual error image. Preliminary comparison between the image endmembers extracted using improved and original IEA are conducted based on an airborne visible infrared imaging

  6. Voxel model in BNCT treatment planning: performance analysis and improvements

    Science.gov (United States)

    González, Sara J.; Carando, Daniel G.; Santa Cruz, Gustavo A.; Zamenhof, Robert G.

    2005-02-01

    In recent years, many efforts have been made to study the performance of treatment planning systems in deriving an accurate dosimetry of the complex radiation fields involved in boron neutron capture therapy (BNCT). The computational model of the patient's anatomy is one of the main factors involved in this subject. This work presents a detailed analysis of the performance of the 1 cm based voxel reconstruction approach. First, a new and improved material assignment algorithm implemented in NCTPlan treatment planning system for BNCT is described. Based on previous works, the performances of the 1 cm based voxel methods used in the MacNCTPlan and NCTPlan treatment planning systems are compared by standard simulation tests. In addition, the NCTPlan voxel model is benchmarked against in-phantom physical dosimetry of the RA-6 reactor of Argentina. This investigation shows the 1 cm resolution to be accurate enough for all reported tests, even in the extreme cases such as a parallelepiped phantom irradiated through one of its sharp edges. This accuracy can be degraded at very shallow depths in which, to improve the estimates, the anatomy images need to be positioned in a suitable way. Rules for this positioning are presented. The skin is considered one of the organs at risk in all BNCT treatments and, in the particular case of cutaneous melanoma of extremities, limits the delivered dose to the patient. Therefore, the performance of the voxel technique is deeply analysed in these shallow regions. A theoretical analysis is carried out to assess the distortion caused by homogenization and material percentage rounding processes. Then, a new strategy for the treatment of surface voxels is proposed and tested using two different irradiation problems. For a parallelepiped phantom perpendicularly irradiated with a 5 keV neutron source, the large thermal neutron fluence deviation present at shallow depths (from 54% at 0 mm depth to 5% at 4 mm depth) is reduced to 2% on average

  7. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  8. Improved analysis of solar signals for differential reflectivity monitoring

    Science.gov (United States)

    Huuskonen, Asko; Kurri, Mikko; Holleman, Iwan

    2016-07-01

    The method for the daily monitoring of the differential reflectivity bias for polarimetric weather radars is developed further. Improved quality control is applied to the solar signals detected during the operational scanning of the radar, which efficiently removes rain and clutter-contaminated gates occurring in the solar hits. The simultaneous reflectivity data are used as a proxy to determine which data points are to be removed. A number of analysis methods to determine the differential reflectivity bias are compared, and methods based on surface fitting are found superior to simple averaging. A separate fit to the reflectivity of the horizontal and vertical polarization channels is recommended because of stability. Separate fitting also provides, in addition to the differential reflectivity bias, the pointing difference of the polarization channels. Data from the Finnish weather radar network show that the pointing difference is less than 0.02° and that the differential reflectivity bias is stable and determined to better than 0.04 dB. The results are compared to those from measurements at vertical incidence, which allows us to determine the total differential reflectivity bias including the differential receiver bias and the transmitter bias.

  9. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  10. Improving diagnostic criteria for Propionibacterium acnes osteomyelitis: a retrospective analysis.

    Science.gov (United States)

    Asseray, Nathalie; Papin, Christophe; Touchais, Sophie; Bemer, Pascale; Lambert, Chantal; Boutoille, David; Tequi, Brigitte; Gouin, François; Raffi, François; Passuti, Norbert; Potel, Gilles

    2010-07-01

    The identification of Propionibacterium acnes in cultures of bone and joint samples is always difficult to interpret because of the ubiquity of this microorganism. The aim of this study was to propose a diagnostic strategy to distinguish infections from contaminations. This was a retrospective analysis of all patient charts of those patients with >or=1 deep samples culture-positive for P. acnes. Every criterion was tested for sensitivity, specificity, and positive likelihood ratio, and then the diagnostic probability of combinations of criteria was calculated. Among 65 patients, 52 (80%) were considered truly infected with P. acnes, a diagnosis based on a multidisciplinary process. The most valuable diagnostic criteria were: >or=2 positive deep samples, peri-operative findings (necrosis, hardware loosening, etc.), and >or=2 surgical procedures. However, no single criterion was sufficient to ascertain the diagnosis. The following combinations of criteria had a diagnostic probability of >90%: >or=2 positive cultures + 1 criterion among: peri-operative findings, local signs of infection, >or=2 previous operations, orthopaedic devices; 1 positive culture + 3 criteria among: peri-operative findings, local signs of infection, >or=2 previous surgical operations, orthopaedic devices, inflammatory syndrome. The diagnosis of P. acnes osteomyelitis was greatly improved by combining different criteria, allowing differentiation between infection and contamination.

  11. Process Correlation Analysis Model for Process Improvement Identification

    OpenAIRE

    Su-jin Choi; Dae-Kyoo Kim; Sooyong Park

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practice...

  12. Analysis of the Improvement Methods for Equipment Maintenance Support

    Institute of Scientific and Technical Information of China (English)

    ZHANG Rui-chang; ZHAO Song-zheng

    2005-01-01

    According to military requirement, and based on the problems of equipment maintenance support methods in high-tech battles, each element supporting equipment maintenance is analyzed, and the methods for improving equipment maintenance are proposed.

  13. Improve the Method for Requirements Analysis on Commercial Information System

    OpenAIRE

    Peng, Chen

    2005-01-01

    This thesis states the tasks of the analyst: communicating with commercial customer to establish their requirements; reframing those requirements by negotiation in order that programmers can understand it to write the codes efficiently. Soft System Methodology (SSM) is an effective approach to identify the situation of the problem. In my thesis, I will improve a new business – oriented method that is called Process Improvement for Strategic Objectives (PISO) with SSM to make PISO have more ef...

  14. Gap Analysis Approach for Construction Safety Program Improvement

    OpenAIRE

    Thanet Aksorn; B.H.W. Hadikusumo

    2007-01-01

    To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual...

  15. New Framework for Improving Big Data Analysis Using Mobile Agent

    OpenAIRE

    Youssef M. ESSA; Gamal ATTIYA; El-Sayed, Ayman

    2014-01-01

    the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS). Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop de...

  16. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  17. Why Economic Analysis of Health System Improvement Interventions Matters

    Science.gov (United States)

    Broughton, Edward Ivor; Marquez, Lani

    2016-01-01

    There is little evidence to direct health systems toward providing efficient interventions to address medical errors, defined as an unintended act of omission or commission or one not executed as intended that may or may not cause harm to the patient but does not achieve its intended outcome. We believe that lack of guidance on what is the most efficient way to reduce medical errors and improve the quality of health-care limits the scale-up of health system improvement interventions. Challenges to economic evaluation of these interventions include defining and implementing improvement interventions in different settings with high fidelity, capturing all of the positive and negative effects of the intervention, using process measures of effectiveness rather than health outcomes, and determining the full cost of the intervention and all economic consequences of its effects. However, health system improvement interventions should be treated similarly to individual medical interventions and undergo rigorous economic evaluation to provide actionable evidence to guide policy-makers in decisions of resource allocation for improvement activities among other competing demands for health-care resources.

  18. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  19. Analysis and Improvement of Low Rank Representation for Subspace segmentation

    CERN Document Server

    Siming, Wei

    2011-01-01

    We analyze and improve low rank representation (LRR), the state-of-the-art algorithm for subspace segmentation of data. We prove that for the noiseless case, the optimization model of LRR has a unique solution, which is the shape interaction matrix (SIM) of the data matrix. So in essence LRR is equivalent to factorization methods. We also prove that the minimum value of the optimization model of LRR is equal to the rank of the data matrix. For the noisy case, we show that LRR can be approximated as a factorization method that combines noise removal by column sparse robust PCA. We further propose an improved version of LRR, called Robust Shape Interaction (RSI), which uses the corrected data as the dictionary instead of the noisy data. RSI is more robust than LRR when the corruption in data is heavy. Experiments on both synthetic and real data testify to the improved robustness of RSI.

  20. Improved Analysis of Kannan's Shortest Lattice Vector Algorithm

    CERN Document Server

    Hanrot, Guillaume

    2007-01-01

    The security of lattice-based cryptosystems such as NTRU, GGH and Ajtai-Dwork essentially relies upon the intractability of computing a shortest non-zero lattice vector and a closest lattice vector to a given target vector in high dimensions. The best algorithms for these tasks are due to Kannan, and, though remarkably simple, their complexity estimates have not been improved since more than twenty years. Kannan's algorithm for solving the shortest vector problem is in particular crucial in Schnorr's celebrated block reduction algorithm, on which are based the best known attacks against the lattice-based encryption schemes mentioned above. Understanding precisely Kannan's algorithm is of prime importance for providing meaningful key-sizes. In this paper we improve the complexity analyses of Kannan's algorithms and discuss the possibility of improving the underlying enumeration strategy.

  1. Does Competition Improve Public School Efficiency? A Spatial Analysis

    Science.gov (United States)

    Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.

    2012-01-01

    Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…

  2. An Improved Adaptive Routing Algorithm Based on Link Analysis

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2012-01-01

    Full Text Available DHT (Distributed Hash Tables has been applied to the structured P2P system to achieve information retrieval and positioning efficiently. KAD is a large-scale network protocol based on the XOR metric in practice, which uses DHT technology to improve network salability without central server. However, the increasing malicious pollutes routing tables to reduce seriously the query performance. Thus, an improved adaptive algorithm based on social network is proposed in order to improve routing table updating algorithm. Firstly, the data structure of routing table is adjusted to store value of centrality and prestige. Secondly, the request nodes can adaptive select nodes to send messages. Then when a find process is terminated, the node will calculate the two values for all participating nodes using the corresponding centrality and prestige algorithms based on XOR metric. Finally, the node updates the routing table depend on the above result. The above algorithm was implemented in an open source project named LibTorrent to test effectiveness. This experiment last a month to verify the change of the search success ratio in a KAD network with about 30% malicious nodes. The results show that the optimized adaptive routing algorithm can effectively resist the attack for routing table and improve the search success ratio of the node. Moreover, this lightweight algorithm is conducive to the deployment in practice without extra network burden.

  3. Analysis of Strategies to Improve Heliostat Tracking at Solar Two

    Energy Technology Data Exchange (ETDEWEB)

    Jones, S.A.; Stone, K.W.

    1999-01-14

    This paper investigates dhlerent strategies that can be used to improve the tracking accuracy of heliostats at Solar Two. The different strategies are analyzed using a geometrical error model to determine their performance over the course of a day. By using the performance of heliostats in representative locations of the field aad on representative days of the year, an estimate of the annual performance of each strategy is presented.

  4. Analysis of strategies to improve heliostat tracking at Solar Two

    Energy Technology Data Exchange (ETDEWEB)

    Jones, S.A.; Stone, K.W.

    1999-07-01

    This paper investigates different strategies that can be used to improve the tracking accuracy of heliostats at Solar Two. The different strategies are analyzed using a geometrical error model to determine their performance over the course of a day. By using the performance of heliostats in representative locations of the field and on representative days of the year, an estimate of the annual performance of each strategy is presented.

  5. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  6. Improvement of DYANA. The dynamic analysis program for event transition

    International Nuclear Information System (INIS)

    In the probabilistic safety assessment (PSA), the fault tree/event tree technique has been widely used to evaluate accident sequence frequencies. However, event transition which operators actually face can not be dynamically treated by the conventional technique. Therefore, we have made the dynamic analysis program(DYANA) for event transition for a liquid metal cooled fast breeder reactor. In the previous development, we made basic model for analysis. However, we have a problem that calculation time is too long. At the current term, we made parallelization of DYANA using MPI. So we got good performance on WS claster. It performance is close to ideal one. (author)

  7. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  8. Stiffness Analysis and Improvement of Bolt-Plate Contact Assemblies

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard; Pedersen, Pauli

    2008-01-01

    of stiffnesses is extended to include different material parameters by including the influence of Poisson's ratio. Two simple practical formulas are suggested and their accuracies are documented for different bolts and different material (Poisson's ratio). Secondly, the contact analysis between the bolt head...

  9. Some Ideas to Improve Pyroclast Density and Vesicularity Data Analysis

    Science.gov (United States)

    Bernard, B.; Kueppers, U.; Ortiz, H. D.

    2014-12-01

    Pyroclast density and vesicularity are critical parameters in physical volcanology used to reconstruct eruptive dynamics and feed numerical models. Pyroclastic deposits typically present a wide range of density and vesicularity values, so measurements must be repeated tens of times. These data are generally treated using classical statistical analysis including averages and frequency histograms. One issue in this approach is that density and vesicularity are intensive properties and therefore they cannot be added or averaged directly. We encourage the use of weighted density and vesicularity averages and histograms, which is, until now, done only in few studies. In order to insure an adequate and efficient use of the weighting equations, we introduce an open-source R code to calculate the most common statistical parameters such as range and weighted averages, and produce abundance histograms. An important question when working with statistics is whether or not the sample size is large enough. To address this matter we also included a stability analysis based on a Monte Carlo approach which enables to quantify the reliability of the results. To illustrate this methodology we chose two large datasets from Chachimbiro (Ecuador) and Unzen (Japan) volcanoes. Our first results indicate that the use of weighted analysis instead of frequency analysis can change the density and vesicularity averages up to 4% and the shape of the abundance histogram leading to different interpretations. The stability analysis reveals that the number of measurements required for reliable results depends greatly on the distribution of density and vesicularity values. Therefore the number of measurements must be fixed on an ipso facto basis using a large sample size at the beginning and reducing it to achieve time efficiency.

  10. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  11. Improved Conjunction Analysis via Collaborative Space Situational Awareness

    Science.gov (United States)

    Kelso, T.; Vallado, D.; Chan, J.; Buckwalter, B.

    With recent events such as the Chinese ASAT test in 2007 and the USA 193 intercept in 2008, many satellite operators are becoming increasingly aware of the potential threat to their satellites as the result of orbital debris or even other satellites. However, to be successful at conjunction monitoring and collision avoidance requires accurate orbital information for as many space objects (payloads, dead satellites, rocket bodies, and debris) as possible. Given the current capabilities of the US Space Surveillance Network (SSN), approximately 18,500 objects are now being tracked and orbital data (in the form of two-line element sets) is available to satellite operators for 11,750 of them (as of 2008 September 1). The capability to automatically process this orbital data to look for close conjunctions and provide that information to satellite operators via the Internet has been continuously available on CelesTrak, in the form of Satellite Orbital Conjunction Reports Assessing Threatening Encounters in Space (SOCRATES), since May 2004. Those reports are used by many operators as one way to keep apprised of these potential threats. However, the two-line element sets (TLEs) are generated using non-cooperative tracking via the SSN's network of radar and optical sensors. As a result, the relatively low accuracy of the data results in a large number of false alarms that satellite operators must routinely deal with. Yet, satellite operators typically perform orbit maintenance for their own satellites, using active ranging and GPS systems. These data are often an order of magnitude more accurate than those available using TLEs. When combined (in the form of ephemerides) with maneuver planning information, the ability to maintain predictive awareness increases significantly. And when satellite operators share this data, the improved space situational awareness, particularly in the crowded geosynchronous belt, can be dramatic and the number of false alarms can be reduced

  12. Modeling Analysis and Improvement of Power Loss in Microgrid

    Directory of Open Access Journals (Sweden)

    H. Lan

    2015-01-01

    Full Text Available The consumption of conventional energy sources and environmental concerns have resulted in rapid growth in the amount of renewable energy introduced to power systems. With the help of distributed generations (DG, the improvement of power loss and voltage profile can be the salient benefits. However, studies show that improper placement and size of energy storage system (ESS lead to undesired power loss and the risk of voltage stability, especially in the case of high renewable energy penetration. To solve the problem, this paper sets up a microgrid based on IEEE 34-bus distribution system which consists of wind power generation system, photovoltaic generation system, diesel generation system, and energy storage system associated with various types of load. Furthermore, the particle swarm optimization (PSO algorithm is proposed in the paper to minimize the power loss and improve the system voltage profiles by optimally managing the different sorts of distributed generations under consideration of the worst condition of renewable energy production. The established IEEE 34-bus system is adopted to perform case studies. The detailed simulation results for each case clearly demonstrate the necessity of optimal management of the system operation and the effectiveness of the proposed method.

  13. Analysis Approach to Improve Star Rating Of Water Heater

    Directory of Open Access Journals (Sweden)

    Sujata Dabhade

    2014-07-01

    Full Text Available Electric Water Heaters are widely used all over the world that can be categorized in two types i.e. Instant Water Heaters & Storage type Water Heaters. The energy consumption for 6 liter water heaters is much higher in the storage type of water heater. As energy is an important factor for economic development of country, therefore there is need to save the energy which implies the focus to use Storage type Water Heaters. In 6 Liter water heater, Existing model converting from 4 star rating to 5 star rating by thermal analysis & insulation. After the theoretical calculation of thickness of glass wool is the practical testing of product with BEE norms & got results for 5 Star Calculation. Finally we are doing the thermal analysis for theoretical & practical verification of the product

  14. Improved statistics for genome-wide interaction analysis.

    Science.gov (United States)

    Ueki, Masao; Cordell, Heather J

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new "joint effects" statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al

  15. Improving E-Business Design through Business Model Analysis

    OpenAIRE

    Ilayperuma, Tharaka

    2010-01-01

    To a rapidly increasing degree, traditional organizational structures evolve in large parts of the world towards online business using modern Information and Communication Technology (ICT) capabilities. For efficient applications of inter-organizational information systems, the alignment between business and ICT is a key factor. In this context, business analysis using business modelling can be regarded as a first step in designing economically sustainable e-business solutions. This thesis ex...

  16. Metadata-based analysis to improve clinical trial exchange

    OpenAIRE

    Luzi, Daniela; Ricci, Fabrizio L. (CNR-IRPPS); Serbanati, Luca D.; GreyNet, Grey Literature Network Service

    2006-01-01

    There are various, important information sources devoted to the diffusion of clinical trials, but they fail to achieve a complete coverage of clinical research. The demand for a mandatory public registration of clinical trials is emerging from different institutions, which are making efforts to develop common metadata schemas to both increase information exchange and make this information publicly available. The paper describes a metadata analysis of the various solutions of CT data represent...

  17. Improved corner detection by ultrasonic testing using phase analysis.

    Science.gov (United States)

    Broberg, Patrik; Runnemalm, Anna; Sjödahl, Mikael

    2013-02-01

    In ultrasonic testing, corners are used for sensitivity calibration in the form of notches, for measuring the sound velocity in the material, and as known reference points during testing. A 90° corner will always reflect incoming waves in the opposite direction due to a double reflection and therefore give a strong echo. This article presents a method for separating the echo from a corner from other echoes and more accurately find the position of the corner. The method is based on analysing the phase of the reflected signal. The proposed method was tested on a steel calibration block and the width of the indication was reduced by up to 50% compared to the amplitude signal. This results in a more accurate positioning of the corner. Using the phase instead of the amplitude will also improve the reliability, since reflections other than from corners will disappear. PMID:23164172

  18. Analysis of constituents for phenotyping drought tolerance in crop improvement

    Directory of Open Access Journals (Sweden)

    Tim L Setter

    2012-06-01

    Full Text Available Investigators now have a wide range of analytical tools to use in measuring metabolites, proteins and transcripts in plant tissues. These tools have the potential to assist genetic studies that seek to phenotype genetic lines for heritable traits that contribute to drought tolerance. To be useful for crop breeding, hundreds or thousands of genetic lines must be assessed. This review considers the utility of assaying certain constituents with roles in drought tolerance for phenotyping genotypes. Abscisic acid (ABA, organic and inorganic osmolytes, compatible solutes, and LEA proteins, are considered. Confounding effects that require appropriate tissue and timing specificity, and the need for high throughput and analytical cost efficiency are discussed. With future advances in analytical methods and the value of analyzing constituents that provide information on the underlying mechanisms of drought tolerance, these approaches are expected to contribute to development crops with improved drought tolerance.

  19. An Improvement on STEM Method in Multi-Criteria Analysis

    Directory of Open Access Journals (Sweden)

    M. Izadikhah

    2012-06-01

    Full Text Available Multi-criteria decision making (MCDM refers to making decision in the presence of multiple and conflicting criteria. Multiobjective programming method such as multiple objective linear programming (MOLP are techniques used to solve such multiple criteria decision making (MCDM problems. One of the first interactive procedures to solve MOLP is step method (STEM. In this paper we try to improve STEM method by introducing the weight vector of objectives which emphasize that more important objectives be more closer to ideal one. Therefore the presented method try to increase the rate of satisfactoriness of the obtained solution. Finally, a numerical example for illustration of the new method is given to clarify the main results developed in this pape

  20. An Effective Analysis of Weblog Files to Improve Website Performance

    Directory of Open Access Journals (Sweden)

    T.Revathi

    2012-02-01

    Full Text Available As there is an enormous growth in the web in terms of web sites, the size of web usage data is also increasing gradually. But this web usage data plays a vital role in the effective management of web sites. This web usage data is stored in a file called weblog by the web server. In order to discover the knowledge, required for improving the performance of websites, we need to apply the best preprocessing methodology on the server weblog file. Data preprocessing is a phase which automatically identifies the meaningful patterns and user behavior. So far analyzing the weblog data has been a challenging task in the area of web usage mining. In this paper we propose an effective and enhanced data preprocessing methodology which produces an efficient usage patterns and reduces the size of weblog down to 75-80% of its initial size. The experimental results are also shown in the following chapters.

  1. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  2. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  3. Functional Virtual Prototyping in Vehicle Chassis Reform Analysis and Improvement Design

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The contribution of functional virtual prototyping to vehicle chassis development is presented. The different topics that we took into consideration were reform analysis and improvement design during the vehicle chassis development. A frame of coordinates based on the digital-model was established, the main CAE analysis methods, multi-body system dynamics and finite element analysis were applied to the digital-model build by CAD/CAM software. The method was applied in the vehicle chassis reform analysis and improvement design, all the analysis and design projects were implemented in the uniform digital-model, and the development was carried through effectively.

  4. Energy response improvement for photon dosimetry using pulse analysis

    Science.gov (United States)

    Zaki, Dizaji H.

    2016-02-01

    During the last few years, active personal dosimeters have been developed and have replaced passive personal dosimeters in some external monitoring systems, frequently using silicon diode detectors. Incident photons interact with the constituents of the diode detector and produce electrons. These photon-induced electrons deposit energy in the detector's sensitive region and contribute to the response of diode detectors. To achieve an appropriate photon dosimetry response, the detectors are usually covered by a metallic layer with an optimum thickness. The metallic cover acts as an energy compensating shield. In this paper, a software process is performed for energy compensation. Selective data sampling based on pulse height is used to determine the photon dose equivalent. This method is applied to improve the energy response in photon dosimetry. The detector design is optimized for the response function and determination of the photon dose equivalent. Photon personal dose equivalent is determined in the energy range of 0.3-6 MeV. The error values of the calculated data for this wide energy range and measured data for 133Ba, 137Cs, 60Co and 241Am-Be sources respectively are up to 20% and 15%. Fairly good agreement is seen between simulation and dose values obtained from our process and specifications from several photon sources.

  5. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada

    2016-08-01

    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  6. Improved data analysis for verifying quantum nonlocality and entanglement

    Science.gov (United States)

    Zhang, Yanbao; Glancy, Scott; Knill, Emanuel

    2012-06-01

    Given a finite number of experimental results originating from local measurements on two separated quantum systems in an unknown state, are these systems nonlocally correlated or entangled with each other? These properties can be verified by violating a Bell inequality or satisfying an entanglement witness. However, violation or satisfaction could be due to statistical fluctuations in finite measurements. Rigorous upper bounds, on the maximum probability (i.e., the p-value) according to local realistic or separable states of a violation or satisfaction as high as the observed, are required. Here, we propose a rigorous upper bound that improves the known bound from large deviation theory [R. Gill, arXiv:quant-ph/0110137]. The proposed bound is robust against experimental instability and the memory loophole [J. Barrett et al., Phys. Rev. A 66, 042111 (2002)]. Compared with our previous method [Phys. Rev. A 84, 062118 (2011)], the proposed method takes advantage of the particular Bell inequality or entanglement witness tested in an experiment, so the computation complexity is reduced. Also, this method can be easily extended to test a set of independent Bell inequalities or entanglement witnesses simultaneously.

  7. Improving the channeler ant model for lung CT analysis

    Science.gov (United States)

    Cerello, Piergiorgio; Lopez Torres, Ernesto; Fiorina, Elisa; Oppedisano, Chiara; Peroni, Cristiana; Arteche Diaz, Raul; Bellotti, Roberto; Bosco, Paolo; Camarlinghi, Niccolo; Massafra, Andrea

    2011-03-01

    The Channeler Ant Model (CAM) is an algorithm based on virtual ant colonies, conceived for the segmentation of complex structures with different shapes and intensity in a 3D environment. It exploits the natural capabilities of virtual ant colonies to modify the environment and communicate with each other by pheromone deposition. When applied to lung CTs, the CAM can be turned into a Computer Aided Detection (CAD) method for the identification of pulmonary nodules and the support to radiologists in the identification of early-stage pathological objects. The CAM has been validated with the segmentation of 3D artificial objects and it has already been successfully applied to the lung nodules detection in Computed Tomography images within the ANODE09 challenge. The model improvements for the segmentation of nodules attached to the pleura and to the vessel tree are discussed, as well as a method to enhance the detection of low-intensity nodules. The results on five datasets annotated with different criteria show that the analytical modules (i.e. up to the filtering stage) provide a sensitivity in the 80 - 90% range with a number of FP/scan of the order of 20. The classification module, although not yet optimised, keeps the sensitivity in the 70 - 85% range at about 10 FP/scan, in spite of the fact that the annotation criteria for the training and the validation samples are different.

  8. Performance Analysis of an Improved MUSIC DoA Estimator

    Science.gov (United States)

    Vallet, Pascal; Mestre, Xavier; Loubaton, Philippe

    2015-12-01

    This paper adresses the statistical performance of subspace DoA estimation using a sensor array, in the asymptotic regime where the number of samples and sensors both converge to infinity at the same rate. Improved subspace DoA estimators were derived (termed as G-MUSIC) in previous works, and were shown to be consistent and asymptotically Gaussian distributed in the case where the number of sources and their DoA remain fixed. In this case, which models widely spaced DoA scenarios, it is proved in the present paper that the traditional MUSIC method also provides DoA consistent estimates having the same asymptotic variances as the G-MUSIC estimates. The case of DoA that are spaced of the order of a beamwidth, which models closely spaced sources, is also considered. It is shown that G-MUSIC estimates are still able to consistently separate the sources, while it is no longer the case for the MUSIC ones. The asymptotic variances of G-MUSIC estimates are also evaluated.

  9. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta;

    2006-01-01

    indicators directly quantifying choice of coxibs, indicators measuring expenditure per Defined Daily Dose, and indicators taking risk aspects into account, (2) "Frequent NSAID prescribing", comprising indicators quantifying prevalence or amount of NSAID prescribing, and (3) "Diverse NSAID choice", comprising...... indicators focusing on the width of GPs' formularies. The number of indicators for measuring the important aspects of quality in prescribing of NSAIDs could be reduced substantially by selecting the indicator in each dimension with the highest factor loading. A high preference for coxibs indicated both...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  10. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian;

    2008-01-01

    beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...

  11. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  12. Delamination Modeling of Composites for Improved Crash Analysis

    Science.gov (United States)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  13. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    Science.gov (United States)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  14. Methodology Improvement of Reactor Physics Codes for CANDU Channels Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun; Choi, Geun Suk; Win, Naing; Aung, Tharndaing; Baek, Min Ho; Lim, Jae Yong [Kyunghee University, Seoul (Korea, Republic of)

    2010-04-15

    As the operational time increase, pressure tubes and calandria tubes in CANDU core encounter inevitably a geometrical deformation along the tube length. A pressure tube may be sagged downward within a calandria tube by creep from irradiation. This event can bring about a problem that is serious in integrity of pressure tube. A measurement of deflection state of in-service pressure tube is, therefore, very important for the safety of CANDU reactor. In this paper, evaluation of impacts on nuclear characteristic due to fuel channel deformation were aimed in order to improve nuclear design tools for concerning the local effects from abnormal deformations. It was known that sagged pressure tube can cause the eccentric configuration of fuel bundles in pressure tube by O.6cm maximum. In this case, adverse pin power distribution and reactivity balance can affect reactor safety under normal and accidental condition. Thermal and radiation-induced creep in pressure tube would expand a tube size. It was known that maximum expansion may be 5% in volume. In this case, more coolant make more moderation in the deformed channel resulting in the increase of reactivity. Sagging of pressure tube did not cause considerable change in K-inf values. However, expansion of the pressure tube made relatively large change in K-inf. Modeling of eccentric and enlarged configuration is not easy in preparation of input geometry at both HELlOS and MCNP. On the other hand, there is no way to consider this deformation in one-dimensional homogenization tool such as WIMS code. The way of handling this deformation was suggested as the correction method of expansion effect by adjusting the number density of coolant. The number density of heavy water coolant was set to be increased as the rate of expansion increase. This correction was done in the intact channel without changing geometry. It was found that this correction was very effective in the prediction of K-inf values. In this study, further

  15. Using robust statistics to improve neutron activation analysis results

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G., E-mail: gzahn@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2011-07-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  16. A potential target gene for the host-directed therapy of mycobacterial infection in murine macrophages.

    Science.gov (United States)

    Bao, Zhang; Chen, Ran; Zhang, Pei; Lu, Shan; Chen, Xing; Yao, Yake; Jin, Xiaozheng; Sun, Yilan; Zhou, Jianying

    2016-09-01

    Mycobacterium tuberculosis (MTB), one of the major bacterial pathogens for lethal infectious diseases, is capable of surviving within the phagosomes of host alveolar macrophages; therefore, host genetic variations may alter the susceptibility to MTB. In this study, to identify host genes exploited by MTB during infection, genes were non-selectively inactivated using lentivirus-based antisense RNA methods in Raw264.7 macrophages, and the cells that survived virulent MTB infection were then screened. Following DNA sequencing of the surviving cell clones, 26 host genes affecting susceptibility to MTB were identified and their pathways were analyzed by bioinformatics analysis. In total, 9 of these genes were confirmed as positive regulators of collagen α-5(IV) chain (Col4a5) expression, a gene encoding a type IV collagen subunit present on the cell surface. The knockdown of Col4a5 consistently suppressed intracellular mycobacterial viability, promoting the survival of Raw264.7 macrophages following mycobacterial infection. Furthermore, Col4a5 deficiency lowered the pH levels of intracellular vesicles, including endosomes, lysosomes and phagosomes in the Raw264.7 cells. Finally, the knockdown of Col4a5 post-translationally increased microsomal vacuolar-type H+-ATPase activity in macrophages, leading to the acidification of intracellular vesicles. Our findings reveal a novel role for Col4a5 in the regulation of macrophage responses to mycobacterial infection and identify Col4a5 as a potential target for the host-directed anti-mycobacterial therapy. PMID:27432120

  17. A potential target gene for the host-directed therapy of mycobacterial infection in murine macrophages

    Science.gov (United States)

    Bao, Zhang; Chen, Ran; Zhang, Pei; Lu, Shan; Chen, Xing; Yao, Yake; Jin, Xiaozheng; Sun, Yilan; Zhou, Jianying

    2016-01-01

    Mycobacterium tuberculosis (MTB), one of the major bacterial pathogens for lethal infectious diseases, is capable of surviving within the phagosomes of host alveolar macrophages; therefore, host genetic variations may alter the susceptibility to MTB. In this study, to identify host genes exploited by MTB during infection, genes were non-selectively inactivated using lentivirus-based antisense RNA methods in RAW264.7 macrophages, and the cells that survived virulent MTB infection were then screened. Following DNA sequencing of the surviving cell clones, 26 host genes affecting susceptibility to MTB were identified and their pathways were analyzed by bioinformatics analysis. In total, 9 of these genes were confirmed as positive regulators of collagen α-5(IV) chain (Col4a5) expression, a gene encoding a type IV collagen subunit present on the cell surface. The knockdown of Col4a5 consistently suppressed intracellular mycobacterial viability, promoting the survival of RAW264.7 macrophages following mycobacterial infection. Furthermore, Col4a5 deficiency lowered the pH levels of intracellular vesicles, including endosomes, lysosomes and phagosomes in the RAW264.7 cells. Finally, the knockdown of Col4a5 post-translationally increased microsomal vacuolar-type H+-ATPase activity in macrophages, leading to the acidification of intracellular vesicles. Our findings reveal a novel role for Col4a5 in the regulation of macrophage responses to mycobacterial infection and identify Col4a5 as a potential target for the host-directed anti-mycobacterial therapy. PMID:27432120

  18. An improved algorithm for model-based analysis of evoked skin conductance responses ☆

    OpenAIRE

    Bach, D R; Friston, K.J.; Dolan, R. J.

    2013-01-01

    Model-based analysis of psychophysiological signals is more robust to noise - compared to standard approaches - and may furnish better predictors of psychological state, given a physiological signal. We have previously established the improved predictive validity of model-based analysis of evoked skin conductance responses to brief stimuli, relative to standard approaches. Here, we consider some technical aspects of the underlying generative model and demonstrate further improvements. Most im...

  19. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H. [Asan Medical Center, Seoul (Korea, Republic of)

    2005-07-01

    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported.

  20. An improvement of window factor analysis for resolution of noisy HPLC-DAD data

    Institute of Scientific and Technical Information of China (English)

    邵学广; 林祥钦; 邵利民; 李梅青

    2002-01-01

    Window factor analysis (WFA) is a powerful tool in analyzing evolutionary process. However, it was found that window factor analysis is much sensitive to the noise involved in original data matrix. An error analysis was done with the fact that the concentration profiles resolved by the conventional window factor analysis are easily distorted by the noise reserved by the abstract factor analysis (AFA), and a modified algorithm for window factor analysis was proposed. Both simulated and experimental HPLC-DAD data were investigated by the conventional and the improved methods. Results show that the improved method can yield less noise-distorted concentration profiles than the conventional method, and the ability for resolution of noisy data sets can be greatly enhanced.

  1. Heterogeneous Multi core processors for improving the efficiency of Market basket analysis algorithm in data mining

    OpenAIRE

    L, Aashiha Priyadarshni.

    2014-01-01

    Heterogeneous multi core processors can offer diverse computing capabilities. The efficiency of Market Basket Analysis Algorithm can be improved with heterogeneous multi core processors. Market basket analysis algorithm utilises apriori algorithm and is one of the popular data mining algorithms which can utilise Map/Reduce framework to perform analysis. The algorithm generates association rules based on transactional data and Map/Reduce motivates to redesign and convert the existing sequentia...

  2. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  3. Preliminary analysis of the J-52 aircraft engine Component Improvement Program

    OpenAIRE

    Butler, Randall Scott

    1992-01-01

    Approved for public release; distribution is unlimited Increasing budgetary constraints have required program managers within the Naval Air Systems Command to justify their programs as never before. This thesis presents a preliminary analysis of the J-52 aircraft engine Component Improvement Program (CIP). The objectives of the research were to scrutinize the association of the CIP with promised improvements and benefits pertaining to the J-52 engine and to determine the obstacles that e...

  4. Daya Bay Nuclear Power Station outdoors electrical equipment pollution status analysis and improving

    International Nuclear Information System (INIS)

    According to the practice operation experience of the outdoors electrical equipment in Guangdong Daya Bay Nuclear Power Station, following to the engineering technical standard applied in China, by analysis and assessment of pollution classes, it is considered that the class four is reasonable. And indicated the voltage distance should be more than 3.5 cm/kV. Some improvements had been executed and effects are good. And further suggest some improving comments

  5. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Jeffrey, D.; Jansen, John, R.; Janke, David, H.; Plowman, Catherine, M.

    2003-02-26

    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results.

  6. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    International Nuclear Information System (INIS)

    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results

  7. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  8. Improvement on reaction model for sodium-water reaction jet code and application analysis

    Energy Technology Data Exchange (ETDEWEB)

    Itooka, Satoshi; Saito, Yoshinori [Hitachi Ltd., Nuclear Systems Division, Hitachi, Ibaraki (Japan); Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi [Hitachi Engineering Co., Ltd., Nuclear Power Plant Engineering No.2 Dept., Hitachi, Ibaraki (Japan)

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3{center_dot}Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  9. Improving the problem analysis in cost-benefit analysis for transport projects : an explorative study

    NARCIS (Netherlands)

    Annema, J.A.; Mouter, N.

    2013-01-01

    Key actors (consultants, scientists and policy makers) in the Netherlands transport policy cost-benefit analysis (CBA) practice consider ‘problem analysis’ to be one of the important CBA substantive problems. Their idea is that a good-quality problem analysis can help to identify proper solutions, a

  10. Improved Analysis of Co-Channel Interference in Cellular Communications Systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zu-fan; DU Hui-ping; ZHU Wei-le

    2005-01-01

    In terms of the carrier-to-interference-ratio, the performance of co-channel interference in cellular communications systems is studied. The approach is based on an improved analysis, which allows to take into account some area in the desired sector may not be interfered by some co-channel sectors with exact geometrical analysis, instead of the entire sector interfered by some co-channel sectors. Other features, such as power control and the number of interferences are also included.

  11. APPLICATION OF DYNAMIC SIMULATIONS IN THE ANALYSIS OF MEASURES FOR IMPROVING ENERGY EFFICIENCY OF BUILDINGS

    OpenAIRE

    DRAGICEVIC SNEZANA M.

    2016-01-01

    One of the most commonly used methods for improving energy performances of buildings is reducing heating energy consumption. This paper shows a comparative analysis of building energy demand for space heating based on case studies in which building modifications were made with insulating materials of building envelopes and with different window types. For the analysis, a public building with 6 floors, located in Belgrade, was selected. For a dynamical simulation and evaluation of the applied ...

  12. Handbook of Soccer Match Analysis: A Systematic Approach to Improving Performance

    OpenAIRE

    Christopher Carling; Mark Williams, A; Thomas Reilly

    2006-01-01

    DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training ro...

  13. Financial Statement Analysis in a Company and Proposals for Improvements of Financial Health

    OpenAIRE

    Mešťan, Marek

    2014-01-01

    The aim of this thesis is to evaluate the financial situation of company LUX, s. r. o. in the years 2009–2013 based on selected methods of financial analysis and formulate proposals to solve problem areas. In thesis is used financial and strategic analysis. The results are compared with the recommended results and averages in the same area. In the final part of thesis are suggestions and recommendations for possible improvements of financial health in the company.

  14. Unifying Geometric Features and Facial Action Units for Improved Performance of Facial Expression Analysis

    OpenAIRE

    Ghayoumi, Mehdi; Bansal, Arvind K.

    2016-01-01

    Previous approaches to model and analyze facial expression analysis use three different techniques: facial action units, geometric features and graph based modelling. However, previous approaches have treated these technique separately. There is an interrelationship between these techniques. The facial expression analysis is significantly improved by utilizing these mappings between major geometric features involved in facial expressions and the subset of facial action units whose presence or...

  15. Identification of Energy Efficiency Opportunities through Building Data Analysis and Achieving Energy Savings through Improved Controls

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Taasevigen, Danny J.; Koran, Bill

    2014-09-04

    This chapter will highlight analysis techniques to identify energy efficiency opportunities to improve operations and controls. A free tool, Energy Charting and Metrics (ECAM), will be used to assist in the analysis of whole-building, sub-metered, and/or data from the building automation system (BAS). Appendix A describes the features of ECAM in more depth, and also provide instructions for downloading ECAM and all resources pertaining to using ECAM.

  16. Improved Detection of Time Windows of Brain Responses in Fmri Using Modified Temporal Clustering Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    @@ Temporal clustering analysis (TCA) has been proposed recently as a method to detect time windows of brain responses in functional MRI (fMRI) studies when the timing and location of the activation are completely unknown. Modifications to the TCA technique are introduced in this report to further improve the sensitivity in detecting brain activation.

  17. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  18. Improving torque per kilogram magnet of permanent magnet couplings using finite element analysis

    DEFF Research Database (Denmark)

    Högberg, Stig; Jensen, Bogi Bech; Bendixen, Flemming Buus

    2013-01-01

    This paper presents the methodology and subsequent findings of a performance-improvement routine that employs automated finite element (FE) analysis to increase the torque-per-kilogram-magnet (TPKM) of a permanent magnet coupling (PMC). The routine is applied to a commercially available cylindrical...

  19. Commentaries to "The Vital Role of Operations Analysis in Improving Healthcare Delivery"

    OpenAIRE

    n/a

    2012-01-01

    This series of discussions presents commentaries on where the field of healthcare operations management is now and possible future research directions, expanding upon the key points raised by Green [Green LV (2012) The vital role of operations analysis in improving healthcare delivery. Manufacturing Service Oper. Management 14(4):488-494].

  20. An improved modal pushover analysis procedure for estimating seismic demands of structures

    Institute of Scientific and Technical Information of China (English)

    Mao Jianmeng; Zhai Changhai; Xie Lili

    2008-01-01

    The pushover analysis (POA) procedure is difficult to apply to high-rise buildings, as it cannot account for the contributions of higher modes. To overcome this limitation, a modal pushover analysis (MPA) procedure was proposed by Chopra et al. (2001). However, invariable lateral force distributions are still adopted in the MPA. In this paper, an improved MPA procedure is presented to estimate the seismic demands of structures, considering the redistribution of inertia forces after the structure yields. This improved procedure is verified with numerical examples of 5-, 9- and 22-story buildings. It is concluded that the improved MPA procedure is more accurate than either the POA procedure or MPA procedure. In addition, the proposed procedure avoids a large computational effort by adopting a two-phase lateral force distribution..

  1. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    Science.gov (United States)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  2. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  3. Reliability improvement of robotics systems: Analysis, design and real time supervision

    International Nuclear Information System (INIS)

    Reliability improvement of Robotics Systems is a key issue in automation and autonomy in maintenance and intervention tasks in Hostile Environment. Constraints in hostile environment require different way of using and programming of robots when compared with industrial application. To take maximum benefit of robot technology, the level of Confidence in the robotics tool must be much higher than in classical production world. To increase this level of confidence, application of Reliability Engineering in combination with strong knowledge of robot technology leads to such an objective. In this paper, three different aspects are considered and developed as tools to be used in different stage of this improvement. The first one is the Analysis of reliability of robotics and in remote handling systems in general to identify failure modes, effects on the system, sensitive components and needs of redundancy. Tools as the Failure Modes, Effects and Criticality Analysis are presented as well as the Fault Tree Analysis. The second one deals with design criteria for new robot systems or improvement of existing one using reliability and safety driven design concepts. Such concepts are applicable on mechanical design, electrical design and electronic design including the computer controller of the robot. The last aspect is the control in real time of availability of functions, safety level as well as failure detection in the various subsystems composing a robot device. Techniques of supervision by use of safety check subroutines are considered. Experiences of such improvement process of robotics for maintenance of Fusion machines is discussed. (author). Figs

  4. The Effectiveness of Transactional Analysis Group-counseling on the Improvement of Couples’ Family Functioning

    Directory of Open Access Journals (Sweden)

    Ghorban Ali Yahyaee

    2015-06-01

    Full Text Available Background & Aims of the Study: Family functioning is among the most important factors ensuring the mental health of family members. Disorder or disturbance in family functioning would cause many psychological problems for family members. Current study intended to examine the effectiveness of transactional analysis group counseling on the improvement of couple's family functioning. Materials & Methods: The design of the study is as semi experimental research with pretest and posttest with follow up and control group. Statistical population consists all couples referring to the psychological and counseling centers of Rasht city in 2012. Samples were selected at first by available sampling method and after completing family assessment  device, and obtaining score for enter to research, were placement using random sampling method in two experimental and control groups (N = 8 couples per group. The experimental group participated in 12 sessions of group counseling based on transactional analysis and control group received no intervention. The gathered data were analyzed using covariance analysis. Results: The results show that there are significant differences between the pre-test and post test scores of the experimental group. This difference is significant at the level of 0.05. Therefore it seems that transactional group therapy improved the dimensions of family functioning in couples. Conclusions: The results indicated that transactional analysis group counseling can improve the family functioning and use this approach to working with couples is recommended.

  5. An Improved Biclustering Algorithm and Its Application to Gene Expression Spectrum Analysis

    Institute of Scientific and Technical Information of China (English)

    Hua Qu; Liu-Pu Wang; Yan-Chun Liang; Chun-Guo Wu

    2005-01-01

    Cheng and Church algorithm is an important approach in biclustering algorithms.In this paper, the process of the extended space in the second stage of Cheng and Church algorithm is improved and the selections of two important parameters are discussed. The results of the improved algorithm used in the gene expression spectrum analysis show that, compared with Cheng and Church algorithm, the quality of clustering results is enhanced obviously, the mining expression models are better, and the data possess a strong consistency with fluctuation on the condition while the computational time does not increase significantly.

  6. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Claudia R.; Hofer, Christoph; Wiek, Arnim; Scholz, Roland W. [Environmental Sciences, Natural and Social Science Interface, Swiss Federal Institute of Technology, ETH Zentrum, HAD, Haldenbachstr. 44, CH-8092 Zurich (Switzerland)

    2004-05-10

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management.

  7. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    Directory of Open Access Journals (Sweden)

    Bahram A. Ghanbari-Hashemabadi

    2012-03-01

    Full Text Available Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen based on screening method. In addition, they were randomly divided into three equal groups: control group (15 participants, cognitive experimental group (15 participants and transactional analysis group (15 participants. Conflict-solving questionnaire was used to collect data and the intervention methods were cognitive and transactional analysis group therapy that was administrated during 8 weekly two-hour sessions. Mean and standard deviation were used for data analysis in the descriptive level and One-Way ANOVA method was used at the inference level.Results: The results of the study suggest that the conflict-solving skills in the two experimental groups were significantly increased. Conclusion: The finding of this research is indicative of the fact that both cognitive and transactional analysis group therapy could be an effective intervention for improving conflict-solving skills

  8. Next generation sequencing: Improved resolution for paternal/maternal duos analysis.

    Science.gov (United States)

    Ma, Yan; Kuang, Jin-Zhi; Nie, Tong-Gang; Zhu, Wei; Yang, Zhi

    2016-09-01

    In the case of two mismatches observed in alleged parent-offspring pairs, there is doubt as to whether there is an exclusion of the putative parent or the existence of two mutations. Here, we report on four cases with two mismatches in paternal/maternal duos based on capillary electrophoresis (CE) results. The analyzed next generation sequencing (NGS) results were compared with 20 autosomal STRs derived from previous CE-based analysis. In summary, the NGS samples used offered comprehensive information of different types of markers that can improve resolutions for paternal/maternal duos analysis. PMID:27347656

  9. Factorial kriging analysis - a geostatistical approach to improve reservoir characterization with seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Evaldo Cesario; Johann, Paulo R. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Remacre, Armando Zaupa [Universidade Estadual de Campinas, SP (Brazil)

    1999-07-01

    In this work the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes as a secondary variable in Kriging system is discussed. Results prove that Factorial Kriging is an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)

  10. Methodological aspects of development of new instrumental methods of analysis and improvement of known ones

    International Nuclear Information System (INIS)

    Consideration is given to possibilities of instrumental methods of analysis, such as method of precision registration of natural isotope rates of light elements from gaseous phase; method of piezoquartz microweighting; probe methods of analysis in spark mass-spectroscopy; extraction-atomic-emission spectroscopy with inductively coupled plasma. Prediction of further development of these methods, improvement of their analytic characteristics is given: increase of sensitivity, accuracy and rapidity. Extension of fields of their application is forecasted as well. 20 refs.; 7 figs.; 2 tabs

  11. Improvement and analysis of ID3 algorithm in decision-making tree

    Science.gov (United States)

    Xie, Xiao-Lan; Long, Zhen; Liao, Wen-Qi

    2015-12-01

    For the cooperative system under development, it needs to use the spatial analysis and relative technology concerning data mining in order to carry out the detection of the subject conflict and redundancy, while the ID3 algorithm is an important data mining. Due to the traditional ID3 algorithm in the decision-making tree towards the log part is rather complicated, this paper obtained a new computational formula of information gain through the optimization of algorithm of the log part. During the experiment contrast and theoretical analysis, it is found that IID3 (Improved ID3 Algorithm) algorithm owns higher calculation efficiency and accuracy and thus worth popularizing.

  12. Improved reporting of statistical design and analysis: guidelines, education, and editorial policies.

    Science.gov (United States)

    Mazumdar, Madhu; Banerjee, Samprit; Van Epps, Heather L

    2010-01-01

    A majority of original articles published in biomedical journals include some form of statistical analysis. Unfortunately, many of the articles contain errors in statistical design and/or analysis. These errors are worrisome, as the misuse of statistics jeopardizes the process of scientific discovery and the accumulation of scientific knowledge. To help avoid these errors and improve statistical reporting, four approaches are suggested: (1) development of guidelines for statistical reporting that could be adopted by all journals, (2) improvement in statistics curricula in biomedical research programs with an emphasis on hands-on teaching by biostatisticians, (3) expansion and enhancement of biomedical science curricula in statistics programs, and (4) increased participation of biostatisticians in the peer review process along with the adoption of more rigorous journal editorial policies regarding statistics. In this chapter, we provide an overview of these issues with emphasis to the field of molecular biology and highlight the need for continuing efforts on all fronts.

  13. Thermal Analysis in Gas Insulated Transmission Lines Using an Improved Finite-Element Model

    Directory of Open Access Journals (Sweden)

    Ling LI

    2013-01-01

    Full Text Available  In this paper, an improved finite element (FE model is proposed to investigate the temperature distribution in gas insulated transmission lines (GILs. The solution of joule losses in eddy current field analysis is indirectly coupled into fluid and thermal fields. As is different from the traditional methods, the surrounding air of the GIL is involved in the model to avoid constant convective heat transfer coefficient, thus multiple species transport technique is employed to deal with the problem of two fluid types in a single model. In addition, the temperature dependent electrical and thermal properties of the materials are considered. The steady-state and transient thermal analysis of the GIL are performed separately with the improved model. The corresponding temperature distributions are compared with experimental results reported in the literature.

  14. Transient Voltage Stability Analysis and Improvement of A Network with different HVDC Systems

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe

    2011-01-01

    This paper presents transient voltage stability analysis of an AC system with multi-infeed HVDC links including a traditional LCC HVDC link and a VSC HVDC link. It is found that the voltage supporting capability of the VSC-HVDC link is significantly influenced by the tie-line distance between...... the two links and the size of loads. In order to improve the transient voltage stability, a voltage adjusting method is proposed in this paper. A voltage increment component has been introduced into the outer voltage control loop under emergency situation caused by severe grid faults. In order to verify...... the theoretical analysis and the improved control method, real time simulation model of a hybrid multi-infeed HVDC system based on western Danish power system is established in RTDS™. Simulation results show that the enhanced transient voltage stability can be achieved....

  15. Analysis of Improvement on Human Resource Management within Chinese Enterprises in Economic Globalization

    OpenAIRE

    Lihui Xie; Dasong Deng; Xifa Liu

    2013-01-01

    In this study, we analysis of improvement on human resource management within Chinese enterprises in economic globalization. China’s entry into WTO has accelerated the economic globalization pace of Chinese enterprises and Chinese economy is further integrated with the global economy in a global scope. Human resource is what economic globalization of Chinese enterprises relies on, the first resource for China to participate in the international competition and is also the key to make effectiv...

  16. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    OpenAIRE

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of ...

  17. Ion beam analysis with external beams: Recent set-up improvements

    International Nuclear Information System (INIS)

    Accelerator-based analytical techniques using external beams are ideally fitted to the study of works of art because of their fully non-destructive character. However, accurate quantitative analysis is not straightforward, due in particular to difficult beam monitoring. Significant improvements have been progressively made on the external beam line of the IBA facility of the Louvre museum in order to increase the accuracy and to conduct combined analyses with different IBA techniques

  18. Improvement of safety by analysis of costs and benefits of the system

    OpenAIRE

    Karkoszka, T.; M. Andraczke

    2011-01-01

    Purpose: of the paper has been the assessment of the dependence between improvement of the implemented occupational health and safety management system and both minimization of costs connected with occupational health and safety assurance and optimization of real work conditions.Design/methodology/approach: used for the analysis has included definition of the occupational health and safety system with regard to the rules and tool allowing for occupational safety assurance in the organisationa...

  19. Economic Analysis of Cost-Effectiveness of Community Engagement to Improve Health

    OpenAIRE

    Andrew Street; Roy Carr-Hill

    2008-01-01

    Liberty of association is one of the building blocks of a democratic society, and presumes that community engagement in a democratic society is universally a good thing. This presumption is not subject to economic analysis, but the issue considered here is whether community engagement is a better vehicle for improving the community’s health than another approach. The problems of applying the standard framework of economic evaluation to consider this issue include: multiple perspectives and ti...

  20. RISK ANALYSIS FOR OCCUPATIONAL IMPLEMENTATION OF IMPROVEMENT IN A RENAL CLINIC

    Directory of Open Access Journals (Sweden)

    Lilian Oliveira de Oliveira

    2013-12-01

    Full Text Available The main purpose of this research to analyze the occupational risks in a Renal Clinic located in central-RS. From the observational analysis of risk maps and instrument data collection, we implemented improvements in local. Through the results, it was noted that the implementations have been significant and that changes are needed to reduce occupational disorders, promoting better quality of life for clinical professionals.

  1. Conference Report: Improving Skills: Evidence from Secondary Analysis of International Surveys

    OpenAIRE

    WEBER ANKE; MOUTHAAN MELISSA

    2013-01-01

    The Improving Skills conference, which took place November 15-16, 2012 in Cyprus, was organised by the European Commission, DG Education and Culture (DG EAC), in close cooperation with the Cypriot Presidency and with the input from CRELL. The aim of the conference was to generate and disseminate knowledge derived from recent secondary analysis of large scale international surveys and assessments such as PISA, TIMSS, ICCS, ESLC and PIRLS. Participants of the conference discussed policy...

  2. Improved breath alcohol analysis with use of carbon dioxide as the tracer gas

    OpenAIRE

    Kaisdotter Andersson, Annika

    2010-01-01

    State-of-the-art breath analysers require a prolonged expiration into a mouthpiece to obtain the accuracy required for evidential testing and screening of the alcohol concentration. This requirement is unsuitable for breath analysers used as alcolock owing to their frequent use and the fact that the majority of users are sober drivers; as well as for breath testing in uncooperative persons. This thesis presents a method by which breath alcohol analysis can be improved, using carbon dioxide (C...

  3. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  4. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  5. Improvement of safety by analysis of costs and benefits of the system

    Directory of Open Access Journals (Sweden)

    T. Karkoszka

    2011-11-01

    Full Text Available Purpose: of the paper has been the assessment of the dependence between improvement of the implemented occupational health and safety management system and both minimization of costs connected with occupational health and safety assurance and optimization of real work conditions.Design/methodology/approach: used for the analysis has included definition of the occupational health and safety system with regard to the rules and tool allowing for occupational safety assurance in the organisational and technical way, analyses of costs and benefits of the system maintenance as well as study on the tools for potential improvement of processes.Findings: of analysis are as follows: continuously improving occupational safety management system guarantees the advancement of work conditions, the decrease of the rate of occupational illnesses as well as the lowering of the amount of occupational accidents.Research limitations/implications: can apply in case of any organisation, which uses both organizational and technical rules, methods and tools to assure the optimal level of occupational health and safety conditions.Originality/value: of the presented paper has been constituted by the specification of the continuous improvement tools and methods in the system implemented on the basis on quality criterion.

  6. Method of sensitivity improving in the non-dispersive infrared gas analysis system

    Institute of Scientific and Technical Information of China (English)

    Youwen Sun; Wenqing Liu; Shimei Wang; Shuhua Huang; Xiaoman Yu

    2011-01-01

    @@ A method of interference correction for improving the sensitivity of non-dispersive infrared (NDIR) gas analysis system is demonstrated.Based on the proposed method, the interference due to water vapor and carbon dioxide in the NDIR NO analyzer is corrected.After interference correction, the absorbance signal at the NO filter channel is only controlled by the absorption of NO, and the sensitivity of the analyzer is improved greatly.In the field experiment for pollution source emission monitoring, the concentration trend of NO monitored by NDIR analyzer is in good agreement with the differential optical absorption spectroscopy NO analyzer.Small variations of NO concentration can also be resolved, and the measuring correlation coefficient of the two analyzers is 94.28%.%A method of interference correction for improving the sensitivity of non-dispersive infrared (NDIR) gas analysis system is demonstrated. Based on the proposed method, the interference due to water vapor and carbon dioxide in the NDIR NO analyzer is corrected. After interference correction, the absorbance signal at the NO filter channel is only controlled by the absorption of NO, and the sensitivity of the analyzer is improved greatly. In the field experiment for pollution source emission monitoring, the concentration trend of NO monitored by NDIR analyzer is in good agreement with the differential optical absorption spectroscopy NO analyzer. Small variations of NO concentration can also be resolved, and the measuring correlation coefficient of the two analyzers is 94.28%.

  7. Improved enteral tolerance following step procedure: systematic literature review and meta-analysis.

    Science.gov (United States)

    Fernandes, Melissa A; Usatin, Danielle; Allen, Isabel E; Rhee, Sue; Vu, Lan

    2016-10-01

    Surgical management of children with short bowel syndrome (SBS) changed with the introduction of the serial transverse enteroplasty procedure (STEP). We conducted a systematic review and meta-analysis using MEDLINE and SCOPUS to determine if children with SBS had improved enteral tolerance following STEP. Studies were included if information about a child's pre- and post-STEP enteral tolerance was provided. A random effects meta-analysis provided a summary estimate of the proportion of children with enteral tolerance increase following STEP. From 766 abstracts, seven case series involving 86 children were included. Mean percent tolerance of enteral nutrition improved from 35.1 to 69.5. Sixteen children had no enteral improvement following STEP. A summary estimate showed that 87 % (95 % CI 77-95 %) of children who underwent STEP had an increase in enteral tolerance. Compilation of the literature supports the belief that SBS subjects' enteral tolerance improves following STEP. Enteral nutritional tolerance is a measure of efficacy of STEP and should be presented as a primary or secondary outcome. By standardizing data collection on children undergoing STEP procedure, better determination of nutritional benefit from STEP can be ascertained. PMID:27461428

  8. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson

    2008-06-01

    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  9. Improved Proteomic Analysis Following Trichloroacetic Acid Extraction of Bacillus anthracis Spore Proteins

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.; Warner, Marvin G.; Wahl, Karen L.; Hutchison, Janine R.

    2015-08-07

    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extraction improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.

  10. Improving resolution of gravity data with wavelet analysis and spectral method

    Institute of Scientific and Technical Information of China (English)

    QIU Ning; HE Zhanxiang; CHANG Yanjun

    2007-01-01

    Gravity data are the results of gravity force field interaction from all the underground sources. The objects of detection are always submerged in the background field, and thus one of the crucial problems for gravity data interpretation is how to improve the resolution of observed information.The wavelet transform operator has recently been introduced into the domain fields both as a filter and as a powerful source analysis tool. This paper studied the effects of improving resolution of gravity data with wavelet analysis and spectral method, and revealed the geometric characteristics of density heterogeneities described by simple shaped sources. First, the basic theory of the multiscale wavelet analysis and its lifting scheme and spectral method were introduced. With the exper-imental study on forward simulation of anomalies given by the superposition of six objects and measured data in Songliao plain, Northeast China, the shape, size and depth of the buried objects were estimated in the study. Also, the results were compared with those obtained by conventional techniques,which demonstrated that this method greatly improves the resolution of gravity anomalies.

  11. Simulations study of neutrino oscillation parameters with the Iron Calorimeter Detector (ICAL): an improved analysis

    CERN Document Server

    Mohan, Lakshmi S

    2016-01-01

    We present an updated and improved simulations analysis of precision measurements of neutrino oscillation parameters from the study of charged-current interactions of atmospheric neutrinos in the Iron Calorimeter (ICAL) detector at the proposed India-based Neutrino Observatory (INO). The present analysis is done in the extended muon energy range of 0.5--25 GeV, as compared to the previous analyses which were limited to the range 1--11 GeV of muon energy. A substantial improvement in the precision measurement of the oscillation parameters in the 2--3 sector, including the magnitude and sign of the 2--3 mass-squared difference $\\Delta{m^2_{32}}$ and especially $\\theta_{23}$ is observed. The sensitivities are further improved by the inclusion of additional systematics which constrains the ratio of neutrino to anti-neutrino fluxes. The best $1\\sigma$ precision on $\\sin^2 \\theta_{23}$ and $|\\Delta{m^2_{32}}|$ achievable with the new analysis for 500 kTon yr exposure of ICAL are $\\sim9\\%$ and $\\sim2.5\\%$ respective...

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-01-01

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  13. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-02-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  14. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  15. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.

    2016-07-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  16. A NOVEL SPEECH ENHANCEMENT APPROACH BASED ON MODIFIED DCT AND IMPROVED PITCH SYNCHRONOUS ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. R. Balaji

    2014-01-01

    Full Text Available Speech enhancement has become an essential issue within the field of speech and signal processing, because of the necessity to enhance the performance of voice communication systems in noisy environment. There has been a number of research works being carried out in speech processing but still there is always room for improvement. The main aim is to enhance the apparent quality of the speech and to improve the intelligibility. Signal representation and enhancement in cosine transformation is observed to provide significant results. Discrete Cosine Transformation has been widely used for speech enhancement. In this research work, instead of DCT, Advanced DCT (ADCT which simultaneous offers energy compaction along with critical sampling and flexible window switching. In order to deal with the issue of frame to frame deviations of the Cosine Transformations, ADCT is integrated with Pitch Synchronous Analysis (PSA. Moreover, in order to improve the noise minimization performance of the system, Improved Iterative Wiener Filtering approach called Constrained Iterative Wiener Filtering (CIWF is used in this approach. Thus, a novel ADCT based speech enhancement using improved iterative filtering algorithm integrated with PSA is used in this approach.

  17. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    Directory of Open Access Journals (Sweden)

    Zhenying Zhang

    2014-08-01

    Full Text Available The impulse turbo expander (ITE is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reduction of the friction losses of the rotor. The performance of the novel cycle is investigated based on energy and exergy analysis. A correlation of the optimum intermediate pressure in terms of ITE efficiency is developed. The improved ITE cycle increases the exergy efficiency by 1.4%–6.1% over the conventional ITE cycle, 4.6%–8.3% over the economizer cycle and 7.2%–21.6% over the base cycle. Furthermore, the improved ITE cycle is also preferred due to its lower exergy loss.

  18. Multi-factor Analysis Model for Improving Profit Management Using Excel in Shellfish Farming Projects

    Institute of Scientific and Technical Information of China (English)

    Zhuming; ZHAO; Changlin; LIU; Xiujuan; SHAN; Jin; YU

    2013-01-01

    By using a farm’s data in Yantai City and the theory of Cost-Volume-Profit analysis and the financial management methods,this paper construct a multi-factor analysis model for improving profit management using Excel 2007 in Shellfish farming projects and describes the procedures to construct a multi-factor analysis model.The model can quickly calculate the profit,improve the level of profit management,find out the breakeven point and enhance the decision-making efficiency of businesses etc.It is also a thought of the application to offer suggestions for government decisions and economic decisions for corporations as a simple analysis tool.While effort has been exerted to construct a four-variable model,some equally important variables may not be discussed sufficiently due to limitation of the paper’s space and the authors’knowledge.All variables can be listed in EXCEL 2007 and can be associated in a logical way to manage the profit of shellfish farming projects more efficiently and more practically.

  19. Improved wavelet analysis in enhancing Electromagnetic Campatibility of underground monitoring system in coal mine

    Institute of Scientific and Technical Information of China (English)

    SUN Ji-ping; MA Feng-ying; WU Dong-xu; LIU Xiao-yang

    2008-01-01

    Underground Electro Magnetic Interference (EMI) has become so serious that there were false alarms in monitoring system, which induced troubles of coal mine safety in production. In order to overcome difficulties caused by the explosion-proof enclosure of the equipments and the limitation of multiple startup and stop in transient process during EMI measurement, a novel technique was proposed to measure underground EMI distribution indirectly and enhance Electromagnetic Campatibility(EMC) of the monitoring system. The wavelet time-frequency analysis was introduced to underground monitoring system. Therefore, the sources, the startup time, duration and waveform of EMI could be ascertained correctly based on running records of underground electric equipments. The electrical fast transient/burst (EFT/B) was studied to verify the validity of wavelet analysis.EMI filter was improved in accordance of the EMI distribution gotten from wavelet analysis.Power port immunity was developed obviously. In addition, the method of setting wavelet thresholds was amended based upon conventional thresholds in the wavelet filter design.Therefore the EFT/B of data port was restrained markedly with the wavelet filtering. Coordinative effect of EMI power and wavelet filter makes false alarms of monitoring system reduce evidently. It is concluded that wavelet analysis and the improved EMI filter have enhanced the EMC of monitoring system obviously.

  20. Improved disparity map analysis through the fusion of monocular image segmentations

    Science.gov (United States)

    Perlant, Frederic P.; Mckeown, David M.

    1991-01-01

    The focus is to examine how estimates of three dimensional scene structure, as encoded in a scene disparity map, can be improved by the analysis of the original monocular imagery. The utilization of surface illumination information is provided by the segmentation of the monocular image into fine surface patches of nearly homogeneous intensity to remove mismatches generated during stereo matching. These patches are used to guide a statistical analysis of the disparity map based on the assumption that such patches correspond closely with physical surfaces in the scene. Such a technique is quite independent of whether the initial disparity map was generated by automated area-based or feature-based stereo matching. Stereo analysis results are presented on a complex urban scene containing various man-made and natural features. This scene contains a variety of problems including low building height with respect to the stereo baseline, buildings and roads in complex terrain, and highly textured buildings and terrain. The improvements are demonstrated due to monocular fusion with a set of different region-based image segmentations. The generality of this approach to stereo analysis and its utility in the development of general three dimensional scene interpretation systems are also discussed.

  1. Improving Markov Chain Monte Carlo algorithms in LISA Pathfinder Data Analysis

    Science.gov (United States)

    Karnesis, N.; Nofrarias, M.; Sopuerta, C. F.; Lobo, A.

    2012-06-01

    The LISA Pathfinder mission (LPF) aims to test key technologies for the future LISA mission. The LISA Technology Package (LTP) on-board LPF will consist of an exhaustive suite of experiments and its outcome will be crucial for the future detection of gravitational waves. In order to achieve maximum sensitivity, we need to have an understanding of every instrument on-board and parametrize the properties of the underlying noise models. The Data Analysis team has developed algorithms for parameter estimation of the system. A very promising one implemented for LISA Pathfinder data analysis is the Markov Chain Monte Carlo. A series of experiments are going to take place during flight operations and each experiment is going to provide us with essential information for the next in the sequence. Therefore, it is a priority to optimize and improve our tools available for data analysis during the mission. Using a Bayesian framework analysis allows us to apply prior knowledge for each experiment, which means that we can efficiently use our prior estimates for the parameters, making the method more accurate and significantly faster. This, together with other algorithm improvements, will lead us to our main goal, which is no other than creating a robust and reliable tool for parameter estimation during the LPF mission.

  2. Multiple breath washout analysis in infants: quality assessment and recommendations for improvement.

    Science.gov (United States)

    Anagnostopoulou, Pinelopi; Egger, Barbara; Lurà, Marco; Usemann, Jakob; Schmidt, Anne; Gorlanova, Olga; Korten, Insa; Roos, Markus; Frey, Urs; Latzin, Philipp

    2016-03-01

    Infant multiple breath washout (MBW) testing serves as a primary outcome in clinical studies. However, it is still unknown whether current software algorithms allow between-centre comparisons. In this study of healthy infants, we quantified MBW measurement errors and tried to improve data quality by simply changing software settings. We analyzed best quality MBW measurements performed with an ultrasonic flowmeter in 24 infants from two centres in Switzerland with the current software settings. To challenge the robustness of these settings, we also used alternative analysis approaches. Using the current analysis software, the coefficient of variation (CV) for functional residual capacity (FRC) differed significantly between centres (mean  ±  SD (%): 9.8  ±  5.6 and 5.8  ±  2.9, respectively, p  =  0.039). In addition, FRC values calculated during the washout differed between  -25 and  +30% from those of the washin of the same tracing. Results were mainly influenced by analysis settings and temperature recordings. Changing few algorithms resulted in significantly more robust analysis. Non-systematic inter-centre differences can be reduced by using correctly recorded environmental data and simple changes in the software algorithms. We provide implications that greatly improve infant MBW outcomes' quality and can be applied when multicentre trials are conducted. PMID:26849570

  3. Quantitative Transcript Analysis in Plants: Improved First-strand cDNA Synthesis

    Institute of Scientific and Technical Information of China (English)

    Nai-Zhong XIAO; Lei BA; Preben Bach HOLM; Xing-Zhi WANG; Steve BOWRA

    2005-01-01

    The quantity and quality of first-strand cDNA directly influence the accuracy of transcriptional analysis and quantification. Using a plant-derived α-tubulin as a model system, the effect of oligo sequence and DTT on the quality and quantity of first-strand cDNA synthesis was assessed via a combination of semi-quantitative PCR and real-time PCR. The results indicated that anchored oligo dT significantly improved the quantity and quality of α-tubulin cDNA compared to the conventional oligo dT. Similarly, omitting DTT from the first-strand cDNA synthesis also enhanced the levels of transcript. This is the first time that a comparative analysis has been undertaken for a plant system and it shows conclusively that small changes to current protocols can have very significant impact on transcript analysis.

  4. UiLog:Improving Log-Based Fault Diagnosis by Log Analysis

    Institute of Scientific and Technical Information of China (English)

    De-Qing Zou; Hao Qin; Hai Jin

    2016-01-01

    In modern computer systems, system event logs have always been the primary source for checking system status. As computer systems become more and more complex, the interaction between software and hardware increases frequently. The components will generate enormous log information, including running reports and fault information. The sheer quantity of data is a great challenge for analysis relying on the manual method. In this paper, we implement a management and analysis system of log information, which can assist system administrators to understand the real-time status of the entire system, classify logs into different fault types, and determine the root cause of the faults. In addition, we improve the existing fault correlation analysis method based on the results of system log classification. We apply the system in a cloud computing environment for evaluation. The results show that our system can classify fault logs automatically and effectively. With the proposed system, administrators can easily detect the root cause of faults.

  5. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis

    Science.gov (United States)

    Oshima, Mitsutaka

    2016-04-01

    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration

  6. Economic analysis of interventions to improve village chicken production in Myanmar.

    Science.gov (United States)

    Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J

    2013-07-01

    A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for

  7. Computerized lung sound analysis following clinical improvement of pulmonary edema due to congestive heart failure exacerbations

    Institute of Scientific and Technical Information of China (English)

    WANG Zhen; XIONG Ying-xia

    2010-01-01

    Background Although acute congestive heart failure (CHF) patients typically present with abnormal auscultatory findings on lung examination, lung sounds are not normally subjected to rigorous analysis. The goals of this study were to use a computerized analytic acoustic tool to evaluate lung sound patterns in CHF patients during acute exacerbation and after clinical improvement and to compare CHF profiles with those of normal individuals.Methods Lung sounds throughout the respiratory cycle was captured using a computerized acoustic-based imaging technique. Thirty-two consecutive CHF patients were imaged at the time of presentation to the emergency department and after clinical improvement. Digital images were created, geographical area of the images and lung sound patterns were quantitatively analyzed.Results The geographical areas of the vibration energy image of acute CHF patients without and with radiographically evident pulmonary edema were (67.9±4.7) and (60.3±3.5) kilo-pixels, respectively (P <0.05). In CHF patients without and with radiographically evident pulmonary edema (REPE), after clinical improvement the geographical area of vibration energy image of lung sound increased to (74.5±4.4) and (73.9±3.9) kilo-pixels (P <0.05), respectively. Vibration energy decreased in CHF patients with REPE following clinical improvement by an average of (85±19)% (P <0.01). Conclusions With clinical improvement of acute CHF exacerbations, there was more homogenous distribution of lung vibration energy, as demonstrated by the increased geographical area of the vibration energy image. Lung sound analysis may be useful to track in acute CHF exacerbations.

  8. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  9. Recent Improvements of Actinides Trace Analysis in Environmental Samples for Nuclear Activities Detection

    International Nuclear Information System (INIS)

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  10. A de-noising algorithm to improve SNR of segmented gamma scanner for spectrum analysis

    Science.gov (United States)

    Li, Huailiang; Tuo, Xianguo; Shi, Rui; Zhang, Jinzhao; Henderson, Mark Julian; Courtois, Jérémie; Yan, Minhao

    2016-05-01

    An improved threshold shift-invariant wavelet transform de-noising algorithm for high-resolution gamma-ray spectroscopy is proposed to optimize the threshold function of wavelet transforms and reduce signal resulting from pseudo-Gibbs artificial fluctuations. This algorithm was applied to a segmented gamma scanning system with large samples in which high continuum levels caused by Compton scattering are routinely encountered. De-noising data from the gamma ray spectrum measured by segmented gamma scanning system with improved, shift-invariant and traditional wavelet transform algorithms were all evaluated. The improved wavelet transform method generated significantly enhanced performance of the figure of merit, the root mean square error, the peak area, and the sample attenuation correction in the segmented gamma scanning system assays. We also found that the gamma energy spectrum can be viewed as a low frequency signal as well as high frequency noise superposition by the spectrum analysis. Moreover, a smoothed spectrum can be appropriate for straightforward automated quantitative analysis.

  11. Improving land cover classification using input variables derived from a geographically weighted principal components analysis

    Science.gov (United States)

    Comber, Alexis J.; Harris, Paul; Tsutsumida, Narumasa

    2016-09-01

    This study demonstrates the use of a geographically weighted principal components analysis (GWPCA) of remote sensing imagery to improve land cover classification accuracy. A principal components analysis (PCA) is commonly applied in remote sensing but generates global, spatially-invariant results. GWPCA is a local adaptation of PCA that locally transforms the image data, and in doing so, can describe spatial change in the structure of the multi-band imagery, thus directly reflecting that many landscape processes are spatially heterogenic. In this research the GWPCA localised loadings of MODIS data are used as textural inputs, along with GWPCA localised ranked scores and the image bands themselves to three supervised classification algorithms. Using a reference data set for land cover to the west of Jakarta, Indonesia the classification procedure was assessed via training and validation data splits of 80/20, repeated 100 times. For each classification algorithm, the inclusion of the GWPCA loadings data was found to significantly improve classification accuracy. Further, but more moderate improvements in accuracy were found by additionally including GWPCA ranked scores as textural inputs, data that provide information on spatial anomalies in the imagery. The critical importance of considering both spatial structure and spatial anomalies of the imagery in the classification is discussed, together with the transferability of the new method to other studies. Research topics for method refinement are also suggested.

  12. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  13. Improved Regression Analysis of Temperature-Dependent Strain-Gage Balance Calibration Data

    Science.gov (United States)

    Ulbrich, N.

    2015-01-01

    An improved approach is discussed that may be used to directly include first and second order temperature effects in the load prediction algorithm of a wind tunnel strain-gage balance. The improved approach was designed for the Iterative Method that fits strain-gage outputs as a function of calibration loads and uses a load iteration scheme during the wind tunnel test to predict loads from measured gage outputs. The improved approach assumes that the strain-gage balance is at a constant uniform temperature when it is calibrated and used. First, the method introduces a new independent variable for the regression analysis of the balance calibration data. The new variable is designed as the difference between the uniform temperature of the balance and a global reference temperature. This reference temperature should be the primary calibration temperature of the balance so that, if needed, a tare load iteration can be performed. Then, two temperature{dependent terms are included in the regression models of the gage outputs. They are the temperature difference itself and the square of the temperature difference. Simulated temperature{dependent data obtained from Triumph Aerospace's 2013 calibration of NASA's ARC-30K five component semi{span balance is used to illustrate the application of the improved approach.

  14. Using Mobile Phones to Improve Educational Outcomes: An Analysis of Evidence from Asia

    Directory of Open Access Journals (Sweden)

    John-Harmen Valk

    2010-03-01

    Full Text Available Despite improvements in educational indicators, such as enrolment, significant challenges remain with regard to the delivery of quality education in developing countries, particularly in rural and remote regions. In the attempt to find viable solutions to these challenges, much hope has been placed in new information and communication technologies (ICTs, mobile phones being one example. This article reviews the evidence of the role of mobile phone-facilitated mLearning in contributing to improved educational outcomes in the developing countries of Asia by exploring the results of six mLearning pilot projects that took place in the Philippines, Mongolia, Thailand, India, and Bangladesh. In particular, this article examines the extent to which the use of mobile phones helped to improve educational outcomes in two specific ways: 1 in improving access to education, and 2 in promoting new learning. Analysis of the projects indicates that while there is important evidence of mobile phones facilitating increased access, much less evidence exists as to how mobiles promote new learning.

  15. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital. PMID:20703560

  16. An Improved Distance and Mass Estimate for Sgr A* from a Multistar Orbit Analysis

    CERN Document Server

    Boehle, A; Schödel, R; Meyer, L; Yelda, S; Albers, S; Martinez, G D; Becklin, E E; Do, T; Lu, J R; Matthews, K; Morris, M R; Sitarski, B; Witzel, G

    2016-01-01

    We present new, more precise measurements of the mass and distance of our Galaxy's central supermassive black hole, Sgr A*. These results stem from a new analysis that more than doubles the time baseline for astrometry of faint stars orbiting Sgr A*, combining two decades of speckle imaging and adaptive optics data. Specifically, we improve our analysis of the speckle images by using information about a star's orbit from the deep adaptive optics data (2005 - 2013) to inform the search for the star in the speckle years (1995 - 2005). When this new analysis technique is combined with the first complete re-reduction of Keck Galactic Center speckle images using speckle holography, we are able to track the short-period star S0-38 (K-band magnitude = 17, orbital period = 19 years) through the speckle years. We use the kinematic measurements from speckle holography and adaptive optics to estimate the orbits of S0-38 and S0-2 and thereby improve our constraints of the mass ($M_{bh}$) and distance ($R_o$) of Sgr A*: $...

  17. Silica Fume and Fly Ash Admixed Can Help to Improve the PRC Durability Combine Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Xiao Li-guang

    2016-01-01

    Full Text Available Silica fume/Fly ash RPC can greatly improve durability. When Silica fume to replace the same amount of 8% of the proportion of cement, re-mixed 15min of mechanically activated Fly ash content of 10%, by chloride ion flux detector measuring, complex doped than the reference RPC impermeability improved significantly; In addition, by using static nitrogen adsorption method showed, RPC internal pore structure determination, the hole integral volume was lower than the reference admixed RPC integral pore volume significantly; And combined SEM microscopic experimental methods, mixed of RPC internal structure and the formation mechanism analysis showed that, SF/FA complex fully embodies the synergy doped composites “Synergistic” principle.

  18. Improving analytic hierarchy process applied to fire risk analysis of public building

    Institute of Scientific and Technical Information of China (English)

    SHI Long; ZHANG RuiFang; XIE QiYuan; FU LiHua

    2009-01-01

    The structure importance in Fault Tree Analysis (FTA) reflects how important Basic Events are to Top Event.Attribute at alternative level in Analytic Hierarchy Process (AHP) also reflect its importance to general goal.Based on the coherence of these two methods,an improved AHP is put forward.Using this improved method,how important the attribute is to the fire safety of public building can be ana-lyzed more credibly because of the reduction of subjective judgment.Olympic venues are very impor-tant public buildings in China.The fire safety evaluation of them will be a big issue to engineers.Im-proved AHP is a useful tool to the safety evaluation to these Olympic venues,and it will guide the evaluation in other areas.

  19. Combining data fusion with multiresolution analysis for improving the classification accuracy of uterine EMG signals

    Science.gov (United States)

    Moslem, Bassam; Diab, Mohamad; Khalil, Mohamad; Marque, Catherine

    2012-12-01

    Multisensor data fusion is a powerful solution for solving difficult pattern recognition problems such as the classification of bioelectrical signals. It is the process of combining information from different sensors to provide a more stable and more robust classification decisions. We combine here data fusion with multiresolution analysis based on the wavelet packet transform (WPT) in order to classify real uterine electromyogram (EMG) signals recorded by 16 electrodes. Herein, the data fusion is done at the decision level by using a weighted majority voting (WMV) rule. On the other hand, the WPT is used to achieve significant enhancement in the classification performance of each channel by improving the discrimination power of the selected feature. We show that the proposed approach tested on our recorded data can improve the recognition accuracy in labor prediction and has a competitive and promising performance.

  20. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics.

    Science.gov (United States)

    Yin, Jian; Fenley, Andrew T; Henriksen, Niel M; Gilson, Michael K

    2015-08-13

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by nonoptimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery.

  1. Development of an improved method to perform single particle analysis by TIMS for nuclear safeguards.

    Science.gov (United States)

    Kraiem, M; Richter, S; Kühn, H; Aregbe, Y

    2011-02-28

    A method is described that allows measuring the isotopic composition of small uranium oxide particles (less than 1μm in diameter) for nuclear safeguards purposes. In support to the development of reliable tools for the identification of uranium and plutonium signatures in trace amounts of nuclear materials, improvements in scanning electron microscopy (SEM) and thermal ionization mass spectrometry (TIMS) in combination with filament carburization and multiple ion counting (MIC) detection were investigated. The method that has been set up enables the analysis of single particles by a combination of analytical tools, thus yielding morphological, elemental and isotopic information. Hereby individual particles of certified reference materials (CRMs) containing uranium at femtogram levels were analysed. The results showed that the combination of techniques proposed in this work is suitable for the accurate determination of uranium isotope ratios in single particles with improved capabilities for the minor abundant isotopes. PMID:21296200

  2. Analysis of Improvement on Human Resource Management within Chinese Enterprises in Economic Globalization

    Directory of Open Access Journals (Sweden)

    Lihui Xie

    2013-04-01

    Full Text Available In this study, we analysis of improvement on human resource management within Chinese enterprises in economic globalization. China’s entry into WTO has accelerated the economic globalization pace of Chinese enterprises and Chinese economy is further integrated with the global economy in a global scope. Human resource is what economic globalization of Chinese enterprises relies on, the first resource for China to participate in the international competition and is also the key to make effective use of other resources. Nevertheless, under the background of economic globalization, human resource management in Chinese enterprises is still faced up with quite a lot of challenges and problems. In order to establish a human resource management concept of globalization and set up a human resource management mechanism to respond to the economic globalization, this study makes a discussion and proposes management method and improvement measures for reference.

  3. State-of-the-art review of sodium fire analysis and current notions for improvements

    International Nuclear Information System (INIS)

    Sodium releases from postulated pipe ruptures, as well as failures of sodium handling equipment in liquid metal fast breeder reactors, may lead to substantial pressure-temperature transients in the sodium system cells, as well as in the reactor containment building. Sodium fire analyses are currently performed with analytical tools, such as the SPRAY, SOMIX, SPOOL-FIRE and SOFIRE-II codes. A review and evaluation of the state-of-the-art in sodium fire analysis is presented, and suggestions for further improvements are made. This work is based, in part, on studies made at Brookhaven National Laboratory during the past several years in the areas of model development and improvement associated with the accident analyses of LMFBRs

  4. Analysis of walking improvement with dynamic shoe insoles, using two accelerometers

    Science.gov (United States)

    Tsuruoka, Yuriko; Tamura, Yoshiyasu; Shibasaki, Ryosuke; Tsuruoka, Masako

    2005-07-01

    The orthopedics at the rehabilitation hospital found that disorders caused by sports injuries to the feet or caused by lower-back are improved by wearing dynamic shoe insoles, these improve walking balance and stability. However, the relationship of the lower-back and knees and the rate of increase in stability were not quantitatively analyzed. In this study, using two accelerometers, we quantitatively analyzed the reciprocal spatiotemporal contributions between the lower-back and knee of patients with left lower-back pain by means of Relative Power Contribution Analysis. When the insoles were worn, the contribution of the left and right knee relative to the left lower-back pain was up to 26% ( panalysis of the left and right knee decreased by up to 67% ( p<0.05). This shows an increase in stability.

  5. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  6. Error analysis and algorithm implementation for an improved optical-electric tracking device based on MEMS

    Science.gov (United States)

    Sun, Hong; Wu, Qian-zhong

    2013-09-01

    In order to improve the precision of optical-electric tracking device, proposing a kind of improved optical-electric tracking device based on MEMS, in allusion to the tracking error of gyroscope senor and the random drift, According to the principles of time series analysis of random sequence, establish AR model of gyro random error based on Kalman filter algorithm, then the output signals of gyro are multiple filtered with Kalman filter. And use ARM as micro controller servo motor is controlled by fuzzy PID full closed loop control algorithm, and add advanced correction and feed-forward links to improve response lag of angle input, Free-forward can make output perfectly follow input. The function of lead compensation link is to shorten the response of input signals, so as to reduce errors. Use the wireless video monitor module and remote monitoring software (Visual Basic 6.0) to monitor servo motor state in real time, the video monitor module gathers video signals, and the wireless video module will sent these signals to upper computer, so that show the motor running state in the window of Visual Basic 6.0. At the same time, take a detailed analysis to the main error source. Through the quantitative analysis of the errors from bandwidth and gyro sensor, it makes the proportion of each error in the whole error more intuitive, consequently, decrease the error of the system. Through the simulation and experiment results shows the system has good following characteristic, and it is very valuable for engineering application.

  7. Primary health care contribution to improve health outcomes in Bogota-Colombia: a longitudinal ecological analysis

    Directory of Open Access Journals (Sweden)

    Mosquera Paola A

    2012-08-01

    Full Text Available Abstract Background Colombia has a highly segmented and fragmented national health system that contributes to inequitable health outcomes. In 2004 the district government of Bogota initiated a Primary Health Care (PHC strategy to improve health care access and population health status. This study aims to analyse the contribution of the PHC strategy to the improvement of health outcomes controlling for socioeconomic variables. Methods A longitudinal ecological analysis using data from secondary sources was carried out. The analysis used data from 2003 and 2007 (one year before and 3 years after the PHC implementation. A Primary Health Care Index (PHCI of coverage intensity was constructed. According to the PHCI, localities were classified into two groups: high and low coverage. A multivariate analysis using a Poisson regression model for each year separately and a Panel Poisson regression model to assess changes between the groups over the years was developed. Dependent variables were infant mortality rate, under-5 mortality rate, infant mortality rate due to acute diarrheal disease and pneumonia, prevalence of acute malnutrition, vaccination coverage for diphtheria, pertussis, tetanus (DPT and prevalence of exclusive breastfeeding. The independent variable was the PHCI. Control variables were sewerage coverage, health system insurance coverage and quality of life index. Results The high PHCI localities as compared with the low PHCI localities showed significant risk reductions of under-5 mortality (13.8% and infant mortality due to pneumonia (37.5% between 2003 and 2007. The probability of being vaccinated for DPT also showed a significant increase of 4.9%. The risk of infant mortality and of acute malnutrition in children under-5 years was lesser in the high coverage group than in the low one; however relative changes were not statistically significant. Conclusions Despite the adverse contextual conditions and the limitations imposed by the

  8. Polymer-modified Concrete with Improved Flexural Toughness and Mechanism Analysis

    Institute of Scientific and Technical Information of China (English)

    CAO Qingyu; SUN Wei; GUO Liping; ZHANG Guorong

    2012-01-01

    By selecting different types of polymer mixing into concrete,the toughness of concrete is investigated,and results indicate polymer has obvious effect to improve the toughness of concrete.Microstructure of polymer-modified concrete were studied through environment scanning electron microscope and digital micro-hardness tester,results show that polymer acts as a flexible filler and reinforcement in concrete,and alters the microstructure at mortar and ITZ.By crack path prediction and energy consumption analysis,the crack path of polymer-modified concrete is more tortuous and consumes more energy than that of ordinary concrete.

  9. Business analysis for Wal-Mart, a grocery retail chain, and improvement proposals

    OpenAIRE

    BARBERÁ MARCILLA, LAURA

    2014-01-01

    This study consists on the analysis of a very big grocery retail chain and the proposal of a serial of improvements I consider that can help the company to grow in the future. Wal-Mart Stores, Inc. is a multinational retail corporation that runs large discount superstores and warehouses. It was founded less than fifty years ago by Sam Walton and his brother Bud in Bentonville, Arkansas (USA). With sales over $300 billion a year, Wal-Mart is considered one of world´s most valuable companies...

  10. Analysis and Improvement of TCP Congestion Control Mechanism Based on Global Optimization Model

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Network flow control is formulated as a global optimization problem of user profit. A general global optimization flow control model is established. This model combined with the stochastic model of TCP is used to study the global rate allocation characteristic of TCP. Analysis shows when active queue manage ment is used in network TCP rates tend to be allocated to maximize the aggregate of a user utility function Us (called Us fairness). The TCP throughput formula is derived. An improved TCP congestion control mecha nism is proposed. Simulations show its throughput is TCP friendly when competing with existing TCP and its rate change is smoother. Therefore, it is suitable to carry multimedia applications.

  11. Improved Method for the Flow Injection Analysis of Chemical Oxygen Demand Using Silver Nitrate

    OpenAIRE

    Korenaga, Takashi; Ikatsu, Hisayoshi; Moriwake, Tosio; Takahashi, Teruo

    1980-01-01

    On the flow injection analysis (FIA) of chemical oxygendemand (COD), silver salt was added as an oxidation catalyst for COD substances and a masking agent for halide to improve operating conditions of the FIA apparatus. Both of a proper concentration of potassium permanganate solution and 6.0 % sulfuric acid solution containing 0.1 % silver nitrate are individually pumped up with respective flow rates of 0.51 ml min(-l) and merged into a carrier stream. A 20 μ1 of sample solution is injected ...

  12. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole

    2002-01-01

    A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...... recovery. The annual energy cost and first cost of the HVAC system were calculat4ed using DOE 2.1E for different levels of air quality (10-50% dissatisfied). This was achieved by changing the outdoor air supply rate and the pollution loads. Previous studies have documented a 1.1% increase in office...

  13. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  14. Improving resolution and depth of astronomical observations via modern mathematical methods for image analysis

    CERN Document Server

    Castellano, Marco; Fontana, Adriano; Merlin, Emiliano; Pilo, Stefano; Falcone, Maurizio

    2015-01-01

    In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.

  15. Improvements in the vapor-time profile analysis of explosive odorants using solid-phase microextraction.

    Science.gov (United States)

    Young, Mimy; Schantz, Michele; MacCrehan, William

    2016-07-15

    A modified approach for characterization of the vapor-time profile of the headspace odors of explosives was developed using solid-phase microextraction (SPME) incorporating introduction of an externally-sampled internal standard (ESIS) followed by gas chromatography/mass spectrometry (GC/MS) analysis. With this new method, reproducibility of the measurements of 2-ethyl-1-hexanol and cyclohexanone were improved compared to previous work (Hoffman et al., 2009; Arthur and Pawliszyn, 1990) through the use of stable-isotope-labeled internal standards. Exposing the SPME fiber to the ESIS after sampling the target analyte proved to be advantageous, while still correcting for fiber variability and detector drift. For the analysis of high volatility compounds, incorporation of the ESIS using the SPME fiber in the retracted position minimized the subsequent competitive loss of the target analyte, allowing for much longer sampling times. PMID:27286650

  16. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification

  17. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  18. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  19. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  20. Structure of CPV17 polyhedrin determined by the improved analysis of serial femtosecond crystallographic data

    International Nuclear Information System (INIS)

    The X-ray free-electron laser (XFEL) allows the analysis of small weakly diffracting protein crystals, but has required very many crystals to obtain good data. Here we use an XFEL to determine the room temperature atomic structure for the smallest cytoplasmic polyhedrosis virus polyhedra yet characterized, which we failed to solve at a synchrotron. These protein microcrystals, roughly a micron across, accrue within infected cells. We use a new physical model for XFEL diffraction, which better estimates the experimental signal, delivering a high-resolution XFEL structure (1.75 Å), using fewer crystals than previously required for this resolution. The crystal lattice and protein core are conserved compared with a polyhedrin with less than 10% sequence identity. We explain how the conserved biological phenotype, the crystal lattice, is maintained in the face of extreme environmental challenge and massive evolutionary divergence. Our improved methods should open up more challenging biological samples to XFEL analysis

  1. Linear analysis of the vertical shear instability: outstanding issues and improved solutions (Research Note)

    CERN Document Server

    Umurhan, O M; Gressel, O

    2015-01-01

    The Vertical Shear Instability is one of two known mechanisms potentially active in the so-called dead zones of protoplanetary accretion disks. A recent analysis indicates that a subset of unstable modes shows unbounded growth - both as resolution is increased and when the nominal lid of the atmosphere is extended, possibly indicating ill-posedness in previous attempts of linear analysis. The reduced equations governing the instability are revisited and the generated solutions are examined using both the previously assumed separable forms and an improved non-separable solution form that is herewith introduced. Analyzing the reduced equations using the separable form shows that, while the low-order body modes have converged eigenvalues and eigenfunctions (as both the vertical boundaries of the atmosphere are extended and with increased radial resolution), it is also confirmed that the corresponding high-order body modes and the surface modes do indeed show unbounded growth rates. However, the energy contained ...

  2. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    Science.gov (United States)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  3. Improvements of instrumental proximate and ultimate analysis of coals and coal conversion products

    Energy Technology Data Exchange (ETDEWEB)

    Selucky, M.L.; Iacchelli, A.; Murray, C.; Lieshout, T. van.

    1982-06-01

    Comparison of proximate analyses obtained using ASTM (American Society for Testing of Materials) methods with those from the Fisher coal analyzer shows that the analyzer gives consistently low moisture and ash values, and high volatile matter values. While the accuracy of moisture and ash determinations can be improved by introducing various instrument and crucible modifications, volatile matter values are less accurate, mainly because of differences in heating rates. However, reproducibility of results is very good and, with modifications, the instrument can be used to advantage for internal purposes, chiefly because of its large sample capacity. In ultimate analysis of coals using the Perkin-Elmer element analyzer, the main problem is that the initial purge gas flushing period after sample introduction partially removes water from the sample. Various methods of sample drying have shown that the best approach is to dry the sample directly in the instrument at the temperature used for moisture determination; with this modification of the analystical cycle, excellent reproducibility and correlation with the ASTM method have been achieved. The proximate and ultimate analysis of samples of extracts and extract residue are impaired by the presence of residual solvent. The samples can contain up to 10% residual solvent which appear as moisture in the proximate analysis. The report describes several ways of removing the solvent so that accurate analysis can be obtained. The foregoing modifications to procedures and equipment have considerably improved both accuracy and reliability of results obtained by instrumental methods. In consequence, considerably more samples can be handled than by using ASTM standard procedures. 4 refs., 1 figs., 19 tabs.

  4. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP

    International Nuclear Information System (INIS)

    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction. (paper)

  5. Security analysis and improvements of authentication and access control in the Internet of Things.

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-08-13

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  6. Efficacy of e-technologies in improving breastfeeding outcomes among perinatal women: a meta-analysis.

    Science.gov (United States)

    Lau, Ying; Htun, Tha P; Tam, Wai S W; Klainin-Yobas, Piyanee

    2016-07-01

    A growing line of research has highlighted that e-technologies may play a promising role in improving breastfeeding outcomes. The objective of this review was to synthesise the best of available evidence by conducting a meta-analysis to evaluate whether e-technologies have had any effect in improving breastfeeding outcomes among perinatal women. The review was conducted using nine electronic databases to search for English-language research studies from 2007 to 2014. A 'risk of bias' table was used to assess methodological quality. Meta-analysis was performed with the RevMan software. The Q test and I(2) test was used to assess the heterogeneity. The test of overall effect was assessed using z-statistics at P attitude (z = 3.01, P = 0.003) and breastfeeding knowledge (z = 4.54, P = < 0.00001) in subgroup analyses. This review provides support for the development of web-based, texting messaging, compact disc read-only memory, electronic prompts and interactive computer agent interventions for promoting and supporting breastfeeding. PMID:26194599

  7. An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers.

    Science.gov (United States)

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-04-21

    In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner-Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments.

  8. Improving the sensory quality of flavored liquid milk by engaging sensory analysis and consumer preference.

    Science.gov (United States)

    Zhi, Ruicong; Zhao, Lei; Shi, Jingye

    2016-07-01

    Developing innovative products that satisfy various groups of consumers helps a company maintain a leading market share. The hedonic scale and just-about-right (JAR) scale are 2 popular methods for hedonic assessment and product diagnostics. In this paper, we chose to study flavored liquid milk because it is one of the most necessary nutrient sources in China. The hedonic scale and JAR scale methods were combined to provide directional information for flavored liquid milk optimization. Two methods of analysis (penalty analysis and partial least squares regression on dummy variables) were used and the results were compared. This paper had 2 aims: (1) to investigate consumer preferences of basic flavor attributes of milk from various cities in China; and (2) to determine the improvement direction for specific products and the ideal overall liking for consumers in various cities. The results showed that consumers in China have local-specific requirements for characteristics of flavored liquid milk. Furthermore, we provide a consumer-oriented product design method to improve sensory quality according to the preference of particular consumers. PMID:27108179

  9. Using uncertainty analysis and groundwater measurements to improve the confidence of river water balance estimates

    Science.gov (United States)

    Adams, R.; Costelloe, J. F.; Western, A. W.; George, B.

    2013-10-01

    An improved understanding of water balances of rivers is fundamental in water resource management. Effective use of a water balance approach requires thorough identification of sources of uncertainty around all terms in the analysis and can benefit from additional, independent information that can be used to interpret the accuracy of the residual term of a water balance. We use a Monte Carlo approach to estimate a longitudinal river channel water balance and to identify its sources of uncertainty for a regulated river in south-eastern Australia, assuming that the residual term of this water balance represents fluxes between groundwater and the river. Additional information from short term monitoring of ungauged tributaries and groundwater heads is used to further test our confidence in the estimates of error and variance for the major components of this water balance. We identify the following conclusions from the water balance analysis. First, improved identification of the major sources of error in consecutive reaches of a catchment can be used to support monitoring infrastructure design to best reduce the largest sources of error in a water balance. Second, estimation of ungauged inflow using rainfall-runoff modelling is sensitive to the representativeness of available gauged data in characterising the flow regime of sub-catchments along a perennial to intermittent continuum. Lastly, comparison of temporal variability of stream-groundwater head difference data and a residual water balance term provides an independent means of assessing the assumption that the residual term represents net stream-groundwater fluxes.

  10. An Improved Plasticity-Based Distortion Analysis Method for Large Welded Structures

    Science.gov (United States)

    Yang, Yu-Ping; Athreya, Badrinarayan P.

    2013-05-01

    The plasticity-based distortion prediction method was improved to address the computationally intensive nature of welding simulations. Plastic strains, which are typically first computed using either two-dimensional (2D) or three-dimensional (3D) thermo-elastic-plastic analysis (EPA) on finite element models of simple weld geometry, are mapped to the full structure finite element model to predict distortion by conducting a linear elastic analysis. To optimize welding sequence to control distortion, a new theory was developed to consider the effect of weld interactions on plastic strains. This improved method was validated with experimental work on a Tee joint and tested on two large-scale welded structures—a light fabrication and a heavy fabrication—by comparing against full-blown distortion predictions using thermo-EPA. 3D solid and shell models were used for the heavy and light fabrications, respectively, to compute plastic strains due to each weld. Quantitative comparisons between this method and thermo-EPA indicate that this method can predict distortions fairly accurately—even for different welding sequences—and is roughly 1-2 orders of magnitude faster. It was concluded from these findings that, with further technical development, this method can be an ideal solver for optimizing welding sequences.

  11. Analysis of the dynamic response improvement of a turbocharged diesel engine driven alternating current generating set

    International Nuclear Information System (INIS)

    Reliability of electric supply systems is among the most required necessities of modern society. Turbocharged diesel engine driven alternating current generating sets are often used to prevent electric black outs and/or as prime electric energy suppliers. It is well known that turbocharged diesel engines suffer from an inadequate response to a sudden load increase, this being a consequence of the nature of the energy exchange between the engine and the turbocharger. The dynamic response of turbocharged diesel engines could be improved by electric assisting systems, either by direct energy supply with an integrated starter-generator-booster (ISG) mounted on the engine flywheel, or by an indirect energy supply with an electrically assisted turbocharger. An experimentally verified zero dimensional computer simulation method was used for the analysis of both types of electrical assistance. The paper offers an analysis of the interaction between a turbocharged diesel engine and different electric assisting systems, as well as the requirements for the supporting electric motors that could improve the dynamic response of a diesel engine while driving an AC generating set. When performance class compliance is a concern, it is evident that an integrated starter-generator-booster outperforms an electrically assisted turbocharger for the investigated generating set. However, the electric energy consumption and frequency recovery times are smaller when an electrically assisted turbocharger is applied

  12. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP

    Science.gov (United States)

    Zhang, Junhong; Wang, Jian; Lin, Jiewei; Bi, Fengrong; Guo, Qian; Chen, Kongwu; Ma, Liang

    2015-09-01

    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction.

  13. Security Analysis and Improvements of Authentication and Access Control in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Bruce Ndibanje

    2014-08-01

    Full Text Available Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. According to our analysis, Jing et al.’s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  14. Design improvement and dynamic finite element analysis of novel ITI dental implant under dynamic chewing loads.

    Science.gov (United States)

    Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei; Lee, Shyh-Yuan

    2015-01-01

    The main aim of this article was to introduce the application of a uniform design for experimental methods to drop the micromotion of a novel ITI dental implant model under the dynamic loads. Combining the characteristics of the traditional ITI and Nano-Tite implants, a new implant with concave holes has been constructed. Compared to the traditional ITI dental implant model, the micromotion of the new dental implant model was significantly reduced by explicit dynamic finite element analysis. From uniform design of experiments, the dynamic finite element analysis method was applied to caluculated the maximum micromotion of the full model. Finally, the chief design in all the experiment simulations which cause the minimum micromotion is picked as the advanced model of the design. Related to the original design, which was associated with a micromotion of 45.11 μm, the micromotion of the improved version was 31.37 μm, for an improvement rate of 30.5%. PMID:26406049

  15. Rice Improvement Through Genome-Based Functional Analysis and Molecular Breeding in India.

    Science.gov (United States)

    Agarwal, Pinky; Parida, Swarup K; Raghuvanshi, Saurabh; Kapoor, Sanjay; Khurana, Paramjit; Khurana, Jitendra P; Tyagi, Akhilesh K

    2016-12-01

    Rice is one of the main pillars of food security in India. Its improvement for higher yield in sustainable agriculture system is also vital to provide energy and nutritional needs of growing world population, expected to reach more than 9 billion by 2050. The high quality genome sequence of rice has provided a rich resource to mine information about diversity of genes and alleles which can contribute to improvement of useful agronomic traits. Defining the function of each gene and regulatory element of rice remains a challenge for the rice community in the coming years. Subsequent to participation in IRGSP, India has continued to contribute in the areas of diversity analysis, transcriptomics, functional genomics, marker development, QTL mapping and molecular breeding, through national and multi-national research programs. These efforts have helped generate resources for rice improvement, some of which have already been deployed to mitigate loss due to environmental stress and pathogens. With renewed efforts, Indian researchers are making new strides, along with the international scientific community, in both basic research and realization of its translational impact. PMID:26743769

  16. Analysis of Technological Innovation and Environmental Performance Improvement in Aviation Sector

    Directory of Open Access Journals (Sweden)

    Jeonghoon Mo

    2011-09-01

    Full Text Available The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector—aircraft manufacturers and airlines—has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation’s lifecycle environmental impact if they can achieve sufficient economies of scale.

  17. Lack of efficacy of music to improve sleep: a polysomnographic and quantitative EEG analysis.

    Science.gov (United States)

    Lazic, Stanley E; Ogilvie, Robert D

    2007-03-01

    An increasing number of studies have been examining non-pharmacological methods to improve the quality of sleep, including the use of music and other types of auditory stimulation. While many of these studies have found significant results, they suffer from a combination of subjective self-report measures as the primary outcome, a lack of proper controls, often combine music with some type of relaxation therapy, or do not randomise subjects to control and treatment conditions. It is therefore difficult to assess the efficacy of music to induce or improve sleep. The present study therefore examined the effects of music using standard polysomnographic measures and quantitative analysis of the electroencephalogram, along with subjective ratings of sleep quality. In addition, a tones condition was used to compare any effects of music with the effects of general auditory stimulation. Using a counter-balanced within-subjects design, the music was not significantly better than the tones or control conditions in improving sleep onset latency, sleep efficiency, wake time after sleep onset, or percent slow wave sleep, as determined by objective physiological criteria. PMID:17123654

  18. Repetitive transcranial magnetic stimulation improves consciousness disturbance in stroke patients A quantitative electroencephalography spectral power analysis

    Institute of Scientific and Technical Information of China (English)

    Ying Xie; Tong Zhang

    2012-01-01

    Repetitive transcranial magnetic stimulation is a noninvasive treatment technique that can directly alter cortical excitability and improve cerebral functional activity in unconscious patients. To investigate the effects and the electrophysiological changes of repetitive transcranial magnetic stimulation cortical treatment, 10 stroke patients with non-severe brainstem lesions and with disturbance of consciousness were treated with repetitive transcranial magnetic stimulation. A quantitative electroencephalography spectral power analysis was also performed. The absolute power in the alpha band was increased immediately after the first repetitive transcranial magnetic stimulation treatment, and the energy was reduced in the delta band. The alpha band relative power values slightly decreased at 1 day post-treatment, then increased and reached a stable level at 2 weeks post-treatment. Glasgow Coma Score and JFK Coma Recovery Scale-Revised score were improved. Relative power value in the alpha band was positively related to Glasgow Coma Score and JFK Coma Recovery Scale-Revised score. These data suggest that repetitive transcranial magnetic stimulation is a noninvasive, safe, and effective treatment technology for improving brain functional activity and promoting awakening in unconscious stroke patients.

  19. Topological-based bottleneck analysis and improvement strategies for traffic networks

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    A method is proposed to find key components of traffic networks with homogenous and heterogeneous topologies, in which heavier traffic flow is transported. One component, called the skeleton, is the minimum spanning tree (MST) based on the zero flow cost (ZCMST). The other component is the infinite incipient percolation cluster (IIC) which represents the spine of the traffic network. Then, a new method to analysis the property of the bottleneck in a large scale traffic network is given from a macroscopic and statistical viewpoint. Moreover, three effective strategies are proposed to alleviate traffic congestion. The significance of the findings is that one can significantly improve the global transport by enhancing the capacity in the ZCMST with a few links, while for improving the local traffic property, improving a tiny fraction of the traffic network in the IIC is effective. The result can be used to help traffic managers prevent and alleviate traffic congestion in time, guard against the formation of congestion bottleneck, and make appropriate policies for traffic demand management. Meanwhile, the method has very important theoretical significance and practical worthiness in optimizing traffic organization, traffic control, and disposal of emergency.

  20. Increasing the number of thyroid lesions classes in microarray analysis improves the relevance of diagnostic markers.

    Directory of Open Access Journals (Sweden)

    Jean-Fred Fontaine

    Full Text Available BACKGROUND: Genetic markers for thyroid cancers identified by microarray analysis have offered limited predictive accuracy so far because of the few classes of thyroid lesions usually taken into account. To improve diagnostic relevance, we have simultaneously analyzed microarray data from six public datasets covering a total of 347 thyroid tissue samples representing 12 histological classes of follicular lesions and normal thyroid tissue. Our own dataset, containing about half the thyroid tissue samples, included all categories of thyroid lesions. METHODOLOGY/PRINCIPAL FINDINGS: Classifier predictions were strongly affected by similarities between classes and by the number of classes in the training sets. In each dataset, sample prediction was improved by separating the samples into three groups according to class similarities. The cross-validation of differential genes revealed four clusters with functional enrichments. The analysis of six of these genes (APOD, APOE, CLGN, CRABP1, SDHA and TIMP1 in 49 new samples showed consistent gene and protein profiles with the class similarities observed. Focusing on four subclasses of follicular tumor, we explored the diagnostic potential of 12 selected markers (CASP10, CDH16, CLGN, CRABP1, HMGB2, ALPL2, ADAMTS2, CABIN1, ALDH1A3, USP13, NR2F2, KRTHB5 by real-time quantitative RT-PCR on 32 other new samples. The gene expression profiles of follicular tumors were examined with reference to the mutational status of the Pax8-PPARgamma, TSHR, GNAS and NRAS genes. CONCLUSION/SIGNIFICANCE: We show that diagnostic tools defined on the basis of microarray data are more relevant when a large number of samples and tissue classes are used. Taking into account the relationships between the thyroid tumor pathologies, together with the main biological functions and pathways involved, improved the diagnostic accuracy of the samples. Our approach was particularly relevant for the classification of microfollicular adenomas.

  1. Energy spectrum analysis of blast waves based on an improved Hilbert-Huang transform

    Science.gov (United States)

    Li, L.; Wang, F.; Shang, F.; Jia, Y.; Zhao, C.; Kong, D.

    2016-07-01

    Using the improved Hilbert-Huang transform (HHT), this paper investigates the problems of analysis and interpretation of the energy spectrum of a blast wave. It has been previously established that the energy spectrum is an effective feature by which to characterize a blast wave. In fact, the higher the energy spectra in a frequency band of a blast wave, the greater the damage to a target in the same frequency band. However, most current research focuses on analyzing wave signals in the time domain or frequency domain rather than considering the energy spectrum. We propose here an improved HHT method combined with a wavelet packet to extract the energy spectrum feature of a blast wave. When applying the HHT, the signal is first roughly decomposed into a series of intrinsic mode functions (IMFs) by empirical mode decomposition. The wavelet packet method is then performed on each IMF to eliminate noise on the energy spectrum. Second, a coefficient is introduced to remove unrelated IMFs. The energy of each instantaneous frequency can be derived through the Hilbert transform. The energy spectrum can then be obtained by adding up all the components after the wavelet packet filters and screens them through a coefficient to obtain the effective IMFs. The effectiveness of the proposed method is demonstrated by 12 groups of experimental data, and an energy attenuation model is established based on the experimental data. The improved HHT is a precise method for blast wave signal analysis. For other shock wave signals from blasting experiments, an energy frequency time distribution and energy spectrum can also be obtained through this method, allowing for more practical applications.

  2. Texture analysis improves level set segmentation of the anterior abdominal wall

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhoubing [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Allen, Wade M. [Institute of Imaging Science, Vanderbilt University, Nashville, Tennessee 37235 (United States); Baucom, Rebeccah B.; Poulose, Benjamin K. [General Surgery, Vanderbilt University Medical Center, Nashville, Tennessee 37235 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 and Institute of Imaging Science, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2013-12-15

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  3. An improved model for whole genome phylogenetic analysis by Fourier transform.

    Science.gov (United States)

    Yin, Changchuan; Yau, Stephen S-T

    2015-10-01

    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes. PMID:26151589

  4. An improved model for whole genome phylogenetic analysis by Fourier transform.

    Science.gov (United States)

    Yin, Changchuan; Yau, Stephen S-T

    2015-10-01

    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes.

  5. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes. PMID:24179734

  6. Application of computational fluid dynamics methods to improve thermal hydraulic code analysis

    Science.gov (United States)

    Sentell, Dennis Shannon, Jr.

    A computational fluid dynamics code is used to model the primary natural circulation loop of a proposed small modular reactor for comparison to experimental data and best-estimate thermal-hydraulic code results. Recent advances in computational fluid dynamics code modeling capabilities make them attractive alternatives to the current conservative approach of coupled best-estimate thermal hydraulic codes and uncertainty evaluations. The results from a computational fluid dynamics analysis are benchmarked against the experimental test results of a 1:3 length, 1:254 volume, full pressure and full temperature scale small modular reactor during steady-state power operations and during a depressurization transient. A comparative evaluation of the experimental data, the thermal hydraulic code results and the computational fluid dynamics code results provides an opportunity to validate the best-estimate thermal hydraulic code's treatment of a natural circulation loop and provide insights into expanded use of the computational fluid dynamics code in future designs and operations. Additionally, a sensitivity analysis is conducted to determine those physical phenomena most impactful on operations of the proposed reactor's natural circulation loop. The combination of the comparative evaluation and sensitivity analysis provides the resources for increased confidence in model developments for natural circulation loops and provides for reliability improvements of the thermal hydraulic code.

  7. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  8. Integrated analysis of independent gene expression microarray datasets improves the predictability of breast cancer outcome

    Directory of Open Access Journals (Sweden)

    Fenstermacher David A

    2007-09-01

    Full Text Available Abstract Background Gene expression profiles based on microarray data have been suggested by many studies as potential molecular prognostic indexes of breast cancer. However, due to the confounding effect of clinical background, independent studies often obtained inconsistent results. The current study investigated the possibility to improve the quality and generality of expression profiles by integrated analysis of multiple datasets. Profiles of recurrence outcome were derived from two independent datasets and validated by a third dataset. Results The clinical background of patients significantly influenced the content and performance of expression profiles when the training samples were unbalanced. The integrated profiling of two independent datasets lead to higher classification accuracy (71.11% vs. 70.59% and larger ROC curve area (0.789 vs. 0.767 of the testing samples. Cell cycle, especially M phase mitosis, was significantly overrepresented by the 60-gene profile obtained from integrated analysis (p Conclusion The current study confirmed that the gene expression profile generated by integrated analysis of multiple datasets achieved better prediction of breast cancer recurrence. However, the content and performance of profiles was confounded by clinical background of training patients. In future studies, prognostic profile applicable to the general population should be derived from more diversified and balanced patient cohorts in larger scale.

  9. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H{sub 2}/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author).

  10. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  11. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  12. Root-cause analysis of a potentially sentinel transfusion event: lessons for improvement of patient safety.

    Science.gov (United States)

    Adibi, Hossein; Khalesi, Nader; Ravaghi, Hamid; Jafari, Mahdi; Jeddian, Ali Reza

    2012-01-01

    Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety. PMID:23165813

  13. Characterization of local complex structures in a recurrence plot to improve nonlinear dynamic discriminant analysis

    Science.gov (United States)

    Ding, Hang

    2014-01-01

    Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.

  14. An Improved, Automated Whole-Air Sampler and VOC Analysis System: Results from SONGNEX 2015

    Science.gov (United States)

    Lerner, B. M.; Gilman, J.; Tokarek, T. W.; Peischl, J.; Koss, A.; Yuan, B.; Warneke, C.; Isaacman-VanWertz, G. A.; Sueper, D.; De Gouw, J. A.; Aikin, K. C.

    2015-12-01

    Accurate measurement of volatile organic compounds (VOCs) in the troposphere is critical for the understanding of emissions and physical and chemical processes that can impact both air quality and climate. Airborne VOC measurements have proven challenging due to the requirements of short sample collection times (=10 s) to maximize spatial resolution and sampling frequency and high sensitivity (pptv) to chemically diverse hydrocarbons, halocarbons, oxygen- and nitrogen-containing VOCs. NOAA ESRL CSD has built an improved whole air sampler (iWAS) which collects compressed ambient air samples in electropolished stainless steel canisters, based on the NCAR HAIS Advanced Whole Air Sampler [Atlas and Blake]. Post-flight chemical analysis is performed with a custom-built gas chromatograph-mass spectrometer system that pre-concentrates analyte cryostatically via a Stirling cooler, an electromechanical chiller which precludes the need for liquid nitrogen to reach trapping temperatures. For the 2015 Shale Oil and Natural Gas Nexus Study (SONGNEX), CSD conducted iWAS measurements on 19 flights aboard the NOAA WP-3D aircraft between March 19th and April 27th. Nine oil and natural gas production regions were surveyed during SONGNEX and more than 1500 air samples were collected and analyzed. For the first time, we employed real-time mapping of sample collection combined with live data from fast time-response measurements (e.g. ethane) for more uniform surveying and improved target plume sampling. Automated sample handling allowed for more than 90% of iWAS canisters to be analyzed within 96 hours of collection - for the second half of the campaign improved efficiencies reduced the median sample age at analysis to 36 hours. A new chromatography peak-fitting software package was developed to minimize data reduction time by an order of magnitude without a loss of precision or accuracy. Here we report mixing ratios for aliphatic and aromatic hydrocarbons (C2-C8) along with select

  15. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    OpenAIRE

    Feofanova Iryna V.; Feofanov Lev K.

    2013-01-01

    The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis) and logical (for identification of directions of improvement of accounting) methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both ...

  16. Deformation of Japan as measured by improved analysis of GEONET data

    Science.gov (United States)

    Owen, S. E.; Dong, D.; Webb, F. H.; Newport, B. J.; Simons, M.

    2006-12-01

    The Japan subduction zone represents a complex set of plate interfaces with significant trench-parallel variability in great earthquakes and transient deep slip events. Within the Japan arc the Nankai segment of the Eurasian-Philippine plate boundary is one of the classic subduction zone segments that last produced a set of temporally linked great earthquakes in the 1940's. Recently, down-dip of the Nankai seismogenic portion of the plate interface, transient slip events and seismic tremor events were observed. Through analysis of the GEONET GPS data, the spatial and higher frequency temporal characteristics of transient slip events can be captured. We describe our analysis methods, the spatial filtering technique that has been developed for use on large networks, a periodic signal filtering method that improves on commonly-used sinusoidal function models, and the resultant velocities and time series. Our newly developed analysis method, the GPS Network Processor, gives us the ability to process large volumes of data extremely fast. The basis of the GPS Network Processor is the JPL-developed GIPSY-OASIS GPS analysis software and the JPL-developed precise point positioning technique. The Network Processor was designed and developed to efficiently implement precise point positioning and bias fixing on a 1000-node (2000 cpu) Beowulf cluster. The entire 10 year ~1000-station GEONET data set can be reanalyzed using the Network Processor in a matter of days. This permits us to test different processing strategies, each with potentially large influence on our ability to detect strain transients from the subduction zones. For example, we can test different ocean loading models, which can effect the diurnal positions of coastal GPS sites by up to 2 cm. We can also test other potentially important factors such as using reprocessed satellite orbits and clocks, the parameterization of the tropospheric delay, or the implementation of refined solid body tide estimates. We will

  17. Improving air pollution control policy in China--A perspective based on cost-benefit analysis.

    Science.gov (United States)

    Gao, Jinglei; Yuan, Zengwei; Liu, Xuewei; Xia, Xiaoming; Huang, Xianjin; Dong, Zhanfeng

    2016-02-01

    To mitigate serious air pollution, the State Council of China promulgated the Air Pollution Prevention and Control Action Plan in 2013. To verify the feasibility and validity of industrial energy-saving and emission-reduction policies in the action plan, we conducted a cost-benefit analysis of implementing these policies in 31 provinces for the period of 2013 to 2017. We also completed a scenario analysis in this study to assess the cost-effectiveness of different measures within the energy-saving and the emission-reduction policies individually. The data were derived from field surveys, statistical yearbooks, government documents, and published literatures. The results show that total cost and total benefit are 118.39 and 748.15 billion Yuan, respectively, and the estimated benefit-cost ratio is 6.32 in the S3 scenario. For all the scenarios, these policies are cost-effective and the eastern region has higher satisfactory values. Furthermore, the end-of-pipe scenario has greater emission reduction potential than energy-saving scenario. We also found that gross domestic product and population are significantly correlated with the benefit-cost ratio value through the regression analysis of selected possible influencing factors. The sensitivity analysis demonstrates that benefit-cost ratio value is more sensitive to unit emission-reduction cost, unit subsidy, growth rate of gross domestic product, and discount rate among all the parameters. Compared with other provinces, the benefit-cost ratios of Beijing and Tianjin are more sensitive to changes of unit subsidy than unit emission-reduction cost. These findings may have significant implications for improving China's air pollution prevention policy.

  18. Improved protocol for rapid identification of certain spa types using high resolution melting curve analysis.

    Directory of Open Access Journals (Sweden)

    Benjamin Mayerhofer

    Full Text Available Methicillin-resistant Staphylococcus aureus is one of the most significant pathogens associated with health care. For efficient surveillance, control and outbreak investigation, S. aureus typing is essential. A high resolution melting curve analysis was developed and evaluated for rapid identification of the most frequent spa types found in an Austrian hospital consortium covering 2,435 beds. Among 557 methicillin-resistant Staphylococcus aureus isolates 38 different spa types were identified by sequence analysis of the hypervariable region X of the protein A gene (spa. Identification of spa types through their characteristic high resolution melting curve profiles was considerably improved by double spiking with genomic DNA from spa type t030 and spa type t003 and allowed unambiguous and fast identification of the ten most frequent spa types t001 (58%, t003 (12%, t190 (9%, t041 (5%, t022 (2%, t032 (2%, t008 (2%, t002 (1%, t5712 (1% and t2203 (1%, representing 93% of all isolates within this hospital consortium. The performance of the assay was evaluated by testing samples with unknown spa types from the daily routine and by testing three different high resolution melting curve analysis real-time PCR instruments. The ten most frequent spa types were identified from all samples and on all instruments with 100% specificity and 100% sensitivity. Compared to classical spa typing by sequence analysis, this gene scanning assay is faster, cheaper and can be performed in a single closed tube assay format. Therefore it is an optimal screening tool to detect the most frequent endemic spa types and to exclude non-endemic spa types within a hospital.

  19. Improved protocol for rapid identification of certain spa types using high resolution melting curve analysis.

    Science.gov (United States)

    Mayerhofer, Benjamin; Stöger, Anna; Pietzka, Ariane T; Fernandez, Haizpea Lasa; Prewein, Bernhard; Sorschag, Sieglinde; Kunert, Renate; Allerberger, Franz; Ruppitsch, Werner

    2015-01-01

    Methicillin-resistant Staphylococcus aureus is one of the most significant pathogens associated with health care. For efficient surveillance, control and outbreak investigation, S. aureus typing is essential. A high resolution melting curve analysis was developed and evaluated for rapid identification of the most frequent spa types found in an Austrian hospital consortium covering 2,435 beds. Among 557 methicillin-resistant Staphylococcus aureus isolates 38 different spa types were identified by sequence analysis of the hypervariable region X of the protein A gene (spa). Identification of spa types through their characteristic high resolution melting curve profiles was considerably improved by double spiking with genomic DNA from spa type t030 and spa type t003 and allowed unambiguous and fast identification of the ten most frequent spa types t001 (58%), t003 (12%), t190 (9%), t041 (5%), t022 (2%), t032 (2%), t008 (2%), t002 (1%), t5712 (1%) and t2203 (1%), representing 93% of all isolates within this hospital consortium. The performance of the assay was evaluated by testing samples with unknown spa types from the daily routine and by testing three different high resolution melting curve analysis real-time PCR instruments. The ten most frequent spa types were identified from all samples and on all instruments with 100% specificity and 100% sensitivity. Compared to classical spa typing by sequence analysis, this gene scanning assay is faster, cheaper and can be performed in a single closed tube assay format. Therefore it is an optimal screening tool to detect the most frequent endemic spa types and to exclude non-endemic spa types within a hospital. PMID:25768007

  20. Term AnalysisImproving the Quality of Learning and Application Documents in Engineering Design

    Directory of Open Access Journals (Sweden)

    S. Weiss

    2006-01-01

    Full Text Available Conceptual homogeneity is one determinant of the quality of text documents. A concept remains the same if the words used (termini change [1, 2]. In other words, termini can vary while the concept retains the same meaning. Human beings are able to handle concepts and termini because of their semantic network, which is able to connect termini to the actual context and thus identify the adequate meaning of the termini. Problems could arise when humans have to learn new content and correspondingly new concepts. Since the content is basically imparted by text via particular termini, it is a challenge to establish the right concept from the text with the termini. A term might be known, but have a different meaning [3, 4]. Therefore, it is very important to build up the correct understanding of concepts within a text. This is only possible when concepts are explained by the right termini, within an adequate context, and above all, homogeneously. So, when setting up or using text documents for teaching or application, it is essential to provide concept homogeneity.Understandably, the quality of documents is, ceteris paribus, reciprocally proportional to variations of termini. Therefore, an analysis of variations of termini could form a basis for specific improvement of conceptual homogeneity.Consequently, an exposition of variations of termini as control and improvement parameters is carried out in this investigation. This paper describes the functionality and the profit of a tool called TermAnalysis.It also outlines the margins, typeface and other vital specifications necessary for authors preparing camera-ready papers for submission to the 5th International Conference on Advanced Engineering Design. The aim of this paper is to ensure that all readers are clear as to the uniformity required by the organizing committee and to ensure that readers’ papers will be accepted as camera-ready for the conference.TermAnalysis is a software tool developed

  1. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    Science.gov (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  2. Metabolites production improvement by identifying minimal genomes and essential genes using flux balance analysis.

    Science.gov (United States)

    Salleh, Abdul Hakim Mohamed; Mohamad, Mohd Saberi; Deris, Safaai; Illias, Rosli Md

    2015-01-01

    With the advancement in metabolic engineering technologies, reconstruction of the genome of host organisms to achieve desired phenotypes can be made. However, due to the complexity and size of the genome scale metabolic network, significant components tend to be invisible. We proposed an approach to improve metabolite production that consists of two steps. First, we find the essential genes and identify the minimal genome by a single gene deletion process using Flux Balance Analysis (FBA) and second by identifying the significant pathway for the metabolite production using gene expression data. A genome scale model of Saccharomyces cerevisiae for production of vanillin and acetate is used to test this approach. The result has shown the reliability of this approach to find essential genes, reduce genome size and identify production pathway that can further optimise the production yield. The identified genes and pathways can be extendable to other applications especially in strain optimisation. PMID:26489144

  3. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters.

    Science.gov (United States)

    Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G

    2016-05-01

    With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice.

  4. Analysis of Entropy Generation for the Performance Improvement of a Tubular Solid Oxide Fuel Cell Stack

    Directory of Open Access Journals (Sweden)

    Vittorio Verda

    2009-03-01

    Full Text Available The aim of the paper is to investigate possible improvements in the design and operation of a tubular solid oxide fuel cell. To achieve this purpose, a CFD model of the cell is introduced. The model includes thermo-fluid dynamics, chemical reactions and electrochemistry. The fluid composition and mass flow rates at the inlet sections are obtained through a finite difference model of the whole stack. This model also provides boundary conditions for the radiation heat transfer. All of these conditions account for the position of each cell within the stack. The analysis of the cell performances is conducted on the basis of the entropy generation. The use of this technique makes it possible to identify the phenomena provoking the main irreversibilities, understand their causes and propose changes in the system design and operation.

  5. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed.

  6. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters.

    Science.gov (United States)

    Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G

    2016-05-01

    With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice. PMID:26337122

  7. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    Directory of Open Access Journals (Sweden)

    M. Mosleh E. Abu Samak

    2016-04-01

    Full Text Available This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD methods, the alternating direction implicit (ADI-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.

  8. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot

    2009-03-01

    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  9. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. PMID:26936844

  10. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  11. Structural Analysis and Improved Design of the Gearbox Casing of a Certain Type of Tracked Vehicle

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xue-sheng; JIA Xiao-ping; CHEN Ya-ning; YU Kui-long

    2011-01-01

    Loads on a gearbox casing of a certain type of tracked vehicle were calculated according to the engine's full load characteristic curve and the worst load condition where the gearbox operated while the tracked vehicle was running, and then stiffness and strength of the casing were analyzed by means of Patran/Nastran software. After a- nalysis, it was found that the casing satisfied the Mises ' yield condition; however, the stress distribution was hetero- geneous, and stresses near the bearing saddle bores of the casing were higher while those in other regions were much less than the allowable stress. For this reason, thicknesses of the casing wall on bearing assembling holes needed in- creasing, while those in other places can decrease. After much structural improving and re-analysis, the optimal casing design was found, and its weight decreased by 5% ; the casing still satisfied the Mises yield criterion and the stress distribution was more homogeneous.

  12. On-line Batch Process Monitoring with Improved Multi-way Independent Component Analysis

    Institute of Scientific and Technical Information of China (English)

    GUO Hui; LI Hongguang

    2013-01-01

    In the past decades,on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry.This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and determining the control limits of monitoring statistics in the presence of non-Gaussian distribution.To optimize the number of key independent components,we introduce a novel concept of system deviation,which is able to evaluate the reconstructed observations with different independent components.The monitored statistics are transformed to Gaussian distribution data by means of Box-Cox transformation,which helps readily determine the control limits.The proposed method is applied to on-line monitoring of a fed-batch penicillin fermentation simulator,and the experimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.

  13. Quality assurance testing of an explosives trace analysis laboratory--further improvements to include peroxide explosives.

    Science.gov (United States)

    Crowson, Andrew; Cawthorne, Richard

    2012-12-01

    The Forensic Explosives Laboratory (FEL) operates within the Defence Science and Technology Laboratory (DSTL) which is part of the UK Government Ministry of Defence (MOD). The FEL provides support and advice to the Home Office and UK police forces on matters relating to the criminal misuse of explosives. During 1989 the FEL established a weekly quality assurance testing regime in its explosives trace analysis laboratory. The purpose of the regime is to prevent the accumulation of explosives traces within the laboratory at levels that could, if other precautions failed, result in the contamination of samples and controls. Designated areas within the laboratory are swabbed using cotton wool swabs moistened with ethanol:water mixture, in equal amounts. The swabs are then extracted, cleaned up and analysed using Gas Chromatography with Thermal Energy Analyser detectors or Liquid Chromatography with triple quadrupole Mass Spectrometry. This paper follows on from two previous published papers which described the regime and summarised results from approximately 14years of tests. This paper presents results from the subsequent 7years setting them within the context of previous results. It also discusses further improvements made to the systems and procedures and the inclusion of quality assurance sampling for the peroxide explosives TATP and HMTD. Monitoring samples taken from surfaces within the trace laboratories and trace vehicle examination bay have, with few exceptions, revealed only low levels of contamination, predominantly of RDX. Analysis of the control swabs, processed alongside the monitoring swabs, has demonstrated that in this environment the risk of forensic sample contamination, assuming all the relevant anti-contamination procedures have been followed, is so small that it is considered to be negligible. The monitoring regime has also been valuable in assessing the process of continuous improvement, allowing sources of contamination transfer into the trace

  14. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  15. Inverse transient radiation analysis in one-dimensional participating slab using improved Ant Colony Optimization algorithms

    International Nuclear Information System (INIS)

    As a heuristic intelligent optimization algorithm, the Ant Colony Optimization (ACO) algorithm was applied to the inverse problem of a one-dimensional (1-D) transient radiative transfer in present study. To illustrate the performance of this algorithm, the optical thickness and scattering albedo of the 1-D participating slab medium were retrieved simultaneously. The radiative reflectance simulated by Monte-Carlo Method (MCM) and Finite Volume Method (FVM) were used as measured and estimated value for the inverse analysis, respectively. To improve the accuracy and efficiency of the Basic Ant Colony Optimization (BACO) algorithm, three improved ACO algorithms, i.e., the Region Ant Colony Optimization algorithm (RACO), Stochastic Ant Colony Optimization algorithm (SACO) and Homogeneous Ant Colony Optimization algorithm (HACO), were developed. By the HACO algorithm presented, the radiative parameters could be estimated accurately, even with noisy data. In conclusion, the HACO algorithm is demonstrated to be effective and robust, which had the potential to be implemented in various fields of inverse radiation problems. -- Highlights: • The ACO-based algorithms were firstly applied to the inverse transient radiation problem. • Three ACO-based algorithms were developed based on the BACO algorithm for continuous domain problem. • HACO shows a robust performance for simultaneous estimation of the radiative properties

  16. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging

    Science.gov (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.

    2016-08-01

    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  17. Inverse transient radiation analysis in one-dimensional participating slab using improved Ant Colony Optimization algorithms

    Science.gov (United States)

    Zhang, B.; Qi, H.; Ren, Y. T.; Sun, S. C.; Ruan, L. M.

    2014-01-01

    As a heuristic intelligent optimization algorithm, the Ant Colony Optimization (ACO) algorithm was applied to the inverse problem of a one-dimensional (1-D) transient radiative transfer in present study. To illustrate the performance of this algorithm, the optical thickness and scattering albedo of the 1-D participating slab medium were retrieved simultaneously. The radiative reflectance simulated by Monte-Carlo Method (MCM) and Finite Volume Method (FVM) were used as measured and estimated value for the inverse analysis, respectively. To improve the accuracy and efficiency of the Basic Ant Colony Optimization (BACO) algorithm, three improved ACO algorithms, i.e., the Region Ant Colony Optimization algorithm (RACO), Stochastic Ant Colony Optimization algorithm (SACO) and Homogeneous Ant Colony Optimization algorithm (HACO), were developed. By the HACO algorithm presented, the radiative parameters could be estimated accurately, even with noisy data. In conclusion, the HACO algorithm is demonstrated to be effective and robust, which had the potential to be implemented in various fields of inverse radiation problems.

  18. Prone positioning improves survival in severe ARDS: a pathophysiologic review and individual patient meta-analysis.

    Science.gov (United States)

    Gattinoni, L; Carlesso, E; Taccone, P; Polli, F; Guérin, C; Mancebo, J

    2010-06-01

    Prone positioning has been used for over 30 years in the management of patients with acute respiratory distress syndrome (ARDS). This maneuver has consistently proven capable of improving oxygenation in patients with acute respiratory failure. Several mechanisms can explain this observation, including possible intervening net recruitment and more homogeneously distributed alveolar inflation. It is also progressively becoming clear that prone positioning may reduce the nonphysiological stress and strain associated with mechanical ventilation, thus decreasing the risk of ventilator-induced lung injury, which is known to adversely impact patient survival. The available randomized clinical trials, however, have failed to demonstrate that prone positioning improves the outcomes of patients with ARDS overall. In contrast, the individual patient meta-analysis of the four major clinical trials available clearly shows that with prone positioning, the absolute mortality of severely hypoxemic ARDS patients may be reduced by approximately 10%. On the other hand, all data suggest that long-term prone positioning may expose patients with less severe ARDS to unnecessary complications. PMID:20473258

  19. Using uterine activity to improve fetal heart rate variability analysis for detection of asphyxia during labor.

    Science.gov (United States)

    Warmerdam, G J J; Vullings, R; Van Laar, J O E H; Van der Hout-Van der Jagt, M B; Bergmans, J W M; Schmitt, L; Oei, S G

    2016-03-01

    During labor, uterine contractions can cause temporary oxygen deficiency for the fetus. In case of severe and prolonged oxygen deficiency this can lead to asphyxia. The currently used technique for detection of asphyxia, cardiotocography (CTG), suffers from a low specificity. Recent studies suggest that analysis of fetal heart rate variability (HRV) in addition to CTG can provide information on fetal distress. However, interpretation of fetal HRV during labor is difficult due to the influence of uterine contractions on fetal HRV. The aim of this study is therefore to investigate whether HRV features differ during contraction and rest periods, and whether these differences can improve the detection of asphyxia. To this end, a case-control study was performed, using 14 cases with asphyxia that were matched with 14 healthy fetuses. We did not find significant differences for individual HRV features when calculated over the fetal heart rate without separating contractions and rest periods (p  >  0.30 for all HRV features). Separating contractions from rest periods did result in a significant difference. In particular the ratio between HRV features calculated during and outside contractions can improve discrimination between fetuses with and without asphyxia (p  <  0.04 for three out of four ratio HRV features that were studied in this paper). PMID:26862891

  20. Use of Selection Indices Based on Multivariate Analysis for Improving Grain Yield in Rice

    Institute of Scientific and Technical Information of China (English)

    Hossein SABOURI; Babak RABIEI; Maryam FAZLALIPOUR

    2008-01-01

    In order to study selection indices for improving rice grain yield, a cross was made between an Iranian traditional rice (Oryza sativa L.) variety, Tarommahalli and an improved indica rice variety, Khazar in 2006. The traits of the parents (30 plants), F1 (30 plants) and F2 generations (492 individuals) were evaluated at the Rice Research institute of Iran (RRII) during 2007. Heritabilities of the number of panicles per plant, plant height, days to heading and panicle exsertion were greater than that of grain yield. The selection indices were developed using the results of multivariate analysis. To evaluate selection strategies to maximize grain yield, 14 selection indices were calculated based on two methods (optimum and base) and combinations of 12 traits with various economic weights. Results of selection indices showed that selection for grain weight, number of panicles per plant and panicle length by using their phenotypic and/or genotypic direct effects (path coefficient) as economic weights should serve as an effective selection criterion for using either the optimum or base index.

  1. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency.

    Science.gov (United States)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-12-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. PMID:26295438

  2. Improving the precision of fMRI BOLD signal deconvolution with implications for connectivity analysis.

    Science.gov (United States)

    Bush, Keith; Cisler, Josh; Bian, Jiang; Hazaroglu, Gokce; Hazaroglu, Onder; Kilts, Clint

    2015-12-01

    An important, open problem in neuroimaging analyses is developing analytical methods that ensure precise inferences about neural activity underlying fMRI BOLD signal despite the known presence of confounds. Here, we develop and test a new meta-algorithm for conducting semi-blind (i.e., no knowledge of stimulus timings) deconvolution of the BOLD signal that estimates, via bootstrapping, both the underlying neural events driving BOLD as well as the confidence of these estimates. Our approach includes two improvements over the current best performing deconvolution approach; 1) we optimize the parametric form of the deconvolution feature space; and, 2) we pre-classify neural event estimates into two subgroups, either known or unknown, based on the confidence of the estimates prior to conducting neural event classification. This knows-what-it-knows approach significantly improves neural event classification over the current best performing algorithm, as tested in a detailed computer simulation of highly-confounded fMRI BOLD signal. We then implemented a massively parallelized version of the bootstrapping-based deconvolution algorithm and executed it on a high-performance computer to conduct large scale (i.e., voxelwise) estimation of the neural events for a group of 17 human subjects. We show that by restricting the computation of inter-regional correlation to include only those neural events estimated with high-confidence the method appeared to have higher sensitivity for identifying the default mode network compared to a standard BOLD signal correlation analysis when compared across subjects.

  3. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-01-01

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005

  4. An improved genetic system for detection and analysis of protein nuclear import signals

    Directory of Open Access Journals (Sweden)

    Derbyshire Stephanie

    2007-01-01

    Full Text Available Abstract Background Nuclear import of proteins is typically mediated by their physical interaction with soluble cytosolic receptor proteins via a nuclear localization signal (NLS. A simple genetic assay to detect active NLSs based on their function in the yeast Saccharomyces cerevisiae has been previously described. In that system, a chimera consisting of a modified bacterial LexA DNA binding domain and the transcriptional activation domain of the yeast Gal4 protein is fused to a candidate NLS. A functional NLS will redirect the chimeric fusion to the yeast cell nucleus and activate transcription of a reporter gene. Results We have reengineered this nuclear import system to expand its utility and tested it using known NLS sequences from adenovirus E1A. Firstly, the vector has been reconstructed to reduce the level of chimera expression. Secondly, an irrelevant "stuffer" sequence from the E. coli maltose binding protein was used to increase the size of the chimera above the passive diffusion limit of the nuclear pore complex. The improved vector also contains an expanded multiple cloning site and a hemagglutinin epitope tag to allow confirmation of expression. Conclusion The alterations in expression level and composition of the fusions used in this nuclear import system greatly reduce background activity in β-galactosidase assays, improving sensitivity and allowing more quantitative analysis of NLS bearing sequences.

  5. Resequencing of the common marmoset genome improves genome assemblies and gene-coding sequence analysis.

    Science.gov (United States)

    Sato, Kengo; Kuroki, Yoko; Kumita, Wakako; Fujiyama, Asao; Toyoda, Atsushi; Kawai, Jun; Iriki, Atsushi; Sasaki, Erika; Okano, Hideyuki; Sakakibara, Yasubumi

    2015-11-20

    The first draft of the common marmoset (Callithrix jacchus) genome was published by the Marmoset Genome Sequencing and Analysis Consortium. The draft was based on whole-genome shotgun sequencing, and the current assembly version is Callithrix_jacches-3.2.1, but there still exist 187,214 undetermined gap regions and supercontigs and relatively short contigs that are unmapped to chromosomes in the draft genome. We performed resequencing and assembly of the genome of common marmoset by deep sequencing with high-throughput sequencing technology. Several different sequence runs using Illumina sequencing platforms were executed, and 181 Gbp of high-quality bases including mate-pairs with long insert lengths of 3, 8, 20, and 40 Kbp were obtained, that is, approximately 60× coverage. The resequencing significantly improved the MGSAC draft genome sequence. The N50 of the contigs, which is a statistical measure used to evaluate assembly quality, doubled. As a result, 51% of the contigs (total length: 299 Mbp) that were unmapped to chromosomes in the MGSAC draft were merged with chromosomal contigs, and the improved genome sequence helped to detect 5,288 new genes that are homologous to human cDNAs and the gaps in 5,187 transcripts of the Ensembl gene annotations were completely filled.

  6. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Moon, Young Min; Lee, Dong Won; Lee, Sang Ik; Kim, Eung Soo; Yeom, Keum Soo [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2002-03-15

    The objective of the present research is to perform the separate effect tests and to assess the RELAP5/MOD3.2 code for the analysis of thermal-hydraulic behavior in the reactor coolant system and the improvement of the auditing technology of safety analysis. Three Separate Effect Tests (SETs) are the reflux condensation in the U-tube, the direct contact condensation in the hot-leg and the mixture level buildup in the pressurizer. The experimental data and the empirical correlations are obtained through SETs. On the ases of the three SET works, models in RELAP5 are modified and improved, which are compared with the data. The Korea Standard Nuclear Power Plant (KSNP) are assessed using the modified RELAP5. In the reflux condensation test, the data of heat transfer coefficients and flooding are obtained and the condensation models are modified using the non-iterative model, as results, modified code better predicts the data. In the direct contact condensation test, the data of heat transfer coefficients are obtained for the cocurrent and countercurrent flow between the mixture gas and the water in condition of horizontal stratified flow. Several condensation and friction models are modified, which well predict the present data. In the mixture level test, the data for the mixture level and the onset of water draining into the surge line are obtained. The standard RELAP5 over-predicts the mixture level and the void fraction in the pressurizer. Simple modification of model related to the pool void fraction is suggested. The KSNP is assessed using the standard and the modified RELAP5 resulting from the experimental and code works for the SETs. In case of the pressurizer manway opening with available secondary side of the steam generators, the modified code predicts that the collapsed level in the pressurizer is little accumulated. The presence and location of the opening and the secondary condition of the steam generators have an effect on the coolant inventory. The

  7. An improved global analysis of nuclear parton distribution functions including RHIC data

    Science.gov (United States)

    Eskola, Kari J.; Paukkunen, Hannu; Salgado, Carlos A.

    2008-07-01

    We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus collisions, with inclusive high-pT hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the χ2 function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated χ2 minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-pT forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions—a sector in the nPDFs which has traditionally been very badly constrained. We obtain a strikingly stronger gluon shadowing than what has been estimated in previous global analyses. The obtained nPDFs are released as a parametrization called EPS08.

  8. An improved global analysis of nuclear parton distribution functions including RHIC data

    CERN Document Server

    Eskola, K J; Salgado, C A

    2008-01-01

    We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus collisions, with inclusive high-$p_T$ hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the $\\chi^2$ function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated $\\chi^2$ minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-$p_T$ forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions -- a sector in the nPDFs which has traditionally been very b...

  9. Measurement of keff with an improved neutron source multiplication method based on numerical analysis

    International Nuclear Information System (INIS)

    In this work, we developed a numerical analysis-associated experiment method to determine the effective multiplication factor keff, which is difficult to obtain directly from conventional neutron source multiplication (NSM) method. The method is based on the relationship between keff, subcritical multiplication factor ks and external neutron source efficiency Φ* in the subcritical system. On basis of the theoretical analysis, the dependence of ks and Φ* on subcriticality and source position was investigated at the Chinese Fast Burst Reactor-II (CFBR-II). A series of ks were measured by NSM experiments at four subcritical states (keff=0.996, 0.994, 0.991 and 0.986) with the 252Cf neutron source located at different positions (from the system center to outside) at each subcritical states. The Φ* was obtained by Monte-Carlo simulation for each condition. With the measured ks and calculated Φ*, keff of the subcritical system was evaluated with a relative difference of <1% between values obtained by the improved method and by positive period method. Especially, the relative difference of <0.18% with the source located at the system center. (authors)

  10. Reliability of multiresolution deconvolution for improving depth resolution in SIMS analysis

    Science.gov (United States)

    Boulakroune, M.'Hamed

    2016-11-01

    This paper deals the effectiveness and reliability of multiresolution deconvolution algorithm for recovery Secondary Ions Mass Spectrometry, SIMS, profiles altered by the measurement. This new algorithm is characterized as a regularized wavelet transform. It combines ideas from Tikhonov Miller regularization, wavelet analysis and deconvolution algorithms in order to benefit from the advantages of each. The SIMS profiles were obtained by analysis of two structures of boron in a silicon matrix using a Cameca-Ims6f instrument at oblique incidence. The first structure is large consisting of two distant wide boxes and the second one is thin structure containing ten delta-layers in which the deconvolution by zone was applied. It is shown that this new multiresolution algorithm gives best results. In particular, local application of the regularization parameter of blurred and estimated solutions at each resolution level provided to smoothed signals without creating artifacts related to noise content in the profile. This led to a significant improvement in the depth resolution and peaks' maximums.

  11. Improvement of tissue analysis and classification using optical coherence tomography combined with Raman spectroscopy

    Science.gov (United States)

    Liu, Chih-Hao; Qi, Ji; Lu, Jing; Wang, Shang; Wu, Chen; Shih, Wei-Chuan; Larin, Kirill V.

    2014-02-01

    Optical coherence tomography (OCT) is an optical imaging technique that is capable of performing high-resolution (approaching the histopathology level) and real-time imaging of tissues without use of contrast agents. Based on these advantages, the pathological features of tumors (in micro scale) can be identified during resection surgery. However, the accuracy of tumor margin prediction still needs to be enhanced for assisting the judgment of surgeons. In this regard, we present a two-dimensional computational method for advanced tissue analysis and characterization based on optical coherence tomography (OCT) and Raman spectroscopy (RS). The method combines the slope of OCT intensity signal and the Principal component (PC) of RS, and relies on the tissue optical attenuation and chemical ingredients for the classification of tissue types. Our pilot experiments were performed on mouse kidney, liver and small intestine. Results demonstrate the improvement of the tissue differentiation compared with the analysis only based on the OCT detection. This combined OCT/RS method is potentially useful as a novel optical biopsy technique for cancer detection.

  12. Improvement of the LOCA PSA model using a beat-estimate thermal-hydraulic analysis

    International Nuclear Information System (INIS)

    Probabilistic Safety Assessment (PSA) has been widely used to estimate the overall safety of nuclear power plants (NPP) and it provides base information for risk informed application (RIA) and risk informed regulation (RIR). For the effective and correct use of PSA in RIA/RIR related decision making, the risk estimated by a PSA model should be as realistic as possible. In this work, a best-estimate thermal-hydraulic analysis of loss-of-coolant accidents (LOCAs) for the Hanul Nuclear Units 3 and 4 is first carried out in a systematic way. That is, the behaviors of peak cladding temperature (PCT) were analyzed with various combinations of break sizes, the operating conditions of safety systems, and the operator's action time for aggressive secondary cooling. Thereafter, the results of the thermal-hydraulic analysis have been reflected in the improvement of the PSA model by changing both accident sequences and success criteria of the event trees for the LOCA scenarios.

  13. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  14. Ultrasound guidance improves the success rate of axillary plexus block: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Qin Qin

    2016-04-01

    Full Text Available ABSTRACT OBJECTIVE: To evaluate the value of real-time ultrasound (US guidance for axillary brachial plexus block (AXB through the success rate and the onset time. METHODS: The meta-analysis was carried out in the Anesthesiology Department of the Second Affiliated Hospital of Soochow University, Suzhou, Jiangsu Province, China. A literature search of Medline, EMBASE, Cochrane database from the years 2004 to 2014 was performed. The literature searches were carried out using medical subject headings and free-text word: "axilla", "axillary", "brachial plexus", "ultrasonography", "ultrasound", "ultrasonics". Two different reviewers carried out the search and evaluated studies independently. RESULTS: Seven randomized controlled trials, one cohort study and three retrospective studies were included. A total of 2042 patients were identified. 1157 patients underwent AXB using US guidance (US group and the controlled group included 885 patients (246 patients using traditional approach (TRAD and 639 patients using nerve stimulation (NS. Our analysis showed that the success rate was higher in the US group compared to the controlled group (90.64% vs. 82.21%, p < 0.00001. The average time to perform the block and the onset of sensory time were shorter in the US group than the controlled group. CONCLUSION: The present study demonstrated that the real-time ultrasound guidance for axillary brachial plexus block improves the success rate and reduce the mean time to onset of anesthesia and the time of block performance.

  15. Score-moment combined linear discrimination analysis (SMC-LDA) as an improved discrimination method.

    Science.gov (United States)

    Han, Jintae; Chung, Hoeil; Han, Sung-Hwan; Yoon, Moon-Young

    2007-01-01

    A new discrimination method called the score-moment combined linear discrimination analysis (SMC-LDA) has been developed and its performance has been evaluated using three practical spectroscopic datasets. The key concept of SMC-LDA was to use not only the score from principal component analysis (PCA), but also the moment of the spectrum, as inputs for LDA to improve discrimination. Along with conventional score, moment is used in spectroscopic fields as an effective alternative for spectral feature representation. Three different approaches were considered. Initially, the score generated from PCA was projected onto a two-dimensional feature space by maximizing Fisher's criterion function (conventional PCA-LDA). Next, the same procedure was performed using only moment. Finally, both score and moment were utilized simultaneously for LDA. To evaluate discrimination performances, three different spectroscopic datasets were employed: (1) infrared (IR) spectra of normal and malignant stomach tissue, (2) near-infrared (NIR) spectra of diesel and light gas oil (LGO) and (3) Raman spectra of Chinese and Korean ginseng. For each case, the best discrimination results were achieved when both score and moment were used for LDA (SMC-LDA). Since the spectral representation character of moment was different from that of score, inclusion of both score and moment for LDA provided more diversified and descriptive information.

  16. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  17. Can Comprehensive Chromosome Screening Technology Improve IVF/ICSI Outcomes? A Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Minghao Chen

    Full Text Available To examine whether comprehensive chromosome screening (CCS for preimplantation genetic screening (PGS has an effect on improving in vitro fertilization/intracytoplasmic sperm injection (IVF/ICSI outcomes compared to traditional morphological methods.A literature search was conducted in PubMed, EMBASE, CNKI and ClinicalTrials.gov up to May 2015. Two reviewers independently evaluated titles and abstracts, extracted data and assessed quality. We included studies that compared the IVF/ICSI outcomes of CCS-based embryo selection with those of the traditional morphological method. Relative risk (RR values with corresponding 95% confidence intervals (CIs were calculated in RevMan 5.3, and subgroup analysis and Begg's test were used to assess heterogeneity and potential publication bias, respectively.Four RCTs and seven cohort studies were included. A meta-analysis of the outcomes showed that compared to morphological criteria, euploid embryos identified by CCS were more likely to be successfully implanted (RCT RR 1.32, 95% CI 1.18-1.47; cohort study RR 1.74, 95% CI 1.35-2.24. CCS-based PGS was also related to an increased clinical pregnancy rate (RCT RR 1.26, 95% CI 0.83-1.93; cohort study RR 1.48, 95% CI 1.20-1.83, an increased ongoing pregnancy rate (RCT RR 1.31, 95% CI 0.64-2.66; cohort study RR 1.61, 95% CI 1.30-2.00, and an increased live birth rate (RCT RR 1.26, 95% CI 1.05-1.50; cohort study RR 1.35, 95% CI 0.85-2.13 as well as a decreased miscarriage rate (RCT RR 0.53, 95% CI 0.24-1.15; cohort study RR 0.31, 95% CI 0.21-0.46 and a decreased multiple pregnancy rate (RCT RR 0.02, 95% CI 0.00-0.26; cohort study RR 0.19, 95% CI 0.07-0.51. The results of the subgroup analysis also showed a significantly increased implantation rate in the CCS group.The effectiveness of CCS-based PGS is comparable to that of traditional morphological methods, with better outcomes for women receiving IVF/ICSI technology. The transfer of both trophectoderm-biopsied and

  18. Improvement of core effective thermal conductivity model of GAMMA+ code based on CFD analysis

    International Nuclear Information System (INIS)

    Highlights: • We assessed the core effective thermal conductivity (ETC) model of GAMMA+ code. • The analytical model of GAMMA+ code was compared with the result of CFD analysis. • Effects of material property of composite and geometric configuration were studied. • The GAMMA+ model agreed with the CFD result when the fuel gap is ignored. • The GAMMA+ model was improved by the ETC model of fuel compact including fuel gap. - Abstract: The GAMMA+ code has been developed for the thermo-fluid and safety analyses of a high temperature gas-cooled reactor (HTGR). In order to calculate the core effective thermal conductivity, this code adopts a heterogeneous model derived from the Maxwell’s theory that accounts for three distinct materials in a fuel block of the reactor core. In this model, the fuel gap is neglected since the gap thickness is quite small. In addition, the configuration of the fuel block is assumed to be homogeneous, and the volume fraction and material properties of each component are taken into account. In the accident condition, the conduction and radiation are major heat transfer mechanism. Therefore, the core effective thermal conductivity model should be validated in order to estimate the heat transfer in the core appropriately. In this regard, the objective of this study is to validate the core effective thermal conductivity model of the GAMMA+ code by a computational fluid dynamics (CFD) analysis using a commercial CFD code, CFX-13. The effects of the temperature condition, material property and geometric modeling on the core effective thermal conductivity were investigated. When the fuel gap is not modeled in the CFD analysis, the result of the GAMMA+ code shows a good agreement with the CFD result. However, when the fuel gap is modeled, the GAMMA+ model overestimates the core effective thermal conductivity considerably for all cases. This is because of the increased thermal resistance by the fuel gap which is not taken into account in

  19. IMPROVEMENT OF EXPERT ANALYSIS FOR ROAD TRAFFIC ACCIDENTS USING COMPUTER SIMULATION PROGRAMS

    Directory of Open Access Journals (Sweden)

    S. A. Azemsha

    2015-01-01

    Full Text Available The existing methods for auto-technical expertise presuppose selection of some parameters on the basis of the expert’s intuition and experience. Type of a vehicle and its loading rate, road conditions are not taken into account also in the case when deceleration is to be determined. While carrying out the analysis it has been established that an application of special software makes it possible to improve significantly efficiency of the executed works directed on solution of the assigned tasks, to speed up calculation processes, to decrease qualitatively probability of arithmetic errors and provides the possibility to visualize results of the conducted investigations. Possibility of using various models for dynamic motion simulation and collision of vehicles (in the form of 3D-models has been established in the paper. In such a case specific features of vehicle technical conditions, its loading rate and condition of roadway surface have been taken account in the paper. The given paper also permits to obtain a dynamic display of reconstructed accident mechanism in axonometric projection, to film video-clips when a camera is positioned at any spatial point: road, roadside, raised position, moving vehicle, driver's seat in the vehicle.The paper contains an analysis of possibilities of road traffic accident simulation programs, a statistical analysis that shows significance in differences between simulation results when various programs have been used. The paper presents initial data and results of vehicle speed calculation on the basis of braking track length which have been obtained with the help of road traffic accident express analysis (a classical approach and PC-Crash when additional influencing factors are taken into account. A number of shortcomings have been revealed while analyzing the simulation results of the applied software. The shortcomings must be removed in the analyzed software products.On the basis of the executed analysis in

  20. Improved analysis of picomole quantities of lithium, sodium, and potassium in biological fluids.

    Science.gov (United States)

    Shalmi, M; Kibble, J D; Day, J P; Christensen, P; Atherton, J C

    1994-10-01

    The analysis of picomolar lithium, sodium, and potassium by electrothermal atomic absorption spectrophotometry was studied using a Perkin-Elmer Zeeman 3030 spectrophotometer. With ordinary pyrolytically coated graphite tubes, a number of interference effects associated with the sample matrix were observed. In particular, the lithium and potassium absorbance signal was depressed by chloride, an effect shown to be dependent on the preatomization heating. When an in situ tantalum-coated atomization surface was used, matrix interferences observed in lithium and potassium analyses were abolished, and the linear range for the potassium assay was extended. Technical difficulties encountered during sodium analysis at the primary wavelength were effectively circumvented by analysis at a less-sensitive wavelength (303.3 nm), at which tantalum coating also prevented significant chloride interference. The improved microanalyses were employed to reevaluate the handling of lithium, sodium, and potassium along the proximal convoluted tubule (PCT) of the anesthetized rat. The average tubular fluid-to-plasma concentration ratios for lithium [(TF/P)Li] and sodium [(TF/P)Na] were 1.13 +/- 0.08, n = 26, and 0.99 +/- 0.07 (n = 26), respectively. The tubular fluid-to-plasma ultrafiltrate concentration ratio for potassium [(TF/UF)K] was 1.09 +/- 0.05 (n = 13). Ratios did not change significantly with puncture site along the PCT for any of the ions. (TF/P)Li and (TF/UF)K were significantly greater than (TF/P)Na, indicating that lithium and potassium reabsorption do not directly parallel sodium reabsorption in the PCT. PMID:7943365

  1. Improvement of the analysis of the biochemical oxygen demand (BOD) of Mediterranean seawater by seeding control.

    Science.gov (United States)

    Simon, F Xavier; Penru, Ywann; Guastalli, Andrea R; Llorens, Joan; Baig, Sylvie

    2011-07-15

    Biochemical oxygen demand (BOD) is a useful parameter for assessing the biodegradability of dissolved organic matter in water. At the same time, this parameter is used to evaluate the efficiency with which certain processes remove biodegradable natural organic matter (NOM). However, the values of BOD in seawater are very low (around 2 mgO(2)L(-1)) and the methods used for its analysis are poorly developed. The increasing attention given to seawater desalination in the Mediterranean environment, and related phenomena such as reverse osmosis membrane biofouling, have stimulated interest in seawater BOD close to the Spanish coast. In this study the BOD analysis protocol was refined by introduction of a new step in which a critical quantity of autochthonous microorganisms, measured as adenosine triphosphate, is added. For the samples analyzed, this improvement allowed us to obtain reliable and replicable BOD measurements, standardized with solutions of glucose-glutamic acid and acetate. After 7 days of analysis duration, more than 80% of ultimate BOD is achieved, which in the case of easily biodegradable compounds represents nearly a 60% of the theoretical oxygen demand. BOD(7) obtained from the Mediterranean Sea found to be 2.0±0.3 mgO(2)L(-1) but this value decreased with seawater storage time due to the rapid consumption of labile compounds. No significant differences were found between two samples points located on the Spanish coast, since their organic matter content was similar. Finally, the determination of seawater BOD without the use of inoculum may lead to an underestimation of BOD. PMID:21645736

  2. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  3. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  4. Threat to the point: improving the value of comparative extinction risk analysis for conservation action.

    Science.gov (United States)

    Murray, Kris A; Verde Arregoitia, Luis D; Davidson, Ana; Di Marco, Moreno; Di Fonzo, Martina M I

    2014-02-01

    Comparative extinction risk analysis is a common approach for assessing the relative plight of biodiversity and making conservation recommendations. However, the usefulness of such analyses for conservation practice has been questioned. One reason for underperformance may be that threats arising from global environmental changes (e.g., habitat loss, invasive species, climate change) are often overlooked, despite being widely regarded as proximal drivers of species' endangerment. We explore this problem by (i) reviewing the use of threats in this field and (ii) quantitatively investigating the effects of threat exclusion on the interpretation and potential application of extinction risk model results. We show that threat variables are routinely (59%) identified as significant predictors of extinction risk, yet while most studies (78%) include extrinsic factors of some kind (e.g., geographic or bioclimatic information), the majority (63%) do not include threats. Despite low overall usage, studies are increasingly employing threats to explain patterns of extinction risk. However, most continue to employ methods developed for the analysis of heritable traits (e.g., body size, fecundity), which may be poorly suited to the treatment of nonheritable predictors including threats. In our global mammal and continental amphibian extinction risk case studies, omitting threats reduced model predictive performance, but more importantly (i) reduced mechanistic information relevant to management; (ii) resulted in considerable disagreement in species classifications (12% and 5% for amphibians and mammals, respectively, translating to dozens and hundreds of species); and (iii) caused even greater disagreement (20-60%) in a downstream conservation application (species ranking). We conclude that the use of threats in comparative extinction risk analysis is important and increasing but currently in the early stages of development. Priorities for future studies include improving uptake

  5. Improvement of the analysis of the biochemical oxygen demand (BOD) of Mediterranean seawater by seeding control.

    Science.gov (United States)

    Simon, F Xavier; Penru, Ywann; Guastalli, Andrea R; Llorens, Joan; Baig, Sylvie

    2011-07-15

    Biochemical oxygen demand (BOD) is a useful parameter for assessing the biodegradability of dissolved organic matter in water. At the same time, this parameter is used to evaluate the efficiency with which certain processes remove biodegradable natural organic matter (NOM). However, the values of BOD in seawater are very low (around 2 mgO(2)L(-1)) and the methods used for its analysis are poorly developed. The increasing attention given to seawater desalination in the Mediterranean environment, and related phenomena such as reverse osmosis membrane biofouling, have stimulated interest in seawater BOD close to the Spanish coast. In this study the BOD analysis protocol was refined by introduction of a new step in which a critical quantity of autochthonous microorganisms, measured as adenosine triphosphate, is added. For the samples analyzed, this improvement allowed us to obtain reliable and replicable BOD measurements, standardized with solutions of glucose-glutamic acid and acetate. After 7 days of analysis duration, more than 80% of ultimate BOD is achieved, which in the case of easily biodegradable compounds represents nearly a 60% of the theoretical oxygen demand. BOD(7) obtained from the Mediterranean Sea found to be 2.0±0.3 mgO(2)L(-1) but this value decreased with seawater storage time due to the rapid consumption of labile compounds. No significant differences were found between two samples points located on the Spanish coast, since their organic matter content was similar. Finally, the determination of seawater BOD without the use of inoculum may lead to an underestimation of BOD.

  6. An improved Agrobacterium-mediated transformation system for the functional genetic analysis of Penicillium marneffei.

    Science.gov (United States)

    Kummasook, Aksarakorn; Cooper, Chester R; Vanittanakom, Nongnuch

    2010-12-01

    We have developed an improved Agrobacterium-mediated transformation (AMT) system for the functional genetic analysis of Penicillium marneffei, a thermally dimorphic, human pathogenic fungus. Our AMT protocol included the use of conidia or pre-germinated conidia of P. marneffei as the host recipient for T-DNA from Agrobacterium tumefaciens and co-cultivation at 28°C for 36 hours. Bleomycin-resistant transformants were selected as yeast-like colonies following incubation at 37°C. The efficiency of transformation was approximately 123 ± 3.27 and 239 ± 13.12 transformants per plate when using 5 × 10(4) conidia and pre-germinated conidia as starting materials, respectively. Southern blot analysis demonstrated that 95% of transformants contained single copies of T-DNA. Inverse PCR was employed for identifying flanking sequences at the T-DNA insertion sites. Analysis of these sequences indicated that integration occurred as random recombination events. Among the mutants isolated were previously described stuA and gasC defective strains. These AMT-derived mutants possessed single T-DNA integrations within their particular coding sequences. In addition, other morphological and pigmentation mutants possessing a variety of gene-specific defects were isolated, including two mutants having T-DNA integrations within putative promoter regions. One of the latter integration events was accompanied by the deletion of the entire corresponding gene. Collectively, these results indicated that AMT could be used for large-scale, functional genetic analyses in P. marneffei. Such analyses can potentially facilitate the identification of those genetic elements related to morphogenesis, as well as pathogenesis in this medically important fungus.

  7. Liposome bupivacaine for improvement in economic outcomes and opioid burden in GI surgery: IMPROVE Study pooled analysis

    Directory of Open Access Journals (Sweden)

    Cohen SM

    2014-06-01

    Full Text Available Stephen M Cohen,1 Jon D Vogel,2 Jorge E Marcet,3 Keith A Candiotti4 1Atlanta Colon and Rectal Surgery, PA, Atlanta, GA, USA; 2General Surgery Clinic, University of Colorado, Aurora, CO, USA; 3Department of Surgery, Morsani College of Medicine, University of South Florida, Tampa, FL, USA; 4Department of Anesthesiology, University of Miami Leonard Miller School of Medicine, Miami, FL, USA Abstract: Postsurgical pain management remains a significant challenge. Liposome bupivacaine, as part of a multimodal analgesic regimen, has been shown to significantly reduce postsurgical opioid consumption, hospital length of stay (LOS, and hospitalization costs in gastrointestinal (GI surgery, compared with intravenous (IV opioid-based patient-controlled analgesia (PCA. Pooled results from open-label studies comparing a liposome bupivacaine-based multimodal analgesic regimen with IV opioid PCA were analyzed. Patients (n=191 who underwent planned surgery and received study drug (IV opioid PCA, n=105; multimodal analgesia, n=86 were included. Liposome bupivacaine-based multimodal analgesia compared with IV opioid PCA significantly reduced mean (standard deviation [SD] postsurgical opioid consumption (38 [55] mg versus [vs] 96 [85] mg; P<0.0001, postsurgical LOS (median 2.9 vs 4.3 days; P<0.0001, and mean hospitalization costs (US$8,271 vs US$10,726; P=0.0109. The multimodal analgesia group reported significantly fewer patients with opioid-related adverse events (AEs than the IV opioid PCA group (P=0.0027; there were no significant between-group differences in patient satisfaction scores at 30 days. A liposome bupivacaine-based multimodal analgesic regimen was associated with significantly less opioid consumption, opioid-related AEs, and better health economic outcomes compared with an IV opioid PCA-based regimen in patients undergoing GI surgery. Study registration: This pooled analysis is based on data from Phase IV clinical trials registered on the US National

  8. Improving the Efficiency and Ease of Healthcare Analysis Through Use of Data Visualization Dashboards.

    Science.gov (United States)

    Stadler, Jennifer G; Donlon, Kipp; Siewert, Jordan D; Franken, Tessa; Lewis, Nathaniel E

    2016-06-01

    The digitization of a patient's health record has profoundly impacted medicine and healthcare. The compilation and accessibility of medical history has provided clinicians an unprecedented, holistic account of a patient's conditions, procedures, medications, family history, and social situation. In addition to the bedside benefits, this level of information has opened the door for population-level monitoring and research, the results of which can be used to guide initiatives that are aimed at improving quality of care. Cerner Corporation partners with health systems to help guide population management and quality improvement projects. With such an enormous and diverse client base-varying in geography, size, organizational structure, and analytic needs-discerning meaning in the data and how they fit with that particular hospital's goals is a slow, difficult task that requires clinical, statistical, and technical literacy. This article describes the development of dashboards for efficient data visualization at the healthcare facility level. Focusing on two areas with broad clinical importance, sepsis patient outcomes and 30-day hospital readmissions, dashboards were developed with the goal of aggregating data and providing meaningful summary statistics, highlighting critical performance metrics, and providing easily digestible visuals that can be understood by a wide range of personnel with varying levels of skill and areas of expertise. These internal-use dashboards have allowed associates in multiple roles to perform a quick and thorough assessment on a hospital of interest by providing the data to answer necessary questions and to identify important trends or opportunities. This automation of a previously manual process has greatly increased efficiency, saving hours of work time per hospital analyzed. Additionally, the dashboards have standardized the analysis process, ensuring use of the same metrics and processes so that overall themes can be compared across

  9. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    Energy Technology Data Exchange (ETDEWEB)

    Miwa, Kenta, E-mail: kenta5710@gmail.com [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Inubushi, Masayuki, E-mail: inubushi@med.kawasaki-m.ac.jp [Department of Nuclear Medicine, Kawasaki Medical School, 577 Matsushima Kurashiki, Okayama 701-0192 (Japan); Wagatsuma, Kei, E-mail: kei1192@hotmail.co.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Nagao, Michinobu, E-mail: minagao@radiol.med.kyushu-u.ac.jp [Department of Molecular Imaging and Diagnosis, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Murata, Taisuke, E-mail: taisuke113@gmail.com [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koyama, Masamichi, E-mail: masamichi.koyama@jfcr.or.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Koizumi, Mitsuru, E-mail: mitsuru@jfcr.or.jp [Department of Nuclear Medicine, Cancer Institute Hospital of Japanese Foundation for Cancer Research, 3-8-31 Ariake, Koto-ku, Tokyo 135-8550 (Japan); Sasaki, Masayuki, E-mail: msasaki@hs.med.kyushu-u.ac.jp [Division of Medical Quantum Science, Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan)

    2014-04-15

    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUV{sub max}) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUV{sub max} and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUV{sub max}, the difference did not reach statistical significance. Tumor size correlated significantly with SUV{sub max} (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUV{sub max} or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUV{sub max} and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy.

  10. An improved state-parameter analysis of ecosystem models using data assimilation

    Science.gov (United States)

    Chen, M.; Liu, S.; Tieszen, L.L.; Hollinger, D.Y.

    2008-01-01

    Much of the effort spent in developing data assimilation methods for carbon dynamics analysis has focused on estimating optimal values for either model parameters or state variables. The main weakness of estimating parameter values alone (i.e., without considering state variables) is that all errors from input, output, and model structure are attributed to model parameter uncertainties. On the other hand, the accuracy of estimating state variables may be lowered if the temporal evolution of parameter values is not incorporated. This research develops a smoothed ensemble Kalman filter (SEnKF) by combining ensemble Kalman filter with kernel smoothing technique. SEnKF has following characteristics: (1) to estimate simultaneously the model states and parameters through concatenating unknown parameters and state variables into a joint state vector; (2) to mitigate dramatic, sudden changes of parameter values in parameter sampling and parameter evolution process, and control narrowing of parameter variance which results in filter divergence through adjusting smoothing factor in kernel smoothing algorithm; (3) to assimilate recursively data into the model and thus detect possible time variation of parameters; and (4) to address properly various sources of uncertainties stemming from input, output and parameter uncertainties. The SEnKF is tested by assimilating observed fluxes of carbon dioxide and environmental driving factor data from an AmeriFlux forest station located near Howland, Maine, USA, into a partition eddy flux model. Our analysis demonstrates that model parameters, such as light use efficiency, respiration coefficients, minimum and optimum temperatures for photosynthetic activity, and others, are highly constrained by eddy flux data at daily-to-seasonal time scales. The SEnKF stabilizes parameter values quickly regardless of the initial values of the parameters. Potential ecosystem light use efficiency demonstrates a strong seasonality. Results show that the

  11. Analysis of Stakeholder's Behaviours for an Improved Management of an Agricultural Coastal Region in Oman

    Science.gov (United States)

    Khatri, Ayisha Al; Jens, Grundmann; der Weth Rüdiger, van; Niels, Schütze

    2015-04-01

    differences exist between groups on how to achieve this improvement, since farmers prefer management interventions operating more on the water resources side while decision makers support measures for a better management on the water demand side. Furthermore, the opinions within single groups are sometimes contradicting for several management interventions. The use of more advanced statistical methods like discriminant analysis or Bayesian network allow for identifying factors and drivers to explain these differences. Both approaches, will help to understand stakeholder's behaviours and to evaluate the implementation potential of several management interventions. Keywords IWRM, Stakeholder participation, field survey, statistical analysis, Oman

  12. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    Science.gov (United States)

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  13. Experimental study and mechanism analysis of modified limestone by red mud for improving desulfurization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongtao; Han, Kuihua; Niu, Shengli; Lu, Chunmei; Liu, Mengqi; Li, Hui [Shandong Univ., Jinan (China). School of Energy and Power Engineering

    2013-07-01

    Red mud is a type of solid waste generated during alumina production from bauxite, and how to dispose and utilize red mud in a large scale is yet a question with no satisfied answer. This paper attempts to use red mud as a kind of additive to modify the limestone. The enhancement of the sulfation reaction of limestone by red mud (two kinds of Bayer process red mud and one kind of sintering process red mud) are studied by a tube furnace reactor. The calcination and sulfation process and kinetics are investigated in a thermogravimetric (TG) analyzer. The results show that red mud can effectively improve the desulfurization performance of limestone in the whole temperature range (1,073-1,373K). Sulfur capacity of limestone (means quality of SO{sub 2} which can be retained by 100mg of limestone) can be increased by 25.73, 7.17 and 15.31% while the utilization of calcium can be increased from 39.68 to 64.13%, 60.61 and 61.16% after modified by three kinds of red mud under calcium/metallic element (metallic element described here means all metallic elements which can play a catalytic effect on the sulfation process, including the Na, K, Fe, Ti) ratio being 15, at the temperature of 1,173K. The structure of limestone modified by red mud is interlaced and tridimensional which is conducive to the sulfation reaction. The phase composition analysis measured by XRD of modified limestone sulfated at high temperature shows that there are correspondingly more sulphates for silicate and aluminate complexes of calcium existing in the products. Temperature, calcium/metallic element ratio and particle diameter are important factors as for the sulfation reaction. The optimum results can be obtained as calcium/metallic element ratio being 15. Calcination characteristic of limestone modified by red mud shows a migration to lower temperature direction. The enhancement of sulfation by doping red mud is more pronounced once the product layer has been formed and consequently the promoting

  14. Improving practices in nanomedicine through near real-time pharmacokinetic analysis

    Science.gov (United States)

    Magafia, Isidro B.

    More than a decade into the development of gold nanoparticles, with multiple clinical trials underway, ongoing pre-clinical research continues towards better understanding in vivo interactions. The goal is treatment optimization through improved best practices. In an effort to collect information for healthcare providers enabling informed decisions in a relevant time frame, instrumentation for real-time plasma concentration (multi-wavelength photoplethysmography) and protocols for rapid elemental analysis (energy dispersive X-Ray fluorescence) of biopsied tumor tissue have been developed in a murine model. An initial analysis, designed to demonstrate the robust nature and utility of the techniques, revealed that area under the bioavailability curve (AUC) alone does not currently inform tumor accumulation with a high degree of accuracy (R2=0.56), marginally better than injected dose (R2=0.46). This finding suggests that the control of additional experimental and physiological variables (chosen through modeling efforts) may yield more predictable tumor accumulation. Subject core temperature, blood pressure, and tumor perfusion are evaluated relative to particle uptake in a murine tumor model. New research efforts are also focused on adjuvant therapies that are employed to modify circulation parameters, including the AUC, of nanorods and gold nanoshells. Preliminary studies demonstrated a greater than 300% increase in average AUC using a reticuloendothelial blockade agent versus control groups. Given a better understanding of the relative importance of the physiological factors that influence rates of tumor accumulation, a set of experimental best practices is presented. This dissertation outlines the experimental protocols conducted, and discusses the real-world needs discovered and how these needs became specifications of developed protocols.

  15. Gaining improved chemical composition by exploitation of Compton-to-Rayleigh intensity ratio in XRF analysis.

    Science.gov (United States)

    Hodoroaba, Vasile-Dan; Rackwitz, Vanessa

    2014-07-15

    The high specificity of the coherent (Rayleigh), as well as incoherent (Compton) X-ray scattering to the mean atomic number of a specimen to be analyzed by X-ray fluorescence (XRF), is exploited to gain more information on the chemical composition. Concretely, the evaluation of the Compton-to-Rayleigh intensity ratio from XRF spectra and its relation to the average atomic number of reference materials via a calibration curve can reveal valuable information on the elemental composition complementary to that obtained from the reference-free XRF analysis. Particularly for matrices of lower mean atomic numbers, the sensitivity of the approach is so high that it can be easily distinguished between specimens of mean atomic numbers differing from each other by 0.1. Hence, the content of light elements which are "invisible" for XRF, particularly hydrogen, or of heavier impurities/additives in light materials can be calculated "by difference" from the scattering calibration curve. The excellent agreement between such an experimental, empirical calibration curve and a synthetically generated one, on the basis of a reliable physical model for the X-ray scattering, is also demonstrated. Thus, the feasibility of the approach for given experimental conditions and particular analytical questions can be tested prior to experiments with reference materials. For the present work a microfocus X-ray source attached on an SEM/EDX (scanning electron microscopy/energy dispersive X-ray spectroscopy) system was used so that the Compton-to-Rayleigh intensity ratio could be acquired with EDX spectral data for improved analysis of the elemental composition. PMID:24950635

  16. Rehabilitation Interventions for Improving Social Participation After Stroke: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Obembe, Adebimpe O; Eng, Janice J

    2016-05-01

    Background Despite the fact that social participation is considered a pivotal outcome of a successful recovery after stroke, there has been little attention on the impact of activities and services on this important domain.Objective To present a systematic review and meta-analysis from randomized controlled trials (RCTs) on the effects of rehabilitation interventions on social participation after stroke.Methods A total of 8 electronic databases were searched for relevant RCTs that evaluated the effects of an intervention on the outcome of social participation after stroke. Reference lists of selected articles were hand searched to identify further relevant studies. The methodological quality of the studies was assessed using the Physiotherapy Evidence Database Scale. Standardized mean differences (SMDs) and confidence intervals (CIs) were estimated using fixed- and random-effect models.Results In all, 24 RCTs involving 2042 stroke survivors were identified and reviewed, and 21 were included in the meta-analysis. There was a small beneficial effect of interventions that utilized exercise on social participation (10 studies; SMD = 0.43; 95% CI = 0.09, 0.78;P= .01) immediately after the program ended. Exercise in combination with other interventions (13 studies; SMD = 0.34; 95% CI = 0.10, 0.58;P= .006) also resulted in beneficial effects. No significant effect was observed for interventions that involved support services over 9 studies (SMD = 0.09 [95% CI = -0.04, 0.21];I(2)= 0%;P= .16).Conclusions The included studies provide evidence that rehabilitation interventions may be effective in improving social participation after stroke, especially if exercise is one of the components. PMID:26223681

  17. Analysis of Improved Reference Design for a Nuclear-Driven High Temperature Electrolysis Hydrogen Production Plant

    Energy Technology Data Exchange (ETDEWEB)

    Edwin A. Harvego; James E. O' Brien; Michael G. McKellar

    2010-06-01

    The use of High Temperature Electrolysis (HTE) for the efficient production of hydrogen without the greenhouse gas emissions associated with conventional fossil-fuel hydrogen production techniques has been under investigation at the Idaho National Engineering Laboratory (INL) for the last several years. The activities at the INL have included the development, testing and analysis of large numbers of solid oxide electrolysis cells, and the analyses of potential plant designs for large scale production of hydrogen using an advanced Very-High Temperature Reactor (VHTR) to provide the process heat and electricity to drive the electrolysis process. The results of these system analyses, using the UniSim process analysis software, have shown that the HTE process, when coupled to a VHTR capable of operating at reactor outlet temperatures of 800 °C to 950 °C, has the potential to produce the large quantities of hydrogen needed to meet future energy and transportation needs with hydrogen production efficiencies in excess of 50%. In addition, economic analyses performed on the INL reference plant design, optimized to maximize the hydrogen production rate for a 600 MWt VHTR, have shown that a large nuclear-driven HTE hydrogen production plant can to be economically competitive with conventional hydrogen production processes, particularly when the penalties associated with greenhouse gas emissions are considered. The results of this research led to the selection in 2009 of HTE as the preferred concept in the U.S. Department of Energy (DOE) hydrogen technology down-selection process. However, the down-selection process, along with continued technical assessments at the INL, has resulted in a number of proposed modifications and refinements to improve the original INL reference HTE design. These modifications include changes in plant configuration, operating conditions and individual component designs. This paper describes the resulting new INL reference design and presents

  18. Analysis of Scattering Components from Fully Polarimetric SAR Images for Improving Accuracies of Urban Density Estimation

    Science.gov (United States)

    Susaki, J.

    2016-06-01

    In this paper, we analyze probability density functions (PDFs) of scatterings derived from fully polarimetric synthetic aperture radar (SAR) images for improving the accuracies of estimated urban density. We have reported a method for estimating urban density that uses an index Tv+c obtained by normalizing the sum of volume and helix scatterings Pv+c. Validation results showed that estimated urban densities have a high correlation with building-to-land ratios (Kajimoto and Susaki, 2013b; Susaki et al., 2014). While the method is found to be effective for estimating urban density, it is not clear why Tv+c is more effective than indices derived from other scatterings, such as surface or double-bounce scatterings, observed in urban areas. In this research, we focus on PDFs of scatterings derived from fully polarimetric SAR images in terms of scattering normalization. First, we introduce a theoretical PDF that assumes that image pixels have scatterers showing random backscattering. We then generate PDFs of scatterings derived from observations of concrete blocks with different orientation angles, and from a satellite-based fully polarimetric SAR image. The analysis of the PDFs and the derived statistics reveals that the curves of the PDFs of Pv+c are the most similar to the normal distribution among all the scatterings derived from fully polarimetric SAR images. It was found that Tv+c works most effectively because of its similarity to the normal distribution.

  19. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  20. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    Science.gov (United States)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  1. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Teoh, Siew Li; Sudfangsai, Suthinee; Lumbiganon, Pisake; Laopaiboon, Malinee; Lai, Nai Ming; Chaiyakunapruk, Nathorn

    2016-01-01

    Chicken essence (CE) is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane's Risk of Bias (ROB) tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD) of -0.55 (-1.04, -0.06)) and another with no significant difference (pooled SMD of 0.70 (-0.001, 1.40)). For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (-0.16, 1.42)). Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE. PMID:26805876

  2. Comparison of DNA polymerases for improved forensic analysis of challenging samples.

    Science.gov (United States)

    Nilsson, Martina; Grånemo, Joakim; Buś, Magdalena M; Havsjö, Mikael; Allen, Marie

    2016-09-01

    Inhibitors of polymerase chain reaction (PCR) amplification often present a challenge in forensic investigations of e.g., terrorism, missing persons, sexual assaults and other criminal cases. Such inhibitors may be counteracted by dilution of the DNA extract, using different additives, and selecting an inhibitory resistant DNA polymerase. Additionally, DNA in forensic samples is often present in limited amounts and degraded, requiring special analyses of short nuclear targets or mitochondrial DNA. The present study evaluated the enzymes AmpliTaq Gold, HotStarTaq Plus, KAPA3G Plant, and KAPA2G Robust, with regard to their ability to overcome inhibitory effects. Our data showed that diluting the extracts and adding bovine serum albumin may increase the yield of the PCR product. However, the largest impact was observed when alternative enzymes were utilized, instead of the commonly used AmpliTaq Gold. KAPA2G Robust presented the highest amplification efficiency in the presence of the inhibitor ammonium nitrate. Moreover, the KAPA3G Plant enzyme had the highest efficiency in amplifying degraded DNA from old buried bone material. KAPA3G Plant and KAPA2G Robust may thus be useful for counteracting inhibitors and improving the analysis of challenging samples. PMID:27299290

  3. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Analysis of Factors Affecting Its Performance

    Science.gov (United States)

    Perry, Bruce A.; Anderson, Molly S.

    2015-01-01

    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station Water Processor Assembly to form a complete water recovery system for future missions. A preliminary chemical process simulation was previously developed using Aspen Custom Modeler® (ACM), but it could not simulate thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. This paper describes modifications to the ACM simulation of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version can be used to model thermal startup and predicts the total energy consumption of the CDS. The simulation has been validated for both NaC1 solution and pretreated urine feeds and no longer requires retuning when operating parameters change. The simulation was also used to predict how internal processes and operating conditions of the CDS affect its performance. In particular, it is shown that the coefficient of performance of the thermoelectric heat pump used to provide heating and cooling for the CDS is the largest factor in determining CDS efficiency. Intrastage heat transfer affects CDS performance indirectly through effects on the coefficient of performance.

  4. Improving breast cancer classification with mammography, supported on an appropriate variable selection analysis

    Science.gov (United States)

    Pérez, Noel; Guevara, Miguel A.; Silva, Augusto

    2013-02-01

    This work addresses the issue of variable selection within the context of breast cancer classification with mammography. A comprehensive repository of feature vectors was used including a hybrid subset gathering image-based and clinical features. It aimed to gather experimental evidence of variable selection in terms of cardinality, type and find a classification scheme that provides the best performance over the Area Under Receiver Operating Characteristics Curve (AUC) scores using the ranked features subset. We evaluated and classified a total of 300 subsets of features formed by the application of Chi-Square Discretization, Information-Gain, One-Rule and RELIEF methods in association with Feed-Forward Backpropagation Neural Network (FFBP), Support Vector Machine (SVM) and Decision Tree J48 (DTJ48) Machine Learning Algorithms (MLA) for a comparative performance evaluation based on AUC scores. A variable selection analysis was performed for Single-View Ranking and Multi-View Ranking groups of features. Features subsets representing Microcalcifications (MCs), Masses and both MCs and Masses lesions achieved AUC scores of 0.91, 0.954 and 0.934 respectively. Experimental evidence demonstrated that classification performance was improved by combining image-based and clinical features. The most important clinical and image-based features were StromaDistortion and Circularity respectively. Other less important but worth to use due to its consistency were Contrast, Perimeter, Microcalcification, Correlation and Elongation.

  5. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-06-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  6. Improved machine learning method for analysis of gas phase chemistry of peptides

    Directory of Open Access Journals (Sweden)

    Ahn Natalie

    2008-12-01

    Full Text Available Abstract Background Accurate peptide identification is important to high-throughput proteomics analyses that use mass spectrometry. Search programs compare fragmentation spectra (MS/MS of peptides from complex digests with theoretically derived spectra from a database of protein sequences. Improved discrimination is achieved with theoretical spectra that are based on simulating gas phase chemistry of the peptides, but the limited understanding of those processes affects the accuracy of predictions from theoretical spectra. Results We employed a robust data mining strategy using new feature annotation functions of MAE software, which revealed under-prediction of the frequency of occurrence in fragmentation of the second peptide bond. We applied methods of exploratory data analysis to pre-process the information in the MS/MS spectra, including data normalization and attribute selection, to reduce the attributes to a smaller, less correlated set for machine learning studies. We then compared our rule building machine learning program, DataSqueezer, with commonly used association rules and decision tree algorithms. All used machine learning algorithms produced similar results that were consistent with expected properties for a second gas phase mechanism at the second peptide bond. Conclusion The results provide compelling evidence that we have identified underlying chemical properties in the data that suggest the existence of an additional gas phase mechanism for the second peptide bond. Thus, the methods described in this study provide a valuable approach for analyses of this kind in the future.

  7. Fluid Analysis and Improved Structure of an ATEG Heat Exchanger Based on Computational Fluid Dynamics

    Science.gov (United States)

    Tang, Z. B.; Deng, Y. D.; Su, C. Q.; Yuan, X. H.

    2015-06-01

    In this study, a numerical model has been employed to analyze the internal flow field distribution in a heat exchanger applied for an automotive thermoelectric generator based on computational fluid dynamics. The model simulates the influence of factors relevant to the heat exchanger, including the automotive waste heat mass flow velocity, temperature, internal fins, and back pressure. The result is in good agreement with experimental test data. Sensitivity analysis of the inlet parameters shows that increase of the exhaust velocity, compared with the inlet temperature, makes little contribution (0.1 versus 0.19) to the heat transfer but results in a detrimental back pressure increase (0.69 versus 0.21). A configuration equipped with internal fins is proved to offer better thermal performance compared with that without fins. Finally, based on an attempt to improve the internal flow field, a more rational structure is obtained, offering a more homogeneous temperature distribution, higher average heat transfer coefficient, and lower back pressure.

  8. Improving Australia's renewable energy project policy and planning: A multiple stakeholder analysis

    International Nuclear Information System (INIS)

    Renewable Energy (RE) is part of Australia's and the world's energy supply matrix with over A$100 billion spent annually on RE projects since 2007. Businesses seeking to invest in RE projects, particularly in the wind and solar energy sectors, may face an onerous collection of planning approvals and permitting processes that impede investment and implementation. In this study, we draw on international and domestic stakeholder inputs to a governmental inquiry in Australia to show how RE projects might be approved in shortened timeframes with reduced associated costs. The process mapping and stakeholder analysis demonstrates that RE supply projects can benefit from standardized approval processes and documentation, a 360° deep engagement with stakeholders, and expanded electricity grid access in resource areas, augmented through supportive public policy and planning frameworks. In addition, stakeholder objections to project approval and implementation streamlining were used to contrast the efficacy of the proposed changes in policy. -- Highlights: •Highlights the over A$200 billion spent annually on global RE projects. •Describes a typical two stage, multi-layered governance RE project approval process. •Exposes long 3 year and multi-million dollar cost approvals for RE projects. •Identifies multi-million dollar remote grid connections as an RE project impediment. •Outlines RE project policy and guidelines shortcomings and proposed improvements

  9. Network Analysis of Force Concept Inventory Responses to Improve Diagnostic Utility

    Science.gov (United States)

    Brewe, Eric; Bruun, Jesper

    2015-04-01

    The Force Concept Inventory (FCI) is a diagnostic instrument designed to investigate students' understanding of Newtonian Mechanics and is widely used in Physics Education Research. One of the strengths of the FCI is that the distractors are drawn from student conceptions based in their experiences. The distractors chosen are often more informative about student's understanding as they identify the particular nature of students' alternative conceptions. We propose a network based analysis of the FCI which will enhance the utility of the FCI as a diagnostic tool for identifying student conceptions. In this approach, student responses are treated as a bipartite network which is then projected into two networks - students and responses. The response network includes all responses that are shared among students. We use the LANS backbone extraction algorithm to identify patterns in student responses. We use community detection algorithms on the backbone networks to identify clusters of common responses which map to models held by students, for example, ``force is needed for movement'' and ``the active agent uses the most force.'' This method has utility across a variety of instruments and could be used to improve instruction by providing in-depth knowledge of student conceptions. Supported in part by NSF Grant #PHY 134424.

  10. Improved target detection and bearing estimation utilizing fast orthogonal search for real-time spectral analysis

    Science.gov (United States)

    Osman, Abdalla; Nourledin, Aboelamgd; El-Sheimy, Naser; Theriault, Jim; Campbell, Scott

    2009-06-01

    The problem of target detection and tracking in the ocean environment has attracted considerable attention due to its importance in military and civilian applications. Sonobuoys are one of the capable passive sonar systems used in underwater target detection. Target detection and bearing estimation are mainly obtained through spectral analysis of received signals. The frequency resolution introduced by current techniques is limited which affects the accuracy of target detection and bearing estimation at a relatively low signal-to-noise ratio (SNR). This research investigates the development of a bearing estimation method using fast orthogonal search (FOS) for enhanced spectral estimation. FOS is employed in this research in order to improve both target detection and bearing estimation in the case of low SNR inputs. The proposed methods were tested using simulated data developed for two different scenarios under different underwater environmental conditions. The results show that the proposed method is capable of enhancing the accuracy for target detection as well as bearing estimation especially in cases of a very low SNR.

  11. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  12. Chicken Essence for Cognitive Function Improvement: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Siew Li Teoh

    2016-01-01

    Full Text Available Chicken essence (CE is a popular traditional remedy in Asia, which is believed to improve cognitive functions. CE company claimed that the health benefits were proven with research studies. A systematic review was conducted to determine the cognitive-enhancing effects of CE. We systematically searched a number of databases for randomized controlled trials with human subjects consuming CE and cognitive tests involved. Cochrane’s Risk of Bias (ROB tool was used to assess the quality of trials and meta-analysis was performed. Seven trials were included, where six healthy subjects and one subject with poorer cognitive functions were recruited. One trial had unclear ROB while the rest had high ROB. For executive function tests, there was a significant difference favoring CE (pooled standardized mean difference (SMD of −0.55 (−1.04, −0.06 and another with no significant difference (pooled SMD of 0.70 (−0.001, 1.40. For short-term memory tests, no significant difference was found (pooled SMD of 0.63 (−0.16, 1.42. Currently, there is a lack of convincing evidence to show a cognitive enhancing effect of CE.

  13. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-09-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  14. IMPROVEMENT OF COMPANY MARKETING STRATEGY BASED ON ANALYSIS OF GOOGLE SEARCH RESULTS

    Directory of Open Access Journals (Sweden)

    Marek Ďurica

    2015-09-01

    Full Text Available Nowadays, Internet plays a major role in people's lives. It is usually used for entertainment, as a source of information, and also for electronic commerce. Electronic commerce (e-commerce is gradually replacing traditional shopping, especially in the past years. It is a quick and easy form of marketing, which provides convenience for the customers, and, therefore, more and more users are using this form of shopping on the Internet. E-commerce also provides new opportunities for companies, which force them to begin dealing with the Internet. Many customers who are shopping on the Internet look for the best product or service close to their home. Most of the space in the search results in Google is occupied by local results. If a company offers some goods or services and they do not show up on the local search results, the company may be losing a lot of profits from these potential customers. That is why companies have to focus on best ranking in the local search results. In this article, we try to experimentally determine which factors affect ranking in Google search. Of course, it is necessary to quantify the impact of these factors. To select these factors and to determine their impact, we use exact methods of mathematical statistics, hypothesis testing, correlation, and regression analysis. Confirmation and quantification of the impact of some qualitative and quantitative characteristics of the company can be used to formulate recommendations for improving corporate strategy in acquiring new customers.

  15. Analysis of microbiota on abalone (Haliotis discus hannai) in South Korea for improved product management.

    Science.gov (United States)

    Lee, Min-Jung; Lee, Jin-Jae; Chung, Han Young; Choi, Sang Ho; Kim, Bong-Soo

    2016-10-01

    Abalone is a popular seafood in South Korea; however, because it contains various microorganisms, its ingestion can cause food poisoning. Therefore, analysis of the microbiota on abalone can improve understanding of outbreaks and causes of food poisoning and help to better manage seafood products. In this study, we collected a total of 40 abalones from four different regions in March and July, which are known as the maximum abalone production areas in Korea. The microbiota were analyzed using high-throughput sequencing, and bacterial loads on abalone were quantified by real-time PCR. Over 2700 species were detected in the samples, and Alpha- and Gammaproteobacteria were the predominant classes. The differences in microbiota among regions and at each sampling time were also investigated. Although Psychrobacter was the dominant genus detected on abalone in both March and July, the species compositions were different between the two sampling times. Five potential pathogens (Lactococcus garvieae, Yersinia kristensenii, Staphylococcus saprophyticus, Staphylococcus warneri, and Staphylococcus epidermidis) were detected among the abalone microbiota. In addition, we analyzed the influence of Vibrio parahaemolyticus infection on shifts in abalone microbiota during storage at different temperatures. Although the proportion of Vibrio increased over time in infected and non-infected abalone, the shifts of microbiota were more dynamic in infected abalone. These results can be used to better understand the potential of food poisoning caused by abalone consumption and manage abalone products according to the microbiota composition. PMID:27371902

  16. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS. PMID:23818249

  17. CUSTOMER VALUE NETWORK ANALYSIS FOR IMPROVEMENT OF CUSTOMER LIFE-TIME VALUE COMPUTATION

    Directory of Open Access Journals (Sweden)

    Monireh Hosseini

    2010-06-01

    Full Text Available The constant changes in the world have exposed companies to a situation of tough competition. This situation, especially in e-commerce, complicates the decision-making process about target customers and the recommendation of products to them. On the one hand, understanding and measuring the customer lifetime value (CLV is a critical factor for long-term success. On the other hand, the value network is a new concept that considers both tangible and intangible complex dynamic value exchanges between two or more enterprises, customers, suppliers, etc. In this paper we introduce a new definition of value networks that has focused on customer relationship management (CRM concepts called business customers' value network. Then, we suggest the value network analysis (VNA approach as a powerful tool for modeling and analyzing tangible and intangible relationships between a company and its business customers, and propose VNA to improve networking potential of CLV. This study provides a conceptual framework for mapping a newly proposed value network consisting of three schemas (star, community and compound schemas with an illustrated example. Development of a new networked measure of CLV called network customer lifetime value (NCLV is our future aim.

  18. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyougn Tae; Moon, Young Min; Choi, Sung Won; Heo, Sun [Korea Advanced Institute Science and Technology, Taejon (Korea, Republic of)

    1999-04-15

    The loss-of-RHR accident during midloop operation has been important as results of the probabilistic safety analysis. The condensation models In RELAP5/MOD3 are not proper to analyze the midloop operation. To audit and improve the model in RELAP5/MOD3.2, several items of separate effect tests have been performed. The 29 sets of reflux condensation data is obtained and the correlation is developed with these heat transfer coefficient's data. In the experiment of the direct contact condensation in hot leg, the apparatus setting is finished and a few experimental data is obtained. Non-iterative model is used to predict the model in RELAP5/MOD3.2 with the results of reflux condensation and evaluates better than the present model. The results of the direct contact condensation in a hot leg represent to be similar with the present model. The study of the CCF and liquid entrainment in a surge line and pressurizer is selected as the third separate experiment and is on performance.

  19. Improved metabolites of pharmaceutical ingredient grade Ginkgo biloba and the correlated proteomics analysis.

    Science.gov (United States)

    Zheng, Wen; Li, Ximin; Zhang, Lin; Zhang, Yanzhen; Lu, Xiaoping; Tian, Jingkui

    2015-06-01

    Ginkgo biloba is an attractive and traditional medicinal plant, and has been widely used as a phytomedicine in the prevention and treatment of cardiovascular and cerebrovascular diseases. Flavonoids and terpene lactones are the major bioactive components of Ginkgo, whereas the ginkgolic acids (GAs) with strong allergenic properties are strictly controlled. In this study, we tested the content of flavonoids and GAs under ultraviolet-B (UV-B) treatment and performed comparative proteomic analyses to determine the differential proteins that occur upon UV-B radiation. That might play a crucial role in producing flavonoids and GAs. Our phytochemical analyses demonstrated that UV-B irradiation significantly increased the content of active flavonoids, and decreased the content of toxic GAs. We conducted comparative proteomic analysis of both whole leaf and chloroplasts proteins. In total, 27 differential proteins in the whole leaf and 43 differential proteins in the chloroplast were positively identified and functionally annotated. The proteomic data suggested that enhanced UV-B radiation exposure activated antioxidants and stress-responsive proteins as well as reduced the rate of photosynthesis. We demonstrate that UV-B irradiation pharmaceutically improved the metabolic ingredients of Ginkgo, particularly in terms of reducing GAs. With high UV absorption properties, and antioxidant activities, the flavonoids were likely highly induced as protective molecules following UV-B irradiation.

  20. An Improved Adaptive Multi-way Principal Component Analysis for Monitoring Streptomycin Fermentation Process

    Institute of Scientific and Technical Information of China (English)

    何宁; 王树青; 谢磊

    2004-01-01

    Multi-way principal component analysis (MPCA) had been successfully applied to monitoring the batch and semi-batch process in most chemical industry. An improved MPCA approach, step-by-step adaptive MPCA (SAMPCA), using the process variable trajectories to monitoring the batch process is presented in this paper. It does not need to estimate or fill in the unknown part of the process variable trajectory deviation from the current time until the end. The approach is based on a MPCA method that processes the data in a sequential and adaptive manner. The adaptive rate is easily controlled through a forgetting factor that controls the weight of past data in a summation. This algorithm is used to evaluate the industrial streptomycin fermentation process data and is compared with the traditional MPCA. The results show that the method is more advantageous than MPCA, especially when monitoring multi-stage batch process where the latent vector structure can change at several points during the batch.

  1. Analysis of Improved Cyclostationary Detector with Multiple Antennas over Fading Channels

    Directory of Open Access Journals (Sweden)

    Ying Zhu

    2013-11-01

    Full Text Available A comprehensive performance analysis of the multi-cycle cyclostationary (MC detection-based spectrum sensing over fading channels with multiple independent and correlated antennas is developed. We first proposed an improved MC detector, aiming to reduce the computational complexity of the conventional one. Compared with conventional MC detector, the proposed method is low-computational complexity and high-accuracy on sensing performance. Based on the proposed MC detector, for the multiple independent antennas case, the average detection probability by employing square-law combining (SLC is derived for several fading channels such as Nakagami, Rayleigh and Rician by using the moment generation function (MGF approach. For multiple correlated antenna case, with Nakagami fading and SLC scheme, expressions of detection probability are derived by the same approach as it in the independent antennas case. Special cases of a linear array of 2 and 4 arbitrarily correlated antennas are treated. Finally, illustrative and analytical results show that the reliability of our proposed MC detector and the degradation of sensing performance over correlation and fading.

  2. Linear analysis of the vertical shear instability: outstanding issues and improved solutions

    Science.gov (United States)

    Umurhan, O. M.; Nelson, R. P.; Gressel, O.

    2016-02-01

    Context. The vertical shear instability is one of several known mechanisms that are potentially active in the so-called dead zones of protoplanetary accretion disks. A recent analysis of the instability mechanism indicates that a subset of unstable modes shows unbounded growth - both as resolution is increased and when the nominal lid of the atmosphere is extended. This trend suggests that, possibly, the model system is ill-posed. Aims: This research note both examines the energy content of these modes and questions the legitimacy of assuming separable solutions for a problem whose linear operator is fundamentally inseparable. Methods: The reduced equations governing the instability are revisited and the generated solutions are examined using both the previously assumed separable forms and an improved non-separable solution form that is introduced in this paper. Results: Reconsidering the solutions of the reduced equations by using the separable form shows that, while the low-order body modes have converged eigenvalues and eigenfunctions (for both variations in the model atmosphere's vertical boundaries and radial numerical resolution). It is also confirmed that the corresponding high-order body modes and the surface modes indeed show unbounded growth rates. The energy contained in both the higher order body modes and surface modes diminishes precipitously due to the disk's Gaussian density profile. Most of the energy of the instability is contained in the low-order modes. An inseparable solution form is introduced to filter out the inconsequential surface modes, leaving only body modes (both low- and high-order ones). The analysis predicts a fastest growing mode with a specific radial length scale. The growth rates associated with the fundamental corrugation and breathing modes match the growth and length scales observed in previous nonlinear studies of the instability. Conclusions: Linear stability analysis of the vertical shear instability should be done

  3. Genre Analysis and Writing Skill: Improving Iranian EFL Learners Writing Performance through the Tenets of Genre Analysis

    Directory of Open Access Journals (Sweden)

    Nazanin Naderi Kalali

    2015-12-01

    Full Text Available The main thrust of this study was to determine whether a genre-based instruction improve the writing proficiency of Iranian EFL learners. To this end, 30 homogenous Iranian BA learners studying English at Islamic Azad University, Bandar Abbas Branch were selected as the participants of the study through a version of TOEFL test as the proficiency test. The selected participants were 15 females and 15 males who were randomly divided into two groups of experimental and control. The both experimental and control groups were asked to write on a topic determined by the researcher which were considered as the pre-test. The writing of the students were scored using holistic scoring procedure. The subjects received sixteen hours instruction—the experimental group using a genre-based pedagogy and the control group through the traditional methodology which was followed by a post-test—the subjects were, this time, asked to write on the same topic which they were asked to write before instruction. Their post-writings were also scored through the holistic scoring procedures. In analyzing the data, t-test statistic was utilized for comparing the performances of the two groups. It was found that there is statistically significant difference between the writing ability of the participants who go under a genre-based instruction and who don’t. The study, however, didn’t find any significant role for gender.Keywords: genre analysis, writing skill, holistic scoring procedure, pre-test, post-test, t-test

  4. Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.

    Science.gov (United States)

    Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming

    2016-10-31

    An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The ANOVA of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to

  5. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyoung Tae; Moon, Young Min; Choi, Sung Won; Hwang, Do Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2000-03-15

    The direct-contact condensation hear transfer coefficients are experimentally obtained in the following conditions : pure steam/steam in the presence of noncondensible gas, horizontal/slightly inclined pipe, cocurrent/countercurrent stratified flow with water. The empirical correlation for liquid Nusselt number is developed in conditions of the slightly inclined pipe and the cocurrent stratified flow. The several models - the wall friction coefficient, the interfacial friction coefficient, the correlation of direct-contact condensation with noncondensible gases, and the correlation of wall film condensation - in the RELAP5/MOD3.2 code are modified, As results, RELAP5/MOD3.2 is improved. The present experimental data is used for evaluating the improved code. The standard RELAP5/MOD3.2 code is modified using the non-iterative modeling, which is a mechanistic model and does not require any interfacial information such as the interfacial temperature, The modified RELAP5/MOD3.2 code os used to simulate the horizontally stratified in-tube condensation experiment which represents the direct-contact condensation phenomena in a hot leg of a nuclear reactor. The modeling capabilities of the modified code as well as the standard code are assessed using several hot-leg condensation experiments. The modified code gives better prediction over local experimental data of liquid void fraction and interfacial heat transfer coefficient than the standard code. For the separate effect test of the thermal-hydraulic phenomena in the pressurizer, the scaling analysis is performed to obtain a similarity of the phenomena between the Korea Standard Nuclear Power Plant(KSNPP) and the present experimental facility. The diameters and lengths of the hot-leg, the surge line and the pressurizer are scaled down with the similitude of CCFL and velocity. The ratio of gas flow rate is 1/25. The experimental facility is composed of the air-water supply tank, the horizontal pipe, the surge line and the

  6. Numerical Analysis of the Unsteady Propeller Performance in the Ship Wake Modified By Different Wake Improvement Devices

    Directory of Open Access Journals (Sweden)

    Bugalski Tomasz

    2014-10-01

    Full Text Available The paper presents the summary of results of the numerical analysis of the unsteady propeller performance in the non-uniform ship wake modified by the different wake improvement devices. This analysis is performed using the lifting surface program DUNCAN for unsteady propeller analysis. Te object of the analysis is a 7000 ton chemical tanker, for which four different types of the wake improvement devices have been designed: two vortex generators, a pre-swirl stator, and a boundary layer alignment device. These produced five different cases of the ship wake structure: the original hull and hull equipped alternatively with four wake improvement devices. Two different propellers were analyzed in these five wake fields, one being the original reference propeller P0 and the other - a specially designed, optimized propeller P3. Te analyzed parameters were the pictures of unsteady cavitation on propeller blades, harmonics of pressure pulses generated by the cavitating propellers in the selected points and the fluctuating bearing forces on the propeller shaft. Some of the calculated cavitation phenomena were confronted with the experimental. Te objective of the calculations was to demonstrate the differences in the calculated unsteady propeller performance resulting from the application of different wake improvement devices. Te analysis and discussion of the results, together with the appropriate conclusions, are included in the paper.

  7. Metal Foam Analysis: Improving Sandwich Structure Technology for Engine Fan and Propeller Blades

    Science.gov (United States)

    Fedor, Jessica L.

    2004-01-01

    The Life Prediction Branch of the NASA Glenn Research Center is searching for ways to construct aircraft and rotorcraft engine fan and propeller blades that are lighter and less costly. One possible design is to create a sandwich structure composed of two metal faces sheets and a metal foam core. The face sheets would carry the bending loads and the foam core would have to resist the transverse shear loads. Metal foam is ideal because of its low density and energy absorption capabilities, making the structure lighter, yet still stiff. The material chosen for the face sheets and core was 17-4PH stainless steel, which is easy to make and has appealing mechanical properties. This material can be made inexpensively compared to titanium and polymer matrix composites, the two current fan blade alternatives. Initial tests were performed on design models, including vibration and stress analysis. These tests revealed that the design is competitive with existing designs; however, some problems were apparent that must be addressed before it can be implemented in new technology. The foam did not hold up as well as expected under stress. This could be due to a number of issues, but was most likely a result of a large number of pores within the steel that weakened the structure. The brazing between the face sheets and the foam was also identified as a concern. The braze did not hold up well under shear stress causing the foam to break away from the face sheets. My role in this project was to analyze different options for improving the design. I primarily spent my time examining various foam samples, created with different sintering conditions, to see which exhibited the most favorable characteristics for our purpose. Methods of analysis that I employed included examining strut integrity under a microscope, counting the number of cells per inch, measuring the density, testing the microhardness, and testing the strength under compression. Shear testing will also be done to examine

  8. Structural Analysis of Char by Raman Spectroscopy: Improving Band Assignments through Computational Calculations from First Principles

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Matthew W.; Dallmeyer, Ian; Johnson, Timothy J.; Brauer, Carolyn S.; McEwen, Jean-Sabin; Espinal, Juan F.; Garcia-Perez, Manuel

    2016-04-01

    Raman spectroscopy is a powerful tool for the characterization of many carbon 27 species. The complex heterogeneous nature of chars and activated carbons has confounded 28 complete analysis due to the additional shoulders observed on the D-band and high intensity 29 valley between the D and G-bands. In this paper the effects of various vacancy and substitution 30 defects have been systematically analyzed via molecular modeling using density functional 31 theory (DFT) and how this is manifested in the calculated gas-phase Raman spectra. The 32 accuracy of these calculations was validated by comparison with (solid-phase) experimental 33 spectra, with a small correction factor being applied to improve the accuracy of frequency 34 predictions. The spectroscopic effects on the char species are best understood in terms of a 35 reduced symmetry as compared to a “parent” coronene molecule. Based upon the simulation 36 results, the shoulder observed in chars near 1200 cm-1 has been assigned to the totally symmetric 37 A1g vibrations of various small polyaromatic hydrocarbons (PAH) as well as those containing 38 rings of seven or more carbons. Intensity between 1400 cm-1 and 1450 cm-1 is assigned to A1g 39 type vibrations present in small PAHs and especially those containing cyclopentane rings. 40 Finally, band intensity between 1500 cm-1 and 1550 cm-1 is ascribed to predominately E2g 41 vibrational modes in strained PAH systems. A total of ten potential bands have been assigned 42 between 1000 cm-1 and 1800 cm-1. These fitting parameters have been used to deconvolute a 43 thermoseries of cellulose chars produced by pyrolysis at 300-700 °C. The results of the 44 deconvolution show consistent growth of PAH clusters with temperature, development of non-45 benzyl rings as temperature increases and loss of oxygenated features between 400 °C and 46 600 °C

  9. Improving food safety with accurate analysis by laboratories participating in proficiency testing programmes

    International Nuclear Information System (INIS)

    Full text: The certification of food products, either for export or internal consumption, requires an analysis as accurate as possible and proofs assuring that the results have a solid base. International food trade is a very sensible area of commerce and the sanitary barriers, including those for potentially toxic metals, are extremely strict. Countries with mutual recognition agreements (MRA) accept the certification of the exporter. Where MRA does not exist, the recipient country analyses the goods using their own sampling and analytical procedures. In some cases the results agree but do not in others. In the last situation the products are rejected and not allowed into the buying country with the consequent losses. Chile has a large international market for its seafood products. It has to comply, however, with regulations established by each importing country. One such requirement refers to the maximum admissible level of cadmium in molluscs, set by many countries at 1 mg/kg of Cd. Discrepancies between the Chilean and laboratories abroad arose in the past, resulting in many rejections of the products. Under these circumstances, the Chilean National Fisheries (SERNAPESCA) and the Chilean Nuclear Energy Commission (CCHEN) set up a proficiency test programme mandatory for all authorized laboratories for the certification of export seafood. CCHEN has the responsibility of all technical aspects of the programme, including the preparation and distribution of the materials and evaluation of the results submitted by the participants. After the first proficiency test, several laboratories had their authorization rescinded and, as an additional consequence, all the laboratories had to review and re-validate their analytical procedures. So far, three proficiency tests have been carried out and the response of the laboratories has noticeably improved with direct consequence in the decrease of rejections of the goods by the importers. This paper presents the details of the

  10. Analysis of first flush to improve the water quality in rainwater tanks.

    Science.gov (United States)

    Kus, B; Kandasamy, J; Vigneswaran, S; Shon, H K

    2010-01-01

    Although most Australians receive their domestic supply from reticulated mains or town water, there are vast areas with very low population densities and few reticulated supplies. In many of these areas rainwater collected in tanks is the primary source of drinking water. Heavy metals have recently become a concern as their concentration in rain water tanks was found to exceed recommended levels suitable for human consumption. Rainwater storage tanks also accumulate contaminants and sediments that settle to the bottom. Although not widely acknowledged, small amounts of contaminants such as lead found in rain water (used as drinking water) may have a cumulative and poisonous effect on human health over a life time. This is true for certain factors that underlie many of the chronic illnesses that are becoming increasingly common in contemporary society. The paper reports on a study which is part of a project that aims to develop a cost effective in-line filtration system to improve water quality in rainwater tanks. To enable this, the characteristics of rainwater need to be known. One component of this characterization is to observe the effects of the first flush on a rainwater tank. Samples of the roof runoff collected from an urban residential roof located in the Sydney Metropolitan Area in the initial first few millimetres of rain were analysed. The results show that bypassing the first 2 mm of rainfall gives water with most water quality parameters compliant with the Australian Drinking Water Guidelines (ADWG) standards. The parameters that did not comply were lead and turbidity, which required bypassing approximately the first 5 mm of rainfall to meet ADWG standards. Molecular weight distribution (MWD) analysis showed that the concentration of rainwater organic matter (RWOM) decreased with increasing amount of roof runoff.

  11. An improved classification tree analysis of high cost modules based upon an axiomatic definition of complexity

    Science.gov (United States)

    Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.

    1992-01-01

    Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.

  12. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    OpenAIRE

    Zhenying Zhang; Lili Tian

    2014-01-01

    The impulse turbo expander (ITE) is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reductio...

  13. Systems Thinking Tools for Improving Evidence-Based Practice: A Cross-Case Analysis of Two High School Leadership Teams

    Science.gov (United States)

    Kensler, Lisa A. W.; Reames, Ellen; Murray, John; Patrick, Lynne

    2012-01-01

    Teachers and administrators have access to large volumes of data but research suggests that they lack the skills to use data effectively for continuous school improvement. This study involved a cross-case analysis of two high school leadership teams' early stages of evidence-based practice development; differing forms of external support were…

  14. Synthetic analysis of Tc and Hc2 of NbTi/Ti multilayers based on improved proximity effect theory

    Science.gov (United States)

    Obi, Y.; Ikebe, M.; Takanaka, K.; Fujimori, H.

    1994-12-01

    Tc and Hc2 of NbTi/Ti multilayers have been calculated based on an improved approximation for the proximity effect. Analysis reveals that the agreement between the calculation and the experiment is satisfactory except for some inevitable scatters of the sample quality.

  15. Information Retrieval in Domain-Specific Databases: An Analysis To Improve the User Interface of the Alcohol Studies Database.

    Science.gov (United States)

    Jantz, Ronald

    2003-01-01

    Describes the methodology and results of the log analysis for the Alcohol Studies Database (ASDB), a domain-specific database supported by the Center of Alcohol Studies at Rutgers University Libraries. The objectives were to better understand user search behavior, to analyze failure rates, and to develop approaches for improving the user…

  16. Improving Data Analysis in Second Language Acquisition by Utilizing Modern Developments in Applied Statistics

    Science.gov (United States)

    Larson-Hall, Jenifer; Herrington, Richard

    2010-01-01

    In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…

  17. Quality Improvement and Infrastructure Activity Costs in Software Development: A Longitudinal Analysis

    OpenAIRE

    Donald E. Harter; Slaughter, Sandra A.

    2003-01-01

    This study draws upon theories of task interdependence and organizational inertia to analyze the effect of quality improvement on infrastructure activity costs in software development. Although increasing evidence indicates that quality improvement reduces software development costs, the impact on infrastructure activities is not known. Infrastructure activities include services like computer operations, data integration, and configuration management that support software development. Because...

  18. Improving Alpine Flood Prediction through Hydrological Process Characterization and Uncertainty Analysis

    OpenAIRE

    Tobin, Cara Christine

    2012-01-01

    Among the many challenges of Alpine flood prediction is describing complex, meteo-hydrological processes in a simplified, robust manner that can be easily integrated into operational forecasting. In this dissertation, improved methods to characterize these processes are developed and integrated into the hydrological modeling component of an operational flood forecasting system used in the Swiss Alps. Detailed studies are conducted to improve hydrologi...

  19. School Board Improvement Plans in Relation to the AIP Model of Educational Accountability: A Content Analysis

    Science.gov (United States)

    van Barneveld, Christina; Stienstra, Wendy; Stewart, Sandra

    2006-01-01

    For this study we analyzed the content of school board improvement plans in relation to the Achievement-Indicators-Policy (AIP) model of educational accountability (Nagy, Demeris, & van Barneveld, 2000). We identified areas of congruence and incongruence between the plans and the model. Results suggested that the content of the improvement plans,…

  20. Tailoring quality improvement interventions to identified barriers: a multiple case analysis.

    NARCIS (Netherlands)

    Bosch, M.; Weijden, T. van der; Wensing, M.J.P.; Grol, R.P.T.M.

    2007-01-01

    RATIONALE, AIMS AND OBJECTIVES: The prevailing view on implementation interventions to improve the organization and management of health care is that the interventions should be tailored to potential barriers. Ideally, possible barriers are analysed before the quality improvement interventions are d

  1. Improving performance of high risk organizations Spanish nuclear sector from the analysis of organizational culture factors

    International Nuclear Information System (INIS)

    This paper presents the research project funded by UNESA and conducted by the CISOT-CIEMAT that aims to contribute to improving the operating performance of the Spanish nuclear power plants. This paper aims to identify the factors and key organizational processes to improve efficiency, in order to advance knowledge about the influence of organizational culture on the safety of high reliability organizations.

  2. An improved model for predicting coolant activity behaviour for fuel-failure monitoring analysis

    Energy Technology Data Exchange (ETDEWEB)

    El-Jaby, A.; Lewis, B.J.; Thompson, W.T. [Department of Chemistry and Chemical Engineering, Royal Military College of Canada, Kingston, Ontario, K7K 7B4 (Canada); Iglesias, F.C. [Candesco Corporation, 230 Richmond Street West, 10th Floor, Toronto, Ontario, M5V 1V6 (Canada); Ip, Monique [Bruce Power, 123 Front Street West, 4th Floor Toronto, Ontario, M5J 2M2 (Canada)

    2009-06-15

    A Candu fuel element becomes defective when the Zircaloy-4 sheath is breached, allowing high pressure D{sub 2}O coolant to enter the fuel-to-sheath gap, thereby creating a direct path for fission products (mainly volatile species of iodine and noble gases) and fuel debris to escape into the primary heat transport system (PHTS). In addition, the entry of high-pressure D{sub 2}O coolant into the fuel-to-sheath gap may cause the UO{sub 2} fuel to oxidize, which in turn can augment the rate of fission product release into the PHTS. The release of fission products and fuel debris into the PHTS will elevate circuit contamination levels, consequently increasing radiation exposure to station personnel during maintenance tasks. Moreover, the continued operation of a defective fuel element may result in a diminished thermal performance if the thermal conductivity and the incipient melting temperature of the UO{sub 2} fuel are reduced due to fuel oxidation effects. It is therefore desirable to discharge defective fuel as soon as possible. Hence, a better understanding of defective fuel behaviour is required in order to develop an improved methodology for fuel-failure monitoring and PHTS coolant activity prediction. Several codes have been previously developed for fuel-failure monitoring in Candu, LWR (PWR and BWR), and WWER reactors. Most tools use a steady-state coolant activity analysis, where a Booth diffusion-type model is used to describe the fission product release from the UO{sub 2} fuel matrix into the fuel-to-sheath gap, and a first order kinetic model to consider the transport, hold-up, and release of volatile fission products from the fuel-to-sheath gap into the PHTS coolant. It is therefore necessary to use an empirical diffusion coefficient D' to account for the fission product diffusion in the UO{sub 2} fuel matrix and an escape rate coefficient {nu} for the release from the fuel-to-sheath gap into the PHTS coolant. However, these parameters are not

  3. Improvement of the PSA model using a best-estimate thermal-hydraulic analysis of LOCA scenarios

    International Nuclear Information System (INIS)

    This study was performed to propose both new success criterion and heading of the event tree by using best-estimate analysis of each LOCA scenario, aiming at the improvement of the PSA models. The MARS code was used for the thermal-hydraulic analysis of LOCA and the Ulchin units 3 and 4 were selected as a reference plant in this study. This study was performed to improve the PSA model of three LOCA scenarios by using best-estimate thermal-hydraulic analysis. The LOCA calculations with various configurations of the safety systems and break sizes were performed. Using the results, we proposed both new success criterion and heading of the small- and middle-break LOCA scenario. The small-break LOCA will be analyzed later in terms of operator actions to depressurize the RCS. The results of this analysis may contribute to improve the PSA model of LOCA. In the probabilistic safety analysis (PSA) of Korean Standard Nuclear Power Plant (KSNP), loss-of-coolant accidents (LOCA) are classified into three scenarios by the break size, such as large-, middle-, and small-break LOCA. The specific break sizes were adopted to identify the boundaries of the three groups in the previous PSA model and the success criteria has been conservatively applied to each state of safety system in the event tree

  4. Turbulence Analysis Upstream of a Wind Turbine: a LES Approach to Improve Wind LIDAR Technology

    Science.gov (United States)

    Calaf, M.

    2015-12-01

    upstream, much can be learned about the incoming turbulence, hence allowing improved wind turbine readjustments. Time correlations with the upstream incoming turbulence have been computed through an entire diurnal cycle, and a non-dimensional analysis shows the existence of different behaviors throughout the day.

  5. Feed-forward active contour analysis for improved brachial artery reactivity testing.

    Science.gov (United States)

    Pugliese, Daniel N; Sehgal, Chandra M; Sultan, Laith R; Reamer, Courtney B; Mohler, Emile R

    2016-08-01

    The object of this study was to utilize a novel feed-forward active contour (FFAC) algorithm to find a reproducible technique for analysis of brachial artery reactivity. Flow-mediated dilation (FMD) is an important marker of vascular endothelial function but has not been adopted for widespread clinical use given its technical limitations, including inter-observer variability and differences in technique across clinical sites. We developed a novel FFAC algorithm with the goal of validating a more reliable standard. Forty-six healthy volunteers underwent FMD measurement according to the standard technique. Ultrasound videos lasting 5-10 seconds each were obtained pre-cuff inflation and at minutes 1 through 5 post-cuff deflation in longitudinal and transverse views. Automated segmentation using the FFAC algorithm with initial boundary definition from three different observers was used to analyze the images to measure diameter/cross-sectional area over the cardiac cycle. The %FMD was calculated for average, minimum, and maximum diameters/areas. Using the FFAC algorithm, the population-specific coefficient of variation (CV) at end-diastole was 3.24% for transverse compared to 9.96% for longitudinal measurements; the subject-specific CV was 15.03% compared to 57.41%, respectively. For longitudinal measurements made via the conventional method, the population-specific CV was 4.77% and subject-specific CV was 117.79%. The intraclass correlation coefficient (ICC) for transverse measurements was 0.97 (95% CI: 0.95-0.98) compared to 0.90 (95% CI: 0.84-0.94) for longitudinal measurements with FFAC and 0.72 (95% CI: 0.51-0.84) for conventional measurements. In conclusion, transverse views using the novel FFAC method provide less inter-observer variability than traditional longitudinal views. Improved reproducibility may allow adoption of FMD testing in a clinical setting. The FFAC algorithm is a robust technique that should be evaluated further for its ability to replace the

  6. Feed-forward active contour analysis for improved brachial artery reactivity testing.

    Science.gov (United States)

    Pugliese, Daniel N; Sehgal, Chandra M; Sultan, Laith R; Reamer, Courtney B; Mohler, Emile R

    2016-08-01

    The object of this study was to utilize a novel feed-forward active contour (FFAC) algorithm to find a reproducible technique for analysis of brachial artery reactivity. Flow-mediated dilation (FMD) is an important marker of vascular endothelial function but has not been adopted for widespread clinical use given its technical limitations, including inter-observer variability and differences in technique across clinical sites. We developed a novel FFAC algorithm with the goal of validating a more reliable standard. Forty-six healthy volunteers underwent FMD measurement according to the standard technique. Ultrasound videos lasting 5-10 seconds each were obtained pre-cuff inflation and at minutes 1 through 5 post-cuff deflation in longitudinal and transverse views. Automated segmentation using the FFAC algorithm with initial boundary definition from three different observers was used to analyze the images to measure diameter/cross-sectional area over the cardiac cycle. The %FMD was calculated for average, minimum, and maximum diameters/areas. Using the FFAC algorithm, the population-specific coefficient of variation (CV) at end-diastole was 3.24% for transverse compared to 9.96% for longitudinal measurements; the subject-specific CV was 15.03% compared to 57.41%, respectively. For longitudinal measurements made via the conventional method, the population-specific CV was 4.77% and subject-specific CV was 117.79%. The intraclass correlation coefficient (ICC) for transverse measurements was 0.97 (95% CI: 0.95-0.98) compared to 0.90 (95% CI: 0.84-0.94) for longitudinal measurements with FFAC and 0.72 (95% CI: 0.51-0.84) for conventional measurements. In conclusion, transverse views using the novel FFAC method provide less inter-observer variability than traditional longitudinal views. Improved reproducibility may allow adoption of FMD testing in a clinical setting. The FFAC algorithm is a robust technique that should be evaluated further for its ability to replace the

  7. Cost savings associated with improving appropriate and reducing inappropriate preventive care: cost-consequences analysis

    Directory of Open Access Journals (Sweden)

    Baskerville Neill

    2005-03-01

    Full Text Available Abstract Background Outreach facilitation has been proven successful in improving the adoption of clinical preventive care guidelines in primary care practice. The net costs and savings of delivering such an intensive intervention need to be understood. We wanted to estimate the proportion of a facilitation intervention cost that is offset and the potential for savings by reducing inappropriate screening tests and increasing appropriate screening tests in 22 intervention primary care practices affecting a population of 90,283 patients. Methods A cost-consequences analysis of one successful outreach facilitation intervention was done, taking into account the estimated cost savings to the health system of reducing five inappropriate tests and increasing seven appropriate tests. Multiple data sources were used to calculate costs and cost savings to the government. The cost of the intervention and costs of performing appropriate testing were calculated. Costs averted were calculated by multiplying the number of tests not performed as a result of the intervention. Further downstream cost savings were determined by calculating the direct costs associated with the number of false positive test follow-ups avoided. Treatment costs averted as a result of increasing appropriate testing were similarly calculated. Results The total cost of the intervention over 12 months was $238,388 and the cost of increasing the delivery of appropriate care was $192,912 for a total cost of $431,300. The savings from reduction in inappropriate testing were $148,568 and from avoiding treatment costs as a result of appropriate testing were $455,464 for a total savings of $604,032. On a yearly basis the net cost saving to the government is $191,733 per year (2003 $Can equating to $3,687 per physician or $63,911 per facilitator, an estimated return on intervention investment and delivery of appropriate preventive care of 40%. Conclusion Outreach facilitation is more expensive

  8. [Improvement of transrectal ultrasound. Artificial neural network analysis (ANNA) in detection and staging of prostatic carcinoma].

    Science.gov (United States)

    Loch, T; Leuschner, I; Genberg, C; Weichert-Jacobsen, K; Küppers, F; Retz, M; Lehmann, J; Yfantis, E; Evans, M; Tsarev, V; Stöckle, M

    2000-07-01

    As a result of the enhanced clinical application of prostate specific antigen (PSA), an increasing number of men are becoming candidates for prostate cancer work-up. A high PSA value over 20 ng/ml is a good indicator of the presence of prostate cancer, but within the range of 4-10 ng/ml, it is rather unreliable. Even more alarming is the fact that prostate cancer has been found in 12-37% of patients with a "normal" PSA value of under 4 ng/ml (Hybritech). While PSA is capable of indicating a statistical risk of prostate cancer in a defined patient population, it is not able to localize cancer within the prostate gland or guide a biopsy needle to a suspicious area. This necessitates an additional effective diagnostic technique that is able to localize or rule out a malignant growth within the prostate. The methods available for the detection of these prostate cancers are digital rectal examination (DRE) and Transrectal ultrasound (TRUS). DRE is not suitable for early detection, as about 70% of the palpable malignancies have already spread beyond the prostate. The classic problem of visual interpretation of TRUS images is that hypoechoic areas suspicious for cancer may be either normal or cancerous histologically. Moreover, about 25% of all cancers have been found to be isoechoic and therefore not distinguishable from normal-appearing areas. None of the current biopsy or imaging techniques are able to cope with this dilemma. Artificial neural networks (ANN) are complex nonlinear computational models, designed much like the neuronal organization of a brain. These networks are able to model complicated biologic relationships without making assumptions based on conventional statistical distributions. Applications in Medicine and Urology have been promising. One example of such an application will be discussed in detail: A new method of Artificial Neural Network Analysis (ANNA) was employed in an attempt to obtain existing subvisual information, other than the gray scale

  9. Geoelectrical time-lapse analysis for improved interpretation of data in a contaminated area

    Science.gov (United States)

    Chitea, Florina; Serban, Adrian; Ioane, Dumitru; Georgescu, Paul

    2014-05-01

    Non invasive geoelectrical studies are useful in the preliminary assessment of areas suspected to be contaminated but also in the investigation stage. Correctly adapted to the site specific situation, they are used to detect and investigate buried sources of pollution, to characterize the geology of the area, to detect the contaminated plume or to study the attenuation of pollution in case the appliance of an site-specific remediation techniques. Despite the improved technological acquisition techniques and the optimized inversion data algorithms, interpretation of geoelectrical data in still a challenging task, especially in a contaminated hydrogeological context. Beside the soil physical properties (composition, porosity, texture, etc.), moisture content and chemical composition of the pollutant are also influencing the measured parameter. Apparent electrical resistivity method was use in an area located near an Oil Refinery. Electrical measurements performed on profiles (transverse and along the direction of water flow -according to hydrological data) revealed the presence of contaminants by means of high resistivity anomalies. Using the same acquisition technique (Schlumberger array, same VES points, injection - AB - and voltage - MN - lines extension), measurements were repeated during time, along the same profiles. On the resulted electrical sections from 2006 to 2013, a dynamic situation regarding the pollution plume was observed. Time - lapse analysis, based on the calculation of resistivity differences between sets of data acquired along the same profile was applied, and data interpretation was made using the resulted sections. Significant variation between data sets (> 17% of apparent resistivity normalized differences) observed along the main profile were mainly ranging from the near surface (1.5 m) to an approximated depth (AB/2) of 10m. Using the time-lapse method, changes in the lateral and in depth extension of polluted areas could be observed and

  10. Analysis of advanced sodium-cooled fast reactor core designs with improved safety characteristics

    International Nuclear Information System (INIS)

    improvements address both neutronics and thermal-hydraulics aspects. Furthermore, emphasis has been placed on not only the beginning-of-life (BOL) state of the core, but also on the beginning of closed equilibrium fuel cycle (BEC) state. An important context for the current thesis is the 7th European Framework Program's Collaborative Project for a European Sodium Fast Reactor (CP-ESFR), the reference 3600 MWth ESFR core being the starting point for the conducted research. The principally employed computational tools belong to the so-called FAST code system, viz. the fast-reactor neutronics code ERANOS, the fuel cycle simulating procedure EQL3D, the spatial kinetics code PARCS and the system thermal-hydraulics code TRACE. The research has been carried out in essentially three successive phases. The first phase has involved achieving a clearer understanding of the principal phenomena contributing to the SFR void effect. Decomposition and analysis of sodium void reactivity have been carried out, while considering different fuel cycle states for the core. Furthermore, the spatial distribution of void reactivity importance, in both axial and radial directions, is investigated. For the reactivity decomposition, two methods, based respectively on neutron balance considerations and on perturbation theory, have been applied. The sodium void reactivity of the reference ESFR core has been, accordingly, decomposed reaction-wise, cross-section-wise, isotope-wise and energy-group-wise. Effectively, the neutron balance based method allows an in-depth understanding of the ‘consequences’ of sodium voidage, while the perturbation theory based method provides a complementary understanding of the ‘causes’. The second phase of the research has addressed optimization of the reference ESFR core design from the neutronics viewpoint. Four options oriented towards either the leakage component or the spectral effect have been considered in detail, viz. introducing an upper sodium plenum and

  11. Quantitative Analysis of Impact of Education on Improving Farmers' Net Income and Yield Per Capita

    Institute of Scientific and Technical Information of China (English)

    DING Jing-zhi

    2002-01-01

    In this paper, we analyze the relation between farmers' schooling and their net income and yield per capita by systemic and scientific method, concluding that improving farmers' educational level may increase their net income.

  12. Impact of Improved Maize Adoption on Welfare of Farm Households in Malawi: A Panel Data Analysis

    OpenAIRE

    Bezu, Sosina; Kassie, Girma; Shiferaw, Bekele; Ricker-Gilbert, Jacob

    2013-01-01

    This paper assesses improved maize adoption in Malawi and examines the link between adoption and household welfare using a three-year household panel data. The distributional effect of maize technology adoption is also investigated by looking at impacts across wealth and gender groups. We applied control function approach and IV regression to control for endogeneity of input subsidy and improved maize adoption. We found that modern maize variety adoption is positively correlated with the hous...

  13. Analysis of Participatory Research Projects in the International Maize and Wheat Improvement Center

    OpenAIRE

    Lilja, Nina K.; Bellon, Mauricio R.

    2006-01-01

    Through a survey of scientists from the International Maize and Wheat Improvement Center (CIMMYT) in 2004, this study assessed the extent to which participatory methods had been used by the center, how they were perceived by the scientists, and how participatory research could be applied more effectively by CIMMYT and partners. Results for 19 CIMMYT projects suggest among other things that participatory approaches at the center were largely “functional”—that is, aimed at improving the efficie...

  14. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  15. A meta-analysis of clinical improvements of general well-being by a standardized Lycium barbarum.

    Science.gov (United States)

    Paul Hsu, Chiu-Hsieh; Nance, Dwight M; Amagase, Harunobu

    2012-11-01

    Four randomized, blind, placebo-controlled clinical trials were pooled to study the general effects of oral consumption of Lycium barbarum at 120 mL/day, as a standardized juice, GoChi(®) (FreeLife International, Phoenix, AZ, USA). A questionnaire consisting of symptoms graded 0-5 was given to the participants. For each question, the score changes in the questionnaire between pre- and postintervention were summarized by the standardized mean difference and associated SE to perform the meta-analysis. The change was also characterized into a binary outcome, improved or not, to derive odds ratio (OR) and associated SE derived by a binary outcome using the Mantel-Haenszel method. The meta-analysis and heterogeneity were evaluated with the R program using the rmeta package. Statistical significance was set at 5%. In total, 161 participants (18-72 years old) were included in the meta-analysis. Compared with the placebo group (n=80), the active group (n=81) showed significant improvements in weakness, stress, mental acuity, ease of awakening, shortness of breath, focus on activity, sleep quality, daydreaming, and overall feelings of health and well-being under a random effects model. A fixed effects model showed additional improvements in fatigue, depression, circulation, and calmness. The OR indicated significantly higher chance to improve fatigue, dizziness, and sleep quality. Three studies had statistically significant heterogeneity in procrastination, shoulder stiffness, energy, and calmness. The present meta-analysis confirmed the various health effects of L. barbarum polysaccharides-standardized L. barbarum intake found in the previous randomized, double-blind, placebo-controlled human clinical trials and revealed it resulted in statistically significant improvements in neurological/psychological performance and overall feelings of health and well-being compared with the placebo group under both the fixed and the random effects models of the R program. PMID

  16. An improved model for predicting coolant activity behaviour for fuel-failure monitoring analysis

    International Nuclear Information System (INIS)

    A Candu fuel element becomes defective when the Zircaloy-4 sheath is breached, allowing high pressure D2O coolant to enter the fuel-to-sheath gap, thereby creating a direct path for fission products (mainly volatile species of iodine and noble gases) and fuel debris to escape into the primary heat transport system (PHTS). In addition, the entry of high-pressure D2O coolant into the fuel-to-sheath gap may cause the UO2 fuel to oxidize, which in turn can augment the rate of fission product release into the PHTS. The release of fission products and fuel debris into the PHTS will elevate circuit contamination levels, consequently increasing radiation exposure to station personnel during maintenance tasks. Moreover, the continued operation of a defective fuel element may result in a diminished thermal performance if the thermal conductivity and the incipient melting temperature of the UO2 fuel are reduced due to fuel oxidation effects. It is therefore desirable to discharge defective fuel as soon as possible. Hence, a better understanding of defective fuel behaviour is required in order to develop an improved methodology for fuel-failure monitoring and PHTS coolant activity prediction. Several codes have been previously developed for fuel-failure monitoring in Candu, LWR (PWR and BWR), and WWER reactors. Most tools use a steady-state coolant activity analysis, where a Booth diffusion-type model is used to describe the fission product release from the UO2 fuel matrix into the fuel-to-sheath gap, and a first order kinetic model to consider the transport, hold-up, and release of volatile fission products from the fuel-to-sheath gap into the PHTS coolant. It is therefore necessary to use an empirical diffusion coefficient D' to account for the fission product diffusion in the UO2 fuel matrix and an escape rate coefficient ν for the release from the fuel-to-sheath gap into the PHTS coolant. However, these parameters are not constant in time as they are influenced by the

  17. Theoretical analysis and an improvement method of the bias effect on the linearity of RF linear power amplifiers

    Institute of Scientific and Technical Information of China (English)

    Wu Tuo; Chen Hongyi; Qian Dahong

    2009-01-01

    Based on the Gummel-Poon model of BJT, the change of the DC bias as a function of the AC input signal in RF linear power amplifiers is theoretically derived, so that the linearity of different DC bias circuits can be interpreted and compared. According to the analysis results, a quantitative adaptive DC bias circuit is proposed,which can improve the linearity and efficiency. From the simulation and test results, we draw conclusions on how to improve the design of linear power amplifier.

  18. Improvement in the Plutonium Parameter Files of the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    The isotopic analysis code Fixed-energy Response-function Analysis with Multiple efficiency (FRAM) employs user-editable parameter sets to analyze a broad range of sample types. This report presents new parameter files, based upon a new set of plutonium branding ratios, which give more accurate isotope results than the current parameter files that use FRAM

  19. Improvement in the Plutonium Parameter Files of the FRAM Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    D. T. Vo; T. E. Sampson

    2000-09-01

    The isotopic analysis code Fixed-energy Response-function Analysis with Multiple efficiency (FRAM) employs user-editable parameter sets to analyze a broad range of sample types. This report presents new parameter files, based upon a new set of plutonium branding ratios, which give more accurate isotope results than the current parameter files that use FRAM.

  20. The job analysis of Korean nurses as a strategy to improve the Korean Nursing Licensing Examination

    Directory of Open Access Journals (Sweden)

    In Sook Park

    2016-06-01

    Full Text Available Purpose: This study aimed at characterizing Korean nurses’ occupational responsibilities to apply the results for improvement of the Korean Nursing Licensing Examination. Methods: First, the contents of nursing job were defined based on a focus group interview of 15 nurses. Developing a Curriculum (DACOM method was used to examine those results and produce the questionnaire by 13 experts. After that, the questionnaire survey to 5,065 hospital nurses was done. Results: The occupational responsibilities of nurses were characterized as involving 8 duties, 49 tasks, and 303 task elements. Those 8 duties are nursing management and professional development, safety and infection control, the management of potential risk factors, basic nursing and caring, the maintenance of physiological integrity, medication and parenteral treatments, socio-psychological integrity, and the maintenance and improvement of health. Conclusion: The content of Korean Nursing Licensing Examination should be improved based on 8 duties and 49 tasks of the occupational responsibilities of Korean nurses.

  1. Improved Identification and Analysis of Small Open Reading Frame Encoded Polypeptides.

    Science.gov (United States)

    Ma, Jiao; Diedrich, Jolene K; Jungreis, Irwin; Donaldson, Cynthia; Vaughan, Joan; Kellis, Manolis; Yates, John R; Saghatelian, Alan

    2016-04-01

    Computational, genomic, and proteomic approaches have been used to discover nonannotated protein-coding small open reading frames (smORFs). Some novel smORFs have crucial biological roles in cells and organisms, which motivates the search for additional smORFs. Proteomic smORF discovery methods are advantageous because they detect smORF-encoded polypeptides (SEPs) to validate smORF translation and SEP stability. Because SEPs are shorter and less abundant than average proteins, SEP detection using proteomics faces unique challenges. Here, we optimize several steps in the SEP discovery workflow to improve SEP isolation and identification. These changes have led to the detection of several new human SEPs (novel human genes), improved confidence in the SEP assignments, and enabled quantification of SEPs under different cellular conditions. These improvements will allow faster detection and characterization of new SEPs and smORFs. PMID:27010111

  2. Investigating data envelopment analysis model with potential improvement for integer output values

    Science.gov (United States)

    Hussain, Mushtaq Taleb; Ramli, Razamin; Khalid, Ruzelan

    2015-12-01

    The decrement of input proportions in DEA model is associated with its input reduction. This reduction is apparently good for economy since it could reduce unnecessary cost resources. However, in some situations the reduction of relevant inputs such as labour could create social problems. Such inputs should thus be maintained or increased. This paper develops an advanced radial DEA model dealing with mixed integer linear programming to improve integer output values through the combination of inputs. The model can deal with real input values and integer output values. This model is valuable for situations dealing with input combination to improve integer output values as faced by most organizations.

  3. JT8D and JT9D jet engine performance improvement program. Task 1: Feasibility analysis

    Science.gov (United States)

    Gaffin, W. O.; Webb, D. E.

    1979-01-01

    JT8D and JT9D component performance improvement concepts which have a high probability of incorporation into production engines were identified and ranked. An evaluation method based on airline payback period was developed for the purpose of identifying the most promising concepts. The method used available test data and analytical models along with conceptual/preliminary designs to predict the performance improvements, weight, installation characteristics, cost for new production and retrofit, maintenance cost, and qualitative characteristics of candidate concepts. These results were used to arrive at the concept payback period, which is the time required for an airline to recover the investment cost of concept implementation.

  4. Improved diagnosis of MV paper-insulated cables using signal analysis

    DEFF Research Database (Denmark)

    Villefrance, Rasmus; Holbøll, Joachim T.; Sørensen, John Aasted;

    1999-01-01

    for the estimation of PD-signals from a parametric model leading to reduction of the noise superimposed on the PD-signals and thus to improved PD-detection. The applicability of these methods is discussed in relation to mobile systems for the assessment of cable insulation condition.......With the purpose of improving the PD estimation accuracy and the degree of automation of the measurements, the following study is carried out. Initially, a library of different discharge pulses and actual background noise from a selection of cables is established. The library is then used...

  5. THE ACCOUNT AND ANALYSIS IMPROVE FOR USING MAIN ITEMS IN DIVISION, SIGNALIZATION AND CONNECTION

    Directory of Open Access Journals (Sweden)

    A. M. Kozuberda

    2011-05-01

    Full Text Available The article is dealt with proposals for improvement of accounting in signaling and communication and the reasons for such decisions. It is offered to use new methods of calculating depreciation, change the criteria for enrollment in low-value items and simplify the procedure for writing off fixed assets. These changes should reduce the costs at the enterprise, simplify the accounting work and improve the overall performance of the maintenance section, provide better to its main function – the traffic safety of trains.

  6. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;

    2013-01-01

    The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check...... to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve...

  7. The contribution of ergonomic analysis to the improvement of working conditions, patient reception, and quality in radiotherapy

    International Nuclear Information System (INIS)

    The authors report an ergonomic analysis of the activity in a radiotherapy department of a hospital. This study aimed at understanding physiological fatigue and pathologies among technical personnel, at analyzing factors which impact reception quality, but also the professional satisfaction associated with this profession. The authors have developed a guide for a radiotherapy technician workstation study. They report and comment the activity analysis (time organisation, working time in different positions, movements and handling, handled weights), also outline the anxiety associated with the risk of mistake. They identify various improvement possibilities

  8. Testing a four-dimensional variational data assimilation method using an improved intermediate coupled model for ENSO analysis and prediction

    Science.gov (United States)

    Gao, Chuan; Wu, Xinrong; Zhang, Rong-Hua

    2016-07-01

    A four-dimensional variational (4D-Var) data assimilation method is implemented in an improved intermediate coupled model (ICM) of the tropical Pacific. A twin experiment is designed to evaluate the impact of the 4D-Var data assimilation algorithm on ENSO analysis and prediction based on the ICM. The model error is assumed to arise only from the parameter uncertainty. The "observation" of the SST anomaly, which is sampled from a "truth" model simulation that takes default parameter values and has Gaussian noise added, is directly assimilated into the assimilation model with its parameters set erroneously. Results show that 4D-Var effectively reduces the error of ENSO analysis and therefore improves the prediction skill of ENSO events compared with the non-assimilation case. These results provide a promising way for the ICM to achieve better real-time ENSO prediction.

  9. Time-frequency analysis of non-stationary fusion plasma signals using an improved Hilbert-Huang transform

    Science.gov (United States)

    Liu, Yangqing; Tan, Yi; Xie, Huiqiao; Wang, Wenhao; Gao, Zhe

    2014-07-01

    An improved Hilbert-Huang transform method is developed to the time-frequency analysis of non-stationary signals in tokamak plasmas. Maximal overlap discrete wavelet packet transform rather than wavelet packet transform is proposed as a preprocessor to decompose a signal into various narrow-band components. Then, a correlation coefficient based selection method is utilized to eliminate the irrelevant intrinsic mode functions obtained from empirical mode decomposition of those narrow-band components. Subsequently, a time varying vector autoregressive moving average model instead of Hilbert spectral analysis is performed to compute the Hilbert spectrum, i.e., a three-dimensional time-frequency distribution of the signal. The feasibility and effectiveness of the improved Hilbert-Huang transform method is demonstrated by analyzing a non-stationary simulated signal and actual experimental signals in fusion plasmas.

  10. An integrated data analysis tool for improving measurements on the MST RFP

    Energy Technology Data Exchange (ETDEWEB)

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  11. An integrated data analysis tool for improving measurements on the MST RFP

    International Nuclear Information System (INIS)

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method

  12. Discriminant analysis of farmers adoption of improved maize varieties in Wa Municipality, Upper West Region of Ghana.

    Science.gov (United States)

    Alhassan, Abukari; Salifu, Hussein; Adebanji, Atinuke O

    2016-01-01

    This study employed the quadratic classification function analysis to examine the influence of farmer's socio-demographic and varietal characteristics of maize on adoption of improved maize varieties (IMVs) in the Wa Municipality of the Upper West region of Ghana. The results showed that, farm labour, information availability about the variety, weed resistance, low yielding variety, early maturity and water stress resistance are the major discriminating variables in classifying farmers in the Municipality. The study however revealed that maize experience, low yield, information availability and cost of variety were the most influential discriminating variables between adopters and non-adopters of IMVs. The study recommended the need to improve on the level of farmers' education, ensure steady access to extension services and improvement in varietal characteristics identified in the study. PMID:27652087

  13. Analysis of Farmers’ Willingness to Adopt Improved Peanut Varieties in Northern Ghana with the use of Baseline Survey Data.

    OpenAIRE

    Ibrahim, Mohammed; Florkowski, Wojciech

    2015-01-01

    This study employed a probit model to identify the factors that influence the willingness of farmers in northern Ghana to adopt improved peanut varieties. A cross-sectional data of 206 peanut farmers from the Tamale Metropolitan, Tolon-Kumbungu and Savelugu-Nanton districts in the northern region of Ghana were used in the analysis. The estimated results indicate that Tolon-Kumbungu district (location), early maturity, farm size, ownership of a radio and membership in a farm organization signi...

  14. Vibrational Analysis of Brucite Surfaces and the Development of an Improved Force Field for Molecular Simulation of Interfaces

    OpenAIRE

    Zeitler, Todd R.; Greathouse, Jeffery A.; Gale, Julian D.; Cygan, Randall T.

    2014-01-01

    We introduce a nonbonded three-body harmonic potential energy term for Mg–O–H interactions for improved edge surface stability in molecular simulations. The new potential term is compatible with the Clayff force field and is applied here to brucite, a layered magnesium hydroxide mineral. Comparisons of normal mode frequencies from classical and density functional theory calculations are used to verify a suitable spring constant (k parameter) for the Mg–O–H bending motion. Vibrational analysis...

  15. Detection of ULF electromagnetic emissions as a precursor to an earthquake in China with an improved polarization analysis

    Directory of Open Access Journals (Sweden)

    Y. Ida

    2008-07-01

    Full Text Available An improved analysis of polarization (as the ratio of vertical magnetic field component to the horizontal one has been developed, and applied to the approximately four years data (from 1 March 2003 to 31 December 2006 observed at Kashi station in China. It is concluded that the polarization ratio has exhibited an apparent increase only just before the earthquake on 1 September 2003 (magnitude = 6.1 and epicentral distance of 116 km.

  16. Using Process Definition and Analysis Techniques to Reduce Errors and Improve Efficiency in the Delivery of Healthcare

    OpenAIRE

    Clarke, Lori; Osterweil, Leon

    2013-01-01

    As has been widely reported in the news lately, heathcare errors are a major cause of death and suffering, and healthcare inefficiencies result in escalating costs. In the University of Massachusetts Medical Safety Project, we are investigating if process definition and analysis technologies can be used to help reduce heathcare errors and improve heathcare efficiency. Specifically, we are modeling healthcare processes using a process definition language and then analyzing these processes usin...

  17. AN IMPROVED ERROR ANALYSIS FOR FINITE ELEMENT APPROXIMATION OF BIOLUMINESCENCE TOMOGRAPHY

    Institute of Scientific and Technical Information of China (English)

    Wei Gong; Ruo Li; Ningning Yan; Weibo Zhao

    2008-01-01

    This paper is concerned with an ill-posed problem which results from the area of molecular imaging and is known as BLT problem. Using Tikhonov regularization technique, a quadratic optimization problem can be formulated. We provide an improved error estimate for the finite element approximation of the regularized optimization problem. Some numerical examples are presented to demonstrate our theoretical results.

  18. Econometric analysis of improved maize varieties and sustainable agricultural practices (SAPs) in Eastern Zambia

    NARCIS (Netherlands)

    Manda, J.

    2016-01-01

    Maize is the principle food staple in Zambia, providing both food and income for most of the rural populace. It is estimated that over 50% of the daily caloric intake is derived from maize; with an average consumption of over 85kg per year. Because of the importance of maize, a number of improved ma

  19. Using Mobile Phones to Improve Educational Outcomes: An Analysis of Evidence from Asia

    Science.gov (United States)

    Valk, John-Harmen; Rashid, Ahmed T.; Elder, Laurent

    2010-01-01

    Despite improvements in educational indicators, such as enrolment, significant challenges remain with regard to the delivery of quality education in developing countries, particularly in rural and remote regions. In the attempt to find viable solutions to these challenges, much hope has been placed in new information and communication technologies…

  20. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jing; GUO ZengYuan

    2008-01-01

    The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.

  1. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.

  2. A Meta-Analysis of Educational Data Mining on Improvements in Learning Outcomes

    Science.gov (United States)

    AlShammari, Iqbal A.; Aldhafiri, Mohammed D.; Al-Shammari, Zaid

    2013-01-01

    A meta-synthesis study was conducted of 60 research studies on educational data mining (EDM) and their impacts on and outcomes for improving learning outcomes. After an overview, an examination of these outcomes is provided (Romero, Ventura, Espejo, & Hervas, 2008; Romero, "et al.", 2011). Then, a review of other EDM-related research…

  3. How to Improve Pupils' Literacy? A Cost-Effectiveness Analysis of a French Educational Project

    Science.gov (United States)

    Massoni, Sebastien; Vergnaud, Jean-Christophe

    2012-01-01

    The "Action Lecture" program is an innovative teaching method run in some nursery and primary schools in Paris and designed to improve pupils' literacy. We report the results of an evaluation of this program. We describe the experimental protocol that was built to estimate the program's impact on several types of indicators. Data were processed…

  4. A Novel Approach to Improve the Detectability of CO2 by GC Analysis

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A novel stochastic resonance algorithm was employed to enhance the signal-to-noise ratio (SNR) of signals of analytical chemistry. By using a gas chromatographic data set, it was proven that the SNR was greatly improved and the quantitative relationship between concentrations and chromatographic responses remained simultaneously. The linear range was extended beyond the instrumental detection limit.

  5. Analysis and Improvement of the Lightweight Mutual Authentication Protocol under EPC C-1 G-2 Standard

    Directory of Open Access Journals (Sweden)

    Masoud Mohammadi

    Full Text Available Radio Frequency Identification (RFID technology is a promising technology. It uses radio waves to identify objects. Through automatic and real-time data acquisition, this technology can give a great benefit to various industries by improving the efficien ...

  6. How Can Bulgaria Improve Its Education System? An Analysis of PISA 2012 and Past Results

    OpenAIRE

    World Bank

    2012-01-01

    Bulgaria's performance on all three disciplines of the program for international student assessment (PISA) 2012 was slightly better than its PISA 2000 performance, after having dropped between 2000 and 2006. The improvements in performance between 2006 and 2012 promoted shared prosperity, but equality of opportunities is still a major challenge. In fact, disaggregating students' PISA score...

  7. Improvement in DMSA imaging using adaptive noise reduction: an ROC analysis.

    Science.gov (United States)

    Lorimer, Lisa; Gemmell, Howard G; Sharp, Peter F; McKiddie, Fergus I; Staff, Roger T

    2012-11-01

    Dimercaptosuccinic acid imaging is the 'gold standard' for the detection of cortical defects and diagnosis of scarring of the kidneys. The Siemens planar processing package, which implements adaptive noise reduction using the Pixon algorithm, is designed to allow a reduction in image noise, enabling improved image quality and reduced acquisition time/injected activity. This study aimed to establish the level of improvement in image quality achievable using this algorithm. Images were acquired of a phantom simulating a single kidney with a range of defects of varying sizes, positions and contrasts. These images were processed using the Pixon processing software and shown to 12 observers (six experienced and six novices) who were asked to rate the images on a six-point scale depending on their confidence that a defect was present. The data were analysed using a receiver operating characteristic approach. Results showed that processed images significantly improved the performance of the experienced observers in terms of their sensitivity and specificity. Although novice observers showed significant increase in sensitivity when using the software, a significant decrease in specificity was also seen. This study concludes that the Pixon software can be used to improve the assessment of cortical defects in dimercaptosuccinic acid imaging by suitably trained observers.

  8. Improving Financial Literacy of College Students: A Cross-Sectional Analysis

    Science.gov (United States)

    Seyedian, Mojtaba; Yi, Taihyeup David

    2011-01-01

    Financial literacy has become more important than ever as an increasing number of college students are relying on credit cards to finance their education. We examine whether college students are knowledgeable about finance, whether they improve upon that knowledge, and whether their demographic profile, financial backgrounds, and…

  9. IMPROVING THE DESIGN AND ANALYSIS OF SUPERCONDUCTING MAGNETS FOR PARTICLE ACCELERATORS

    Energy Technology Data Exchange (ETDEWEB)

    GUPTA,R.C.

    1996-11-01

    The field quality in superconducting magnets has been improved to a level that it does not appear to be a limiting factor on the performance of RHIC. The many methods developed, improved and adopted during the course of this work have contributed significantly to that performance. One can not only design and construct magnets with better field quality than in one made before but can also improve on that quality after construction. The relative field error ({Delta}B/B) can now be made as low as a few parts in 10{sup {minus}5} at 2/3 of the coil radius. This is about an order of magnitude better than what is generally expected for superconducting magnets. This extra high field quality is crucial to the luminosity performance of RHIC. The research work described here covers a number of areas which all must be addressed to build the production magnets with a high field quality. The work has been limited to the magnetic design of the cross section which in most cases essentially determines the field quality performance of the whole magnet since these magnets are generally long. Though the conclusions to be presented in this chapter have been discussed at the end of each chapter, a summary of them might be useful to present a complete picture. The lessons learned from these experiences may be useful in the design of new magnets. The possibilities of future improvements will also be presented.

  10. Improving the Memory Sections of the Standardized Assessment of Concussion Using Item Analysis

    Science.gov (United States)

    McElhiney, Danielle; Kang, Minsoo; Starkey, Chad; Ragan, Brian

    2014-01-01

    The purpose of the study was to improve the immediate and delayed memory sections of the Standardized Assessment of Concussion (SAC) by identifying a list of more psychometrically sound items (words). A total of 200 participants with no history of concussion in the previous six months (aged 19.60 ± 2.20 years; N?=?93 men, N?=?107 women)…

  11. Reporting Data with "Over-the-Counter" Data Analysis Supports Improves Educators' Data Analyses

    Science.gov (United States)

    Rankin, Jenny Grant

    2014-01-01

    The benefits of making data-informed decisions to improve learning rely on educators correctly interpreting given data. Many educators routinely misinterpret data, even at districts with proactive support for data use. The tool most educators use for data analyses, which is an information technology data system or its reports, typically reports…

  12. Stability analysis and design of the improved droop controller on a voltage source inverter

    DEFF Research Database (Denmark)

    Calabria, Mauro; Schumacher, Walter; Guerrero, Josep M.;

    2015-01-01

    This paper studies the dynamics of a droop-controlled voltage source inverter connected to a stiff grid and addresses the use of the improved droop controller in order to enhance the dynamic behavior of the system. The small-signal stability of the inverter is studied in depth considering...

  13. The Practical Relevance of Accountability Systems for School Improvement: A Descriptive Analysis of California Schools

    Science.gov (United States)

    Mintrop, Heinrich; Trujillo, Tina

    2007-01-01

    In search for the practical relevance of accountability systems for school improvement, the authors ask whether practitioners traveling between the worlds of system-designated high- and low-performing schools would detect tangible differences in educational quality and organizational effectiveness. In comparing nine exceptionally high and low…

  14. Does Agency Competition Improve the Quality of Policy Analysis? Evidence from OMB and CBO Fiscal Projections

    Science.gov (United States)

    Krause, George A.; Douglas, James W.

    2006-01-01

    Public management scholars often claim that agency competition provides an effective institutional check on monopoly authority, and hence, leads to improvement of administrative performance in public sector agencies. This logic was central for creating the Congressional Budget Office (CBO) in 1975 to challenge the policy information provided by…

  15. Toward improving the proteomic analysis of formalin-fixed, paraffin-embedded tissue.

    Science.gov (United States)

    Fowler, Carol B; O'Leary, Timothy J; Mason, Jeffrey T

    2013-08-01

    Archival formalin-fixed, paraffin-embedded (FFPE) tissue and their associated diagnostic records represent an invaluable source of retrospective proteomic information on diseases for which the clinical outcome and response to treatment are known. However, analysis of archival FFPE tissues by high-throughput proteomic methods has been hindered by the adverse effects of formaldehyde fixation and subsequent tissue histology. This review examines recent methodological advances for extracting proteins from FFPE tissue suitable for proteomic analysis. These methods, based largely upon heat-induced antigen retrieval techniques borrowed from immunohistochemistry, allow at least a qualitative analysis of the proteome of FFPE archival tissues. The authors also discuss recent advances in the proteomic analysis of FFPE tissue; including liquid-chromatography tandem mass spectrometry, reverse phase protein microarrays and imaging mass spectrometry.

  16. Improving Real Analysis in Coq: a User-Friendly Approach to Integrals and Derivatives

    OpenAIRE

    Boldo, Sylvie; Lelay, Catherine; Melquiond, Guillaume

    2012-01-01

    International audience Verification of numerical analysis programs requires dealing with derivatives and integrals. High confidence in this process can be achieved using a formal proof checker, such as Coq. Its standard library provides an axiomatization of real numbers and various lemmas about real analysis, which may be used for this purpose. Unfortunately, its definitions of derivative and integral are unpractical as they are partial functions that demand a proof term. This proof term m...

  17. Analysis of marketing communications in the selected services firm and proposal of possible improvements

    OpenAIRE

    Černá, Jana

    2009-01-01

    This graduation theses were concerned with analysis of marketing communication in services {--} in the Wellness hotel Rezidence Nové Hrady. This theses was incurred during temporary opening of the hotel till finishing and opening new spaces. There is recapped a history of the hotel and executed detailed situation analysis in the analytical part. There were detection advantages and disadvantages, which should the hotel use for persuasion the costumers. The synthetic part consequent to the anal...

  18. Analysis of the evaluated data discrepancies for minor actinides and development of improved evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ignatyuk, A. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)

    1997-03-01

    The work is directed on a compilation of experimental and evaluated data available for neutron induced reaction cross sections on {sup 237}Np, {sup 241}Am, {sup 242m}Am and {sup 243}Am isotopes, on the analysis of the old data and renormalizations connected with changes of standards and on the comparison of experimental data with theoretical calculation. Main results of the analysis performed by now are presented in this report. (J.P.N.)

  19. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.

    2013-11-01

    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  20. Do improvements in outreach, clinical, and family and community-based services predict improvements in child survival? An analysis of serial cross-sectional national surveys

    Directory of Open Access Journals (Sweden)

    Simen-Kapeu Aline

    2011-06-01

    Full Text Available Abstract Background There are three main service delivery channels: clinical services, outreach, and family and community. To determine which delivery channels are associated with the greatest reductions in under-5 mortality rates (U5MR, we used data from sequential population-based surveys to examine the correlation between changes in coverage of clinical, outreach, and family and community services and in U5MR for 27 high-burden countries. Methods Household survey data were abstracted from serial surveys in 27 countries. Average annual changes (AAC between the most recent and penultimate survey were calculated for under-five mortality rates and for 22 variables in the domains of clinical, outreach, and family- and community-based services. For all 27 countries and a subset of 19 African countries, we conducted principal component analysis to reduce the variables into a few components in each domain and applied linear regression to assess the correlation between changes in the principal components and changes in under-five mortality rates after controlling for multiple potential confounding factors. Results AAC in under 5-mortality varied from 6.6% in Nepal to -0.9% in Kenya, with six of the 19 African countries all experiencing less than a 1% decline in mortality. The strongest correlation with reductions in U5MR was observed for access to clinical services (all countries: p = 0.02, r2 = 0.58; 19 African countries p 2 = 0.67. For outreach activities, AAC U5MR was significantly correlated with antenatal care and family planning services, while AAC in immunization services showed no association. In the family- and community services domain, improvements in breastfeeding were associated with significant changes in mortality in the 30 countries but not in the African subset; while in the African countries, nutritional status improvements were associated with a significant decline in mortality. Conclusions Our findings support the importance of

  1. Analysis of policy towards improvement of perinatal mortality in the Netherlands (2004-2011).

    Science.gov (United States)

    Vos, Amber A; van Voorst, Sabine F; Steegers, Eric A P; Denktaş, Semiha

    2016-05-01

    Relatively high perinatal mortality and morbidity rates(2) in the Netherlands resulted in a process which induced policy changes regarding the Dutch perinatal healthcare system. Aims of this policy analysis are (1) to identify actors, context and process factors that promoted or impeded agenda setting and formulation of policy regarding perinatal health care reform and (2) to present an overview of the renewed perinatal health policy. The policy triangle framework for policy analysis by Walt and Gilson was applied(3). Contents of policy, actors, context factors and process factors were identified by triangulation of data from three sources: a document analysis, stakeholder analysis and semi-structured interviews with key stakeholders. Analysis enabled us to chronologically reconstruct the policy process in response to the perinatal mortality rates. The quantification of the perinatal mortality problem, the openness of the debate and the nature of the topic were important process factors. Main theme of policy was that change was required in the entire spectrum of perinatal healthcare. This ranged from care in the preconception phase through to the puerperium. Furthermore emphasis was placed on the importance of preventive measures and socio-environmental determinants of health. This required involvement of the preventive setting, including municipalities. The Dutch tiered perinatal healthcare system and divergent views amongst curative perinatal health care providers were important context factors. This study provides lessons which are applicable to health care professionals and policy makers in perinatal care or other multidisciplinary fields.

  2. Improving detection of differentially expressed gene sets by applying cluster enrichment analysis to Gene Ontology

    Directory of Open Access Journals (Sweden)

    Gu JianLei

    2009-08-01

    Full Text Available Abstract Background Gene set analysis based on Gene Ontology (GO can be a promising method for the analysis of differential expression patterns. However, current studies that focus on individual GO terms have limited analytical power, because the complex structure of GO introduces strong dependencies among the terms, and some genes that are annotated to a GO term cannot be found by statistically significant enrichment. Results We proposed a method for enriching clustered GO terms based on semantic similarity, namely cluster enrichment analysis based on GO (CeaGO, to extend the individual term analysis method. Using an Affymetrix HGU95aV2 chip dataset with simulated gene sets, we illustrated that CeaGO was sensitive enough to detect moderate expression changes. When compared to parent-based individual term analysis methods, the results showed that CeaGO may provide more accurate differentiation of gene expression results. When used with two acute leukemia (ALL and ALL/AML microarray expression datasets, CeaGO correctly identified specifically enriched GO groups that were overlooked by other individual test methods. Conclusion By applying CeaGO to both simulated and real microarray data, we showed that this approach could enhance the interpretation of microarray experiments. CeaGO is currently available at http://chgc.sh.cn/en/software/CeaGO/.

  3. Security Analysis and Improvement of User Authentication Framework for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Nan Chen

    2014-01-01

    Full Text Available Cloud Computing, as an emerging, virtual, large-scale distributed computing model, has gained increasing attention these years. Meanwhile it also faces many secure challenges, one of which is authentication. In this paper, we firstly analyze a user authentication framework for cloud computing proposed by Amlan Jyoti Choudhury et al and point out the security attacks existing in the protocol. Then we propose an improved user authentication scheme. Our improved protocol ensures user legitimacy before entering into the cloud. The confidentiality and the mutual authentication of our protocol are formally proved by the strand space model theory and the authentication test method. The simulation illustrates that the communication performance of our scheme is efficient

  4. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  5. Use of PSA Level 2 analysis for improving containment performance. Report of a technical committee meeting

    International Nuclear Information System (INIS)

    In order to discuss and exchange experience on different aspects of methods associated with Level 2 PSA and its applications for improving containment performance, the IAEA held a Technical Committee meeting in Vienna in December 1996. The meeting, which was attended by 26 participants from 20 Member States, provided a broad forum for discussion. The meeting addressed the issues related to the actual performance of Level 2 PSA studies as well as the insights gained from applications to improve containment performance. Particular attention was given to studies and applications for WWER type reactors, for which Level 2 work is still in its early stages, and for channel type reactors where modelling of accident progression is complex and significantly different from vessel type light water reactors. This TECDOC contains the papers presented at the meeting and the results of extensive discussions which were held in specific working groups

  6. Electromagnetic Analysis of Magnetic-Jack type CRDM for Thrust Force Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Hyung; Lee, Jae Sun; Kim, Ji Ho; Choi, Suhn; Park, Keun Bae [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2007-07-01

    The magnetic-jack (magjack) type is used for control rod drive mechanism (CRDM) of the System-integrated Modular Advanced Reactor (SMART). An arrangement of three flat-face plunger electromagnets, when energized in a controlled sequence, will lift, hold, or insert a rod cluster in the reactor core. Especially the thrust force of magjack type electromagnet for the SMART needs more than that of KSNP (Korean Standard Nuclear Power Plant) under the same spatial constraints. In order to achieve improved thrust force, numerical magnetic field calculations for various kinds of magnetic yoke configuration of electromagnet have been performed. As a result, we present the improved design of electromagnet of magjack type CRDM for the SMART.

  7. Electromagnetic Analysis of Magnetic-Jack type CRDM for Thrust Force Improvement

    International Nuclear Information System (INIS)

    The magnetic-jack (magjack) type is used for control rod drive mechanism (CRDM) of the System-integrated Modular Advanced Reactor (SMART). An arrangement of three flat-face plunger electromagnets, when energized in a controlled sequence, will lift, hold, or insert a rod cluster in the reactor core. Especially the thrust force of magjack type electromagnet for the SMART needs more than that of KSNP (Korean Standard Nuclear Power Plant) under the same spatial constraints. In order to achieve improved thrust force, numerical magnetic field calculations for various kinds of magnetic yoke configuration of electromagnet have been performed. As a result, we present the improved design of electromagnet of magjack type CRDM for the SMART

  8. Performance Efficiency Improvement of Parabolic Solar Concentrating Collector (An Experimental Evaluation and Analysis

    Directory of Open Access Journals (Sweden)

    Durai Kalyana Kumar

    2014-12-01

    Full Text Available Energy conserved is energy generated’. Energy crisis is one of the crucial problems faced by all countries due to the rapid depletion of natural resources. A viable and an immediate solution at this juncture is the use of renewable energy sources like solar energy, wind energy, etc. A focusing type solar energy concentrator was fabricated and tested to evaluate its performance and to improve its operation efficiency. The experimental evaluations were carried out during the solar window (between 9:00 am to 3:00 pm using the statistical solar irradiation data and the real time measurements carried out using a pyranometer. Efficiency improvement was tried through different reflecting surfaces, greenhouse effect and selective coating. The energy conservation, preservation of fossil fuel and carbon foot print were estimated along with the cost economics and presented in this article in a very simplified style.

  9. Genetic Diversity Analysis of Iranian Improved Rice Cultivars through RAPD Markers

    Directory of Open Access Journals (Sweden)

    Ghaffar KIANI

    2011-08-01

    Full Text Available The aim of this study was to evaluate the genetic diversity of Iranian improved rice varieties. Sixteen rice varieties of particular interest to breeding programs were evaluated by means of random amplified polymorphic DNA (RAPD technique. The number of amplification products generated by each primer varied from 4 (OPB-04 to 11 (OPD-11 with an average of 8.2 bands per primer. Out of 49 bands, 33 (67.35% were found to be polymorphic for one or more cultivars ranging from 4 to 9 fragments per primer. The size of amplified fragments ranged between 350 to 1800 bp. Pair-wise Nei and Li�s (1979 similarity estimated the range of 0.59 to 0.98 between rice cultivars. Results illustrate the potential of RAPD markers to distinguish improved cultivars at DNA level. The information will facilitate selection of genotypes to serve as parents for effective rice breeding programs in Iran.

  10. Improved Sinusoid Analysis and Post-Processing in Parametric Audio Coding

    Institute of Scientific and Technical Information of China (English)

    周宏; 陈健

    2003-01-01

    This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.

  11. An analysis of "eco-centric management" in China and methods to improve it

    Institute of Scientific and Technical Information of China (English)

    Li Zhengfeng; Zhang Fan

    2006-01-01

    This essay focuses on the question "what does the evidence on environmental regulation and its implementation tell us about the extent of eco-centric management in China and how to improve it". The first part will introduce ecocentrism, eco-centric management, and one major way to achieve eco-centric management in reality. Second, the environmental regulations of United Nation (UN) and China will be analyzed and compared to find out whether they are eco-centric. Moreover, the implementation of environmental regulation in China will be analyzed because regulation cannot exist without proper implementation. Three suggestions were given to improve eco-centric management in China:natural science research and public administration, environmental education, international cooperation.

  12. Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference

    CERN Document Server

    Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter

    2016-01-01

    A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...

  13. Cooperation Improves Success during Intergroup Competition: An Analysis Using Data from Professional Soccer Tournaments

    OpenAIRE

    David, Gwendolyn Kim; Wilson, Robbie Stuart

    2015-01-01

    The benefit mutually gained by cooperators is considered the ultimate explanation for why cooperation evolved among non-relatives. During intergroup competition, cooperative behaviours within groups that provide a competitive edge over their opposition should be favoured by selection, particularly in lethal human warfare. Aside from forming larger groups, three other ways that individuals within a group can cooperate to improve their chances of gaining a mutual benefit are: (i) greater networ...

  14. Analysis of Server Log by Web Usage Mining for Website Improvement

    OpenAIRE

    Navin Kumar Tyagi; A. K. Solanki; Manoj Wadhwa

    2010-01-01

    Web server logs stores click stream data which can be useful for mining purposes. The data is stored as a result of user's access to a website. Web usage mining an application of data mining can be used to discover user access patterns from weblog data. The obtained results are used in different applications like, site modifications, business intelligence, system improvement and personalization. In this study, we have analyzed the log files of smart sync software web server to get information...

  15. Genetic analysis, genetic improvement and evaluation of induced semi-dwarf mutants in wheat

    International Nuclear Information System (INIS)

    Recent results from breeding studies in T. aestivum wheats indicate that improved high yielding recombinants that carry the reduced height gene Rht13 from the semi-dwarf mutant Magnif 41 M1 in combination with Rht2 have been isolated. These improved lines should be useful in further breeding. In genetic analyses, additional data have confirmed that the reduced height gene Rht12 from the mutant Karcag 522M7K is strongly dominant, while typical epistatic, partially additive interactions may occur with other Rht genes and recombinations with different Rht or reduced height alleles can produce taller or shorter derivatives. Thus, the degree of dominance or recessiveness of Rht genes appears to be a continuum, with their expression in crosses further modified by epistatic interactions with other Rht alleles. Mutant Burt M860 was found to carry a new mutant gene Rht20 that is partially dominant for reduced height. The reduced height gene Rht11 of Bezostaja dwarf mutant Karlik-1 was largely recessive in the four combinations studied. In T. turgidum durum, the partially dominant Rht14 gene of 'Castelporziano' showed independent inheritance from Rht1. The inheritance of two other partially dominant induced mutant genes, respectively Rht16 of Edmore SD1 and Rht18, of 'Icaro' (from E.N.E.A., Italy) differed from Rht1 and Rht14. The Rht15 locus of 'Durox' showed less dominance than Rht14, and the two genes were independently inherited. Significant new useful genetic variation for breeding improved semi-dwarf bread and durum wheat cultivars has been induced. These mutants offer breeders greater freedom in choosing Rht genes and combinations for cross-breeding to control straw height and lodging and to improve harvest index. (author). 17 refs, 15 figs, 2 tabs

  16. Analysis of modernization of tire recycling machine for improvement of environmental sustainability and feasibility

    OpenAIRE

    Samarskiy, Boris

    2014-01-01

    The main idea of this thesis is twofold: first of all to develop utilization processes for used tires and, second, to study and explain the serious ecological problems in the tire recycling and waste utilization sector in Russia. This thesis was commissioned by a recycling firm called Istra Ecologia Company. The thesis presents improvements in a tire recycling machine owned by this company. The owner of the plant has developed a modernized version of the system, and seems to have solved som...

  17. SECURITY ANALYSIS AND IMPROVEMENT OF A NEW THRESHOLD MULTI-PROXY MULTI-SIGNATURE SCHEME

    Institute of Scientific and Technical Information of China (English)

    Lu Rongbo; He Dake; Wang Changji

    2008-01-01

    Kang, et al. [Journal of Electronics(China), 23(2006)4] proposed a threshold multi-proxy multi-signature scheme, and claimed the scheme satisfies the security requirements of threshold multi-proxy multi-signature. However, in this paper, two forgery attacks are proposed to show that their schemes have serious security flaws. To overcome theses flaws, an improvement on Kang, et al.'sscheme is proposed.

  18. Does Corporate Social Responsibility Improve Financial Performance of Nigerian Firms? Empirical Evidence from Triangulation Analysis

    OpenAIRE

    Fasanya Olaleke Ismail; Onakoya Adegbemi

    2013-01-01

    This paper examines the impact of Corporate Social Responsibility (CSR) on Financial Performance of Firms in Nigeria. This study utilizes primary data that were obtained through the use of structured questionnaires. The questions were structured in such a way as to gather pertinent and specific information on how effective Corporate Social Responsibility (CSR) has improved the financial viability of firms in Nigeria. This paper employs both descriptive and quantitative techniques in which chi...

  19. Improved neutron-gamma discrimination for a 6Li-glass neutron detector using digital signal analysis methods

    International Nuclear Information System (INIS)

    A 6Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 104. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a1, b0) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods

  20. Improved neutron-gamma discrimination for a 6Li-glass neutron detector using digital signal analysis methods

    Science.gov (United States)

    Wang, C. L.; Riedel, R. A.

    2016-01-01

    A 6Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 104. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a1, b0) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods.

  1. Improved neutron-gamma discrimination for a {sup 6}Li-glass neutron detector using digital signal analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C. L., E-mail: wangc@ornl.gov; Riedel, R. A. [Instrument and Source Division, Neutron Sciences Directorate, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States)

    2016-01-15

    A {sup 6}Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at Spallation Neutron Source. Traditional Pulse-Height Analysis (PHA) for Neutron-Gamma Discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 10{sup 4}. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, six digital signal analysis methods of individual waveforms acquired from photomultiplier tubes were proposed using (i) charge integration, (ii) pulse-amplitude histograms, (iii) power spectrum analysis combined with the maximum pulse-amplitude, (iv) two event parameters (a{sub 1}, b{sub 0}) obtained from a Wiener filter, (v) an effective amplitude (m) obtained from an adaptive least-mean-square filter, and (vi) a cross-correlation coefficient between individual and reference waveforms. The NGD ratios are about 70 times those from the traditional PHA method. Our results indicate the NGD capabilities of neutron Anger cameras based on GS20 scintillators can be significantly improved with digital signal analysis methods.

  2. The Fuel Efficiency of Maritime Transport. Potential for improvement and analysis of barriers

    Energy Technology Data Exchange (ETDEWEB)

    Faber, J.; Nelissen, D.; Smit, M. [CE Delft, Delft (Netherlands); Behrends, B. [Marena Ltd., s.l. (United Kingdom); Lee, D.S. [Manchester Metropolitan University, Machester (United Kingdom)

    2012-02-15

    There is significant potential to improve the fuel efficiency of ships and thus contribute to reducing greenhouse gas emissions from maritime transport. It has long been recognised that this potential is not being fully exploited, owing to the existence of non-market barriers. This report analyses the barriers to implementing fuel efficiency improvements, and concludes that the most important of these are the split incentive between ship owners and operators, a lack of trusted data on new technologies, and transaction costs associated with evaluating measures. As a result, in practice about a quarter of the cost-effective abatement potential is unavailable. There are several ways to overcome these barriers. The split incentive can - to some extent - be overcome by providing more detailed information on the fuel efficiency of vessels, making due allowance for operational profiles. This would allow fuel consumption to be more accurately projected and a larger share of efficiency benefits to accrue to ship owners, thus increasing the return on investment in fuel-saving technologies. This would also require changes to standard charter parties. The credibility of information on new technologies can be improved through intensive collaboration between suppliers of new technologies and shipping companies. In order to overcome risk, government subsidies could provide an incentive. This could have the additional benefit that governments could require publication of results.

  3. Energy analysis and improvement potential of finned double-pass solar collector

    International Nuclear Information System (INIS)

    Highlights: • The developed steady state model predicting the thermal performance of double-pass solar collectors is presented. • The main objective of this paper is to analyze the energy and exergy of finned double-pass solar collector. • A new mathematical model, solution procedure, and test results are presented. • The thermal performances and improvement potential of the double-pass solar collectors are discussed. - Abstract: Steady state energy balance equations for the finned double-pass solar collector have been developed. These equations were solved using the matrix inversion method. The predicted results were in agreement with the results obtained from the experiments. The predictions and experiments were observed at the mass flow rate ranging between 0.03 kg/s and 0.1 kg/s, and solar radiation ranging between 400 W/m2 and 800 W/m2. The effects of mass flow rates and solar radiation levels on energy efficiency, exergy efficiency and the improvement potential have been observed. The optimum energy efficiency is approximately 77%, which was observed at the mass flow rate of 0.09 kg/s. The optical efficiency of the finned double-pass solar collector is approximately 70–80%. The exergy efficiency is approximately 15–28% and improvement potential of 740–1070 W for a solar radiation of 425–790 W/m2

  4. Performance Analysis Of An Improved Graded Precision Localization Algorithm For Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sanat Sarangi

    2010-07-01

    Full Text Available In this paper an improved version of the graded precision localization algorithm GRADELOC, calledIGRADELOC is proposed. The performance of GRADELOC is dependent on the regions formed by theoverlapping radio ranges of the nodes of the underlying sensor network. A different region pattern couldsignificantly alter the nature and precision of localization. In IGRADELOC, two improvements aresuggested. Firstly, modifications are proposed in the radio range of the fixed-grid nodes, keeping in mindthe actual radio range of commonly available nodes, to allow for routing through them. Routing is notaddressed by GRADELOC, but is of prime importance to the deployment of any adhoc network,especially sensor networks. A theoretical model expressing the radio range in terms of the celldimensions of the grid infrastructure is proposed, to help in carrying out a deployment plan whichachieves the desirable precision of coarse-grained localization. Secondly, in GRADELOC it is observedthat fine-grained localization does not achieve significant performance benefits over coarse-grainedlocalization. In IGRADELOC, this factor is addressed with the introduction of a parameter that could beused to improve and fine-tune the precision of fine-grained localization..

  5. Analysis and improvement of the return loss performance in HTS filter subsystem

    International Nuclear Information System (INIS)

    Highlights: • Both an 8-pole HTS filter and a cryogenic LNA with high performance are fabricated. • The degradation of the return loss performance in the HTS filter subsystem is addressed. • We discuss and analyze the phase-matching effect between the HTS filter and the LNA. • The phase of the LNA’s S11a can change with its DC operating point. • By applying the “phase matching” process, the return loss has been improved to −17.5 dB easily. -- Abstract: In this article, both an 8-pole High Temperature Superconductor (HTS) filter and a cryogenic low noise amplifier (LNA) with high performance are fabricated. The HTS filter subsystem is realized by installing the HTS filter in front of the cryogenic LNA. Although the HTS filter has been fine tuned specially, the return loss performance of the HTS filter subsystem is often deteriorated due to the mismatch with the LNA. In order to improve the return loss performance of the HTS filter subsystem, we discuss and analyze the phase-matching effect between the HTS filter and the LNA. The article also proposes that tuning of the cryogenic LNA’s phase by adjusting its drain current is a helpful and relatively easy method to improve the HTS filter subsystem’s return loss performance in cryogenic environments

  6. Analysis of explosion in enclosure based on improved method of images

    Science.gov (United States)

    Wu, Z.; Guo, J.; Yao, X.; Chen, G.; Zhu, X.

    2016-05-01

    The aim of this paper is to present an improved method to calculate the pressure loading on walls during a confined explosion. When an explosion occurs inside of an enclosure, reflected shock waves produce multiple pressure peaks at a given wall location, especially at the corners. The effects of confined blast loading may bring about more serious damage to the structure due to multiple shock reflection. An approach, first proposed by Chan to describe the track of shock waves based on the mirror reflecting theory, using the method of images (MOI) is proposed to simplify internal explosion loading calculations. An improved method of images is proposed that takes into account wall openings and oblique reflections that cannot be considered with the standard MOI. The approach, validated using experimental data, provides a simplified and quick approach for loading calculation of a confined explosion. The results show that the peak overpressure tends to decline as the measurement point moves away from the center, and increases sharply as it approaches the enclosure corners. The specific impulse increases from the center to the corners. The improved method is capable of predicting pressure-time history and impulse with an accuracy comparable to that of three-dimensional AUTODYN code predictions.

  7. Novel approaches to the analysis of nuclear and other radioactive materials - Improving detection capability through alpha-gamma coincidence, alpha-induced optical fluorescence and advanced spectrum analysis

    OpenAIRE

    Ihantola, Sakari

    2013-01-01

    Nuclear and other radioactive materials pose a special concern in the proliferation of nuclear weapons, reactor accidents or through criminal acts. To prevent the adverse effects of the use of these materials, novel approaches for their detection and analysis are required. The objective of the research in this thesis was to improve the detection and characterisation of nuclear and other radioactive materials with radiometric methods. Radioactive sources can be detected and identified base...

  8. Improvement of Information and Methodical Provision of Macro-economic Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Tiurina Dina M.

    2014-02-01

    Full Text Available The article generalises and analyses main shortcomings of the modern system of macro-statistical analysis based on the use of the system of national accounts and balance of the national economy. The article proves on the basis of historic analysis of formation of indicators of the system of national accounts that problems with its practical use have both regional and global reasons. In order to eliminate impossibility of accounting life quality the article offers a system of quality indicators based on the general perception of wellbeing as assurance in own solvency of population and representative sampling of economic subjects.

  9. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  10. 'As-Built' site response analysis for nuclear power plants - an improvement to the 'State-of-the-Practice'

    International Nuclear Information System (INIS)

    The study presents a methodology to incorporate the results of a Probabilistic Seismic Hazard Analysis (PSHA) into a significantly improved site amplification analysis procedure. This technique is known as the Domain Reduction Method, consisting in a Finite Element Method (FEM) based approach that is capable of preserving the three-dimensional nature of seismic waves that originate from an earthquake. The Domain Reduction Method (DRM) also furnishes a more accurate representation of the soil conditions and of the seismic input when performing the SSI (Soil Structure Interaction), the SSSIS (Soil Structure Soil Interaction), and the StSI Analyses in one single step. As such, the procedure reduces the levels of epistemic uncertainty in the site analysis, eliminates numerous requirements of information exchange between analysts, and eliminates controversies surrounding the approaches to develop foundation input ground motion. This study also shows how the DRM methodology can be incorporated into existing FEM computer codes, making it available for industry use

  11. Air source absorption heat pump in district heating: Applicability analysis and improvement options

    International Nuclear Information System (INIS)

    Highlights: • Applicability of air source absorption heat pump (ASAHP) district heating is studied. • Return temperature and energy saving rate (ESR) in various conditions are optimized. • ASAHP is more suitable for shorter distance or lower temperature district heating. • Two options can reduce the primary return temperature and improve the applicability. • The maximum ESR is improved from 13.6% to 20.4–25.6% by compression-assisted ASAHP. - Abstract: The low-temperature district heating system based on the air source absorption heat pump (ASAHP) was assessed to have great energy saving potential. However, this system may require smaller temperature drop leading to higher pump consumption for long-distance distribution. Therefore, the applicability of ASAHP-based district heating system is analyzed for different primary return temperatures, pipeline distances, pipeline resistances, supplied water temperatures, application regions, and working fluids. The energy saving rate (ESR) under different conditions are calculated, considering both the ASAHP efficiency and the distribution consumption. Results show that ASAHP system is more suitable for short-distance district heating, while for longer-distance heating, lower supplied hot water temperature is preferred. In addition, the advantages of NH3/H2O are inferior to those of NH3/LiNO3, and the advantages for warmer regions and lower pipeline resistance are more obvious. The primary return temperatures are optimized to obtain maximum ESRs, after which the suitable distances under different acceptable ESRs are summarized. To improve the applicability of ASAHP, the integration of cascaded heat exchanger (CHX) and compression-assisted ASAHP (CASAHP) are proposed, which can reduce the primary return temperature. The integration of CHX can effectively improve the applicability of ASAHP under higher supplied water temperatures. As for the utilization of CASAHP, higher compression ratio (CR) is better in longer

  12. Exergy analysis: An efficient tool for understanding and improving hydrogen production via the steam methane reforming process

    International Nuclear Information System (INIS)

    Exergy analysis has been shown to be an efficient tool for understanding and improvement of industrial processes. In the present study, exergy analysis has been used to examine the energy consumption of an existing Steam Methane Reforming (SMR) process and then to test for possible savings in primary energy consumption and environmental protection. In the first step, energy and exergy balances of a steam methane reforming process were established to identify the thermodynamic imperfections of the process. Recommendations from this study have contributed to the building of a new and more efficient process. Consequently, a heat exchanger, corresponding to 44.9% of the total required area for the SMR heat exchange, has been incorporated in the SMR for waste heat recovery. The thermal and exergetic efficiencies of the original process are 70% and 65.5%, respectively. For the new process, the thermal and exergetic efficiencies are 74% and 69.1%, respectively. The unused exergy is reduced by 9.3% from 125.9 to 114.2 kJ per mole of H2 produced. One mole of methane produces 2.48 mol of H2 compared to 2.35 mol of H2 produced in the original process. Furthermore, the new SMR process produces the lower greenhouse gas emissions. - Highlights: ► Exergy analysis is used for evaluating a steam methane reforming process and for guiding efficiency-improvement efforts. ► The main part of the processes exergy destroyed occurs in the chemical reactors. ► To improve the exergetic efficiency the system components should be improved and/or the exhaust exergy should be decreased. ► Heat recovery not only helps to save energy but also decreases the environmental impact.

  13. Improved elucidation of biological processes linked to diabetic nephropathy by single probe-based microarray data analysis.

    Directory of Open Access Journals (Sweden)

    Clemens D Cohen

    Full Text Available BACKGROUND: Diabetic nephropathy (DN is a complex and chronic metabolic disease that evolves into a progressive fibrosing renal disorder. Effective transcriptomic profiling of slowly evolving disease processes such as DN can be problematic. The changes that occur are often subtle and can escape detection by conventional oligonucleotide DNA array analyses. METHODOLOGY/PRINCIPAL FINDINGS: We examined microdissected human renal tissue with or without DN using Affymetrix oligonucleotide microarrays (HG-U133A by standard Robust Multi-array Analysis (RMA. Subsequent gene ontology analysis by Database for Annotation, Visualization and Integrated Discovery (DAVID showed limited detection of biological processes previously identified as central mechanisms in the development of DN (e.g. inflammation and angiogenesis. This apparent lack of sensitivity may be associated with the gene-oriented averaging of oligonucleotide probe signals, as this includes signals from cross-hybridizing probes and gene annotation that is based on out of date genomic data. We then examined the same CEL file data using a different methodology to determine how well it could correlate transcriptomic data with observed biology. ChipInspector (CI is based on single probe analysis and de novo gene annotation that bypasses probe set definitions. Both methods, RMA and CI, used at default settings yielded comparable numbers of differentially regulated genes. However, when verified by RT-PCR, the single probe based analysis demonstrated reduced background noise with enhanced sensitivity and fewer false positives. CONCLUSIONS/SIGNIFICANCE: Using a single probe based analysis approach with de novo gene annotation allowed an improved representation of the biological processes linked to the development and progression of DN. The improved analysis was exemplified by the detection of Wnt signaling pathway activation in DN, a process not previously reported to be involved in this disease.

  14. Economic viewpoints in educational effectiveness : Cost-effectiveness analysis of an educational improvement project

    NARCIS (Netherlands)

    Creemers, B; van der Werf, G

    2000-01-01

    Cost-effectiveness analysis is not only important for decision making in educational policy and practice. Also within educational effectiveness research it is important to establish the costs of educational processes in relationship to their effects. The integrated multilevel educational effectivene

  15. Production yield analysis - a new systematic method for improvement of raw material yield

    NARCIS (Netherlands)

    Somsen, D.J.; Capelle, A.; Tramper, J.

    2004-01-01

    Production Yield Analysis (PYA) is a structured system approach to optimize the production yield of production processes. The paper outlines the developed method and the 10 basic steps of the PYA. The PYA-method makes it possible to calculate the Yield Index of a process. This dimensionless figure c

  16. Improved cosmological constraints from a joint analysis of the SDSS-II and SNLS supernova samples

    CERN Document Server

    Betoule, M; Guy, J; Mosher, J; Hardin, D; Biswas, R; Astier, P; El-Hage, P; Konig, M; Kuhlmann, S; Marriner, J; Pain, R; Regnault, N; Balland, C; Bassett, B A; Brown, P J; Campbell, H; Carlberg, R G; Cellier-Holzem, F; Cinabro, D; Conley, A; D'Andrea, C B; DePoy, D L; Doi, M; Ellis, R S; Fabbro, S; Filippenko, A V; Foley, R J; Frieman, J A; Fouchez, D; Galbany, L; Goobar, A; Gupta, R R; Hill, G J; Hlozek, R; Hogan, C J; Hook, I M; Howell, D A; Jha, S W; Guillou, L Le; Leloudas, G; Lidman, C; Marshall, J L; Möller, A; Mourão, A M; Neveu, J; Nichol, R; Olmstead, M D; Palanque-Delabrouille, N; Perlmutter, S; Prieto, J L; Pritchet, C J; Richmond, M; Riess, A G; Ruhlmann-Kleider, V; Sako, M; Schahmaneche, K; Schneider, D P; Smith, M; Sollerman, J; Sullivan, M; Walton, N A; Wheeler, C J

    2014-01-01

    We present cosmological constraints from a joint analysis of type Ia supernova (SN Ia) observations obtained by the SDSS-II and SNLS collaborations. The data set includes several low-redshift samples (z<0.1), all 3 seasons from the SDSS-II (0.05 < z < 0.4), and 3 years from SNLS (0.2

  17. Improving Student Critical Thinking and Perceptions of Critical Thinking through Direct Instruction in Rhetorical Analysis

    Science.gov (United States)

    McGuire, Lauren A.

    2010-01-01

    This study investigated the effect of direct instruction in rhetorical analysis on students' critical thinking abilities, including knowledge, skills, and dispositions. The researcher investigated student perceptions of the effectiveness of argument mapping; Thinker's Guides, based on Paul's model of critical thinking; and Socratic questioning.…

  18. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    Science.gov (United States)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  19. Structure analysis of interstellar clouds - I. Improving the Delta-variance method

    NARCIS (Netherlands)

    Ossenkopf, V.; Krips, M.; Stutzki, J.

    2008-01-01

    Context. The Delta-variance analysis, introduced as a wavelet-based measure for the statistical scaling of structures in astronomical maps, has proven to be an efficient and accurate method of characterising the power spectrum of interstellar turbulence. It has been applied to observed molecular clo

  20. Improved method for fibre content and quality analysis and their application to flax genetic diversity investigations

    NARCIS (Netherlands)

    Oever, van den M.J.A.; Bas, N.; Soest, van L.J.M.; Melis, C.; Dam, van J.E.G.

    2003-01-01

    Evaluation for fibre content and quality in a breeding selection program is time consuming and costly. Therefore, this study aims to develop a method for fast and reproducible fibre content analysis on small flax straw samples. A protocol has been developed and verified with fibre screening methods

  1. Using trajectory sensitivity analysis to find suitable locations of series compensators for improving rotor angle stability

    DEFF Research Database (Denmark)

    Nasri, Amin; Eriksson, Robert; Ghandhar, Mehrdad

    2014-01-01

    This paper proposes an approach based on trajectory sensitivity analysis (TSA) to find most suitable placement of series compensators in the power system. The main objective is to maximize the benefit of these devices in order to enhance the rotor angle stability. This approach is formulated as a......-machine 39-bus test system demonstrate the usefulness of the proposed method....

  2. Improving Treatment Plan Implementation in Schools: A Meta-Analysis of Single Subject Design Studies

    Science.gov (United States)

    Noell, George H.; Gansle, Kristin A.; Mevers, Joanna Lomas; Knox, R. Maria; Mintz, Joslyn Cynkus; Dahir, Amanda

    2014-01-01

    Twenty-nine peer-reviewed journal articles that analyzed intervention implementation in schools using single-case experimental designs were meta-analyzed. These studies reported 171 separate data paths and provided 3,991 data points. The meta-analysis was accomplished by fitting data extracted from graphs in mixed linear growth models. This…

  3. Meta-Analysis as a Choice to Improve Research in Career and Technical Education

    Science.gov (United States)

    Gordon, Howard R. D.; McClain, Clifford R.; Kim, Yeonsoo; Maldonado, Cecilia

    2010-01-01

    A search of the ERIC and Academic Search Premier data bases, and a comprehensive review of literature suggest that meta-analysis is ignored by career and technical education (CTE) researchers, a situation that is regrettable but remediable. The purpose of this theoretical paper is to provide CTE researchers and consumers with procedures for…

  4. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    Science.gov (United States)

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  5. Costs and benefits of automotive fuel economy improvement: A partial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. [Oak Ridge National Lab., TN (United States); Duleep, K.G. [Energy and Environmental Analysis, Inc., Arlington, VA (United States)

    1992-03-01

    This paper is an exercise in estimating the costs and benefits of technology-based fuel economy improvements for automobiles and light trucks. Benefits quantified include vehicle cots, fuel savings, consumer`s surplus effects, the effect of reduced weight on vehicle safety, impacts on emissions of CO{sub 2} and criteria pollutants, world oil market and energy security benefits, and the transfer of wealth from US consumes to oil producers. A vehicle stock model is used to capture sales, scrappage, and vehicle use effects under three fuel price scenarios. Three alternative fuel economy levels for 2001 are considered, ranging from 32.9 to 36.5 MPG for cars and 24.2 to 27.5 MPG for light trucks. Fuel economy improvements of this size are probably cost-effective. The size of the benefit, and whether there is a benefit, strongly depends on the financial costs of fuel economy improvement and judgments about the values of energy security, emissions, safety, etc. Three sets of values for eight parameters are used to define the sensitivity of costs and benefits to key assumptions. The net present social value (1989$) of costs and benefits ranges from a cost of $11 billion to a benefit of $286 billion. The critical parameters being the discount rate (10% vs. 3%) and the values attached to externalities. The two largest components are always the direct vehicle costs and fuel savings, but these tend to counterbalance each other for the fuel economy levels examined here. Other components are the wealth transfer, oil cost savings, CO{sub 2} emissions reductions, and energy security benefits. Safety impacts, emissions of criteria pollutants, and consumer`s surplus effects are relatively minor components. The critical issues for automotive fuel economy are therefore: (1) the value of present versus future costs and benefits, (2) the values of external costs and benefits, and (3) the financially cost-effective level of MPG achievable by available technology. 53 refs.

  6. Costs and benefits of automotive fuel economy improvement: A partial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greene, D.L. (Oak Ridge National Lab., TN (United States)); Duleep, K.G. (Energy and Environmental Analysis, Inc., Arlington, VA (United States))

    1992-03-01

    This paper is an exercise in estimating the costs and benefits of technology-based fuel economy improvements for automobiles and light trucks. Benefits quantified include vehicle cots, fuel savings, consumer's surplus effects, the effect of reduced weight on vehicle safety, impacts on emissions of CO{sub 2} and criteria pollutants, world oil market and energy security benefits, and the transfer of wealth from US consumes to oil producers. A vehicle stock model is used to capture sales, scrappage, and vehicle use effects under three fuel price scenarios. Three alternative fuel economy levels for 2001 are considered, ranging from 32.9 to 36.5 MPG for cars and 24.2 to 27.5 MPG for light trucks. Fuel economy improvements of this size are probably cost-effective. The size of the benefit, and whether there is a benefit, strongly depends on the financial costs of fuel economy improvement and judgments about the values of energy security, emissions, safety, etc. Three sets of values for eight parameters are used to define the sensitivity of costs and benefits to key assumptions. The net present social value (1989$) of costs and benefits ranges from a cost of $11 billion to a benefit of $286 billion. The critical parameters being the discount rate (10% vs. 3%) and the values attached to externalities. The two largest components are always the direct vehicle costs and fuel savings, but these tend to counterbalance each other for the fuel economy levels examined here. Other components are the wealth transfer, oil cost savings, CO{sub 2} emissions reductions, and energy security benefits. Safety impacts, emissions of criteria pollutants, and consumer's surplus effects are relatively minor components. The critical issues for automotive fuel economy are therefore: (1) the value of present versus future costs and benefits, (2) the values of external costs and benefits, and (3) the financially cost-effective level of MPG achievable by available technology. 53 refs.

  7. Enhancement on "Security analysis and improvements of arbitrated quantum signature schemes"

    CERN Document Server

    Hwang, Tzonelih; Chong, Song-Kong

    2011-01-01

    Recently, Zou et al. [Phys. Rev. A 82, 042325 (2010)] demonstrated that two arbitrated quantum signature (AQS) schemes are not secure, because an arbitrator cannot arbitrate the dispute between two users when a receiver repudiates the integrity of a signature. By using a public board, Zou et al. proposed two AQS schemes to solve the problem. This work shows that the same security problem may exist in Zou et al.'s schemes and also that a malicious party can reveal the other party's secret key without being detected by using Trojan-horse attacks. Accordingly, an improved scheme is proposed to resolve the problems.

  8. Analysis and Improvement of Aerodynamic Performance of Straight Bladed Vertical Axis Wind Turbines

    Science.gov (United States)

    Ahmadi-Baloutaki, Mojtaba

    Vertical axis wind turbines (VAWTs) with straight blades are attractive for their relatively simple structure and aerodynamic performance. Their commercialization, however, still encounters many challenges. A series of studies were conducted in the current research to improve the VAWTs design and enhance their aerodynamic performance. First, an efficient design methodology built on an existing analytical approach is presented to formulate the design parameters influencing a straight bladed-VAWT (SB-VAWT) aerodynamic performance and determine the optimal range of these parameters for prototype construction. This work was followed by a series of studies to collectively investigate the role of external turbulence on the SB-VAWTs operation. The external free-stream turbulence is known as one of the most important factors influencing VAWTs since this type of turbines is mainly considered for urban applications where the wind turbulence is of great significance. Initially, two sets of wind tunnel testing were conducted to study the variation of aerodynamic performance of a SB-VAWT's blade under turbulent flows, in two major stationary configurations, namely two- and three-dimensional flows. Turbulent flows generated in the wind tunnel were quasi-isotropic having uniform mean flow profiles, free of any wind shear effects. Aerodynamic force measurements demonstrated that the free-stream turbulence improves the blade aerodynamic performance in stall and post-stall regions by delaying the stall and increasing the lift-to-drag ratio. After these studies, a SB-VAWT model was tested in the wind tunnel under the same type of turbulent flows. The turbine power output was substantially increased in the presence of the grid turbulence at the same wind speeds, while the increase in turbine power coefficient due to the effect of grid turbulence was small at the same tip speed ratios. The final section presents an experimental study on the aerodynamic interaction of VAWTs in arrays

  9. Analysis of Server Log by Web Usage Mining for Website Improvement

    Directory of Open Access Journals (Sweden)

    Navin Kumar Tyagi

    2010-07-01

    Full Text Available Web server logs stores click stream data which can be useful for mining purposes. The data is stored as a result of user's access to a website. Web usage mining an application of data mining can be used to discover user access patterns from weblog data. The obtained results are used in different applications like, site modifications, business intelligence, system improvement and personalization. In this study, we have analyzed the log files of smart sync software web server to get information about visitors; top errors which can be utilized by system administrator and web designer to increase the effectiveness of the web site.

  10. How to improve mental health competency in general practice training?--a SWOT analysis.

    Science.gov (United States)

    van Marwijk, Harm

    2004-06-01

    It is quite evident there is room for improvement in the primary care management of common mental health problems. Patients respond positively when GPs adopt a more proactive role in this respect. The Dutch general practice curriculum is currently being renewed. The topics discussed here include the Strengths, Weaknesses, Opportunities and Threats (SWOT) of present primary mental healthcare teaching. What works well and what needs improving? Integrated teaching packages are needed to help general practice trainees manage various presentations of psychological distress. Such packages comprise training videotapes, in which models such as problem-solving treatment (PST) are demonstrated, as well as roleplaying material for new skills, self-report questionnaires for patients, and small-group video feedback of consultations. While GP trainees can effectively master such skills, it is important to query the level of proficiency required by registrars. Are these skills of use only to connoisseur GPs, or to all? More room for specialisation and differentiation among trainees may be the way forward. We have just developed a new curriculum for the obligatory three-month psychiatry housemanship. It is competency oriented, self-directed and assignment driven. This new curriculum will be evaluated in due course.

  11. Towards improved confinement: Analysis of the radial electric field in LHD

    International Nuclear Information System (INIS)

    The radial electric field (Er) properties in LHD have been investigated to give guidance towards improved confinement with a possible Er transition and bifurcation. The ambipolar Er is calculated from the neoclassical flux on the basis of analytical formulas. This approach is appropriate to clarifying the Er properties over a wide parameter range in a more transparent way. A comparison between calculated Er and the experimental values has shown a qualitatively good agreement, for example, for the threshold density for the transition from the ion root to the electron root. The calculations also reproduce well the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. On the basis of the usefulness of this approach, calculations over a wide range have been performed to clarify the parameter region where the Er transition and bifurcation are possible. This gives a comprehensive understanding of the parameter region of interest, which is valuable for the promotion of experiments towards improved confinement. (author)

  12. Shock reliability analysis and improvement of MEMS electret-based vibration energy harvesters

    Science.gov (United States)

    Renaud, M.; Fujita, T.; Goedbloed, M.; de Nooijer, C.; van Schaijk, R.

    2015-10-01

    Vibration energy harvesters can serve as a replacement solution to batteries for powering tire pressure monitoring systems (TPMS). Autonomous wireless TPMS powered by microelectromechanical system (MEMS) electret-based vibration energy harvester have been demonstrated. The mechanical reliability of the MEMS harvester still has to be assessed in order to bring the harvester to the requirements of the consumer market. It should survive the mechanical shocks occurring in the tire environment. A testing procedure to quantify the shock resilience of harvesters is described in this article. Our first generation of harvesters has a shock resilience of 400 g, which is far from being sufficient for the targeted application. In order to improve this aspect, the first important aspect is to understand the failure mechanism. Failure is found to occur in the form of fracture of the device’s springs. It results from impacts between the anchors of the springs when the harvester undergoes a shock. The shock resilience of the harvesters can be improved by redirecting these impacts to nonvital parts of the device. With this philosophy in mind, we design three types of shock absorbing structures and test their effect on the shock resilience of our MEMS harvesters. The solution leading to the best results consists of rigid silicon stoppers covered by a layer of Parylene. The shock resilience of the harvesters is brought above 2500 g. Results in the same range are also obtained with flexible silicon bumpers, which are simpler to manufacture.

  13. Computer-aided texture analysis combined with experts' knowledge: Improving endoscopic celiac disease diagnosis

    Science.gov (United States)

    Gadermayr, Michael; Kogler, Hubert; Karla, Maximilian; Merhof, Dorit; Uhl, Andreas; Vécsei, Andreas

    2016-01-01

    AIM To further improve the endoscopic detection of intestinal mucosa alterations due to celiac disease (CD). METHODS We assessed a hybrid approach based on the integration of expert knowledge into the computer-based classification pipeline. A total of 2835 endoscopic images from the duodenum were recorded in 290 children using the modified immersion technique (MIT). These children underwent routine upper endoscopy for suspected CD or non-celiac upper abdominal symptoms between August 2008 and December 2014. Blinded to the clinical data and biopsy results, three medical experts visually classified each image as normal mucosa (Marsh-0) or villous atrophy (Marsh-3). The experts’ decisions were further integrated into state-of-the-art texture recognition systems. Using the biopsy results as the reference standard, the classification accuracies of this hybrid approach were compared to the experts’ diagnoses in 27 different settings. RESULTS Compared to the experts’ diagnoses, in 24 of 27 classification settings (consisting of three imaging modalities, three endoscopists and three classification approaches), the best overall classification accuracies were obtained with the new hybrid approach. In 17 of 24 classification settings, the improvements achieved with the hybrid approach were statistically significant (P computer-aided diagnosis systems. PMID:27610022

  14. Improved response surface method and its application in stability reliability degree analysis of tunnel surrounding rock

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    An approach of limit state equation for surrounding rock was put forward based on deformation criterion. A method of symmetrical sampling of basic random variables adopted by classical response surface method was mended, and peak value and deflection degree of basic random variables distribution curve were took into account in the mended sampling method. A calculation way of probability moment, based on mended Rosenbluth method, suitable for non-explicit performance function was put forward.The first, second, third and fourth order moments of functional function value were calculated by mended Rosenbluth method through the first, second, third and fourth order moments of basic random variable. A probability density the function(PDF) of functional function was deduced through its first, second, third and fourth moments, the PDF in the new method took the place of the method of quadratic polynomial to approximate real functional function and reliability probability was calculated through integral by the PDF for random variable of functional function value in the new method. The result shows that the improved response surface method can adapt to various statistic distribution types of basic random variables, its calculation process is legible and need not iterative circulation. In addition, a stability probability of surrounding rock for a tunnel was calculated by the improved method,whose workload is only 30% of classical method and its accuracy is comparative.

  15. Improving breast cancer survival analysis through competition-based multidimensional modeling.

    Directory of Open Access Journals (Sweden)

    Erhan Bilal

    Full Text Available Breast cancer is the most common malignancy in women and is responsible for hundreds of thousands of deaths annually. As with most cancers, it is a heterogeneous disease and different breast cancer subtypes are treated differently. Understanding the difference in prognosis for breast cancer based on its molecular and phenotypic features is one avenue for improving treatment by matching the proper treatment with molecular subtypes of the disease. In this work, we employed a competition-based approach to modeling breast cancer prognosis using large datasets containing genomic and clinical information and an online real-time leaderboard program used to speed feedback to the modeling team and to encourage each modeler to work towards achieving a higher ranked submission. We find that machine learning methods combined with molecular features selected based on expert prior knowledge can improve survival predictions compared to current best-in-class methodologies and that ensemble models trained across multiple user submissions systematically outperform individual models within the ensemble. We also find that model scores are highly consistent across multiple independent evaluations. This study serves as the pilot phase of a much larger competition open to the whole research community, with the goal of understanding general strategies for model optimization using clinical and molecular profiling data and providing an objective, transparent system for assessing prognostic models.

  16. Walk This Way: Improving Pedestrian Agent-Based Models through Scene Activity Analysis

    Directory of Open Access Journals (Sweden)

    Andrew Crooks

    2015-09-01

    Full Text Available Pedestrian movement is woven into the fabric of urban regions. With more people living in cities than ever before, there is an increased need to understand and model how pedestrians utilize and move through space for a variety of applications, ranging from urban planning and architecture to security. Pedestrian modeling has been traditionally faced with the challenge of collecting data to calibrate and validate such models of pedestrian movement. With the increased availability of mobility datasets from video surveillance and enhanced geolocation capabilities in consumer mobile devices we are now presented with the opportunity to change the way we build pedestrian models. Within this paper we explore the potential that such information offers for the improvement of agent-based pedestrian models. We introduce a Scene- and Activity-Aware Agent-Based Model (SA2-ABM, a method for harvesting scene activity information in the form of spatiotemporal trajectories, and incorporate this information into our models. In order to assess and evaluate the improvement offered by such information, we carry out a range of experiments using real-world datasets. We demonstrate that the use of real scene information allows us to better inform our model and enhance its predictive capabilities.

  17. Analysis on Electromagnetic Characteristics of Research Reactor Control Rod Drive Mechanism for Thrust Force Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Hyung; Choi, Myoung Hwan; Yu, Je Yong; Cho, Yeong Garp; Kim, Jong In [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The control rod drive mechanism (CRDM) is the part of reactor regulating system (RRS), which is located in the reactor pool top or the room below the reactor pool. The function of the CRDM is to insert, withdraw or maintain neutron absorbing material (control rod) at any required position within the reactor core, in order to the reactivity of the core. There are so many kinds of CRDM, such as magnetic-jack type, hydraulic type, rack and pinion type, chain type and linear or rotary step motor and so on. As a part of a new project, we are investigating the movable coil electromagnetic drive mechanism (MCEDM) which is new scheme for the reactor control rod adopted by China Advanced Research Reactor (CARR). To have a better knowledge of the electromagnetic and magnetic characteristics, numerical models of MCEDM are proposed. Especially in order to achieve improved thrust force, numerical magnetic field calculations for various kinds of magnetic and electromagnetic configuration have been performed. As a result, we present the improved design of MCEDM for research reactor

  18. Design and Analysis of Multi Level D-STATCOM to Improve the Power Quality

    Directory of Open Access Journals (Sweden)

    Dinesh. Badavath,

    2014-01-01

    Full Text Available In the last decade, the electrical power quality issue has been the main concern of the power companies. Power quality is defined as the index which both the delivery and consumption of electric power affect on the performance of electrical apparatus. From a customer point of view, a power quality problem can be defined as any problem is manifested on voltage, current, or frequency deviation that results in power failure. The power electronics progressive, especially in flexible alternating-current transmission system (FACTS and custom power devices, affects power quality improvement. This paper presents an investigation of seven-Level Cascaded H - bridge (CHB Inverter as Distribution Static Compensator (DSTATCOM in Power System (PS for compensation of reactive power and harmonics. The advantages of CHB inverter are low harmonic distortion, reduced number of switches and suppression of switching losses. The DSTATCOM helps to improve the power factor and eliminate the Total Harmonics Distortion (THD drawn from a Non-Liner Diode Rectifier Load (NLDRL. The D-Q reference frame theory is used to generate the reference compensating currents for DSTATCOM while Proportional and Integral (PI control is used for capacitor dc voltage regulation. A CHB Inverter is considered for shunt compensation of a 11 Kv distribution system. Finally a level shifted PWM (LSPWM and phase shifted PWM (PSPWM techniques are adopted to investigate the performance of CHB Inverter. The results are obtained through Matlab/Simulink software package.

  19. Some drastic improvements found in the analysis of routing protocol for the Bluetooth technology using scatternet

    CERN Document Server

    Perwej, Yusuf; Jaleel, Uruj; Saxena, Sharad

    2012-01-01

    Bluetooth is a promising wireless technology that enables portable devices to form short-range wireless ad hoc networks. Unlike wireless LAN, the communication of Bluetooth devices follow a strict master slave relationship, that is, it is not possible for a slave device to directly communicate with another slave device even though they are within the radio coverage of each other. For inter piconet communication, a scatternet has to be formed, in which some Bluetooth devices have to act as bridge nodes between piconets. The Scatternet formed have following properties in which they are connected i.e every Bluetooth device can be reached from every other device, Piconet size is limited to eight nodes [1]. The author of this research paper have studied different type of routing protocol and have made efforts to improve throughput and reduce packet loss due to failure in the routing loop and increased mobility and improve the cohesive network structure, resolve the change topology conflicts [2], and a successful &...

  20. An Effective Analysis of Weblog Files to Improve Website Personalization for E-Business

    Directory of Open Access Journals (Sweden)

    Bhavyesh Gandhi

    2013-10-01

    Full Text Available The World Wide Web is a perennial repository of immense information. Web has provided with too many options and web users are overloaded with information. As there is an enormous growth in the web in terms of web sites, the size of web usage data is also increasing gradually. But this web usage data plays a vital role in the effective management of web sites. This web usage data is stored in a file called weblog by the web server. In order to discover the knowledge, required for improving the performance of websites and web personalization, we need to apply best preprocessing methodology on the server weblog files. Data preprocessing is a phase which automatically identifies the meaningful patterns and user behavior. So far analyzing the weblog data has been a challenging task in the area of web usage mining. Web personalization is most effective approach to overcome the problem of information overload.In this paper we propose an effective and enhanced data preprocessing methodology which produces an efficient usage patterns, reduces the size of weblog and produce a personalized website which help to improve E-business.The experimental results are also shown in the following chapters

  1. Benchmark Report on Key Outage Attributes: An Analysis of Outage Improvement Opportunities and Priorities

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn St. [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Farris, Ronald [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally do not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.

  2. Error analysis to improve the speech recognition accuracy on Telugu language

    Indian Academy of Sciences (India)

    N Usha Rani; P N Girija

    2012-12-01

    Speech is one of the most important communication channels among the people. Speech Recognition occupies a prominent place in communication between the humans and machine. Several factors affect the accuracy of the speech recognition system. Much effort was involved to increase the accuracy of the speech recognition system, still erroneous output is generating in current speech recognition systems. Telugu language is one of the most widely spoken south Indian languages. In the proposed Telugu speech recognition system, errors obtained from decoder are analysed to improve the performance of the speech recognition system. Static pronunciation dictionary plays a key role in the speech recognition accuracy. Modification should be performed in the dictionary, which is used in the decoder of the speech recognition system. This modification reduces the number of the confusion pairs which improves the performance of the speech recognition system. Language model scores are also varied with this modification. Hit rate is considerably increased during this modification and false alarms have been changing during the modification of the pronunciation dictionary. Variations are observed in different error measures such as F-measures, error-rate and Word Error Rate (WER) by application of the proposed method.

  3. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  4. A novel LIDAR-based Atmospheric Calibration Method for Improving the Data Analysis of MAGIC

    CERN Document Server

    Fruck, Christian; Zanin, Roberta; Dorner, Daniela; Garrido, Daniel; Mirzoyan, Razmik; Font, Lluis

    2014-01-01

    A new method for analyzing the returns of the custom-made 'micro'-LIDAR system, which is operated along with the two MAGIC telescopes, allows to apply atmospheric corrections in the MAGIC data analysis chain. Such corrections make it possible to extend the effective observation time of MAGIC under adverse atmospheric conditions and reduce the systematic errors of energy and flux in the data analysis. LIDAR provides a range-resolved atmospheric backscatter profile from which the extinction of Cherenkov light from air shower events can be estimated. Knowledge of the extinction can allow to reconstruct the true image parameters, including energy and flux. Our final goal is to recover the source-intrinsic energy spectrum also for data affected by atmospheric extinction from aerosol layers, such as clouds.

  5. A Data Matrix Method for Improving the Quantification of Element Percentages of SEM/EDX Analysis

    Science.gov (United States)

    Lane, John

    2009-01-01

    A simple 2D M N matrix involving sample preparation enables the microanalyst to peer below the noise floor of element percentages reported by the SEM/EDX (scanning electron microscopy/ energy dispersive x-ray) analysis, thus yielding more meaningful data. Using the example of a 2 3 sample set, there are M = 2 concentration levels of the original mix under test: 10 percent ilmenite (90 percent silica) and 20 percent ilmenite (80 percent silica). For each of these M samples, N = 3 separate SEM/EDX samples were drawn. In this test, ilmenite is the element of interest. By plotting the linear trend of the M sample s known concentration versus the average of the N samples, a much higher resolution of elemental analysis can be performed. The resulting trend also shows how the noise is affecting the data, and at what point (of smaller concentrations) is it impractical to try to extract any further useful data.

  6. Improving the Computational Morphological Analysis of a Swahili Corpus for Lexicographic Purposes

    Directory of Open Access Journals (Sweden)

    Guy De Pauw

    2011-10-01

    Full Text Available

    Abstract: Computational morphological analysis is an important first step in the automatic treatment of natural language and a useful lexicographic tool. This article describes a corpus-based approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation.

    Keywords: LEXICOGRAPHY, MORPHOLOGY, CORPUS ANNOTATION, LEMMATIZATION,MACHINE LEARNING, SWAHILI (KISWAHILI

    Samenvatting: Accuratere computationele morfologische analyse van eenSwahili corpus voor lexicografische doeleinden. Computationele morfologischeanalyse is een belangrijke eerste stap in de automatische verwerking van natuurlijke taal en eennuttig lexicografisch hulpmiddel. Dit artikel beschrijft een corpusgebaseerde aanpak voor de morfologischeanalyse van het Swahili. We concentreren ons hierbij vooral op de lemmatiseringseigenschappenvan het ontwikkelde systeem en evalueren het als een hulpmiddel bij de corpusgebaseerdeontwikkeling van woordenboeken.

    Sleutelwoorden: LEXICOGRAFIE, MORFOLOGIE, CORPUSANNOTATIE, LEMMATISERING,AUTOMATISCHE LEERTECHNIEKEN, SWAHILI (KISWAHILI

  7. Improved framework for the maintenance of the JET intershot analysis chain

    International Nuclear Information System (INIS)

    Highlights: ► A continuous integration of the development of analysis codes at JET was achieved. ► The maintenance of the highly available and traceable system was greatly rationalised. ► Historical configurations of the analysis chain can be automatically retrieved now. ► The tools for the new framework are provided with unit tests and documentation. -- Abstract: At the JET experiment data from routine diagnostics is analysed automatically by a suite of codes within minutes after operation. The maintenance of these interdependent codes and the provision of a consistent state of the physics database over many experimental campaigns against a backdrop of continuous hardware and software updates, requires well defined maintenance and validation procedures. In this paper, the development of a new generation of maintenance tools using distributed version control and a work-flow following the principle of continuous integration [1] is described

  8. Improved gene prediction by principal component analysis based autoregressive Yule-Walker method.

    Science.gov (United States)

    Roy, Manidipa; Barman, Soma

    2016-01-10

    Spectral analysis using Fourier techniques is popular with gene prediction because of its simplicity. Model-based autoregressive (AR) spectral estimation gives better resolution even for small DNA segments but selection of appropriate model order is a critical issue. In this article a technique has been proposed where Yule-Walker autoregressive (YW-AR) process is combined with principal component analysis (PCA) for reduction in dimensionality. The spectral peaks of DNA signal are used to detect protein-coding regions based on the 1/3 frequency component. Here optimal model order selection is no more critical as noise is removed by PCA prior to power spectral density (PSD) estimation. Eigenvalue-ratio is used to find the threshold between signal and noise subspaces for data reduction. Superiority of proposed method over fast Fourier Transform (FFT) method and autoregressive method combined with wavelet packet transform (WPT) is established with the help of receiver operating characteristics (ROC) and discrimination measure (DM) respectively.

  9. Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM

    OpenAIRE

    KAIYA, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji

    2010-01-01

    Quality requirements are scattered over a requirements specification. thus it Is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs We proposed a technique called "spectrum analysis for quality requirements" which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature...

  10. The analysis and improvement of the humidity problem on air-condition system of SSRF

    International Nuclear Information System (INIS)

    In this work, efforts were made to solve the problem of high relative humidity in some areas of the Shanghai Synchrotron Radiation Facility (SSRF). Based on data analysis, theoretical demonstration and field tests, we found that the problem of high humidity was caused by two factors. The high humidity problem was solved by appropriate measures of keeping the actual operation load match to the design load of the air-conditioning systems and minimizing the outdoor air infiltration. (authors)

  11. Improvement of seismic analysis methods for soil-building coupled systems

    International Nuclear Information System (INIS)

    It is important to take into account soil-foundation interaction for seismic analysis of nuclear plants. In the first part, a numerical method to calculate soil-equivalent springs and dashpots is developed for any soil configuration. In the second part, the behaviour of a PWR plant is studied during an earthquake. Three approaches are used: linear elastic evaluation, equivalent linear approach, non linear calculation with a cyclic behaviour law

  12. IS BITCOIN BUSINESS INCOME OR SPECULATIVE FOOLERY? NEW IDEAS THROUGH AN IMPROVED FREQUENCY DOMAIN ANALYSIS

    OpenAIRE

    JAMAL BOUOIYOUR; REFK SELMI; AVIRAL KUMAR TIWARI

    2015-01-01

    The present study addresses one of the most problematic phenomena: Bitcoin price. We explore the Granger causality for two relationships (Bitcoin price and trade transactions; Bitcoin price and investors' attractiveness) from a frequency domain perspective-based on unconditional and conditional data analysis. Accurately, this research empirically assesses the causal links between these variables unconditionally on the one hand and conditioning upon relevant control variables (recorded in lite...

  13. An improved allele-specific PCR primer design method for SNP marker analysis and its application

    OpenAIRE

    Liu Jing; Huang Shunmou; Sun Meiyu; Liu Shengyi; Liu Yumei; Wang Wanxing; Zhang Xiurong; Wang Hanzhong; Hua Wei

    2012-01-01

    Abstract Background Although Single Nucleotide Polymorphism (SNP) marker is an invaluable tool for positional cloning, association study and evolutionary analysis, low SNP detection efficiency by Allele-Specific PCR (AS-PCR) still restricts its application as molecular marker like other markers such as Simple Sequence Repeat (SSR). To overcome this problem, primers with a single nucleotide artificial mismatch introduced within the three bases closest to the 3’end (SNP site) have been used in ...

  14. A new approach for improving coronary plaque component analysis based on intravascular ultrasound images

    OpenAIRE

    Taki, Arash; Hetterich, Holger; Roodaki, Alireza; Setarehdan, S.K.; Ünal, Gözde; UNAL, GOZDE; Rieber, Johannes; Navab, Nassir; König, Andreas; Konig, Andreas

    2010-01-01

    Virtual histology intravascular ultrasound (VH-IVUS) is a clinically available technique for atherosclerosis plaque characterization. It, however, suffers from a poor longitudinal resolution due to electrocardiogram (ECG)-gated acquisition. This article presents an effective algorithm for IVUS image-based histology to overcome this limitation. After plaque area extraction within an input IVUS image, a textural analysis procedure consisting of feature extraction and classification steps is pro...

  15. An Improvement of Shotgun Proteomics Analysis by Adding Next-Generation Sequencing Transcriptome Data in Orange

    OpenAIRE

    Song, Jiaping; Sun, Renjie; Li, Dazhi; Tan, Fengji; Li, Xin; Jiang, Pingping; Huang, Xinjie; Lin, Liang; Deng, Ziniu; Zhang, Yong

    2012-01-01

    Background Shotgun proteomics data analysis usually relies on database search. Because commonly employed protein sequence databases of most species do not contain sufficient protein information, the application of shotgun proteomics to the research of protein sequence profile remains a big challenge, especially to the species whose genome has not been sequenced yet. Methodology/Principal Findings In this paper, we present a workflow with integrated database to partly address this problem. Fir...

  16. Spectral graph theory analysis of software-defined networks to improve performance and security

    OpenAIRE

    Parker, Thomas C.

    2015-01-01

    Software-defined networks are revolutionizing networking by providing unprecedented visibility into and control over data communication networks. The focus of this work is to develop a method to extract network features, develop a closed-loop control framework for a software-defined network, and build a test bed to validate the proposed scheme. The method developed to extract the network features is called the dual-basis analysis, which is based on the eigendecomposition of a weighted graph t...

  17. An improved silver staining procedure for schizodeme analysis in polyacrylamide gradient gels

    Directory of Open Access Journals (Sweden)

    Antonio M. Gonçalves

    1990-03-01

    Full Text Available A simple protocol is described for the silver staining of polyacrylamide gradient gels used for the separation of restriction fragments of kinetoplast DNA [schizodeme analysis of trypanosomatids (Morel et al., 1980]. The method overcomes the problems of non-uniform staining and strong background color which are frequently encountered when conventional protocols for silver staining of linear gels. The method described has proven to be of general applicability for DNA, RNA and protein separations in gradient gels.

  18. Multivariate genomic model improves analysis of oil palm (Elaeis guineensis Jacq.) progeny tests

    OpenAIRE

    Marchal, Alexandre; Legarra Albizu, Andres; Tisne, Sebastien; Carasco-Lacombe, Catherine; Manez, Aurore; Suryana, Edyana; Omoré, Alphonse; Nouy, Bruno; Durand-Gasselin, Tristan; Sanchez, Leopoldo; Bouvet, Jean-Marc; Cros, David

    2016-01-01

    Genomic selection is promising for plant breeding, particularly for perennial crops. Multivariate analysis, which considers several traits jointly, takes advantage of the genetic correlations to increase accuracy. The aim of this study was to empirically evaluate the potential of a univariate and multivariate genomic mixed model (G-BLUP) compared to the traditional univariate pedigree-based BLUP (T-BLUP) when analyzing progeny tests of oil palm, the world’s major oil crop. The dataset compris...

  19. Gravity Probe B Data Analysis. Status and Potential for Improved Accuracy of Scientific Results

    Science.gov (United States)

    Everitt, C. W. F.; Adams, M.; Bencze, W.; Buchman, S.; Clarke, B.; Conklin, J. W.; Debra, D. B.; Dolphin, M.; Heifetz, M.; Hipkins, D.; Holmes, T.; Keiser, G. M.; Kolodziejczak, J.; Li, J.; Lipa, J.; Lockhart, J. M.; Mester, J. C.; Muhlfelder, B.; Ohshima, Y.; Parkinson, B. W.; Salomon, M.; Silbergleit, A.; Solomonik, V.; Stahl, K.; Taber, M.; Turneaure, J. P.; Wang, S.; Worden, P. W.

    2009-12-01

    This is the first of five connected papers detailing progress on the Gravity Probe B (GP-B) Relativity Mission. GP-B, launched 20 April 2004, is a landmark physics experiment in space to test two fundamental predictions of Einstein’s general relativity theory, the geodetic and frame-dragging effects, by means of cryogenic gyroscopes in Earth orbit. Data collection began 28 August 2004 and science operations were completed 29 September 2005. The data analysis has proven deeper than expected as a result of two mutually reinforcing complications in gyroscope performance: (1) a changing polhode path affecting the calibration of the gyroscope scale factor C g against the aberration of starlight and (2) two larger than expected manifestations of a Newtonian gyro torque due to patch potentials on the rotor and housing. In earlier papers, we reported two methods, ‘geometric’ and ‘algebraic’, for identifying and removing the first Newtonian effect (‘misalignment torque’), and also a preliminary method of treating the second (‘roll-polhode resonance torque’). Central to the progress in both torque modeling and C g determination has been an extended effort on “Trapped Flux Mapping” commenced in November 2006. A turning point came in August 2008 when it became possible to include a detailed history of the resonance torques into the computation. The East-West (frame-dragging) effect is now plainly visible in the processed data. The current statistical uncertainty from an analysis of 155 days of data is 5.4 marc-s/yr (˜14% of the predicted effect), though it must be emphasized that this is a preliminary result requiring rigorous investigation of systematics by methods discussed in the accompanying paper by Muhlfelder et al. A covariance analysis incorporating models of the patch effect torques indicates that a 3-5% determination of frame-dragging is possible with more complete, computationally intensive data analysis.

  20. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    OpenAIRE

    M. Mosleh E. Abu Samak; Bakar, A. Ashrif A.; Muhammad Kashif; Mohd Saiful Dzulkifly Zan

    2016-01-01

    This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be...