WorldWideScience

Sample records for improved phosphopeptide analysis

  1. Phosphoric acid as a matrix additive for MALDI MS analysis of phosphopeptides and phosphoproteins

    DEFF Research Database (Denmark)

    Kjellström, Sven; Jensen, Ole Nørregaard

    2004-01-01

    Phosphopeptides are often detected with low efficiency by MALDI MS analysis of peptide mixtures. In an effort to improve the phosphopeptide ion response in MALDI MS, we investigated the effects of adding low concentrations of organic and inorganic acids during peptide sample preparation in 2,5-di...... acid to 2,5-DHB were also observed in LC-MALDI-MS analysis of tryptic phosphopeptides of B. subtilis PrkC phosphoprotein. Finally, the mass resolution of MALDI mass spectra of intact proteins was significantly improved by using phosphoric acid in 2,5-DHB matrix....

  2. Improved detection of hydrophilic phosphopeptides using graphite powder microcolumns and mass spectrometry: evidence for in vivo doubly phosphorylated dynamin I and dynamin III

    DEFF Research Database (Denmark)

    Larsen, Martin Røssel; Graham, Mark E; Robinson, Phillip J

    2004-01-01

    A common strategy in proteomics to improve the number and quality of peptides detected by mass spectrometry (MS) is to desalt and concentrate proteolytic digests using reversed phase (RP) chromatography prior to analysis. However, this does not allow for detection of small or hydrophilic peptides...... a large improvement in the detection of small amounts of phosphopeptides by MS and the approach has major implications for both small- and large-scale projects in phosphoproteomics.......A common strategy in proteomics to improve the number and quality of peptides detected by mass spectrometry (MS) is to desalt and concentrate proteolytic digests using reversed phase (RP) chromatography prior to analysis. However, this does not allow for detection of small or hydrophilic peptides......, or peptides altered in hydrophilicity such as phosphopeptides. We used microcolumns to compare the ability of RP resin or graphite powder to retain phosphopeptides. A number of standard phosphopeptides and a biologically relevant phosphoprotein, dynamin I, were analyzed. MS revealed that some phosphopeptides...

  3. A Multidimensional System for Phosphopeptide Analysis Using TiO{sub 2} Enrichment and Ion-exchange Chromatography with Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Kun; Yoo, Jisun; Kim, Eunmin; Kim, Jin Young; Kim, Young Hwan; Yoo, Jong Shin [Korea Basic Science Institute, Ochang (Korea, Republic of); Oh, Han Bin [Sogang Univ., Seoul (Korea, Republic of)

    2012-10-15

    Although offline enrichment of phosphorylated peptides is widely used, enrichment for phosphopeptides using TiO{sub 2} is often performed manually, which is labor-intensive and can lead to irreproducible results. To address the problems associated with offline enrichment and to improve the effectiveness of phosphopeptide detection, we developed an automated online enrichment system for phosphopeptide analysis. A standard protein mixture comprising BSA, fetuin, crystalline, α-casein and β-casein, and ovalbumin was assessed using our new system. Our multidimensional system has four main parts: a sample pump, a 20-mm TiO{sub 2}-based column, a weak anion-exchange, and a strong cation-exchange (2:1 WAX:SCX) separation column with LC/MS. Phosphorylated peptides were successfully detected using the TiO{sub 2}-based online system with little interference from nonphosphorylated peptides. Our results confirmed that our online enrichment system is a simple and efficient method for detecting phosphorylated peptides.

  4. Enhanced MALDI-TOF MS Analysis of Phosphopeptides Using an Optimized DHAP/DAHC Matrix

    Science.gov (United States)

    Hou, Junjie; Xie, Zhensheng; Xue, Peng; Cui, Ziyou; Chen, Xiulan; Li, Jing; Cai, Tanxi; Wu, Peng; Yang, Fuquan

    2010-01-01

    Selecting an appropriate matrix solution is one of the most effective means of increasing the ionization efficiency of phosphopeptides in matrix-assisted laser-desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS). In this study, we systematically assessed matrix combinations of 2, 6-dihydroxyacetophenone (DHAP) and diammonium hydrogen citrate (DAHC), and demonstrated that the low ratio DHAP/DAHC matrix was more effective in enhancing the ionization of phosphopeptides. Low femtomole level of phosphopeptides from the tryptic digests of α-casein and β-casein was readily detected by MALDI-TOF-MS in both positive and negative ion mode without desalination or phosphopeptide enrichment. Compared with the DHB/PA matrix, the optimized DHAP/DAHC matrix yielded superior sample homogeneity and higher phosphopeptide measurement sensitivity, particularly when multiple phosphorylated peptides were assessed. Finally, the DHAP/DAHC matrix was applied to identify phosphorylation sites from α-casein and β-casein and to characterize two phosphorylation sites from the human histone H1 treated with Cyclin-Dependent Kinase-1 (CDK1) by MALDI-TOF/TOF MS. PMID:20339515

  5. Enhanced MALDI-TOF MS Analysis of Phosphopeptides Using an Optimized DHAP/DAHC Matrix

    Directory of Open Access Journals (Sweden)

    Junjie Hou

    2010-01-01

    Full Text Available Selecting an appropriate matrix solution is one of the most effective means of increasing the ionization efficiency of phosphopeptides in matrix-assisted laser-desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS. In this study, we systematically assessed matrix combinations of 2, 6-dihydroxyacetophenone (DHAP and diammonium hydrogen citrate (DAHC, and demonstrated that the low ratio DHAP/DAHC matrix was more effective in enhancing the ionization of phosphopeptides. Low femtomole level of phosphopeptides from the tryptic digests of α-casein and β-casein was readily detected by MALDI-TOF-MS in both positive and negative ion mode without desalination or phosphopeptide enrichment. Compared with the DHB/PA matrix, the optimized DHAP/DAHC matrix yielded superior sample homogeneity and higher phosphopeptide measurement sensitivity, particularly when multiple phosphorylated peptides were assessed. Finally, the DHAP/DAHC matrix was applied to identify phosphorylation sites from α-casein and β-casein and to characterize two phosphorylation sites from the human histone H1 treated with Cyclin-Dependent Kinase-1 (CDK1 by MALDI-TOF/TOF MS.

  6. Evaluation of phosphopeptide enrichment strategies for quantitative TMT analysis of complex network dynamics in cancer-associated cell signalling

    Directory of Open Access Journals (Sweden)

    Benedetta Lombardi

    2015-03-01

    Full Text Available Defining alterations in signalling pathways in normal and malignant cells is becoming a major field in proteomics. A number of different approaches have been established to isolate, identify and quantify phosphorylated proteins and peptides. In the current report, a comparison between SCX prefractionation versus an antibody based approach, both coupled to TiO2 enrichment and applied to TMT labelled cellular lysates, is described. The antibody strategy was more complete for enriching phosphopeptides and allowed the identification of a large set of proteins known to be phosphorylated (715 protein groups with a minimum number of not previously known phosphorylated proteins (2.

  7. Computational analysis of phosphopeptide binding to the polo-box domain of the mitotic kinase PLK1 using molecular dynamics simulation.

    Directory of Open Access Journals (Sweden)

    David J Huggins

    2010-08-01

    Full Text Available The Polo-Like Kinase 1 (PLK1 acts as a central regulator of mitosis and is over-expressed in a wide range of human tumours where high levels of expression correlate with a poor prognosis. PLK1 comprises two structural elements, a kinase domain and a polo-box domain (PBD. The PBD binds phosphorylated substrates to control substrate phosphorylation by the kinase domain. Although the PBD preferentially binds to phosphopeptides, it has a relatively broad sequence specificity in comparison with other phosphopeptide binding domains. We analysed the molecular determinants of recognition by performing molecular dynamics simulations of the PBD with one of its natural substrates, CDC25c. Predicted binding free energies were calculated using a molecular mechanics, Poisson-Boltzmann surface area approach. We calculated the per-residue contributions to the binding free energy change, showing that the phosphothreonine residue and the mainchain account for the vast majority of the interaction energy. This explains the very broad sequence specificity with respect to other sidechain residues. Finally, we considered the key role of bridging water molecules at the binding interface. We employed inhomogeneous fluid solvation theory to consider the free energy of water molecules on the protein surface with respect to bulk water molecules. Such an analysis highlights binding hotspots created by elimination of water molecules from hydrophobic surfaces. It also predicts that a number of water molecules are stabilized by the presence of the charged phosphate group, and that this will have a significant effect on the binding affinity. Our findings suggest a molecular rationale for the promiscuous binding of the PBD and highlight a role for bridging water molecules at the interface. We expect that this method of analysis will be very useful for probing other protein surfaces to identify binding hotspots for natural binding partners and small molecule inhibitors.

  8. Phosphopeptide enrichment by immobilized metal affinity chromatography

    DEFF Research Database (Denmark)

    Thingholm, Tine E.; Larsen, Martin R.

    2016-01-01

    Immobilized metal affinity chromatography (IMAC) has been the method of choice for phosphopeptide enrichment prior to mass spectrometric analysis for many years and it is still used extensively in many laboratories. Using the affinity of negatively charged phosphate groups towards positively...... charged metal ions such as Fe3+, Ga3+, Al3+, Zr4+, and Ti4+ has made it possible to enrich phosphorylated peptides from peptide samples. However, the selectivity of most of the metal ions is limited, when working with highly complex samples, e.g., whole-cell extracts, resulting in contamination from...

  9. Highly efficient enrichment of phosphopeptides from HeLa cells using hollow magnetic macro/mesoporous TiO2 nanoparticles.

    Science.gov (United States)

    Hong, Yayun; Zhan, Qiliang; Pu, Chenlu; Sheng, Qianying; Zhao, Hongli; Lan, Minbo

    2018-09-01

    In this work, hollow magnetic macro/mesoporous TiO 2 nanoparticles (denoted as Fe 3 O 4 @H-fTiO 2 ) were synthesized by a facile "hydrothermal etching assisted crystallization" route to improve the phosphopeptide enrichment efficiency. The porous nanostructure of TiO 2 shell and large hollow space endowed the Fe 3 O 4 @H-fTiO 2 with a high surface area (144.71 m 2 g -1 ) and a large pore volume (0.52 cm 3 g -1 ), which could provide more affinity sites for phosphopeptide enrichment. Besides, the large pore size of TiO 2 nanosheets and large hollow space could effectively prevent the "shadow effect", thereby facilitating the diffusion and release of phosphopeptides. Compared with the hollow magnetic mesoporous TiO 2 with small and deep pores (denoted as Fe 3 O 4 @H-mTiO 2 ) and solid magnetic macro/mesoporous TiO 2 , the Fe 3 O 4 @H-fTiO 2 nanoparticles showed a better selectivity (molar ratio of α-casein/BSA up to 1:10000) and a higher sensitivity (0.2 fmol/μL α-casein) for phosphopeptide enrichment. Furthermore, 1485 unique phosphopeptides derived from 660 phosphoproteins were identified from HeLa cell extracts after enrichment with Fe 3 O 4 @H-fTiO 2 nanoparticles, further demonstrating that the Fe 3 O 4 @H-fTiO 2 nanoparticles had a high-efficiency performance for phosphopeptide enrichment. Taken together, the Fe 3 O 4 @H-fTiO 2 nanoparticles will have unique advantages in phosphoproteomics analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. The use of titanium dioxide micro-columns to selectively isolate phosphopeptides from proteolytic digests

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Larsen, Martin R

    2009-01-01

    Titanium dioxide has very high affinity for phosphopeptides and it has become an efficient alternative to already existing methods for phosphopeptide enrichment from complex samples. Peptide loading in a highly acidic environment in the presence of 2,5-dihydroxybenzoic acid (DHB), phthalic acid......, or glycolic acid has been shown to improve selectivity significantly by reducing unspecific binding from nonphosphorylated peptides. The enriched phosphopeptides bound to the titanium dioxide are subsequently eluted from the micro-column using an alkaline buffer. Titanium dioxide chromatography is extremely...... tolerant towards most buffers used in biological experiments. It is highly robust and as such it has become one of the methods of choice in large-scale phospho-proteomics. Here we describe the protocol for phosphopeptide enrichment using titanium dioxide chromatography followed by desalting...

  11. An Internal Standard for Assessing Phosphopeptide Recovery from Metal Ion/Oxide Enrichment Strategies

    Science.gov (United States)

    Paulo, Joao A.; Navarrete-Perea, Jose; Erickson, Alison R.; Knott, Jeffrey; Gygi, Steven P.

    2018-04-01

    Phosphorylation-mediated signaling pathways have major implications in cellular regulation and disease. However, proteins with roles in these pathways are frequently less abundant and phosphorylation is often sub-stoichiometric. As such, the efficient enrichment, and subsequent recovery of phosphorylated peptides, is vital. Mass spectrometry-based proteomics is a well-established approach for quantifying thousands of phosphorylation events in a single experiment. We designed a peptide internal standard-based assay directed toward sample preparation strategies for mass spectrometry analysis to understand better phosphopeptide recovery from enrichment strategies. We coupled mass-differential tandem mass tag (mTMT) reagents (specifically, TMTzero and TMTsuper-heavy), nine mass spectrometry-amenable phosphopeptides (phos9), and peak area measurements from extracted ion chromatograms to determine phosphopeptide recovery. We showcase this mTMT/phos9 recovery assay by evaluating three phosphopeptide enrichment workflows. Our assay provides data on the recovery of phosphopeptides, which complement other metrics, namely the number of identified phosphopeptides and enrichment specificity. Our mTMT/phos9 assay is applicable to any enrichment protocol in a typical experimental workflow irrespective of sample origin or labeling strategy. [Figure not available: see fulltext.

  12. Functionalized diamond nanopowder for phosphopeptides enrichment from complex biological fluids

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, Dilshad [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Najam-ul-Haq, Muhammad, E-mail: najamulhaq@bzu.edu.pk [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Institute of Analytical Chemistry and Radiochemistry, Leopold-Franzens University, Innrain 80-82, A-6020 Innsbruck (Austria); Jabeen, Fahmida; Ashiq, Muhammad N.; Athar, Muhammad [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Rainer, Matthias; Huck, Christian W.; Bonn, Guenther K. [Institute of Analytical Chemistry and Radiochemistry, Leopold-Franzens University, Innrain 80-82, A-6020 Innsbruck (Austria)

    2013-05-02

    Graphical abstract: -- Highlights: •Derivatization of diamond nanopowder as IMAC and RP. •Characterization with SEM, EDX and FT-IR. •Phosphopeptide enrichment from standard as well as real samples. •Desalting and human serum profiling with reproducible results. •MALDI-MS analysis with database identification. -- Abstract: Diamond is known for its high affinity and biocompatibility towards biomolecules and is used exclusively in separation sciences and life science research. In present study, diamond nanopowder is derivatized as Immobilized Metal Ion Affinity Chromatographic (IMAC) material for the phosphopeptides enrichment and as Reversed Phase (C-18) media for the desalting of complex mixtures and human serum profiling through MALDI-TOF-MS. Functionalized diamond nanopowder is characterized by Fourier transform infrared (FT-IR) spectroscopy, scanning electron microscopy (SEM) and energy dispersive X-ray (EDX) spectroscopy. Diamond-IMAC is applied to the standard protein (β-casein), spiked human serum, egg yolk and non-fat milk for the phosphopeptides enrichment. Results show the selectivity of synthesized IMAC-diamond immobilized with Fe{sup 3+} and La{sup 3+} ions. To comprehend the elaborated use, diamond-IMAC is also applied to the serum samples from gall bladder carcinoma for the potential biomarkers. Database search is carried out by the Mascot program ( (www.matrixscience.com)) for the assignment of phosphorylation sites. Diamond nanopowder is thus a separation media with multifunctional use and can be applied to cancer protein profiling for the diagnosis and biomarker identification.

  13. A novel strategy for phosphopeptide enrichment using lanthanide phosphate co-precipitation.

    Science.gov (United States)

    Mirza, Munazza Raza; Rainer, Matthias; Güzel, Yüksel; Choudhary, Iqbal M; Bonn, Günther K

    2012-08-01

    Reversible phosphorylation of proteins is a common theme in the regulation of important cellular functions such as growth, metabolism, and differentiation. The comprehensive understanding of biological processes requires the characterization of protein phosphorylation at the molecular level. Although, the number of cellular phosphoproteins is relatively high, the phosphorylated residues themselves are generally of low abundance due to the sub-stoichiometric nature. However, low abundance of phosphopeptides and low degree of phosphorylation typically necessitates isolation and concentration of phosphopeptides prior to mass spectrometric analysis. In this study, we used trivalent lanthanide ions (LaCl(3), CeCl(3), EuCl(3), TbCl(3), HoCl(3), ErCl(3), and TmCl(3)) for phosphopeptide enrichment and cleaning-up. Due to their low solubility product, lanthanide ions form stable complexes with the phosphate groups of phosphopeptides and precipitate out of solution. In a further step, non-phosphorylated compounds can easily be removed by simple centrifugation and washing before mass spectrometric analysis using Matrix-assisted laser desorption/ionisation-time of flight. The precipitation method was applied for the isolation of phosphopeptides from standard proteins such as ovalbumin, α-casein, and β-casein. High enrichment of phosphopeptides could also be achieved for real samples such as fresh milk and egg white. The technology presented here represents an excellent and highly selective tool for phosphopeptide recovery; it is easily applicable and shows several advantages as compared with standard approaches such as TiO(2) or IMAC.

  14. Crystal Structures and Thermodynamic Analysis Reveal Distinct Mechanisms of CD28 Phosphopeptide Binding to the Src Homology 2 (SH2) Domains of Three Adaptor Proteins*

    Science.gov (United States)

    Inaba, Satomi; Numoto, Nobutaka; Ogawa, Shuhei; Morii, Hisayuki; Ikura, Teikichi; Abe, Ryo; Ito, Nobutoshi; Oda, Masayuki

    2017-01-01

    Full activation of T cells and differentiation into effector T cells are essential for many immune responses and require co-stimulatory signaling via the CD28 receptor. Extracellular ligand binding to CD28 recruits protein-tyrosine kinases to its cytoplasmic tail, which contains a YMNM motif. Following phosphorylation of the tyrosine, the proteins growth factor receptor-bound protein 2 (Grb2), Grb2-related adaptor downstream of Shc (Gads), and p85 subunit of phosphoinositide 3-kinase may bind to pYMNM (where pY is phosphotyrosine) via their Src homology 2 (SH2) domains, leading to downstream signaling to distinct immune pathways. These three adaptor proteins bind to the same site on CD28 with variable affinity, and all are important for CD28-mediated co-stimulatory function. However, the mechanism of how these proteins recognize and compete for CD28 is unclear. To visualize their interactions with CD28, we have determined the crystal structures of Gads SH2 and two p85 SH2 domains in complex with a CD28-derived phosphopeptide. The high resolution structures obtained revealed that, whereas the CD28 phosphopeptide bound to Gads SH2 is in a bent conformation similar to that when bound to Grb2 SH2, it adopts a more extended conformation when bound to the N- and C-terminal SH2 domains of p85. These differences observed in the peptide-protein interactions correlated well with the affinity and other thermodynamic parameters for each interaction determined by isothermal titration calorimetry. The detailed insight into these interactions reported here may inform the development of compounds that specifically inhibit the association of CD28 with these adaptor proteins to suppress excessive T cell responses, such as in allergies and autoimmune diseases. PMID:27927989

  15. Crystal Structures and Thermodynamic Analysis Reveal Distinct Mechanisms of CD28 Phosphopeptide Binding to the Src Homology 2 (SH2) Domains of Three Adaptor Proteins.

    Science.gov (United States)

    Inaba, Satomi; Numoto, Nobutaka; Ogawa, Shuhei; Morii, Hisayuki; Ikura, Teikichi; Abe, Ryo; Ito, Nobutoshi; Oda, Masayuki

    2017-01-20

    Full activation of T cells and differentiation into effector T cells are essential for many immune responses and require co-stimulatory signaling via the CD28 receptor. Extracellular ligand binding to CD28 recruits protein-tyrosine kinases to its cytoplasmic tail, which contains a YMNM motif. Following phosphorylation of the tyrosine, the proteins growth factor receptor-bound protein 2 (Grb2), Grb2-related adaptor downstream of Shc (Gads), and p85 subunit of phosphoinositide 3-kinase may bind to pYMNM (where pY is phosphotyrosine) via their Src homology 2 (SH2) domains, leading to downstream signaling to distinct immune pathways. These three adaptor proteins bind to the same site on CD28 with variable affinity, and all are important for CD28-mediated co-stimulatory function. However, the mechanism of how these proteins recognize and compete for CD28 is unclear. To visualize their interactions with CD28, we have determined the crystal structures of Gads SH2 and two p85 SH2 domains in complex with a CD28-derived phosphopeptide. The high resolution structures obtained revealed that, whereas the CD28 phosphopeptide bound to Gads SH2 is in a bent conformation similar to that when bound to Grb2 SH2, it adopts a more extended conformation when bound to the N- and C-terminal SH2 domains of p85. These differences observed in the peptide-protein interactions correlated well with the affinity and other thermodynamic parameters for each interaction determined by isothermal titration calorimetry. The detailed insight into these interactions reported here may inform the development of compounds that specifically inhibit the association of CD28 with these adaptor proteins to suppress excessive T cell responses, such as in allergies and autoimmune diseases. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Casein phosphopeptides drastically increase the secretion of extracellular proteins in Aspergillus awamori. Proteomics studies reveal changes in the secretory pathway.

    Science.gov (United States)

    Kosalková, Katarina; García-Estrada, Carlos; Barreiro, Carlos; Flórez, Martha G; Jami, Mohammad S; Paniagua, Miguel A; Martín, Juan F

    2012-01-10

    The secretion of heterologous animal proteins in filamentous fungi is usually limited by bottlenecks in the vesicle-mediated secretory pathway. Using the secretion of bovine chymosin in Aspergillus awamori as a model, we found a drastic increase (40 to 80-fold) in cells grown with casein or casein phosphopeptides (CPPs). CPPs are rich in phosphoserine, but phosphoserine itself did not increase the secretion of chymosin. The stimulatory effect is reduced about 50% using partially dephosphorylated casein and is not exerted by casamino acids. The phosphopeptides effect was not exerted at transcriptional level, but instead, it was clearly observed on the secretion of chymosin by immunodetection analysis. Proteomics studies revealed very interesting metabolic changes in response to phosphopeptides supplementation. The oxidative metabolism was reduced, since enzymes involved in fermentative processes were overrepresented. An oxygen-binding hemoglobin-like protein was overrepresented in the proteome following phosphopeptides addition. Most interestingly, the intracellular pre-protein enzymes, including pre-prochymosin, were depleted (most of them are underrepresented in the intracellular proteome after the addition of CPPs), whereas the extracellular mature form of several of these secretable proteins and cell-wall biosynthetic enzymes was greatly overrepresented in the secretome of phosphopeptides-supplemented cells. Another important 'moonlighting' protein (glyceraldehyde-3-phosphate dehydrogenase), which has been described to have vesicle fusogenic and cytoskeleton formation modulating activities, was clearly overrepresented in phosphopeptides-supplemented cells. In summary, CPPs cause the reprogramming of cellular metabolism, which leads to massive secretion of extracellular proteins.

  17. Improving the Phosphoproteome Coverage for Limited Sample Amounts Using TiOsub>2sub>-SIMAC-HILIC (TiSH) Phosphopeptide Enrichment and Fractionation

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin R

    2016-01-01

    spectrometry (LC-MS/MS) analysis. Due to the sample loss resulting from fractionation, this procedure is mainly performed when large quantities of sample are available. To make large-scale phosphoproteomics applicable to smaller amounts of protein we have recently combined highly specific TiO2-based...... protocol we describe the procedure step by step to allow for comprehensive coverage of the phosphoproteome utilizing only a few hundred micrograms of protein....

  18. Parallel reaction monitoring on a Q Exactive mass spectrometer increases reproducibility of phosphopeptide detection in bacterial phosphoproteomics measurements.

    Science.gov (United States)

    Taumer, Christoph; Griesbaum, Lena; Kovacevic, Alen; Soufi, Boumediene; Nalpas, Nicolas C; Macek, Boris

    2018-03-29

    Increasing number of studies report the relevance of protein Ser/Thr/Tyr phosphorylation in bacterial physiology, yet the analysis of this type of modification in bacteria still presents a considerable challenge. Unlike in eukaryotes, where tens of thousands of phosphorylation events likely occupy more than two thirds of the proteome, the abundance of protein phosphorylation is much lower in bacteria. Even the state-of-the-art phosphopeptide enrichment protocols fail to remove the high background of abundant unmodified peptides, leading to low signal intensity and undersampling of phosphopeptide precursor ions in consecutive data-dependent MS runs. Consequently, large-scale bacterial phosphoproteomic datasets often suffer from poor reproducibility and a high number of missing values. Here we explore the application of parallel reaction monitoring (PRM) on a Q Exactive mass spectrometer in bacterial phosphoproteome analysis, focusing especially on run-to-run sampling reproducibility. In multiple measurements of identical phosphopeptide-enriched samples, we show that PRM outperforms data-dependent acquisition (DDA) in terms of detection frequency, reaching almost complete sampling efficiency, compared to 20% in DDA. We observe a similar trend over multiple heterogeneous phosphopeptide-enriched samples and conclude that PRM shows a great promise in bacterial phosphoproteomics analyses where reproducible detection and quantification of a relatively small set of phosphopeptides is desired. Bacterial phosphorylated peptides occur in low abundance compared to their unmodified counterparts, and are therefore rarely reproducibly detected in shotgun (DDA) proteomics measurements. Here we show that parallel reaction monitoring complements DDA analyses and makes detection of known, targeted phosphopeptides more reproducible. This will be of significance in replicated MS measurements that have a goal to reproducibly detect and quantify phosphopeptides of interest. Copyright

  19. Proteolytic Digestion and TiO2 Phosphopeptide Enrichment Microreactor for Fast MS Identification of Proteins

    Science.gov (United States)

    Deng, Jingren; Lazar, Iulia M.

    2016-04-01

    The characterization of phosphorylation state(s) of a protein is best accomplished by using isolated or enriched phosphoprotein samples or their corresponding phosphopeptides. The process is typically time-consuming as, often, a combination of analytical approaches must be used. To facilitate throughput in the study of phosphoproteins, a microreactor that enables a novel strategy for performing fast proteolytic digestion and selective phosphopeptide enrichment was developed. The microreactor was fabricated using 100 μm i.d. fused-silica capillaries packed with 1-2 mm beds of C18 and/or TiO2 particles. Proteolytic digestion-only, phosphopeptide enrichment-only, and sequential proteolytic digestion/phosphopeptide enrichment microreactors were developed and tested with standard protein mixtures. The protein samples were adsorbed on the C18 particles, quickly digested with a proteolytic enzyme infused over the adsorbed proteins, and further eluted onto the TiO2 microreactor for enrichment in phosphopeptides. A number of parameters were optimized to speed up the digestion and enrichments processes, including microreactor dimensions, sample concentrations, digestion time, flow rates, buffer compositions, and pH. The effective time for the steps of proteolytic digestion and enrichment was less than 5 min. For simple samples, such as standard protein mixtures, this approach provided equivalent or better results than conventional bench-top methods, in terms of both enzymatic digestion and selectivity. Analysis times and reagent costs were reduced ~10- to 15-fold. Preliminary analysis of cell extracts and recombinant proteins indicated the feasibility of integration of these microreactors in more advanced workflows amenable for handling real-world biological samples.

  20. Development of an enrichment method for endogenous phosphopeptide characterization in human serum.

    Science.gov (United States)

    La Barbera, Giorgia; Capriotti, Anna Laura; Cavaliere, Chiara; Ferraris, Francesca; Laus, Michele; Piovesana, Susy; Sparnacci, Katia; Laganà, Aldo

    2018-01-01

    The work describes the development of an enrichment method for the analysis of endogenous phosphopeptides in serum. Endogenous peptides can play significant biological roles, and some of them could be exploited as future biomarkers. In this context, blood is one of the most useful biofluids for screening, but a systematic investigation of the endogenous peptides, especially phosphorylated ones, is still lacking, mainly due to the lack of suitable analytical methods. Thus, in this paper, different phosphopeptide enrichment strategies were pursued, based either on metal oxide affinity chromatography (MOAC, in the form of commercial TiO 2 spin columns or magnetic graphitized carbon black-TiO 2 composite), or on immobilized metal ion affinity chromatography (IMAC, in the form of Ti 4+ -IMAC magnetic material or commercial Fe 3+ -IMAC spin columns). While MOAC strategies proved completely unsuccessful, probably due to interfering phospholipids displacing phosphopeptides, the IMAC materials performed very well. Different sample preparation strategies were tested, comprising direct dilution with the loading buffer, organic solvent precipitation, and lipid removal from the matrix, as well as the addition of phosphatase inhibitors during sample handling for maximized endogenous phosphopeptide enrichment. All data were acquired by a shotgun peptidomics approach, in which peptide samples were separated by reversed-phase nanoHPLC hyphenated with high-resolution tandem mass spectrometry. The devised method allowed the identification of 176 endogenous phosphopeptides in fresh serum added with inhibitors by the direct dilution protocol and the Ti 4+ -IMAC magnetic material enrichment, but good results could also be obtained from the commercial Fe 3+ -IMAC spin column adapted to the batch enrichment protocol.

  1. High-Throughput Quantification of SH2 Domain-Phosphopeptide Interactions with Cellulose-Peptide Conjugate Microarrays.

    Science.gov (United States)

    Engelmann, Brett W

    2017-01-01

    The Src Homology 2 (SH2) domain family primarily recognizes phosphorylated tyrosine (pY) containing peptide motifs. The relative affinity preferences among competing SH2 domains for phosphopeptide ligands define "specificity space," and underpins many functional pY mediated interactions within signaling networks. The degree of promiscuity exhibited and the dynamic range of affinities supported by individual domains or phosphopeptides is best resolved by a carefully executed and controlled quantitative high-throughput experiment. Here, I describe the fabrication and application of a cellulose-peptide conjugate microarray (CPCMA) platform to the quantitative analysis of SH2 domain specificity space. Included herein are instructions for optimal experimental design with special attention paid to common sources of systematic error, phosphopeptide SPOT synthesis, microarray fabrication, analyte titrations, data capture, and analysis.

  2. Casein phosphopeptides drastically increase the secretion of extracellular proteins in Aspergillus awamori. Proteomics studies reveal changes in the secretory pathway

    Directory of Open Access Journals (Sweden)

    Kosalková Katarina

    2012-01-01

    Full Text Available Abstract Background The secretion of heterologous animal proteins in filamentous fungi is usually limited by bottlenecks in the vesicle-mediated secretory pathway. Results Using the secretion of bovine chymosin in Aspergillus awamori as a model, we found a drastic increase (40 to 80-fold in cells grown with casein or casein phosphopeptides (CPPs. CPPs are rich in phosphoserine, but phosphoserine itself did not increase the secretion of chymosin. The stimulatory effect is reduced about 50% using partially dephosphorylated casein and is not exerted by casamino acids. The phosphopeptides effect was not exerted at transcriptional level, but instead, it was clearly observed on the secretion of chymosin by immunodetection analysis. Proteomics studies revealed very interesting metabolic changes in response to phosphopeptides supplementation. The oxidative metabolism was reduced, since enzymes involved in fermentative processes were overrepresented. An oxygen-binding hemoglobin-like protein was overrepresented in the proteome following phosphopeptides addition. Most interestingly, the intracellular pre-protein enzymes, including pre-prochymosin, were depleted (most of them are underrepresented in the intracellular proteome after the addition of CPPs, whereas the extracellular mature form of several of these secretable proteins and cell-wall biosynthetic enzymes was greatly overrepresented in the secretome of phosphopeptides-supplemented cells. Another important 'moonlighting' protein (glyceraldehyde-3-phosphate dehydrogenase, which has been described to have vesicle fusogenic and cytoskeleton formation modulating activities, was clearly overrepresented in phosphopeptides-supplemented cells. Conclusions In summary, CPPs cause the reprogramming of cellular metabolism, which leads to massive secretion of extracellular proteins.

  3. Characterization of casein phosphopeptides from fermented milk products.

    Science.gov (United States)

    Kawahara, Takeshi; Aruga, Kaori; Otani, Hajime

    2005-10-01

    This study dealt with the potential of fermented milk products as a source of functional casein phosphopeptides (CPPs) using plain yogurts and Camembert cheeses. The CPPs were prepared by tryptic digestion from four commercially available plain yogurts (P1-P4), five Camembert cheeses (C1-C5), and raw milk. From portions with a 1-g protein content of the plain yogurts, the Camembert cheeses, and the raw milk, 171 mg, 139 mg, and 146 mg of CPPs were obtained, respectively. The Camembert cheeses retained high amounts of organic phosphorus (32 microg) per 1 mg CPPs compared to the raw milk (15 microg) and plain yogurts (16 microg). Reverse-phase high-performance liquid chromatographic analysis showed that the elution patterns and retention times of the three major peaks of CPPs from P1 and C1 were similar to those from raw milk. Moreover, CPPs from P1 and C1 showed a mitogenic effect, while CPPs from C1 showed an IgA-enhancing effect in mouse spleen cell cultures. These results suggest that fermented milk products such as plain yogurts and Camembert cheeses generate functional CPPs in the body and exert beneficial effects on the immune system.

  4. Phosphopeptide enrichment with inorganic nanofibers prepared by forcespinning technology

    Czech Academy of Sciences Publication Activity Database

    Křenková, Jana; Morávková, J.; Buk, J.; Foret, František

    2016-01-01

    Roč. 1427, JAN (2016), s. 8-15 ISSN 0021-9673 R&D Projects: GA ČR(CZ) GA14-06319S; GA ČR(CZ) GBP206/12/G014 Institutional support: RVO:68081715 Keywords : nanofibers * enrichment * phosphopeptides Subject RIV: CB - Analytical Chemistry , Separation Impact factor: 3.981, year: 2016

  5. Phosphopeptide enrichment with inorganic nanofibers prepared by forcespinning technology

    Czech Academy of Sciences Publication Activity Database

    Křenková, Jana; Morávková, J.; Buk, J.; Foret, František

    2016-01-01

    Roč. 1427, JAN (2016), s. 8-15 ISSN 0021-9673 R&D Projects: GA ČR(CZ) GA14-06319S; GA ČR(CZ) GBP206/12/G014 Institutional support: RVO:68081715 Keywords : nanofibers * enrichment * phosphopeptides Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.981, year: 2016

  6. Optimized IMAC-IMAC protocol for phosphopeptide recovery from complex biological samples

    DEFF Research Database (Denmark)

    Ye, Juanying; Zhang, Xumin; Young, Clifford

    2010-01-01

    using Fe(III)-NTA IMAC resin and it proved to be highly selective in the phosphopeptide enrichment of a highly diluted standard sample (1:1000) prior to MALDI MS analysis. We also observed that a higher iron purity led to an increased IMAC enrichment efficiency. The optimized method was then adapted...... to phosphoproteome analyses of cell lysates of high protein complexity. From either 20 microg of mouse sample or 50 microg of Drosophila melanogaster sample, more than 1000 phosphorylation sites were identified in each study using IMAC-IMAC and LC-MS/MS. We demonstrate efficient separation of multiply phosphorylated...... characterization of phosphoproteins in functional phosphoproteomics research projects....

  7. 3-Aminoquinoline/p-coumaric acid as a MALDI matrix for glycopeptides, carbohydrates, and phosphopeptides.

    Science.gov (United States)

    Fukuyama, Yuko; Funakoshi, Natsumi; Takeyama, Kohei; Hioki, Yusaku; Nishikaze, Takashi; Kaneshiro, Kaoru; Kawabata, Shin-Ichirou; Iwamoto, Shinichi; Tanaka, Koichi

    2014-02-18

    Glycosylation and phosphorylation are important post-translational modifications in biological processes and biomarker research. The difficulty in analyzing these modifications is mainly their low abundance and dissociation of labile regions such as sialic acids or phosphate groups. One solution in matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is to improve matrices for glycopeptides, carbohydrates, and phosphopeptides by increasing the sensitivity and suppressing dissociation of the labile regions. Recently, a liquid matrix 3-aminoquinoline (3-AQ)/α-cyano-4-hydroxycinnamic acid (CHCA) (3-AQ/CHCA), introduced by Kolli et al. in 1996, has been reported to increase sensitivity for carbohydrates or phosphopeptides, but it has not been systematically evaluated for glycopeptides. In addition, 3-AQ/CHCA enhances the dissociation of labile regions. In contrast, a liquid matrix 1,1,3,3-tetramethylguanidium (TMG, G) salt of p-coumaric acid (CA) (G3CA) was reported to suppress dissociation of sulfate groups or sialic acids of carbohydrates. Here we introduce a liquid matrix 3-AQ/CA for glycopeptides, carbohydrates, and phosphopeptides. All of the analytes were detected as [M + H](+) or [M - H](-) with higher or comparable sensitivity using 3-AQ/CA compared with 3-AQ/CHCA or 2,5-dihydroxybenzoic acid (2,5-DHB). The sensitivity was increased 1- to 1000-fold using 3-AQ/CA. The dissociation of labile regions such as sialic acids or phosphate groups and the fragmentation of neutral carbohydrates were suppressed more using 3-AQ/CA than using 3-AQ/CHCA or 2,5-DHB. 3-AQ/CA was thus determined to be an effective MALDI matrix for high sensitivity and the suppression of dissociation of labile regions in glycosylation and phosphorylation analyses.

  8. A novel tantalum-based sol-gel packed microextraction syringe for highly specific enrichment of phosphopeptides in MALDI-MS applications.

    Science.gov (United States)

    Çelikbıçak, Ömür; Atakay, Mehmet; Güler, Ülkü; Salih, Bekir

    2013-08-07

    A new tantalum-based sol-gel material was synthesized using a unique sol-gel synthesis pathway by PEG incorporation into the sol-gel structure without performing a calcination step. This improved its chemical and physical properties for the high capacity and selective enrichment of phosphopeptides from protein digests in complex biological media. The specificity of the tantalum-based sol-gel material for phosphopeptides was evaluated and compared with tantalum(V) oxide (Ta2O5) in different phosphopeptide enrichment applications. The tantalum-based sol-gel and tantalum(V) oxide were characterized in detail using FT-IR spectroscopy, X-ray diffraction (XRD) and scanning electron microscopy (SEM), and also using a surface area and pore size analyzer. In the characterization studies, the surface morphology, pore volume, crystallinity of the materials and PEG incorporation into the sol-gel structure to produce a more hydrophilic material were successfully demonstrated. The X-ray diffractograms of the two different materials were compared and it was noted that the broad signals of the tantalum-based sol-gel clearly represented the amorphous structure of the sol-gel material, which was more likely to create enough surface area and to provide more accessible tantalum atoms for phosphopeptides to be easily adsorbed when compared with the neat and more crystalline structure of Ta2O5. Therefore, the phosphopeptide enrichment performance of the tantalum-based sol-gels was found to be remarkably higher than the more crystalline Ta2O5 in our studies. Phosphopeptides at femtomole levels could be selectively enriched using the tantalum-based sol-gel and detected with a higher signal-to-noise ratio by matrix-assisted laser desorption/ionization-mass spectrometer (MALDI-MS). Moreover, phosphopeptides in a tryptic digest of non-fat bovine milk as a complex real-world biological sample were retained with higher yield using a tantalum-based sol-gel. Additionally, the sol-gel material

  9. Global proteomic profiling of phosphopeptides using electron transfer dissociation tandem mass spectrometry

    DEFF Research Database (Denmark)

    Molina, Henrik; Horn, David M; Tang, Ning

    2007-01-01

    Electron transfer dissociation (ETD) is a recently introduced mass spectrometric technique that provides a more comprehensive coverage of peptide sequences and posttranslational modifications. Here, we evaluated the use of ETD for a global phosphoproteome analysis. In all, we identified a total...... of 1,435 phosphorylation sites from human embryonic kidney 293T cells, of which 1,141 ( approximately 80%) were not previously described. A detailed comparison of ETD and collision-induced dissociation (CID) modes showed that ETD identified 60% more phosphopeptides than CID, with an average of 40% more...... fragment ions that facilitated localization of phosphorylation sites. Although our data indicate that ETD is superior to CID for phosphorylation analysis, the two methods can be effectively combined in alternating ETD and CID modes for a more comprehensive analysis. Combining ETD and CID, from this single...

  10. On-bead chemical synthesis and display of phosphopeptides for affinity pull-down proteomics

    DEFF Research Database (Denmark)

    Malene, Brandt; Madsen, Jens C.; Bunkenborg, Jakob

    2006-01-01

    We describe a new method for phosphopeptide proteomics based on the solid-phase synthesis of phosphopeptides on beads suitable for affinity pull-down experiments. Peptide sequences containing the Bad Ser112 and Ser136 phosphorylation motifs were used as bait in affinity pull-down experiments...... (aldehyde) at the C terminus for potential activity-based proteomics. The synthetic support-bound Bad phosphopeptides were able to pull down 14-3-3zeta. Furthermore, Bad phosphopeptides bound endogenous 14-3-3 proteins, and all seven members of the 14-3-3 family were identified by mass spectrometry....... In control experiments, none of the unphosphorylated Bad peptides bound transfected 14-3-3zeta or endogenous 14-3-3. We conclude that the combined synthesis and display of phosphopeptides on-bead is a fast and efficient method for affinity pull-down proteomics....

  11. Phosphopeptide derivatization signatures to identify serine and threonine phosphorylated peptides by mass spectrometry.

    Science.gov (United States)

    Molloy, M P; Andrews, P C

    2001-11-15

    The development of rapid, global methods for monitoring states of protein phosphorylation would provide greater insight for understanding many fundamental biological processes. Current best practices use mass spectrometry (MS) to profile digests of purified proteins for evidence of phosphorylation. However, this approach is beset by inherent difficulties in both identifying phosphopeptides from within a complex mixture containing many other unmodified peptides and ionizing phosphopeptides in positive-ion MS. We have modified an approach that uses barium hydroxide to rapidly eliminate the phosphoryl group of serine and threonine modified amino acids, creating dehydroamino acids that are susceptible to nucleophilic derivatization. By derivatizing a protein digest with a mixture of two different alkanethiols, phosphopeptide-specific derivatives were readily distinguished by MS due to their characteristic ion-pair signature. The resulting tagged ion pairs accommodate simple and rapid screening for phosphopeptides in a protein digest, obviating the use of isotopically labeled samples for qualitative phosphopeptide detection. MALDI-MS is used in a first pass manner to detect derivatized phosphopeptides, while the remaining sample is available for tandem MS to reveal the site of derivatization and, thus, phosphorylation. We demonstrated the technique by identifying phosphopeptides from beta-casein and ovalbumin. The approach was further used to examine in vitro phosphorylation of recombinant human HSP22 by protein kinase C, revealing phosphorylation of Thr-63.

  12. Effect of Casein Phosphopeptide-Amorphous Calcium Phosphate and Three Calcium Phosphate on Enamel Microhardness.

    Science.gov (United States)

    Haghgou, En Hr; Haghgoo, Roza; Roholahi, Mohamad R; Ghorbani, Zahra

    2017-07-01

    This study aims to investigate the effect of casein phos-phopeptide-amorphous calcium phosphate and three calcium phosphate (CPP-ACP and TCP) on increasing the microhardness of human enamel after induction of erosion. A total of 26 healthy human-impacted third molar teeth were chosen, and their hardness measured using a microhardness testing machine. The samples were immersed in Coca Cola (pH = 4.7) for 8 minutes. Then, micro-hardness was measured again, and these samples were randomly divided into four groups (two control groups and two experimental groups). (1) Negative control group: Artificial saliva was used for 10 minutes, (2) positive control group: Fluoride gel was used for 10 minutes, (3) β-TCP group: TCP was used for 10 minutes, (4) CCP-ACP group: CCP-ACP was used for 10 minutes. The final microhardness of those samples was measured, and the changes in microhardness of teeth within group and between groups were analyzed using the paired and analysis of variance tests respectively. Results were considered statistically significant at a level of p < 0.05. No significant difference was observed in microhard-ness between CPP-ACP group and TCP group (p = 0.368) during the time microhardness significantly dropped after soaking in soda. Casein phosphopeptide-amorphous calcium phosphate and TCP increased the microhardness of teeth. The increase in hardness in the TCP group was higher than in the CPP-ACP group, but this difference was not significant (p = 0.36). Casein phosphopeptide-amorphous calcium phosphate and TCP can affect the remineralization of erosive lesions.

  13. Three-dimensional ordered titanium dioxide-zirconium dioxide film-based microfluidic device for efficient on-chip phosphopeptide enrichment.

    Science.gov (United States)

    Zhao, De; He, Zhongyuan; Wang, Gang; Wang, Hongzhi; Zhang, Qinghong; Li, Yaogang

    2016-09-15

    Microfluidic technology plays a significant role in separating biomolecules, because of its miniaturization, integration, and automation. Introducing micro/nanostructured functional materials can improve the properties of microfluidic devices, and extend their application. Inverse opal has a three-dimensional ordered net-like structure. It possesses a large surface area and exhibits good mass transport, making it a good candidate for bio-separation. This study exploits inverse opal titanium dioxide-zirconium dioxide films for on-chip phosphopeptide enrichment. Titanium dioxide-zirconium dioxide inverse opal film-based microfluidic devices were constructed from templates of 270-, 340-, and 370-nm-diameter poly(methylmethacrylate) spheres. The phosphopeptide enrichments of these devices were determined by matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. The device constructed from the 270-nm-diameter sphere template exhibited good comprehensive phosphopeptide enrichment, and was the best among these three devices. Because the size of opal template used in construction was the smallest, the inverse opal film therefore had the smallest pore sizes and the largest surface area. Enrichment by this device was also better than those of similar devices based on nanoparticle films and single component films. The titanium dioxide-zirconium dioxide inverse opal film-based device provides a promising approach for the efficient separation of various biomolecules. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    Science.gov (United States)

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  15. Casein phosphopeptide-amorphous calcium phosphate and shear bond strength of adhesives to primary teeth enamel.

    Science.gov (United States)

    Farokh Gisovar, Elham; Hedayati, Nassim; Shadman, Niloofar; Shafiee, Leila

    2015-02-01

    CPP-ACP (Phosphopeptide-Amorphous Calcium Phosphate) has an important role in caries prevention in pediatric patients. This study was done, because of the great use of CPP-ACP and the need for restoration for teeth treated with CPP-ACP as well as the importance of shear bond strength of adhesives in the success of restorations. This study aimed to evaluate the effect of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on shear bond strength of dental adhesives to enamel of primary teeth molars. This in vitro study was conducted on 180 extracted primary molars. They were randomly divided into 6 groups and each group was divided into 2 subgroups (treated with CPP-ACP and untreated). In subgroups with CPP-ACP, enamel was treated with CPP-ACP paste 1 h/d for 5 days. Types of adhesives that were evaluated in this study were Tetric N-Bond, AdheSE, AdheSE One F, single Bond 2, SE Bond, and Adper Prompt L-Pop. Shear bond strength was tested with a universal testing machine and mode of failure was evaluated under stereomicroscope. Data were analyzed by T test, 2-way analysis of variance (ANOVA), Tukey and Fisher exact test using SPSS18. P adhesive systems to enamel of primary teeth treated and untreated with CPP-ACP showed no significant difference (P > 0.05). Mode of failure in all groups regardless of CPP-ACP administration was mainly adhesive type. Our results indicated that CPP-ACP did not affect shear bond strength of studied adhesives to primary teeth enamel. To have a successful and durable composite restoration, having a high strength bonding is essential. Considering the wide use of CPP-ACP in preventing tooth decay and the role of adhesive shear bond strength (SBS) in success of composite restoration, we conducted the present study to evaluate the effect of CPP-ACP on the SBS of adhesives to primary teeth enamel.

  16. Improving Loop Dependence Analysis

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven

    2017-01-01

    Programmers can no longer depend on new processors to have significantly improved single-thread performance. Instead, gains have to come from other sources such as the compiler and its optimization passes. Advanced passes make use of information on the dependencies related to loops. We improve th...

  17. Cyclic phosphopeptides for interference with Grb2 SH2 domain signal transduction prepared by ring-closing metathesis and phosphorylation

    NARCIS (Netherlands)

    Dekker, Frank J; de Mol, Nico J; Fischer, Marcel J E; Kemmink, Johan; Liskamp, Rob M J; Dekker, Frank

    2003-01-01

    Cyclic phosphopeptides were prepared using ring-closing metathesis followed by phosphorylation. These cyclic phosphopeptides were designed to interact with the SH2 domain of Grb2, which is a signal transduction protein of importance as a target for antiproliferative drug development. Binding of

  18. In vivo Phosphoproteome of Human Skeletal Muscle Revealed by Phosphopeptide Enrichment and HPLC-ESI-MS/MS

    DEFF Research Database (Denmark)

    Højlund, Kurt; Bowen, Benjamin P; Hwang, Hyonson

    2009-01-01

    volunteers. Trypsin digestion of 3-5 mg human skeletal muscle protein was followed by phosphopeptide enrichment using SCX and TiO2. The resulting phosphopeptides were analyzed by HPLC-ESI-MS/MS. Using this unbiased approach, we identified 306 distinct in vivo phosphorylation sites in 127 proteins, including...

  19. Phosphotyrosine-based-phosphoproteomics scaled-down to biopsy level for analysis of individual tumor biology and treatment selection.

    Science.gov (United States)

    Labots, Mariette; van der Mijn, Johannes C; Beekhof, Robin; Piersma, Sander R; de Goeij-de Haas, Richard R; Pham, Thang V; Knol, Jaco C; Dekker, Henk; van Grieken, Nicole C T; Verheul, Henk M W; Jiménez, Connie R

    2017-06-06

    Mass spectrometry-based phosphoproteomics of cancer cell and tissue lysates provides insight in aberrantly activated signaling pathways and potential drug targets. For improved understanding of individual patient's tumor biology and to allow selection of tyrosine kinase inhibitors in individual patients, phosphoproteomics of small clinical samples should be feasible and reproducible. We aimed to scale down a pTyr-phosphopeptide enrichment protocol to biopsy-level protein input and assess reproducibility and applicability to tumor needle biopsies. To this end, phosphopeptide immunoprecipitation using anti-phosphotyrosine beads was performed using 10, 5 and 1mg protein input from lysates of colorectal cancer (CRC) cell line HCT116. Multiple needle biopsies from 7 human CRC resection specimens were analyzed at the 1mg-level. The total number of phosphopeptides captured and detected by LC-MS/MS ranged from 681 at 10mg input to 471 at 1mg HCT116 protein. ID-reproducibility ranged from 60.5% at 10mg to 43.9% at 1mg. Per 1mg-level biopsy sample, >200 phosphopeptides were identified with 57% ID-reproducibility between paired tumor biopsies. Unsupervised analysis clustered biopsies from individual patients together and revealed known and potential therapeutic targets. This study demonstrates the feasibility of label-free pTyr-phosphoproteomics at the tumor biopsy level based on reproducible analyses using 1mg of protein input. The considerable number of identified phosphopeptides at this level is attributed to an effective down-scaled immuno-affinity protocol as well as to the application of ID propagation in the data processing and analysis steps. Unsupervised cluster analysis reveals patient-specific profiles. Together, these findings pave the way for clinical trials in which pTyr-phosphoproteomics will be performed on pre- and on-treatment biopsies. Such studies will improve our understanding of individual tumor biology and may enable future p

  20. Treatment of post-orthodontic white spot lesions with casein phosphopeptide-stabilised amorphous calcium phosphate

    DEFF Research Database (Denmark)

    Bröchner, Ann; Christensen, Carsten; Kristensen, Bjarne

    2010-01-01

    This study aims to investigate the effect of topical applications of 10% casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on white spot lesions (WSL) detected after treatment with fixed orthodontic appliances. Sixty healthy adolescents with >/=1 clinically visible WSL at debonding were...... findings were largely reflected by the clinical scores. No side effects were reported. Topical treatment of white spot lesions after debonding of orthodontic appliances with a casein phosphopeptide-stabilised amorphous calcium phosphate agent resulted in significantly reduced fluorescence and a reduced...

  1. Enrichment and characterization of phosphopeptides by immobilized metal affinity chromatography (IMAC) and mass spectrometry

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N

    2009-01-01

    The combination of immobilized metal affinity chromatography (IMAC) and mass spectrometry is a widely used technique for enrichment and sequencing of phosphopeptides. In the IMAC method, negatively charged phosphate groups interact with positively charged metal ions (Fe3+, Ga3+, and Al3...

  2. Structural insights into the recognition of phosphopeptide by the FHA domain of kanadaptin

    DEFF Research Database (Denmark)

    Xu, Qingping; Deller, Marc C; Nielsen, Tine K

    2014-01-01

    with a phosphopeptide mimic derived from a peptide segment from the N-terminus of a symmetry-related molecule as well as a sulfate bound to the structurally conserved phosphothreonine recognition cleft. This structure provides insights into the molecular recognition features utilized by this family of proteins...

  3. Comparative evaluation of Nano-hydroxyapatite and casein Phosphopeptide-amorphous calcium phosphate on the remineralization potential of early enamel lesions: An in vitro study

    Directory of Open Access Journals (Sweden)

    Anshul Sharma

    2017-01-01

    Full Text Available Background: Benefits of remineralizing agents in a wide variety of formulations have been proved beneficial in caries management. Casein phosphopeptide-amorphous calcium phosphate (CPP–ACP nanocomplex has been recommended and used as remineralizing agent. Nano-hydroxyapatite (n-HAp is one of the most biocompatible and bioactive material having wide range of application in dentistry, but does it excel better compared to CPP-ACP. Aims: To evaluate and compare the remineralizing efficiency of the paste containing hydroxyapatite and casein phosphopeptide-amorphous calcium phosphate. Settings and Design: The study was an in vitro single blinded study with lottery method of randomization approved by the Institutional Ethics Committee. Materials and methods: 30 non carious premolar teeth. The teeth were demineralized and divided into 2 groups and subjected to remineralization. The samples were analysed for surface hardness and mineral content. Statistical Analysis: Student t’ test and repeated measures of ANOVA was applied. Results: Average hardness in Nano-hydroxyapatite group increased to 340 ± 31.70 SD and 426 ± 50.62 SD for 15 and 30 days respectively and that of (CPP–ACP, 355.83 ± 38.55 SD and 372.67 ± 53.63 SD. The change in the hardness values was not statistically significant with P value of 0.39 (P > 0.05. Calcium and Phosphorous levels increased in both the groups but was not significant. Conclusion: Both the agents used are effective in causing remineralization of enamel. Nano-hydroxyapatite is more effective as compared to Casein phosphopeptide-amorphous calcium phosphate, in increasing the Calcium and Phosphorus content of enamel, and this effect is more evident over a longer treatment period. Key Message: Remineralizing agents are a boon for caries management. With the advent of many formulations it is difficult to clinically select the agent. This study compares the remineralizing potential of Casein

  4. Hierarchically templated beads with tailored pore structure for phosphopeptide capture and phosphoproteomics

    DEFF Research Database (Denmark)

    Wierzbicka, Celina; Torsetnes, Silje B.; Jensen, Ole N.

    2017-01-01

    Two templating approaches to produce imprinted phosphotyrosine capture beads with a controllable pore structure are reported and compared with respect to their ability to enrich phosphopeptides from a tryptic peptide mixture. The beads were prepared by the polymerization of urea-based host monomers...... and crosslinkers inside the pores of macroporous silica beads with both free and immobilized template. In the final step the silica was removed by fluoride etching resulting in mesoporous polymer replicas with narrow pore size distributions, pore diameters ≈ 10 nm and surface area > 260 m2 g-1. The beads displayed...... pronounced phosphotyrosine affinity and selectivity in binding tests using model peptides in acetonitrile rich solutions with a performance surpassing solution polymerized bulk imprinted materials. Tests of the beads for the enrichment of phosphopeptides from tryptic digests of twelve proteins revealed both...

  5. Enzymatic Dissolution of Biocomposite Solids Consisting of Phosphopeptides to Form Supramolecular Hydrogels

    KAUST Repository

    Shi, Junfeng; Yuan, Dan; Haburcak, Richard; Zhang, Qiang; Zhao, Chao; Zhang, Xixiang; Xu, Bing

    2015-01-01

    Enzyme-catalyzed dephosphorylation is essential for biomineralization and bone metabolism. Here we report the exploration of using enzymatic reaction to transform biocomposites of phosphopeptides and calcium (or strontium) ions to supramolecular hydrogels as a mimic of enzymatic dissolution of biominerals. 31P NMR shows that strong affinity between the phosphopeptides and alkaline metal ions (e.g., Ca2+ or Sr2+) induces the formation of biocomposites as precipitates. Electron microscopy reveals that the enzymatic reaction regulates the morphological transition from particles to nanofibers. Rheology confirms the formation of a rigid hydrogel. As the first example of enzyme-instructed dissolution of a solid to form supramolecular nanofibers/hydrogels, this work provides an approach to generate soft materials with desired properties, expands the application of supramolecular hydrogelators, and offers insights to control the demineralization of calcified soft tissues.

  6. Enzymatic Dissolution of Biocomposite Solids Consisting of Phosphopeptides to Form Supramolecular Hydrogels

    KAUST Repository

    Shi, Junfeng

    2015-10-14

    Enzyme-catalyzed dephosphorylation is essential for biomineralization and bone metabolism. Here we report the exploration of using enzymatic reaction to transform biocomposites of phosphopeptides and calcium (or strontium) ions to supramolecular hydrogels as a mimic of enzymatic dissolution of biominerals. 31P NMR shows that strong affinity between the phosphopeptides and alkaline metal ions (e.g., Ca2+ or Sr2+) induces the formation of biocomposites as precipitates. Electron microscopy reveals that the enzymatic reaction regulates the morphological transition from particles to nanofibers. Rheology confirms the formation of a rigid hydrogel. As the first example of enzyme-instructed dissolution of a solid to form supramolecular nanofibers/hydrogels, this work provides an approach to generate soft materials with desired properties, expands the application of supramolecular hydrogelators, and offers insights to control the demineralization of calcified soft tissues.

  7. Incorporation of casein phosphopeptide-amorphous calcium phosphate into a glass-ionomer cement.

    Science.gov (United States)

    Mazzaoui, S A; Burrow, M F; Tyas, M J; Dashper, S G; Eakins, D; Reynolds, E C

    2003-11-01

    Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) nanocomplexes have been shown to prevent demineralization and promote remineralization of enamel subsurface lesions in animal and in situ caries models. The aim of this study was to determine the effect of incorporating CPP-ACP into a self-cured glass-ionomer cement (GIC). Incorporation of 1.56% w/w CPP-ACP into the GIC significantly increased microtensile bond strength (33%) and compressive strength (23%) and significantly enhanced the release of calcium, phosphate, and fluoride ions at neutral and acidic pH. MALDI mass spectrometry also showed casein phosphopeptides from the CPP-ACP nanocomplexes to be released. The release of CPP-ACP and fluoride from the CPP-ACP-containing GIC was associated with enhanced protection of the adjacent dentin during acid challenge in vitro.

  8. Iminodiacetic acid-modified magnetic poly(2-hydroxyethyl methacrylate)-based microspheres for phosphopeptide enrichment

    Czech Academy of Sciences Publication Activity Database

    Novotná, L.; Emmerová, T.; Horák, Daniel; Kučerová, Z.; Tichá, M.

    2010-01-01

    Roč. 1217, č. 51 (2010), s. 8032-8040 ISSN 0021-9673 R&D Projects: GA AV ČR(CZ) KAN401220801; GA ČR GA203/09/0857; GA ČR GAP503/10/0664 Institutional research plan: CEZ:AV0Z40500505 Keywords : IMAC phosphopeptide separation * IDA-modified magnetic microspheres * Porcine pepsin A Subject RIV: EE - Microbiology, Virology Impact factor: 4.194, year: 2010

  9. Highly selective manganese-doped zinc sulfide quantum dots based label free phosphorescent sensor for phosphopeptides in presence of zirconium (IV).

    Science.gov (United States)

    Gong, Yan; Fan, Zhefeng

    2015-04-15

    We report a room-temperature phosphorescence (RTP) sensor for phosphopeptides based on zirconium (IV)-modulated mercaptopropionic acid (MPA)-capped Mn-doped ZnS quantum dots (QDs). This sensor incorporates the advantages of the well-known Zr(4+)-phosphopeptide affinity pair and the RTP properties of doped QDs. The RTP of Mn-doped ZnS QDs capped with MPA can be effectively quenched by Zr(4+). The high affinity of phosphopeptides to Zr(4+) enables the dissociation of the ion from the surface of MPA-capped ZnS QDs, thereby forming a stable complex with phosphopeptides in the solution, and recovering the RTP of the QDs. The Zr(4+)-induced RTP quenching and subsequent phosphopeptide-induced RTP recovery for MPA-capped ZnS QDs provide a solid basis for the present RTP sensor based on QDs for the detection of phosphopeptides. The detection limit for phosphopeptides is 0.9ngmL(-1), the relative standard deviations is 2.5%, and the recovery of urine and serum samples with phosphopeptides addition rangs from 96% to 105% at optimal conditions. The proposed method was successfully applied to biological fluids and obtained satisfactory results. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Clinical evaluation of remineralization potential of casein phosphopeptide amorphous calcium phosphate nanocomplexes for enamel decalcification in orthodontics.

    Science.gov (United States)

    Wang, Jun-xiang; Yan, Yan; Wang, Xiu-jing

    2012-11-01

    Enamel decalcification in orthodontics is a concern for dentists and methods to remineralize these lesions are the focus of intense research. The aim of this study was to evaluate the remineralizing effect of casein phosphopeptide amorphous calcium phosphate (CPP-ACP) nanocomplexes on enamel decalcification in orthodontics. Twenty orthodontic patients with decalcified enamel lesions during fixed orthodontic therapy were recruited to this study as test group and twenty orthodontic patients with the similar condition as control group. GC Tooth Mousse, the main component of which is CPP-ACP, was used by each patient of test group every night after tooth-brushing for six months. For control group, each patient was asked to brush teeth with toothpaste containing 1100 parts per million (ppm) of fluoride twice a day. Standardized intraoral images were taken for all patients and the extent of enamel decalcification was evaluated before and after treatment over this study period. Measurements were statistically compared by t test. After using CPP-ACP for six months, the enamel decalcification index (EDI) of all patients had decreased; the mean EDI before using CPP-ACP was 0.191 ± 0.025 and that after using CPP-ACP was 0.183 ± 0.023, the difference was significant (t = 5.169, P 0.05). CPP-ACP can effectively improve the demineralized enamel lesions during orthodontic treatment, so it has some remineralization potential for enamel decalcification in orthodontics.

  11. Molecularly Imprinted Porous Monolithic Materials from Melamine-Formaldehyde for Selective Trapping of Phosphopeptides

    DEFF Research Database (Denmark)

    Liu, Mingquan; Tran, Tri Minh; Abbas Elhaj, Ahmed Awad

    2017-01-01

    monoliths, chosen based on the combination of meso- and macropores providing optimal percolative flow and accessible surface area, was synthesized in the presence of N-Fmoc and O-Et protected phosphoserine and phosphotyrosine to prepare molecularly imprinted monoliths with surface layers selective...... for phosphopeptides. These imprinted monoliths were characterized alongside nonimprinted monoliths by a variety of techniques and finally evaluated by liquid chromatography-mass spectrometry in the capillary format to assess their abilities to trap and release phosphorylated amino acids and peptides from partly...

  12. Phosphopeptide Enrichment by Covalent Chromatography after Derivatization of Protein Digests Immobilized on Reversed-Phase Supports

    Science.gov (United States)

    Nika, Heinz; Nieves, Edward; Hawke, David H.; Angeletti, Ruth Hogue

    2013-01-01

    A rugged sample-preparation method for comprehensive affinity enrichment of phosphopeptides from protein digests has been developed. The method uses a series of chemical reactions to incorporate efficiently and specifically a thiol-functionalized affinity tag into the analyte by barium hydroxide catalyzed β-elimination with Michael addition using 2-aminoethanethiol as nucleophile and subsequent thiolation of the resulting amino group with sulfosuccinimidyl-2-(biotinamido) ethyl-1,3-dithiopropionate. Gentle oxidation of cysteine residues, followed by acetylation of α- and ε-amino groups before these reactions, ensured selectivity of reversible capture of the modified phosphopeptides by covalent chromatography on activated thiol sepharose. The use of C18 reversed-phase supports as a miniaturized reaction bed facilitated optimization of the individual modification steps for throughput and completeness of derivatization. Reagents were exchanged directly on the supports, eliminating sample transfer between the reaction steps and thus, allowing the immobilized analyte to be carried through the multistep reaction scheme with minimal sample loss. The use of this sample-preparation method for phosphopeptide enrichment was demonstrated with low-level amounts of in-gel-digested protein. As applied to tryptic digests of α-S1- and β-casein, the method enabled the enrichment and detection of the phosphorylated peptides contained in the mixture, including the tetraphosphorylated species of β-casein, which has escaped chemical procedures reported previously. The isolates proved highly suitable for mapping the sites of phosphorylation by collisionally induced dissociation. β-Elimination, with consecutive Michael addition, expanded the use of the solid-phase-based enrichment strategy to phosphothreonyl peptides and to phosphoseryl/phosphothreonyl peptides derived from proline-directed kinase substrates and to their O-sulfono- and O-linked β-N-acetylglucosamine (O

  13. Search for phosphopeptides in the feces of axenic rats fed radioactive ovine casein

    International Nuclear Information System (INIS)

    Pelissier, J.P.; Dubos, F.; Daburon, F.

    1981-01-01

    Radioactive ovine casein was obtained by injecting 100 μCi of 14 C-Ser into the jugular vein of an ewe. The milk collected 17 and 24 h after this injection contained 12% of the radioactivity injected in protein form. The seryl residues were specificially labelled. This casein was used as the only protein source fed to axenic rats; 0.30% of the tracer ingested was found in the feces of those rats. Since phosphoserine represented 25% of the total casein seryl residues, the phosphopeptides may not be selectively unabsorbable [fr

  14. Neighbor-directed histidine N(τ) alkylation. A route to imidazolium-containing phosphopeptide macrocycles

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Wen-Jian [National Cancer Inst., Frederick, MD (United States); Park, Jung-Eun [National Cancer Inst., Bethesda, MD (United States); Grant, Robert [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Lai, Christopher C. [National Cancer Inst., Frederick, MD (United States); Kelley, James A. [National Cancer Inst., Frederick, MD (United States); Yaffe, Michael B. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Lee, Kyung S. [National Cancer Inst., Bethesda, MD (United States); Burke, Terrence R. [National Cancer Inst., Frederick, MD (United States)

    2015-07-07

    Our recently discovered, selective, on-resin route to N(τ)-alkylated imidazolium-containing histidine residues affords new strategies for peptide mimetic design. In this, we demonstrate the use of this chemistry to prepare a series of macrocyclic phosphopeptides, in which imidazolium groups serve as ring-forming junctions. These cationic moieties subsequently serve to charge-mask the phosphoamino acid group that directed their formation. Furthermore, neighbor-directed histidine N(τ)-alkylation opens the door to new families of phosphopeptidomimetics for use in a range of chemical biology contexts.

  15. Comparison of Zirconium Phosphonate-Modified Surfaces for Immobilizing Phosphopeptides and Phosphate-Tagged Proteins.

    Science.gov (United States)

    Forato, Florian; Liu, Hao; Benoit, Roland; Fayon, Franck; Charlier, Cathy; Fateh, Amina; Defontaine, Alain; Tellier, Charles; Talham, Daniel R; Queffélec, Clémence; Bujoli, Bruno

    2016-06-07

    Different routes for preparing zirconium phosphonate-modified surfaces for immobilizing biomolecular probes are compared. Two chemical-modification approaches were explored to form self-assembled monolayers on commercially available primary amine-functionalized slides, and the resulting surfaces were compared to well-characterized zirconium phosphonate monolayer-modified supports prepared using Langmuir-Blodgett methods. When using POCl3 as the amine phosphorylating agent followed by treatment with zirconyl chloride, the result was not a zirconium-phosphonate monolayer, as commonly assumed in the literature, but rather the process gives adsorbed zirconium oxide/hydroxide species and to a lower extent adsorbed zirconium phosphate and/or phosphonate. Reactions giving rise to these products were modeled in homogeneous-phase studies. Nevertheless, each of the three modified surfaces effectively immobilized phosphopeptides and phosphopeptide tags fused to an affinity protein. Unexpectedly, the zirconium oxide/hydroxide modified surface, formed by treating the amine-coated slides with POCl3/Zr(4+), afforded better immobilization of the peptides and proteins and efficient capture of their targets.

  16. Structural and biophysical investigation of the interaction of a mutant Grb2 SH2 domain (W121G) with its cognate phosphopeptide.

    Science.gov (United States)

    Papaioannou, Danai; Geibel, Sebastian; Kunze, Micha B A; Kay, Christopher W M; Waksman, Gabriel

    2016-03-01

    The adaptor protein Grb2 is a key element of mitogenetically important signaling pathways. With its SH2 domain it binds to upstream targets while its SH3 domains bind to downstream proteins thereby relaying signals from the cell membranes to the nucleus. The Grb2 SH2 domain binds to its targets by recognizing a phosphotyrosine (pY) in a pYxNx peptide motif, requiring an Asn at the +2 position C-terminal to the pY with the residue either side of this Asn being hydrophobic. Structural analysis of the Grb2 SH2 domain in complex with its cognate peptide has shown that the peptide adopts a unique β-turn conformation, unlike the extended conformation that phosphopeptides adopt when bound to other SH2 domains. TrpEF1 (W121) is believed to force the peptide into this unusual conformation conferring this unique specificity to the Grb2 SH2 domain. Using X-ray crystallography, electron paramagnetic resonance (EPR) spectroscopy, and isothermal titration calorimetry (ITC), we describe here a series of experiments that explore the role of TrpEF1 in determining the specificity of the Grb2 SH2 domain. Our results demonstrate that the ligand does not adopt a pre-organized structure before binding to the SH2 domain, rather it is the interaction between the two that imposes the hairpin loop to the peptide. Furthermore, we find that the peptide adopts a similar structure when bound to both the wild-type Grb2 SH2 domain and a TrpEF1Gly mutant. This suggests that TrpEF1 is not the determining factor for the conformation of the phosphopeptide. © 2015 The Protein Society.

  17. Specific phosphopeptide binding regulates a conformational change in the PI 3-kinase SH2 domain associated with enzyme activation.

    Science.gov (United States)

    Shoelson, S E; Sivaraja, M; Williams, K P; Hu, P; Schlessinger, J; Weiss, M A

    1993-01-01

    SH2 (src-homology 2) domains define a newly recognized binding motif that mediates the physical association of target phosphotyrosyl proteins with downstream effector enzymes. An example of such phosphoprotein-effector coupling is provided by the association of phosphatidylinositol 3-kinase (PI 3-kinase) with specific phosphorylation sites within the PDGF receptor, the c-Src/polyoma virus middle T antigen complex and the insulin receptor substrate IRS-1. Notably, phosphoprotein association with the SH2 domains of p85 also stimulates an increase in catalytic activity of the PI 3-kinase p110 subunit, which can be mimicked by phosphopeptides corresponding to targeted phosphoprotein phosphorylation sites. To investigate how phosphoprotein binding to the p85 SH2 domain stimulates p110 catalytic activation, we have examined the differential effects of phosphotyrosine and PDGF receptor-, IRS-1- and c-Src-derived phosphopeptides on the conformation of an isolated SH2 domain of PI 3-kinase. Although phosphotyrosine and both activating and non-activating phosphopeptides bind to the SH2 domain, activating phosphopeptides bind with higher affinity and induce a qualitatively distinct conformational change as monitored by CD and NMR spectroscopy. Amide proton exchange and protease protection assays further show that high affinity, specific phosphopeptide binding induces non-local dynamic SH2 domain stabilization. Based on these findings we propose that specific phosphoprotein binding to the p85 subunit induces a change in SH2 domain structure which is transmitted to the p110 subunit and regulates enzymatic activity by an allosteric mechanism. Images PMID:8382612

  18. ACCA phosphopeptide recognition by the BRCT repeats of BRCA1.

    Science.gov (United States)

    Ray, Hind; Moreau, Karen; Dizin, Eva; Callebaut, Isabelle; Venezia, Nicole Dalla

    2006-06-16

    The tumour suppressor gene BRCA1 encodes a 220 kDa protein that participates in multiple cellular processes. The BRCA1 protein contains a tandem of two BRCT repeats at its carboxy-terminal region. The majority of disease-associated BRCA1 mutations affect this region and provide to the BRCT repeats a central role in the BRCA1 tumour suppressor function. The BRCT repeats have been shown to mediate phospho-dependant protein-protein interactions. They recognize phosphorylated peptides using a recognition groove that spans both BRCT repeats. We previously identified an interaction between the tandem of BRCA1 BRCT repeats and ACCA, which was disrupted by germ line BRCA1 mutations that affect the BRCT repeats. We recently showed that BRCA1 modulates ACCA activity through its phospho-dependent binding to ACCA. To delineate the region of ACCA that is crucial for the regulation of its activity by BRCA1, we searched for potential phosphorylation sites in the ACCA sequence that might be recognized by the BRCA1 BRCT repeats. Using sequence analysis and structure modelling, we proposed the Ser1263 residue as the most favourable candidate among six residues, for recognition by the BRCA1 BRCT repeats. Using experimental approaches, such as GST pull-down assay with Bosc cells, we clearly showed that phosphorylation of only Ser1263 was essential for the interaction of ACCA with the BRCT repeats. We finally demonstrated by immunoprecipitation of ACCA in cells, that the whole BRCA1 protein interacts with ACCA when phosphorylated on Ser1263.

  19. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    Science.gov (United States)

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  20. Synthesis of Thermally Switchable Chromatographic Materials with Immobilized Ti4+ for Enrichment of Phosphopeptides by Reversible Addition Fragmentation Chain Transfer Polymerization

    Science.gov (United States)

    Wang, Di; Cao, Zhihan; Pang, Xinzhu; Deng, Yulin; Li, Bo; Dai, Rongji

    2018-01-01

    Reversible phosphorylation of proteins is one of the most crucial types of post-translational modifications (PTMs). And it shows significant work in diversified biological processes. However, the separation technology of phosphorylated peptides is still an analytical challenge in phosphoproteomics, because phosphopeptides are alway in low stoichiometry. Thus, enrichment of phosphopeptides before detection is indispensable. In this study, a novel temperature regulated separation protocol was developed. Silica@p (NIPAAm-co-IPPA)-Ti4+, a new Ti(IV)-IMAC (Immobilized Metal Affinity chromatography) materials was synthesized by reversible addition fragmentation chain transfer polymerization (RAFT). By the unique thermally responsive properties of poly(N-isopropylacrylamide) (PNIPAAm), the captured phosphorylated peptides could be released by changing temperature only without applying any other eluant which could damage the phosphopeptides. We employed isopropanol phosphonic acid (IPPA) as an IMAC ligand for the immobilization of Ti(IV) which could increase the specific adsorption of phosphopeptides. The enrichment and release properties were examined by treatment with pyridoxal 5’-phosphate (PLP) and casein phosphopeptides (CPP). Two phosphorylated compounds above have temperature-stimulated binding to Ti4+. Finally, silica@p (NIPAAm-co-IPPA)-Ti4+ was successfully employed in pretreatment of phosphopeptides in a tryptic digest of a-casein and human serum albumin (HSA). The results indicated a great potential of this new temperature-responsive material in phosphoproteomics study.

  1. [Conversation analysis for improving nursing communication].

    Science.gov (United States)

    Yi, Myungsun

    2007-08-01

    Nursing communication has become more important than ever before because quality of nursing services largely depends on the quality of communication in a very competitive health care environment. This article was to introduce ways to improve nursing communication using conversation analysis. This was a review study on conversation analysis, critically examining previous studies in nursing communication and interpersonal relationships. This study provided theoretical backgrounds and basic assumptions of conversation analysis which was influenced by ethnomethodology, phenomenology, and sociolinguistic. In addition, the characteristics and analysis methods of conversation analysis were illustrated in detail. Lastly, how conversation analysis could help improve communication was shown, by examining researches using conversation analysis not only for ordinary conversations but also for extraordinary or difficult conversations such as conversations between patients with dementia and their professional nurses. Conversation analysis can help in improving nursing communication by providing various structures and patterns as well as prototypes of conversation, and by suggesting specific problems and problem-solving strategies in communication.

  2. An SH2 domain model of STAT5 in complex with phospho-peptides define ``STAT5 Binding Signatures''

    Science.gov (United States)

    Gianti, Eleonora; Zauhar, Randy J.

    2015-05-01

    The signal transducer and activator of transcription 5 (STAT5) is a member of the STAT family of proteins, implicated in cell growth and differentiation. STAT activation is regulated by phosphorylation of protein monomers at conserved tyrosine residues, followed by binding to phospho-peptide pockets and subsequent dimerization. STAT5 is implicated in the development of severe pathological conditions, including many cancer forms. However, nowadays a few STAT5 inhibitors are known, and only one crystal structure of the inactive STAT5 dimer is publicly available. With a view to enabling structure-based drug design, we have: (1) analyzed phospho-peptide binding pockets on SH2 domains of STAT5, STAT1 and STAT3; (2) generated a model of STAT5 bound to phospho-peptides; (3) assessed our model by docking against a class of known STAT5 inhibitors (Müller et al. in ChemBioChem 9:723-727, 2008); (4) used molecular dynamics simulations to optimize the molecular determinants responsible for binding and (5) proposed unique "Binding Signatures" of STAT5. Our results put in place the foundations to address STAT5 as a target for rational drug design, from sequence, structural and functional perspectives.

  3. Phosphopeptide occupancy and photoaffinity cross-linking of the v-Src SH2 domain attenuates tyrosine kinase activity.

    Science.gov (United States)

    Garcia, P; Shoelson, S E; Drew, J S; Miller, W T

    1994-12-02

    Phosphorylation of c-Src at carboxyl-terminal Tyr-527 suppresses tyrosine kinase activity and transforming potential, presumably by facilitating the intramolecular interaction of the C terminus of Src with its SH2 domain. In addition, it has been shown previously that occupancy of the c-Src SH2 domain with a phosphopeptide stimulates c-Src kinase catalytic activity. We have performed analogous studies with v-Src, the transforming protein from Rous sarcoma virus, which has extensive homology with c-Src. v-Src lacks an autoregulatory phosphorylation site, and its kinase domain is constitutively active. Phosphopeptides corresponding to the sequences surrounding c-Src Tyr-527 and a Tyr-Glu-Glu-Ile motif from the hamster polyoma virus middle T antigen inhibit tyrosine kinase activity of baculovirus-expressed v-Src 2- and 4-fold, respectively. To determine the mechanism of this regulation, the Tyr-527 phosphopeptide was substituted with the photoactive amino acid p-benzoylphenylalanine at the adjacent positions (N- and C-terminal) to phosphotyrosine. These peptides photoinactivate the v-Src tyrosine kinase 5-fold in a time- and concentration-dependent manner. Furthermore, the peptides cross-link an isolated Src SH2 domain with similar rates and specificity. These data indicate that occupancy of the v-Src SH2 domain induces a conformational change that is transmitted to the kinase domain and attenuates tyrosine kinase activity.

  4. An SH2 domain model of STAT5 in complex with phospho-peptides define "STAT5 Binding Signatures".

    Science.gov (United States)

    Gianti, Eleonora; Zauhar, Randy J

    2015-05-01

    The signal transducer and activator of transcription 5 (STAT5) is a member of the STAT family of proteins, implicated in cell growth and differentiation. STAT activation is regulated by phosphorylation of protein monomers at conserved tyrosine residues, followed by binding to phospho-peptide pockets and subsequent dimerization. STAT5 is implicated in the development of severe pathological conditions, including many cancer forms. However, nowadays a few STAT5 inhibitors are known, and only one crystal structure of the inactive STAT5 dimer is publicly available. With a view to enabling structure-based drug design, we have: (1) analyzed phospho-peptide binding pockets on SH2 domains of STAT5, STAT1 and STAT3; (2) generated a model of STAT5 bound to phospho-peptides; (3) assessed our model by docking against a class of known STAT5 inhibitors (Müller et al. in ChemBioChem 9:723-727, 2008); (4) used molecular dynamics simulations to optimize the molecular determinants responsible for binding and (5) proposed unique "Binding Signatures" of STAT5. Our results put in place the foundations to address STAT5 as a target for rational drug design, from sequence, structural and functional perspectives.

  5. An Innovative Approach to Treat Incisors Hypomineralization (MIH): A Combined Use of Casein Phosphopeptide-Amorphous Calcium Phosphate and Hydrogen Peroxide-A Case Report.

    Science.gov (United States)

    Mastroberardino, Stefano; Campus, Guglielmo; Strohmenger, Laura; Villa, Alessandro; Cagetti, Maria Grazia

    2012-01-01

    Molar Incisor Hypomineralization (MIH) is characterized by a developmentally derived deficiency in mineral enamel. Affected teeth present demarcated enamel opacities, ranging from white to brown; also hypoplasia can be associated. Patient frequently claims aesthetic discomfort if anterior teeth are involved. This problem leads patients to request a bleaching treatment to improve aestheticconditions.Nevertheless, hydrogen peroxide can produce serious side-effects, resulting from further mineral loss. Microabrasion and/or a composite restoration are the treatments of choice in teeth with mild/moderate MIH, but they also need enamel loss. Recently, a new remineralizing agent based on Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP) has been proposed to be effective in hypomineralized enamel, improving also aesthetic conditions. The present paper presents a case report of a young man with white opacities on incisors treated with a combined use of CPP-ACP mousse and hydrogen peroxide gel to correct the aesthetic defect. The patient was instructed to use CPP-ACP for two hours per day for three months in order to obtain enamel remineralization followed by a combined use of CPP-ACP and bleaching agent for further two months. At the end of this five-month treatment, a noticeable aesthetic improvement of the opacities was observed.

  6. An Innovative Approach to Treat Incisors Hypomineralization (MIH: A Combined Use of Casein Phosphopeptide-Amorphous Calcium Phosphate and Hydrogen Peroxide—A Case Report

    Directory of Open Access Journals (Sweden)

    Stefano Mastroberardino

    2012-01-01

    Full Text Available Molar Incisor Hypomineralization (MIH is characterized by a developmentally derived deficiency in mineral enamel. Affected teeth present demarcated enamel opacities, ranging from white to brown; also hypoplasia can be associated. Patient frequently claims aesthetic discomfort if anterior teeth are involved. This problem leads patients to request a bleaching treatment to improve aestheticconditions.Nevertheless, hydrogen peroxide can produce serious side-effects, resulting from further mineral loss. Microabrasion and/or a composite restoration are the treatments of choice in teeth with mild/moderate MIH, but they also need enamel loss. Recently, a new remineralizing agent based on Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP has been proposed to be effective in hypomineralized enamel, improving also aesthetic conditions. The present paper presents a case report of a young man with white opacities on incisors treated with a combined use of CPP-ACP mousse and hydrogen peroxide gel to correct the aesthetic defect. The patient was instructed to use CPP-ACP for two hours per day for three months in order to obtain enamel remineralization followed by a combined use of CPP-ACP and bleaching agent for further two months. At the end of this five-month treatment, a noticeable aesthetic improvement of the opacities was observed.

  7. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  8. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  9. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  10. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  11. Highly Efficient Single-Step Enrichment of Low Abundance Phosphopeptides from Plant Membrane Preparations

    Directory of Open Access Journals (Sweden)

    Xu Na Wu

    2017-09-01

    Full Text Available Mass spectrometry (MS-based large scale phosphoproteomics has facilitated the investigation of plant phosphorylation dynamics on a system-wide scale. However, generating large scale data sets for membrane phosphoproteins usually requires fractionation of samples and extended hands-on laboratory time. To overcome these limitations, we developed “ShortPhos,” an efficient and simple phosphoproteomics protocol optimized for research on plant membrane proteins. The optimized workflow allows fast and efficient identification and quantification of phosphopeptides, even from small amounts of starting plant materials. “ShortPhos” can produce label-free datasets with a high quantitative reproducibility. In addition, the “ShortPhos” protocol recovered more phosphorylation sites from membrane proteins, especially plasma membrane and vacuolar proteins, when compared to our previous workflow and other membrane-based data in the PhosPhAt 4.0 database. We applied “ShortPhos” to study kinase-substrate relationships within a nitrate-induction experiment on Arabidopsis roots. The “ShortPhos” identified significantly more known kinase-substrate relationships compared to previous phosphoproteomics workflows, producing new insights into nitrate-induced signaling pathways.

  12. The decreased of Streptococcus Mutans growth after topical application of phosphopeptide amorphous calcium phosphate paste

    Directory of Open Access Journals (Sweden)

    Tika Faradina Araf

    2011-07-01

    Full Text Available Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP paste is a topical application substance that consisted of a series of milk derivative peptide as a result of phosphorylation and has an antibacterial activity. The objective of this research was to find out the difference of Streptococcus mutans growth before and after CPP-ACP paste given topically to child's teeth. The method of the research was a quasi-experiment. Research samples were 10 students of MI Al Falah Islamic Boarding School, Jatinangor, West Jawa Indonesia and collected with purposive sampling technique. This research used dental plaque from child's teeth before and after applicated by CPP-ACP paste. The plaque was cultivated in selective media Tryptone Yeast Cysteine Sucrose Bacitracin (TYCSB with repeated twice. Streptococcus mutans colony in TYCSB were counted by Stuart colony counter and statistically analyzed based on paired t-test. The results showed the average of Streptococcus mutans growth before applicated CPP-ACP paste was 57.05, whereas after applicated CPP-ACP paste for 1 days was 9.4; for 3 days was 2.85, and for 14 days was 1.7 colony. The research concluded that there was a decrease of Streptococcus mutans growth in isolate plaque after CPP-ACP paste topically given to child's teeth.

  13. Does systemic administration of casein phosphopeptides affect orthodontic movement and root resorption in rats?

    Science.gov (United States)

    Crowther, Lachlan; Shen, Gang; Almuzian, Mohammed; Jones, Allan; Walsh, William; Oliver, Rema; Petocz, Peter; Tarraf, Nour E; Darendeliler, M Ali

    2017-10-01

    To assess the potential effects of casein phosphopeptides (CPPs) on orthodontically induced iatrogenic root resorption (OIIRR) and orthodontic teeth movement. Forty Wistar rats (aged 11 weeks) were randomly divided into experimental group (EG; n = 20) that received a diet supplemented with CPP and control group (CG; n = 20) devoid of diet supplement. A 150 g force was applied using nickel titanium (NiTi) coil that was bonded on maxillary incisors and extended unilaterally to a maxillary first molar. At Day 28, animals in both groups were euthanized. Volumetric assessment of root resorption craters and linear measurement of maxillary first molars movement were blindly examined using a micro-computed tomography scan. Nine rats were excluded from the experiment due to loss during general anesthesia or appliances' failure. Intra-operator reproducibility was high in both volumetric and linear measurements, 92.8 per cent and 98.5-97.6 per cent, respectively. The results reveal that dietary CPP has statistically insignificant effect on the overall OIIRR and orthodontic movement. CPP seems to have statistically insignificant effect on the volume of OIIRR and orthodontic movement in rats. A long-term study with larger sample size using a different concentration of CPP is required to clarify the dentoalveolar effect of CPP. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  14. Effects of conditioners on microshear bond strength to enamel after carbamide peroxide bleaching and/or casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) treatment.

    Science.gov (United States)

    Adebayo, O A; Burrow, M F; Tyas, M J

    2007-11-01

    To evaluate (a) the enamel microshear bond strength (MSBS) of a universal adhesive and (b) the effects of conditioning with a self-etching primer adhesive with/without prior bleaching and/or casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) application. Thirty-five molars were cut into four sections, assigned randomly to four groups (no treatment; 16% carbamide peroxide bleaching; CPP-ACP-containing paste (Tooth Mousse, TM); bleaching and TM) and treated accordingly. Specimens were divided into two for bonding with either a self-etching primer (Clearfil SE Bond, CSE) or a total-etch adhesive (Single Bond, SB). Specimens for CSE bonding were subdivided for one of four preconditioning treatments (no conditioning; 30-40% phosphoric acid (PA); 15% EDTA; 20% polyacrylic acid conditioner (Cavity conditioner, CC) and treated. The adhesives were applied and resin composite bonded to the enamel using microtubes (internal diameter 0.75mm). Bonds were stressed in shear until failure, mean MSBS calculated and data analysed using ANOVA with Tukey's HSD test (alpha=0.05). The modes of bond failure were assessed and classified. Two-way ANOVA revealed significant differences between treatments (Padhesive system on treated enamel may significantly improve bond strengths.

  15. The Effect of Casein Phosphopeptide-amorphous Calcium Fluoride Phosphate on the Remineralization of Artificial Caries Lesions: An In Vitro Study

    Directory of Open Access Journals (Sweden)

    Ngoc Vo Truong Nhu

    2017-08-01

    Full Text Available The studies on electron microstructure of the effect of the use of products that contain casein phosphopeptide-amorphous calcium fluoride phosphate (CPP-ACPF on enamel remineralization are still needed. It is important method to observe of the morphological changes of teeth in different conditions. Objective: To evaluate the remineralization potential of paste on enamel lesions using scanning electron microscopy (SEM. Methods: Sixty enamel specimens were prepared from extracted human premolars. The specimens were placed in a demineralizing solution for four days to produce artificial carious lesions. The specimens were then randomly assigned to two study groups: group A (control group and group B. Group B was incubated in remineralizing paste (CPP-ACPF for 30 minutes per day for 10 days. The control group received no intervention with remineralizing paste. All 60 specimens were stored in artificial saliva at 370C. After remineralization, the samples were observed using SEM. Results: The statistical analysis showed a decrease in the lesion area between the demineralized and remineralized samples, but no significant difference was observed in the lesion depth for group B. There was a significant increase observed in both the lesion depth and lesion area for group A (p = 0.03. Conclusion: The results showed the capacity of CPP-ACPF in supplying calcium and phosphate to the enamel, decreasing the dissolution of the enamel surface and increasing the remineralization of the enamel surface. 

  16. Crystal Structure of the Human Symplekin-Ssu72-CTD Phosphopeptide Complex

    Energy Technology Data Exchange (ETDEWEB)

    K Xiang; T Nigaike; S Xiang; T Kilic; M Beh; J Manley; L Tong

    2011-12-31

    Symplekin (Pta1 in yeast) is a scaffold in the large protein complex that is required for 3'-end cleavage and polyadenylation of eukaryotic messenger RNA precursors (pre-mRNAs); it also participates in transcription initiation and termination by RNA polymerase II (Pol II). Symplekin mediates interactions between many different proteins in this machinery, although the molecular basis for its function is not known. Here we report the crystal structure at 2.4 {angstrom} resolution of the amino-terminal domain (residues 30-340) of human symplekin in a ternary complex with the Pol II carboxy-terminal domain (CTD) Ser5 phosphatase Ssu72 and a CTD Ser5 phosphopeptide. The N-terminal domain of symplekin has the ARM or HEAT fold, with seven pairs of antiparallel {alpha}-helices arranged in the shape of an arc. The structure of Ssu72 has some similarity to that of low-molecular-mass phosphotyrosine protein phosphatase, although Ssu72 has a unique active-site landscape as well as extra structural features at the C terminus that are important for interaction with symplekin. Ssu72 is bound to the concave face of symplekin, and engineered mutations in this interface can abolish interactions between the two proteins. The CTD peptide is bound in the active site of Ssu72, with the pSer5-Pro6 peptide bond in the cis configuration, which contrasts with all other known CTD peptide conformations. Although the active site of Ssu72 is about 25 {angstrom} from the interface with symplekin, we found that the symplekin N-terminal domain stimulates Ssu72 CTD phosphatase activity in vitro. Furthermore, the N-terminal domain of symplekin inhibits polyadenylation in vitro, but only when coupled to transcription. Because catalytically active Ssu72 overcomes this inhibition, our results show a role for mammalian Ssu72 in transcription-coupled pre-mRNA 3'-end processing.

  17. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  18. Structure of a 14-3-3σ–YAP phosphopeptide complex at 1.15 Å resolution

    International Nuclear Information System (INIS)

    Schumacher, Benjamin; Skwarczynska, Malgorzata; Rose, Rolf; Ottmann, Christian

    2010-01-01

    The first structure of a 14-3-3 protein–phosphopeptide complex is reported at 1.15 Å resolution. The YAP 14-3-3-binding motif is revealed for the first time using crystallographic tools. The 14-3-3 proteins are a class of eukaryotic acidic adapter proteins, with seven isoforms in humans. 14-3-3 proteins mediate their biological function by binding to target proteins and influencing their activity. They are involved in pivotal pathways in the cell such as signal transduction, gene expression, enzyme activation, cell division and apoptosis. The Yes-associated protein (YAP) is a WW-domain protein that exists in two transcript variants of 48 and 54 kDa in humans. By transducing signals from the cytoplasm to the nucleus, YAP is important for transcriptional regulation. In both variants, interaction with 14-3-3 proteins after phosphorylation of Ser127 is important for nucleocytoplasmic trafficking, via which the localization of YAP is controlled. In this study, 14-3-3σ has been cloned, purified and crystallized in complex with a phosphopeptide from the YAP 14-3-3-binding domain, which led to a crystal that diffracted to 1.15 Å resolution. The crystals belonged to space group C222 1 , with unit-cell parameters a = 82.3, b = 112.1, c = 62.9 Å

  19. Surface remineralization potential of casein phosphopeptide-amorphous calcium phosphate on enamel eroded by cola-drinks: An in-situ model study

    Directory of Open Access Journals (Sweden)

    Navneet Grewal

    2013-01-01

    Full Text Available Aim: The aim of this study was to investigate the remineralization potential of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP on enamel eroded by cola drinks. Subjects and Methods: A total of 30 healthy subjects were selected from a random sample of 1200 children and divided into two groups of 15 each wherein calcium and phosphorus analyses and scanning electron microscope (SEM analysis was carried out to investigate the remineralization of enamel surface. A total of 30 non-carious premolar teeth were selected from the human tooth bank (HTB to prepare the in-situ appliance. Three enamel slabs were prepared from the same. One enamel slab was used to obtain baseline values and the other two were embedded into the upper palatal appliances prepared on the subjects′ maxillary working model. The subjects wore the appliance after which 30 ml cola drink exposure was given. After 15 days, the slabs were removed and subjected to respective analysis. Statistical Analysis Used: Means of all the readings of soluble calcium and phosphorous levels at baseline,post cola-drink exposure and post cpp-acp application were subjected to statistical analysis SPSS11.5 version.Comparison within groups and between groups was carried out using ANOVA and F-values at 1% level of significance. Results: Decrease in calcium solubility of enamel in the CPP-ACP application group as compared to post-cola drink exposure group (P < 0.05 was seen. Distinctive change in surface topography of enamel in the post-CPP-ACP application group as compared to post-cola drink exposure group was observed. Conclusion: CPP-ACP significantly promoted remineralization of enamel eroded by cola drinks as revealed by significant morphological changes seen in SEM magnification and spectrophotometric analyses.

  20. Economic Analysis of Improved Alkaline Water Electrolysis

    International Nuclear Information System (INIS)

    Kuckshinrichs, Wilhelm; Ketelaer, Thomas; Koj, Jan Christian

    2017-01-01

    Alkaline water electrolysis (AWE) is a mature hydrogen production technology and there exists a range of economic assessments for available technologies. For advanced AWEs, which may be based on novel polymer-based membrane concepts, it is of prime importance that development comes along with new configurations and technical and economic key process parameters for AWE that might be of interest for further economic assessments. This paper presents an advanced AWE technology referring to three different sites in Europe (Germany, Austria, and Spain). The focus is on financial metrics, the projection of key performance parameters of advanced AWEs, and further financial and tax parameters. For financial analysis from an investor’s (business) perspective, a comprehensive assessment of a technology not only comprises cost analysis but also further financial analysis quantifying attractiveness and supply/market flexibility. Therefore, based on cash flow (CF) analysis, a comprehensible set of metrics may comprise levelised cost of energy or, respectively, levelized cost of hydrogen (LCH) for cost assessment, net present value (NPV) for attractiveness analysis, and variable cost (VC) for analysis of market flexibility. The German AWE site turns out to perform best in all three financial metrics (LCH, NPV, and VC). Though there are slight differences in investment cost and operation and maintenance cost projections for the three sites, the major cost impact is due to the electricity cost. Although investment cost is slightly lower and labor cost is significantly lower in Spain, the difference can not outweigh the higher electricity cost compared to Germany. Given the assumption that the electrolysis operators are customers directly and actively participating in power markets, and based on the regulatory framework in the three countries, in this special case electricity cost in Germany is lowest. However, as electricity cost is profoundly influenced by political decisions as

  1. Economic Analysis of Improved Alkaline Water Electrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Kuckshinrichs, Wilhelm, E-mail: w.kuckshinrichs@fz-juelich.de; Ketelaer, Thomas; Koj, Jan Christian [Forschungszentrum Juelich, Institute for Energy and Climate Research – Systems Analysis and Technology Evaluation (IEK-STE), Juelich (Germany)

    2017-02-20

    Alkaline water electrolysis (AWE) is a mature hydrogen production technology and there exists a range of economic assessments for available technologies. For advanced AWEs, which may be based on novel polymer-based membrane concepts, it is of prime importance that development comes along with new configurations and technical and economic key process parameters for AWE that might be of interest for further economic assessments. This paper presents an advanced AWE technology referring to three different sites in Europe (Germany, Austria, and Spain). The focus is on financial metrics, the projection of key performance parameters of advanced AWEs, and further financial and tax parameters. For financial analysis from an investor’s (business) perspective, a comprehensive assessment of a technology not only comprises cost analysis but also further financial analysis quantifying attractiveness and supply/market flexibility. Therefore, based on cash flow (CF) analysis, a comprehensible set of metrics may comprise levelised cost of energy or, respectively, levelized cost of hydrogen (LCH) for cost assessment, net present value (NPV) for attractiveness analysis, and variable cost (VC) for analysis of market flexibility. The German AWE site turns out to perform best in all three financial metrics (LCH, NPV, and VC). Though there are slight differences in investment cost and operation and maintenance cost projections for the three sites, the major cost impact is due to the electricity cost. Although investment cost is slightly lower and labor cost is significantly lower in Spain, the difference can not outweigh the higher electricity cost compared to Germany. Given the assumption that the electrolysis operators are customers directly and actively participating in power markets, and based on the regulatory framework in the three countries, in this special case electricity cost in Germany is lowest. However, as electricity cost is profoundly influenced by political decisions as

  2. Analysis and Improvement of Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Xi-Guang Li

    2017-02-01

    Full Text Available The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA, this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opposition-based learning initialization population. Secondly a new explosion amplitude mechanism for the optimal firework is proposed. In addition, the adaptive t-distribution mutation for non-optimal individuals and elite opposition-based learning for the optimal individual are used. Finally, a new selection strategy, namely Disruptive Selection, is proposed to reduce the running time of the algorithm compared with FWA. In our simulation, we apply the CEC2013 standard functions and compare the proposed algorithm (IFWA with SPSO2011, FWA, EFWA and dynFWA. The results show that the proposed algorithm has better overall performance on the test functions.

  3. Casein phosphopeptides and CaCl2 increase penicillin production and cause an increment in microbody/peroxisome proteins in Penicillium chrysogenum.

    Science.gov (United States)

    Domínguez-Santos, Rebeca; Kosalková, Katarina; García-Estrada, Carlos; Barreiro, Carlos; Ibáñez, Ana; Morales, Alejandro; Martín, Juan-Francisco

    2017-03-06

    Transport of penicillin intermediates and penicillin secretion are still poorly characterized in Penicillium chrysogenum (re-identified as Penicillium rubens). Calcium (Ca 2+ ) plays an important role in the metabolism of filamentous fungi, and casein phosphopeptides (CPP) are involved in Ca 2+ internalization. In this study we observe that the effect of CaCl 2 and CPP is additive and promotes an increase in penicillin production of up to 10-12 fold. Combination of CaCl 2 and CPP greatly promotes expression of the three penicillin biosynthetic genes. Comparative proteomic analysis by 2D-DIGE, identified 39 proteins differentially represented in P. chrysogenum Wisconsin 54-1255 after CPP/CaCl 2 addition. The most interesting group of overrepresented proteins were a peroxisomal catalase, three proteins of the methylcitrate cycle, two aminotransferases and cystationine β-synthase, which are directly or indirectly related to the formation of penicillin amino acid precursors. Importantly, two of the enzymes of the penicillin pathway (isopenicillin N synthase and isopenicillin N acyltransferase) are clearly induced after CPP/CaCl 2 addition. Most of these overrepresented proteins are either authentic peroxisomal proteins or microbody-associated proteins. This evidence suggests that addition of CPP/CaCl 2 promotes the formation of penicillin precursors and the penicillin biosynthetic enzymes in peroxisomes and vesicles, which may be involved in transport and secretion of penicillin. Penicillin biosynthesis in Penicillium chrysogenum is one of the best characterized secondary metabolism processes. However, the mechanism by which penicillin is secreted still remains to be elucidated. Taking into account the role played by Ca 2+ and CPP in the secretory pathway and considering the positive effect that Ca 2+ exerts on penicillin production, the analysis of global protein changes produced after CPP/CaCl 2 addition is very helpful to decipher the processes related to the

  4. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  5. Fully automated synthesis of (phosphopeptide arrays in microtiter plate wells provides efficient access to protein tyrosine kinase characterization

    Directory of Open Access Journals (Sweden)

    Goldstein David J

    2005-01-01

    Full Text Available Abstract Background Synthetic peptides have played a useful role in studies of protein kinase substrates and interaction domains. Synthetic peptide arrays and libraries, in particular, have accelerated the process. Several factors have hindered or limited the applicability of various techniques, such as the need for deconvolution of combinatorial libraries, the inability or impracticality of achieving full automation using two-dimensional or pin solid phases, the lack of convenient interfacing with standard analytical platforms, or the difficulty of compartmentalization of a planar surface when contact between assay components needs to be avoided. This paper describes a process for synthesis of peptides and phosphopeptides on microtiter plate wells that overcomes previous limitations and demonstrates utility in determination of the epitope of an autophosphorylation site phospho-motif antibody and utility in substrate utilization assays of the protein tyrosine kinase, p60c-src. Results The overall reproducibility of phospho-peptide synthesis and multiplexed EGF receptor (EGFR autophosphorylation site (pY1173 antibody ELISA (9H2 was within 5.5 to 8.0%. Mass spectrometric analyses of the released (phosphopeptides showed homogeneous peaks of the expected molecular weights. An overlapping peptide array of the complete EGFR cytoplasmic sequence revealed a high redundancy of 9H2 reactive sites. The eight reactive phospopeptides were structurally related and interestingly, the most conserved antibody reactive peptide motif coincided with a subset of other known EGFR autophosphorylation and SH2 binding motifs and an EGFR optimal substrate motif. Finally, peptides based on known substrate specificities of c-src and related enzymes were synthesized in microtiter plate array format and were phosphorylated by c-Src with the predicted specificities. The level of phosphorylation was proportional to c-Src concentration with sensitivities below 0.1 Units of

  6. In vitro evaluation of casein phosphopeptide-amorphous calcium phosphate effect on the shear bond strength of dental adhesives to enamel.

    Science.gov (United States)

    Shadman, Niloofar; Ebrahimi, Shahram Farzin; Shoul, Maryam Azizi; Sattari, Hasti

    2015-01-01

    Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) is applied for remineralization of early caries lesions or tooth sensitivity conditions and may affect subsequent resin bonding. This in vitro study investigated the effect of CPP-ACP on the shear bond strength of dental adhesives to enamel. Sixty extracted human molar teeth were selected and randomly divided into three groups and six subgroups. Buccal or lingual surfaces of teeth were prepared to create a flat enamel surface. Adhesives used were Tetric N-Bond, AdheSE and AdheSE One F. In three subgroups, before applying adhesives, enamel surfaces were treated with Tooth Mousse CPP-ACP for one hour, rinsed and stored in 37°C temperature with 100% humidity. This procedure was repeated for 5 days and then adhesives were applied and Tetric N-Ceram composite was adhered to the enamel. This procedure was also fulfilled for the other three subgroups without CPP-ACP treatment. After 24 hour water storage, samples were tested for shear bond strength test in a universal testing machine. Failure modes were determined by stereomicroscope. Data were analyzed by t-test and one-way analysis of variance with P enamel only in Tetric N-Bond (P > 0.05). In non-applied CPP-ACP subgroups, there were statistically significant differences among all subgroups. Tetric N-Bond had the highest and AdheSE One F had the lowest shear bond strength. CPP-ACP application reduces the shear bond strength of AdheSE and AdheSE One F to enamel but not Tetric N-Bond.

  7. In vitro evaluation of casein phosphopeptide-amorphous calcium phosphate effect on the shear bond strength of dental adhesives to enamel

    Directory of Open Access Journals (Sweden)

    Niloofar Shadman

    2015-01-01

    Full Text Available Background: Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP is applied for remineralization of early caries lesions or tooth sensitivity conditions and may affect subsequent resin bonding. This in vitro study investigated the effect of CPP-ACP on the shear bond strength of dental adhesives to enamel. Materials and Methods: Sixty extracted human molar teeth were selected and randomly divided into three groups and six subgroups. Buccal or lingual surfaces of teeth were prepared to create a flat enamel surface. Adhesives used were Tetric N-Bond, AdheSE and AdheSE One F. In three subgroups, before applying adhesives, enamel surfaces were treated with Tooth Mousse CPP-ACP for one hour, rinsed and stored in 37°C temperature with 100% humidity. This procedure was repeated for 5 days and then adhesives were applied and Tetric N-Ceram composite was adhered to the enamel. This procedure was also fulfilled for the other three subgroups without CPP-ACP treatment. After 24 hour water storage, samples were tested for shear bond strength test in a universal testing machine. Failure modes were determined by stereomicroscope. Data were analyzed by t-test and one-way analysis of variance with P 0.05. In non-applied CPP-ACP subgroups, there were statistically significant differences among all subgroups. Tetric N-Bond had the highest and AdheSE One F had the lowest shear bond strength. Conclusion: CPP-ACP application reduces the shear bond strength of AdheSE and AdheSE One F to enamel but not Tetric N-Bond.

  8. The molecular basis of FHA domain:phosphopeptide binding specificity and implications for phospho-dependent signaling mechanisms.

    Science.gov (United States)

    Durocher, D; Taylor, I A; Sarbassova, D; Haire, L F; Westcott, S L; Jackson, S P; Smerdon, S J; Yaffe, M B

    2000-11-01

    Forkhead-associated (FHA) domains are a class of ubiquitous signaling modules that appear to function through interactions with phosphorylated target molecules. We have used oriented peptide library screening to determine the optimal phosphopeptide binding motifs recognized by several FHA domains, including those within a number of DNA damage checkpoint kinases, and determined the X-ray structure of Rad53p-FHA1, in complex with a phospho-threonine peptide, at 1.6 A resolution. The structure reveals a striking similarity to the MH2 domains of Smad tumor suppressor proteins and reveals a mode of peptide binding that differs from SH2, 14-3-3, or PTB domain complexes. These results have important implications for DNA damage signaling and CHK2-dependent tumor suppression, and they indicate that FHA domains play important and unsuspected roles in S/T kinase signaling mechanisms in prokaryotes and eukaryotes.

  9. Comparison of a novel TiO₂/diatomite composite and pure TiO₂ for the purification of phosvitin phosphopeptides.

    Science.gov (United States)

    Zhang, Yang; Li, Junhua; Niu, Fuge; Sun, Jun; Dou, Yuan; Liu, Yuntao; Su, Yujie; Zhou, Bei; Xu, Qinqin; Yang, Yanjun

    2014-06-01

    A novel TiO2/diatomite composite (TD) was prepared and then characterized by scanning electron microscope (SEM) and Fourier Transform Infrared (FTIR). The results of SEM showed that after modification, the porous surface of diatomite was covered with TiO2. Both diatomite and TD had clear disc-shaped structures with average grain diameters of around 25 μm. Then TD and pure TiO2 were applied in the purification of phosvitin phosphopeptides (PPPs) from the digest of egg yolk protein, and a comparative study of adsorption properties of PPPs on TD and TiO2 was performed. In the study of adsorption kinetics, the adsorption equilibrium of PPPs on TD and TiO2 fitted well with the Langmuir model, and the time needed to reach adsorption equilibrium were both around 10 min. The maximum dynamic adsorption capacity of TD (8.15 mg/g) was higher than that of TiO2 (4.96 mg/g). The results of repeated use showed that TD and TiO2 were very stable after being subjected to ten repeated adsorption-elution cycles, and TD could easily be separated from aqueous solution by filtration. On the other hand, the present synthetic technology of TD was very simple, cost-effective, organic solvent-free and available for large-scale preparation. Thus, this separation method not only brings great advantages in the purification of PPPs from egg yolk protein but also provides a promising purification material for the enrichment of phosphopeptides in proteomic researches. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. SPORTS ORGANIZATIONS MANAGEMENT IMPROVEMENT: A SURVEY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alin Molcut

    2015-07-01

    Full Text Available Sport organizations exist to perform tasks that can only be executed through cooperative effort, and sport management is responsible for the performance and success of these organizations. The main of the paper is to analyze several issues of management sports organizations in order to asses their quality management. In this respect a questionnaire has been desingned for performing a survey analysis through a statistical approach. Investigation was conducted over a period of 3 months, and have been questioned a number of managers and coaches of football, all while pursuing an activity in football clubs in the counties of Timis and Arad, the level of training for children and juniors. The results suggest that there is a significant interest for the improvement of management across teams of children and under 21 clubs, emphasis on players' participation and rewarding performance. Furthermore, we can state that in the sports clubs there is established a vision and a mission as well as the objectives of the club's general refers to both sporting performance, and financial performance.

  11. Continuous improvement projects: an authorship bibliometric analysis.

    Science.gov (United States)

    Gonzalez Aleu, Fernando; Van Aken, Eileen M

    2017-06-12

    Purpose The purpose of this paper is to describe the current research on hospital continuous improvement projects (CIPs) from an author characteristics' perspective. This work addresses the following questions: who are the predominant research authors in hospital CIPs? To what extent are the research communities collaborating in distinct research groups? How internationalized has hospital CIPs research become with respect to author location? Design/methodology/approach A systematic literature review was conducted, identifying 302 academic publications related to hospital CIPs. Publications were analyzed using: author, quantity, diversity, collaboration, and impact. Findings Hospital CIPs are increasingly attracting new scholars each year. Based on the authors' analysis, authors publishing in this area can be described as a relatively new international community given the countries represented. Originality/value This paper describes the current hospital CIP research by assessing author characteristics. Future work should examine additional attributes to characterize maturity such as how new knowledge is being created and to what extent new knowledge is being disseminated to practitioners.

  12. Neighbor-Directed Histidine N (s)–Alkylation: A Route to Imidazolium-Containing Phosphopeptide Macrocycles-Biopolymers | Center for Cancer Research

    Science.gov (United States)

    Our recently discovered, selective, on-resin route to N(s)-alkylated imidazolium-containing histidine residues affords new strategies for peptide mimetic design. In this, we demonstrate the use of this chemistry to prepare a series of macrocyclic phosphopeptides, in which imidazolium groups serve as ring-forming junctions. Interestingly, these cationic moieties subsequently serve to charge-mask the phosphoamino acid group that directed their formation.

  13. Casein phosphopeptides drastically increase the secretion of extracellular proteins in Aspergillus awamori. Proteomics studies reveal changes in the secretory pathway

    OpenAIRE

    Kosalková Katarina; García-Estrada Carlos; Barreiro Carlos; Flórez Martha G; Jami Mohammad S; Paniagua Miguel A; Martín Juan F

    2012-01-01

    Abstract Background The secretion of heterologous animal proteins in filamentous fungi is usually limited by bottlenecks in the vesicle-mediated secretory pathway. Results Using the secretion of bovine chymosin in Aspergillus awamori as a model, we found a drastic increase (40 to 80-fold) in cells grown with casein or casein phosphopeptides (CPPs). CPPs are rich in phosphoserine, but phosphoserine itself did not increase the secretion of chymosin. The stimulatory effect is reduced about 50% u...

  14. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  15. Facile Synthesis of Mesocrystalline SnO2 Nanorods on Reduced Graphene Oxide Sheets: An Appealing Multifunctional Affinity Probe for Sequential Enrichment of Endogenous Peptides and Phosphopeptides.

    Science.gov (United States)

    Ma, Wen; Zhang, Feng; Li, Liping; Chen, Shuai; Qi, Limin; Liu, Huwei; Bai, Yu

    2016-12-28

    A novel multifunctional composite comprising mesocrystalline SnO 2 nanorods (NRs) vertically aligned on reduced graphene oxide (rGO) sheets was synthesized and developed for sequential capture of endogenous peptides and phosphopeptides. With the hydrophobicity of rGO and high affinity of SnO 2 nanorods, sequential enrichment of endogenous peptides and phosphopeptides could be easily achieved through a modulation of elution buffer. With this multifunctional nanomaterial, 36 peptides were observed from diluted bovine serum albumin (BSA) tryptic digest and 4 phosphopeptides could be selectively captured from β-casein digest. The detection limit of tryptic digest of β-casein was low to 4 × 10 -10 M, and the selectivity was up to 1:500 (molar ratio of β-casein and BSA digest). The effectiveness and robustness of rGO-SnO 2 NRs in a complex biological system was also confirmed by using human serum as a real sample. Our work is promising for small peptide enrichment and identification especially in complicated biological sample preparation, which also opens a new perspective in the design of multifunctional affinity probes for proteome or peptidome.

  16. Hydrophilic Nb{sup 5+}-immobilized magnetic core–shell microsphere – A novel immobilized metal ion affinity chromatography material for highly selective enrichment of phosphopeptides

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xueni; Liu, Xiaodan; Feng, Jianan [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China); Li, Yan, E-mail: yanli@fudan.edu.cn [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China); Deng, Chunhui [Department of Chemistry and Institutes of Biomedical Sciences, Fudan University, Shanghai 200433 (China); Duan, Gengli [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China)

    2015-06-23

    Highlights: • A new IMAC material (Fe{sub 3}O{sub 4}@PD-Nb{sup 5+}) was synthesized. • The strong magnetic behaviors of the microspheres ensure fast and easy separation. • The enrichment ability was tested by human serum and nonfat milk. • The results were compared with other IMAC materials including the commercial kits. • All results proved the good enrichment ability, especially for multiphosphopeptides. - Abstract: Rapid and selective enrichment of phosphopeptides from complex biological samples is essential and challenging in phosphorylated proteomics. In this work, for the first time, niobium ions were directly immobilized on the surface of polydopamine-coated magnetic microspheres through a facile and effective synthetic route. The Fe{sub 3}O{sub 4}@polydopamine-Nb{sup 5+} (denoted as Fe{sub 3}O{sub 4}@PD-Nb{sup 5+}) microspheres possess merits of high hydrophilicity and good biological compatibility, and demonstrated low limit of detection (2 fmol). The selectivity was also basically satisfactory (β-casein:BSA = 1:500) to capture phosphopeptides. They were also successfully applied for enrichment of phosphopeptides from real biological samples such as human serum and nonfat milk. Compared with Fe{sub 3}O{sub 4}@PD-Ti{sup 4+} microspheres, the Fe{sub 3}O{sub 4}@PD-Nb{sup 5+} microspheres exhibit superior selectivity to multi-phosphorylated peptides, and thus may be complementary to the conventional IMAC materials.

  17. Occupational Analysis: A Continuous Improvement Approach

    National Research Council Canada - National Science Library

    Duffy, Tom

    1998-01-01

    .... In doing so, the Air Force has implemented "Quality Air Force (QAF)" (AF Handbook 90-502). QAF is a leadership commitment that inspires trust, teamwork, and continuous improvement everywhere in the Air Force...

  18. Improving Public Perception of Behavior Analysis.

    Science.gov (United States)

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  19. Analysis of improvement ways of creative accounting

    Directory of Open Access Journals (Sweden)

    I.A. Yuhimenko-Nazaruk

    2017-02-01

    Full Text Available The necessity of carrying out the research in the direction of finding out the ways to improve creative accounting is grounded. The existing approaches of researchers to eliminate the negative consequences of creative accounting are analyzed. Four main groups of researchers' approaches to the improvement of creative accounting are singled out and analyzed. The general and distinctive features of the researchers’ proposals on the improvement of creative accounting are examined. The reasons for the impossibility of using the ethical approach to the improvement of creative accounting in Ukraine in modern conditions are grounded. The necessity of procedural aspects perfection of the creative accounting on the basis of the concept of true and fair view is proved. The classification of the approaches to the construction of accounting methodology in the context of the use of creative accounting is developed. The main regulations of the concept of true and fair view are studied, their use provides an adequate reflection of the company's economic reality in financial reporting.

  20. Improved Layout of Inverter for EMC Analysis

    OpenAIRE

    Yade , Ousseynou; Martin , Christian; Vollaire , Christian; Bréard , Arnaud; Ali , Marwan; Meuret , Régis; Hervé , Morel

    2017-01-01

    International audience; This paper details EMC (electromagnetic compatibility) analysis on an inverter application. The work deals with the whole power chain (±270Vdc input voltage to 3-phase 115 Vac output voltage). This inverter is composed by modular parts (power module and EMC filters) that supply motors in more electrical aircraft. Through our analysis an approach is defined to design a detailed lumped circuit model of the power module layout by using Q3D extractor and SABER software. Fr...

  1. Combining casein phosphopeptide-amorphous calcium phosphate with fluoride: synergistic remineralization potential of artificially demineralized enamel or not?

    Science.gov (United States)

    Elsayad, Iman; Sakr, Amal; Badr, Yahia

    2009-07-01

    Recaldent is a product of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP). The remineralizing potential of CPP-ACP per se, or when combined with 0.22% Fl gel on artificially demineralized enamel using laser florescence, is investigated. Mesial surfaces of 15 sound human molars are tested using a He-Cd laser beam at 441.5 nm with 18-mW power as an excitation source on a suitable setup based on a Spex 750-M monochromator provided with a photomultiplier tube (PMT) for detection of collected autofluorescence from sound enamel. Mesial surfaces are subjected to demineralization for ten days. The spectra from demineralized enamel are measured. Teeth are divided into three groups according to the remineralizing regimen: group 1 Recaldent per se, group 2 Recaldent combined with fluoride gel and ACP, and group 3 artificial saliva as a positive control. After following these protocols for three weeks, the spectra from the remineralized enamel are measured. The spectra of enamel autofluorescence are recorded and normalized to peak intensity at about 540 nm to compare spectra from sound, demineralized, and remineralized enamel surfaces. A slight red shift occurred in spectra from demineralized enamel, while a blue shift may occur in remineralized enamel. Group 2 shows the highest remineralizing potential. Combining fluoride and ACP with CPP-ACP can give a synergistic effect on enamel remineralization.

  2. International Space Station Future Correlation Analysis Improvements

    Science.gov (United States)

    Laible, Michael R.; Pinnamaneni, Murthy; Sugavanam, Sujatha; Grygier, Michael

    2018-01-01

    Ongoing modal analyses and model correlation are performed on different configurations of the International Space Station (ISS). These analyses utilize on-orbit dynamic measurements collected using four main ISS instrumentation systems: External Wireless Instrumentation System (EWIS), Internal Wireless Instrumentation System (IWIS), Space Acceleration Measurement System (SAMS), and Structural Dynamic Measurement System (SDMS). Remote Sensor Units (RSUs) are network relay stations that acquire flight data from sensors. Measured data is stored in the Remote Sensor Unit (RSU) until it receives a command to download data via RF to the Network Control Unit (NCU). Since each RSU has its own clock, it is necessary to synchronize measurements before analysis. Imprecise synchronization impacts analysis results. A study was performed to evaluate three different synchronization techniques: (i) measurements visually aligned to analytical time-response data using model comparison, (ii) Frequency Domain Decomposition (FDD), and (iii) lag from cross-correlation to align measurements. This paper presents the results of this study.

  3. Planning, Conducting, and Documenting Data Analysis for Program Improvement

    Science.gov (United States)

    Winer, Abby; Taylor, Cornelia; Derrington, Taletha; Lucas, Anne

    2015-01-01

    This 2015 document was developed to help technical assistance (TA) providers and state staff define and limit the scope of data analysis for program improvement efforts, including the State Systemic Improvement Plan (SSIP); develop a plan for data analysis; document alternative hypotheses and additional analyses as they are generated; and…

  4. Dispersion sensitivity analysis & consistency improvement of APFSDS

    Directory of Open Access Journals (Sweden)

    Sangeeta Sharma Panda

    2017-08-01

    In Bore Balloting Motion simulation shows that reduction in residual spin by about 5% results in drastic 56% reduction in first maximum yaw. A correlation between first maximum yaw and residual spin is observed. Results of data analysis are used in design modification for existing ammunition. Number of designs are evaluated numerically before freezing five designs for further soundings. These designs are critically assessed in terms of their comparative performance during In-bore travel & external ballistics phase. Results are validated by free flight trials for the finalised design.

  5. Improvement of product design process by knowledge value analysis

    OpenAIRE

    XU, Yang; BERNARD, Alain; PERRY, Nicolas; LAROCHE, Florent

    2013-01-01

    Nowadays, design activities remain the core issue for global product development. As knowledge is more and more integrated, effective analysis of knowledge value becomes very useful for the improvement of product design processes. This paper aims at proposing a framework of knowledge value analysis in the context of product design process. By theoretical analysis and case study, the paper illustrates how knowledge value can be calculated and how the results can help the improvement of product...

  6. Atomic force microscopic comparison of remineralization with casein-phosphopeptide amorphous calcium phosphate paste, acidulated phosphate fluoride gel and iron supplement in primary and permanent teeth: An in-vitro study

    Directory of Open Access Journals (Sweden)

    Nikita Agrawal

    2014-01-01

    Full Text Available Context: Demineralization of tooth by erosion is caused by frequent contact between the tooth surface and acids present in soft drinks. Aim: The present study objective was to evaluate the remineralization potential of casein-phosphopeptide-amorphous calcium phosphate (CPP-ACP paste, 1.23% acidulated phosphate fluoride (APF gel and iron supplement on dental erosion by soft drinks in human primary and permanent enamel using atomic force microscopy (AFM. Materials and Methods: Specimens were made from extracted 15 primary and 15 permanent teeth which were randomly divided into three treatment groups: CPP-ACP paste, APF gel and iron supplement. AFM was used for baseline readings followed by demineralization and remineralization cycle. Results and Statistics: Almost all group of samples showed remineralization that is a reduction in surface roughness which was higher with CPP-ACP paste. Statistical analysis was performed using by one-way ANOVA and Mann-Whitney U-test with P < 0.05. Conclusions: It can be concluded that the application of CPP-ACP paste is effective on preventing dental erosion from soft drinks.

  7. Recent improvement of the resonance analysis methods

    International Nuclear Information System (INIS)

    Sirakov, I.; Lukyanov, A.

    2000-01-01

    By the use of a two-step method called Combined, the R-matrix Wigner-Eisenbud representation in the resonance reaction theory has been converted into other equivalent representations (parameterizations) of the collision matrix with Poles in E domain. Two of them called Capture Elimination (CE) and Reaction Elimination (RE) representation respectively, have energy independent parameters and are both rigorous and applicable. The CE representation is essentially a generalization of the Reich-Moore (RM) formalism. The RE representation, in turn, offers some distinct advantages when analyzing fissile nuclei. The latter does not require any approximation for the capture channels and does not need any assumption about the number of fission channels in contrast to the RM representation. Unlike the RM parameters the RE ones are uniquely determined for applications in the resonance analysis. When given in the RE representation, neutron cross sections of fissile nuclei in the resolved resonance region are presented through simple scalar expressions without the need of matrix inversion. Various computer codes have been developed to demonstrate the viability of the new method. The RM parameters of the fissile nuclei have been converted into equivalent RE parameters implying the RM assumptions (REFINE code). Conversely, the RE parameters have been converted into corresponding RM parameters when one fission channel is present and the RM parameter set is unique, e.g. Pu-239, J =1 (REVERSE code). To further enhance the flexibility of the proposed method the obtained RE parameters have been converted into equivalent Generalized Pole parameters (REFILE code), which are parameters of the rigorous pole expansion of the collision matrix in √E domain. equi valent sets of RM, RE and GP parameters of 239 Pu are given as an example. It has been pointed out that all the advantages of the newly proposed representation can be implemented through an independent evaluation of the RE resonance

  8. Casein Phosphopeptide-Amorphous Calcium Phosphate Reduces Streptococcus mutans Biofilm Development on Glass Ionomer Cement and Disrupts Established Biofilms.

    Science.gov (United States)

    Dashper, Stuart G; Catmull, Deanne V; Liu, Sze-Wei; Myroforidis, Helen; Zalizniak, Ilya; Palamara, Joseph E A; Huq, N Laila; Reynolds, Eric C

    2016-01-01

    Glass ionomer cements (GIC) are dental restorative materials that are suitable for modification to help prevent dental plaque (biofilm) formation. The aim of this study was to determine the effects of incorporating casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) into a GIC on the colonisation and establishment of Streptococcus mutans biofilms and the effects of aqueous CPP-ACP on established S mutans biofilms. S. mutans biofilms were either established in flow cells before a single ten min exposure to 1% w/v CPP-ACP treatment or cultured in static wells or flow cells with either GIC or GIC containing 3% w/w CPP-ACP as the substratum. The biofilms were then visualised using confocal laser scanning microscopy after BacLight LIVE/DEAD staining. A significant decrease in biovolume and average thickness of S. mutans biofilms was observed in both static and flow cell assays when 3% CPP-ACP was incorporated into the GIC substratum. A single ten min treatment with aqueous 1% CPP-ACP resulted in a 58% decrease in biofilm biomass and thickness of established S. mutans biofilms grown in a flow cell. The treatment also significantly altered the structure of these biofilms compared with controls. The incorporation of 3% CPP-ACP into GIC significantly reduced S. mutans biofilm development indicating another potential anticariogenic mechanism of this material. Additionally aqueous CPP-ACP disrupted established S. mutans biofilms. The use of CPP-ACP containing GIC combined with regular CPP-ACP treatment may lower S. mutans challenge.

  9. Casein Phosphopeptide-Amorphous Calcium Phosphate Reduces Streptococcus mutans Biofilm Development on Glass Ionomer Cement and Disrupts Established Biofilms.

    Directory of Open Access Journals (Sweden)

    Stuart G Dashper

    Full Text Available Glass ionomer cements (GIC are dental restorative materials that are suitable for modification to help prevent dental plaque (biofilm formation. The aim of this study was to determine the effects of incorporating casein phosphopeptide-amorphous calcium phosphate (CPP-ACP into a GIC on the colonisation and establishment of Streptococcus mutans biofilms and the effects of aqueous CPP-ACP on established S mutans biofilms. S. mutans biofilms were either established in flow cells before a single ten min exposure to 1% w/v CPP-ACP treatment or cultured in static wells or flow cells with either GIC or GIC containing 3% w/w CPP-ACP as the substratum. The biofilms were then visualised using confocal laser scanning microscopy after BacLight LIVE/DEAD staining. A significant decrease in biovolume and average thickness of S. mutans biofilms was observed in both static and flow cell assays when 3% CPP-ACP was incorporated into the GIC substratum. A single ten min treatment with aqueous 1% CPP-ACP resulted in a 58% decrease in biofilm biomass and thickness of established S. mutans biofilms grown in a flow cell. The treatment also significantly altered the structure of these biofilms compared with controls. The incorporation of 3% CPP-ACP into GIC significantly reduced S. mutans biofilm development indicating another potential anticariogenic mechanism of this material. Additionally aqueous CPP-ACP disrupted established S. mutans biofilms. The use of CPP-ACP containing GIC combined with regular CPP-ACP treatment may lower S. mutans challenge.

  10. Robust, Sensitive, and Automated Phosphopeptide Enrichment Optimized for Low Sample Amounts Applied to Primary Hippocampal Neurons

    NARCIS (Netherlands)

    Post, Harm; Penning, Renske; Fitzpatrick, Martin; Garrigues, L.B.; Wu, W.; Mac Gillavry, H.D.; Hoogenraad, C.C.; Heck, A.J.R.; Altelaar, A.F.M.

    2017-01-01

    Because of the low stoichiometry of protein phosphorylation, targeted enrichment prior to LC–MS/MS analysis is still essential. The trend in phosphoproteome analysis is shifting toward an increasing number of biological replicates per experiment, ideally starting from very low sample amounts,

  11. Advanced Analysis Cognition: Improving the Cognition of Intelligence Analysis

    Science.gov (United States)

    2013-09-01

    Reviews, 3rd ed., Sage Publications, Thousand Oaks, CA, 1998. 5 Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook for Systematic Reviews of...Structured Analytic Techniques for Intelligence Analysis, CQ Press, Washington, D.C., 2011. Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook...RW 3989) Bleicher, J. Contemporary Hermeneutics: Hermeneutics as Method, Philosophy, and Critique, Routledge & Kegan Paul, London; Boston, 1980

  12. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  13. A large synthetic peptide and phosphopeptide reference library for mass spectrometry–based proteomics

    NARCIS (Netherlands)

    Marx, H.; Lemeer, S.; Schliep, J.E.; Matheron, L.I.; Mohammed, S.; Cox, J.; Mann, M.; Heck, A.J.R.; Kuster, B.

    2013-01-01

    We present a peptide library and data resource of >100,000 synthetic, unmodified peptides and their phosphorylated counterparts with known sequences and phosphorylation sites. Analysis of the library by mass spectrometry yielded a data set that we used to evaluate the merits of different search

  14. Improving the Individual Work Performance Questionnaire using Rasch analysis.

    OpenAIRE

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Buuren, S. van; Beek, A.J. van der; Vet, H.C.W. de

    2014-01-01

    Recently, the Individual Work Performance Questionnaire (IWPQ) version 0.2 was developed using Rasch analysis. The goal of the current study was to improve targeting of the IWPQ scales by including additional items. The IWPQ 0.2 (original) and 0.3 (including additional items) were examined using Rasch analysis. Additional items that showed misfit or did not improve targeting were removed from the IWPQ 0.3, resulting in a final IWPQ 1.0. Subsequently, the scales showed good model fit and relia...

  15. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  16. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  17. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  18. Costs and returns analysis of improved and alternative cassava ...

    African Journals Online (AJOL)

    The specific objectives of the study was an analysis of the costs and returns of improved and alternative technologies available in the study area by farmers and their level of adoption of the new technologies. Data were collected from a random sample of 250 farmers and 30 extension Staff in the three (3) agricultural zones ...

  19. Improving the effectiveness of geological prospecting with neutron activation analysis

    International Nuclear Information System (INIS)

    Fardy, J.J.

    1984-01-01

    Two examples of the use of neutron activation analysis to improve the effectiveness of geological prospecting are examined. The first is application to the direct hydrogeochemical prospecting for gold in surface waters. The second shows how multielement data banks produced by NAA for a geological formation provide a powerful method for the classification of ore bodies and sedimentary materials

  20. Improving the Individual Work Performance Questionnaire using Rasch analysis.

    NARCIS (Netherlands)

    Koopmans, L.; Bernaards, C.M.; Hildebrandt, V.H.; Buuren, S. van; Beek, A.J. van der; Vet, H.C.W. de

    2014-01-01

    Recently, the Individual Work Performance Questionnaire (IWPQ) version 0.2 was developed using Rasch analysis. The goal of the current study was to improve targeting of the IWPQ scales by including additional items. The IWPQ 0.2 (original) and 0.3 (including additional items) were examined using

  1. Thermal hydraulic analysis of the JMTR improved LEU-core

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Toshio; Nagao, Yoshiharu; Komukai, Bunsaku; Naka, Michihiro; Fujiki, Kazuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Takeda, Takashi [Radioactive Waste Management and Nuclear Facility Decommissioning Technology Center, Tokai, Ibaraki (Japan)

    2003-01-01

    After the investigation of the new core arrangement for the JMTR reactor in order to enhance the fuel burn-up and consequently extend the operation period, the ''improved LEU core'' that utilized 2 additional fuel elements instead of formerly installed reflector elements, was adopted. This report describes the results of the thermal-hydraulic analysis of the improved LEU core as a part of safety analysis for the licensing. The analysis covers steady state, abnormal operational transients and accidents, which were described in the annexes of the licensing documents as design bases events. Calculation conditions for the computer codes were conservatively determined based on the neutronic analysis results and others. The results of the analysis, that revealed the safety criteria were satisfied on the fuel temperature, DNBR and primary coolant temperature, were used in the licensing. The operation license of the JMTR with the improved LEU core was granted in March 2001, and the reactor operation with new core started in November 2001 as 142nd operation cycle. (author)

  2. Using Operational Analysis to Improve Access to Pulmonary Function Testing

    Directory of Open Access Journals (Sweden)

    Ada Ip

    2016-01-01

    Full Text Available Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15, with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.

  3. Direct effects of casein phosphopeptides on growth and differentiation of in vitro cultured osteoblastic cells (MC3T3-E1).

    Science.gov (United States)

    Tulipano, Giovanni; Bulgari, Omar; Chessa, Stefania; Nardone, Alessandro; Cocchi, Daniela; Caroli, Anna

    2010-02-25

    Casein phosphopeptides (CPPs) obtained by enzymatic hydrolysis in vitro of caseins, have been shown to enhance calcium solubility and to increase the calcification of embryonic rat bones in their diaphyseal area. Little is known about the direct effects of CPPs on cultured osteoblastic cells. Calcium in the microenvironment surrounding bone cells is not only important for the mineralization of the extracellular matrix, but it is believed to provide preosteblasts with a signal that modulates their proliferation and differentiation. The aim of the present study was to investigate the direct effects of four selected casein phosphopeptides on osteoblastic cell (MC3T3-E1 cells) viability and differentiation. The selected peptides have been obtained by chemical synthesis and differed in the number of phosphorylated sites and in the amino acid spacing out two phosphorylated sites, in order to further characterize the relationship between structure and function. The results obtained in this work demonstrated that CPPs may directly affect osteoblast-like cell growth, calcium uptake and ultimately calcium deposition in the extracellular matrix. The effects exerted by distinct CPPs on osteogenesis in vitro can be either stimulatory or inhibitory. Differential short amino acid sequences in their molecules, like the -SpEE- and the -SpTSpEE-motifs, are likely the molecular determinants for their biological activities on osteoblastic cells. Moreover, two genetic variants of CPPs showing one amino acid change in their sequence may profoundly differ in their biological activities. Finally, our data may also suggest important clues about the role of intrinsic phosphorylated peptides derived from endogenous phosphorylated proteins in bone metabolism, apart from extrinsic CPPs. Copyright 2009 Elsevier B.V. All rights reserved.

  4. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  5. Development and improvement of safety analysis code for geological disposal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    In order to confirm the long-term safety concerning geological disposal, probabilistic safety assessment code and other analysis codes, which can evaluate possibility of each event and influence on engineered barrier and natural barrier by the event, were introduced. We confirmed basic functions of those codes and studied the relation between those functions and FEP/PID which should be taken into consideration in safety assessment. We are planning to develop 'Nuclide Migration Assessment System' for the purpose of realizing improvement in efficiency of assessment work, human error prevention for analysis, and quality assurance of the analysis environment and analysis work for safety assessment by using it. As the first step, we defined the system requirements and decided the system composition and functions which should be mounted in them based on those requirements. (author)

  6. Light ion microbeam analysis / processing system and its improvement

    International Nuclear Information System (INIS)

    Koka, Masashi; Ishii, Yasuyuki; Yamada, Naoto; Ohkubo, Takeru; Kamiya, Tomihiro; Satoh, Takahiro; Kada, Wataru; Kitamura, Akane; Iwata, Yoshihiro

    2016-03-01

    A MeV-class light ion microbeam system has been developed for micro-analysis and micro-fabrication with high spatial resolution at 3-MV single-ended accelerator in Takasaki Ion Accelerators for Advanced Radiation Application of Takasaki Advanced Radiation Research Institute, Sector of Nuclear Science Research, Japan Atomic Energy Agency. This report describes the technical improvements for the main apparatus (the accelerator, beam-transport lines, and microbeam system), and auxiliary equipments/ parts for ion beam applications such as Particle Induced X-ray/Gamma-ray Emission (PIXE/PIGE) analysis, 3-D element distribution analysis using PIXE Computed Tomography (CT), Ion Beam Induced Luminescence (IBIL) analysis, and Proton Beam Writing with the microbeam scanning, with functional outline of these apparatus and equipments/parts. (author)

  7. Assessment of phosphopeptide enrichment/precipitation method for LC-MS/MS based phosphoproteomic analysis of plant tissue

    DEFF Research Database (Denmark)

    Ye, Juanying; Rudashevskaya, Elena; Hansen, Thomas Aarup

    thaliana (Col-0) leaves using a two-phase partitioning system. The concentration of plasma membrane protein was determined by Bradford assay. Protein was digested with Lys-C for 4 hours and then by trypsin overnight. The peptide mixture was purified with IMAC, TiO2, CPP, SIMAC (IMAC+TiO2), the combination...

  8. Performance analysis and improvement of WPAN MAC for home networks.

    Science.gov (United States)

    Mehta, Saurabh; Kwak, Kyung Sup

    2010-01-01

    The wireless personal area network (WPAN) is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC) is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3) MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.

  9. Performance Analysis and Improvement of WPAN MAC for Home Networks

    Directory of Open Access Journals (Sweden)

    Saurabh Mehta

    2010-03-01

    Full Text Available The wireless personal area network (WPAN is an emerging wireless technology for future short range indoor and outdoor communication applications. The IEEE 802.15.3 medium access control (MAC is proposed to coordinate the access to the wireless medium among the competing devices, especially for short range and high data rate applications in home networks. In this paper we use analytical modeling to study the performance analysis of WPAN (IEEE 802.15.3 MAC in terms of throughput, efficient bandwidth utilization, and delay with various ACK policies under error channel condition. This allows us to introduce a K-Dly-ACK-AGG policy, payload size adjustment mechanism, and Improved Backoff algorithm to improve the performance of the WPAN MAC. Performance evaluation results demonstrate the impact of our improvements on network capacity. Moreover, these results can be very useful to WPAN application designers and protocol architects to easily and correctly implement WPAN for home networking.

  10. Improving Immunizations in Children: A Clinical Break-even Analysis.

    Science.gov (United States)

    Jones, Kyle Bradford; Spain, Chad; Wright, Hannah; Gren, Lisa H

    2015-06-01

    Immunizing the population is a vital public health priority. This article describes a resident-led continuous quality improvement project to improve the immunization rates of children under 3 years of age at two urban family medicine residency clinics in Salt Lake City, Utah, as well as a break-even cost analysis to the clinics for the intervention. Immunization records were distributed to provider-medical assistant teamlets daily for each pediatric patient scheduled in clinic to decrease missed opportunities. An outreach intervention by letter, followed by telephone call reminders, was conducted to reach children under 3 years of age who were behind on recommended immunizations for age (total n=457; those behind on immunizations n=101). Immunization rates were monitored at 3 months following start of intervention. A break-even analysis to the clinics for the outreach intervention was performed. Immunizations were improved from a baseline of 75.1% (n=133) and 79.6% (n=223) at the two clinics to 92.1% (n=163) and 89.6% (n=251), respectively, at 3 months following the start of intervention (Pbreak-even point required 36 immunizations to be administered. Significant improvement in the immunization rate of patients under 3 years of age at two family medicine residency training clinics was achieved through decreasing missed opportunities for immunization in clinic, and with outreach through letters and follow-up phone calls. The intervention showed positive revenue to both clinics. © 2015 Marshfield Clinic.

  11. Training needs analysis for MSMEs: how to improve training effectiveness

    Science.gov (United States)

    Rohayati, Y.; Wulandari, S.

    2017-12-01

    The study aims to analyze training needs for MSMEs in the area of Kabupaten Bandung by selecting the case of MSMEs joined in Association for Agricultural Product Process, focusing on marketing as the main topic of the training. The needs analysis was required to improve training participation and effectiveness. Both aspects are important to notice since making MSMEs participate in training is not an easy task. Similarly, the needs analysis was carried out to anticipate participants’ thoughts that the training does not give any benefits for them or is ineffective because it does not meet their needs although it was actually to help MSMEs improve their marketing knowledge expected to lead to their success. This research involved 100 MSMEs with business ages starting from less than five years to more than 15 years. Those involved MSMEs were dominated by MSMEs targeting local marketing areas. The data were collected by survey and judgmental sampling technique. By conducting a descriptive analysis, it can be concluded that the needs of SMEs on marketing training materials should focus on improving marketing skills such as product development, sales, and use of marketing media as well as discussing legal aspects such as the need for certification and product brand. The results of the study also concluded that there is a need for training that is supplemented by making visits to more successful SMEs as well as practices with on the job training methods.

  12. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  13. Improved hydrogen combustion model for multi-compartment analysis

    International Nuclear Information System (INIS)

    Ogino, Masao; Hashimoto, Takashi

    2000-01-01

    NUPEC has been improving a hydrogen combustion model in MELCOR code for severe accident analysis. In the proposed combustion model, the flame velocity in a node was predicted using six different flame front shapes of fireball, prism, bubble, spherical jet, plane jet, and parallelepiped. A verification study of the proposed model was carried out using the NUPEC large-scale combustion test results following the previous work in which the GRS/Battelle multi-compartment combustion test results had been used. The selected test cases for the study were the premixed test and the scenario-oriented test which simulated the severe accident sequences of an actual plant. The improved MELCOR code replaced by the proposed model could predict sufficiently both results of the premixed test and the scenario-oriented test of NUPEC large-scale test. The improved MELCOR code was confirmed to simulate the combustion behavior in the multi-compartment containment vessel during a severe accident with acceptable degree of accuracy. Application of the new model to the LWR severe accident analysis will be continued. (author)

  14. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  15. Improvement of numerical analysis method for FBR core characteristics. 3

    International Nuclear Information System (INIS)

    Takeda, Toshikazu; Yamamoto, Toshihisa; Kitada, Takanori; Katagi, Yousuke

    1998-03-01

    As the improvement of numerical analysis method for FBR core characteristics, studies on several topics have been conducted; multiband method, Monte Carlo perturbation and nodal transport method. This report is composed of the following three parts. Part 1: Improvement of Reaction Rate Calculation Method in the Blanket Region Based on the Multiband Method; A method was developed for precise evaluation of the reaction rate distribution in the blanket region using the multiband method. With the 3-band parameters obtained from the ordinary fitting method, major reaction rates such as U-238 capture, U-235 fission, Pu-239 fission and U-238 fission rate distributions were analyzed. Part 2: Improvement of Estimation Method for Reactivity Based on Monte-Carlo Perturbation Theory; Perturbation theory based on Monte-Carlo perturbation theory have been investigated and introduced into the calculational code. The Monte-Carlo perturbation code was applied to MONJU core and the calculational results were compared to the reference. Part 3: Improvement of Nodal Transport Calculation for Hexagonal Geometry; A method to evaluate the intra-subassembly power distribution from the nodal averaged neutron flux and surface fluxes at the node boundaries, was developed based on the transport theory. (J.P.N.)

  16. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    Energy Technology Data Exchange (ETDEWEB)

    Sun Zengzhi [School of Materials Science and Engineering, Chang' an University, Xi' an 710064 (China)], E-mail: zz-sun@126.com; Xu Qinwu [Pavement research, Transtec Group Inc., Austin 78731 (United States)], E-mail: qinwu_xu@yahoo.com

    2008-08-25

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca{sup 2+} and Al{sup 3+} cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH){sub 2}, acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent.

  17. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    International Nuclear Information System (INIS)

    Sun Zengzhi; Xu Qinwu

    2008-01-01

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca 2+ and Al 3+ cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH) 2 , acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent

  18. Security analysis and improvements to the PsychoPass method.

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  19. Modified paraffin wax for improvement of histological analysis efficiency.

    Science.gov (United States)

    Lim, Jin Ik; Lim, Kook-Jin; Choi, Jin-Young; Lee, Yong-Keun

    2010-08-01

    Paraffin wax is usually used as an embedding medium for histological analysis of natural tissue. However, it is not easy to obtain enough numbers of satisfactory sectioned slices because of the difference in mechanical properties between the paraffin and embedded tissue. We describe a modified paraffin wax that can improve the histological analysis efficiency of natural tissue, composed of paraffin and ethylene vinyl acetate (EVA) resin (0, 3, 5, and 10 wt %). Softening temperature of the paraffin/EVA media was similar to that of paraffin (50-60 degrees C). The paraffin/EVA media dissolved completely in xylene after 30 min at 50 degrees C. Physical properties such as the amount of load under the same compressive displacement, elastic recovery, and crystal intensity increased with increased EVA content. EVA medium (5 wt %) was regarded as an optimal composition, based on the sectioning efficiency measured by the numbers of unimpaired sectioned slices, amount of load under the same compressive displacement, and elastic recovery test. Based on the staining test of sectioned slices embedded in a 5 wt % EVA medium by hematoxylin and eosin (H&E), Masson trichrome (MT), and other staining tests, it was concluded that the modified paraffin wax can improve the histological analysis efficiency with various natural tissues. (c) 2010 Wiley-Liss, Inc.

  20. Toward improved analysis of concentration data: Embracing nondetects.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien

    2018-03-01

    Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.

  1. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  2. Crystal quality analysis and improvement using x-ray topography

    International Nuclear Information System (INIS)

    Maj, J.; Goetze, K.; Macrander, A.; Zhong, Y.; Huang, X.; Maj, L.

    2008-01-01

    The Topography X-ray Laboratory of the Advanced Photon Source (APS) at Argonne National Laboratory operates as a collaborative effort with APS users to produce high performance crystals for APS X-ray beamline experiments. For many years the topography laboratory has worked closely with an on-site optics shop to help ensure the production of crystals with the highest quality, most stress-free surface finish possible. It has been instrumental in evaluating and refining methods used to produce high quality crystals. Topographical analysis has shown to be an effective method to quantify and determine the distribution of stresses, to help identify methods that would mitigate the stresses and improve the Rocking curve, and to create CCD images of the crystal. This paper describes the topography process and offers methods for reducing crystal stresses in order to substantially improve the crystal optics.

  3. Harmonic analysis of electrified railway based on improved HHT

    Science.gov (United States)

    Wang, Feng

    2018-04-01

    In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.

  4. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  5. Improved signal analysis for motional Stark effect data

    International Nuclear Information System (INIS)

    Makowski, M.A.; Allen, S.L.; Ellis, R.; Geer, R.; Jayakumar, R.J.; Moller, J.M.; Rice, B.W.

    2005-01-01

    Nonideal effects in the optical train of the motional Stark effect diagnostic have been modeled using the Mueller matrix formalism. The effects examined are birefringence in the vacuum windows, an imperfect reflective mirror, and signal pollution due to the presence of a circularly polarized light component. Relations for the measured intensity ratio are developed for each case. These relations suggest fitting functions to more accurately model the calibration data. One particular function, termed the tangent offset model, is found to fit the data for all channels better than the currently used tangent slope function. Careful analysis of the calibration data with the fitting functions reveals that a nonideal effect is present in the edge array and is attributed to nonideal performance of a mirror in that system. The result of applying the fitting function to the analysis of our data has been to improve the equilibrium reconstruction

  6. MITG post-test analysis and design improvements

    International Nuclear Information System (INIS)

    Schock, A.

    1983-01-01

    The design, performance analysis, and key attributes of the Modular Isotopic Thermoelectric Generator (MITG) were described in a 1981 IECEC paper; and the design, fabrication, and testing of prototypical MITG test assemblies were described in preceding papers in these proceedings. Each test assembly simulated a typical modular slice of the flight generator. The present paper describes a detailed thermal-stress analysis, which identified the causes of stress-related problems observed during the tests. It then describes how additional analyses were used to evaluate design changes to alleviate those problems. Additional design improvements are discussed in the next paper in these proceedings, which also describes revised fabrication procedures and updated performance estimates for the generator

  7. Improvements and experience in the analysis of reprocessing samples

    International Nuclear Information System (INIS)

    Koch, L.; Cricchio, A.; Meester, R. de; Romkowski, M.; Wilhelmi, M.; Arenz, H.J.; Stijl, E. van der; Baeckmann, A. von

    1976-01-01

    Improvements in the analysis of input samples for reprocessing were obtained. To cope with the decomposition of reprocessing input solutions owling to the high radioactivity, an aluminium capsule technique was developed. A known amount of the dissolver solution was weighed into an aluminium can, dried, and the capsule was sealed. In this form, the sample could be stored over a long period and could be redissolved later for the analysis. The isotope correlation technique offers an attractive alternative for measuring the plutonium isotopic content in the dissolver solution. Moreover, this technique allows for consistency checks of analytical results. For this purpose, a data bank of correlated isotopic data is in use. To improve the efficiency of analytical work, four automatic instruments have been developed. The conditioning of samples for the U-Pu isotopic measurement was achieved by an automatic ion exchanger. A mass spectrometer, to which a high vacuum lock is connected, allows the automatic measurement of U-Pu samples. A process-computer controls the heating, focusing and scanning processes during the measurement and evaluates the data. To ease the data handling, alpha-spectrometry as well as a balance have been automated. (author)

  8. [Analysis of thickening polysaccharides by the improved diethyldithioacetal derivatization method].

    Science.gov (United States)

    Akiyama, Takumi; Yamazaki, Takeshi; Tanamoto, Kenichi

    2011-01-01

    The identification test for thickening polysaccharides containing neutral saccharides and uronic acids was investigated by GC analysis of constituent monosaccharides. The reported method, in which monosaccharides were converted to diethyldithioacetal derivatives with ethanethiol followed by trimethylsilylation, was improved in terms of operability and reproducibility of GC/MS analysis. The suitability of the improved diethyldithioacetal derivatization method was determined for seven thickening polysaccharides, i.e., carob bean gum, guar gum, karaya gum, gum arabic, gum ghatti, tragacanth gum and peach gum. The samples were acid-hydrolyzed to form monosaccharides. The hydrolysates were derivatized and analyzed with GC/FID. Each sugar derivative was detected as a single peak and was well separated from others on the chromatograms. The amounts of constituent monosaccharides in thickening polysaccharides were successfully estimated. Seven polysaccharides were distinguished from each other on the basis of constituent monosaccharides. Further examination of the time period of hydrolysis of polysaccharides using peach gum showed that the optimal times were not the same for all monosaccharides. A longer time was needed to hydrolyze glucuronic acid than neutral saccharides. The findings suggest that hydrolysis time may sometimes affect the analytical results on composition of constituent monosaccharides in polysaccharides.

  9. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ

    2010-09-01

    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  10. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  11. Does plyometric training improve strength performance? A meta-analysis.

    Science.gov (United States)

    Sáez-Sáez de Villarreal, Eduardo; Requena, Bernardo; Newton, Robert U

    2010-09-01

    Majority of the research suggests plyometric training (PT) improves maximal strength performance as measured by 1RM, isometric MVC or slow velocity isokinetic testing. However, the effectiveness of PT depends upon various factors. A meta-analysis of 15 studies with a total of 31 effect sizes (ES) was carried out to analyse the role of various factors on the effects of PT on strength performance. The inclusion criteria for the analysis were: (a) studies using PT programs for lower limb muscles; (b) studies employing true experimental design and valid and reliable measurements; (c) studies including sufficient data to calculate ES. When subjects can adequately follow plyometric exercises, the training gains are independent of fitness level. Subjects in either good or poor physical condition, benefit equally from plyometric work, also men obtain similar strength results to women following PT. In relation to the variables of program design, training volume of less than 10 weeks and with more than 15 sessions, as well as the implementation of high-intensity programs, with more than 40 jumps per session, were the strategies that seem to maximize the probability to obtain significantly greater improvements in performance (p<0.05). In order to optimise strength enhancement, the combination of different types of plyometrics with weight-training would be recommended, rather than utilizing only one form (p<0.05). The responses identified in this analysis are essential and should be considered by the strength and conditioning professional with regard to the most appropriate dose-response trends for PT to optimise strength gains.

  12. Improving patient safety in radiotherapy through error reporting and analysis

    International Nuclear Information System (INIS)

    Findlay, Ú.; Best, H.; Ottrey, M.

    2016-01-01

    Aim: To improve patient safety in radiotherapy (RT) through the analysis and publication of radiotherapy errors and near misses (RTE). Materials and methods: RTE are submitted on a voluntary basis by NHS RT departments throughout the UK to the National Reporting and Learning System (NRLS) or directly to Public Health England (PHE). RTE are analysed by PHE staff using frequency trend analysis based on the classification and pathway coding from Towards Safer Radiotherapy (TSRT). PHE in conjunction with the Patient Safety in Radiotherapy Steering Group publish learning from these events, on a triannual and summarised on a biennial basis, so their occurrence might be mitigated. Results: Since the introduction of this initiative in 2010, over 30,000 (RTE) reports have been submitted. The number of RTE reported in each biennial cycle has grown, ranging from 680 (2010) to 12,691 (2016) RTE. The vast majority of the RTE reported are lower level events, thus not affecting the outcome of patient care. Of the level 1 and 2 incidents reported, it is known the majority of them affected only one fraction of a course of treatment. This means that corrective action could be taken over the remaining treatment fractions so the incident did not have a significant impact on the patient or the outcome of their treatment. Analysis of the RTE reports demonstrates that generation of error is not confined to one professional group or to any particular point in the pathway. It also indicates that the pattern of errors is replicated across service providers in the UK. Conclusion: Use of the terminology, classification and coding of TSRT, together with implementation of the national voluntary reporting system described within this report, allows clinical departments to compare their local analysis to the national picture. Further opportunities to improve learning from this dataset must be exploited through development of the analysis and development of proactive risk management strategies

  13. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  14. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)

    2012-03-15

    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  15. Full-motion video analysis for improved gender classification

    Science.gov (United States)

    Flora, Jeffrey B.; Lochtefeld, Darrell F.; Iftekharuddin, Khan M.

    2014-06-01

    The ability of computer systems to perform gender classification using the dynamic motion of the human subject has important applications in medicine, human factors, and human-computer interface systems. Previous works in motion analysis have used data from sensors (including gyroscopes, accelerometers, and force plates), radar signatures, and video. However, full-motion video, motion capture, range data provides a higher resolution time and spatial dataset for the analysis of dynamic motion. Works using motion capture data have been limited by small datasets in a controlled environment. In this paper, we explore machine learning techniques to a new dataset that has a larger number of subjects. Additionally, these subjects move unrestricted through a capture volume, representing a more realistic, less controlled environment. We conclude that existing linear classification methods are insufficient for the gender classification for larger dataset captured in relatively uncontrolled environment. A method based on a nonlinear support vector machine classifier is proposed to obtain gender classification for the larger dataset. In experimental testing with a dataset consisting of 98 trials (49 subjects, 2 trials per subject), classification rates using leave-one-out cross-validation are improved from 73% using linear discriminant analysis to 88% using the nonlinear support vector machine classifier.

  16. TENDENCY OF IMPROVEMENT ANALYSIS OF VENTURE ACTIVITY FOR MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс

    2015-03-01

    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  17. Improving knowledge management systems with latent semantic analysis

    International Nuclear Information System (INIS)

    Sebok, A.; Plott, C.; LaVoie, N.

    2006-01-01

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  18. Photovoltaic module reliability improvement through application testing and failure analysis

    Science.gov (United States)

    Dumas, L. N.; Shumka, A.

    1982-01-01

    During the first four years of the U.S. Department of Energy (DOE) National Photovoltatic Program, the Jet Propulsion Laboratory Low-Cost Solar Array (LSA) Project purchased about 400 kW of photovoltaic modules for test and experiments. In order to identify, report, and analyze test and operational problems with the Block Procurement modules, a problem/failure reporting and analysis system was implemented by the LSA Project with the main purpose of providing manufacturers with feedback from test and field experience needed for the improvement of product performance and reliability. A description of the more significant types of failures is presented, taking into account interconnects, cracked cells, dielectric breakdown, delamination, and corrosion. Current design practices and reliability evaluations are also discussed. The conducted evaluation indicates that current module designs incorporate damage-resistant and fault-tolerant features which address field failure mechanisms observed to date.

  19. A improved method for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    Equillor, Hugo E.

    2004-01-01

    In this work we describe a methodology, developed in the last years, for the analysis of alpha emitters spectra, obtained with implanted ion detectors, that tend to solve some of the problems that shows this type of spectra. This is an improved methodology respect to that described in a previous publication. The method is based on the application of a mathematical function that allows to model the tail of an alpha peak, to evaluate the part of the peak that is not seen in the cases of partial superposition with another peak. Also, a calculation program that works in a semiautomatic way, with the possibility of interactive intervention of the analyst, has been developed simultaneously and is described in detail. (author)

  20. Analysis and improvement measures of flight delay in China

    Science.gov (United States)

    Zang, Yuhang

    2017-03-01

    Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.

  1. Improving assessment of personality disorder traits through social network analysis.

    Science.gov (United States)

    Clifton, Allan; Turkheimer, Eric; Oltmanns, Thomas F

    2007-10-01

    When assessing personality disorder traits, not all judges make equally valid judgments of all targets. The present study uses social network analysis to investigate factors associated with reliability and validity in peer assessment. Participants were groups of military recruits (N=809) who acted as both targets and judges in a round-robin design. Participants completed self- and informant versions of the Multisource Assessment of Personality Pathology. Social network matrices were constructed based on reported acquaintance, and cohesive subgroups were identified. Judges who shared a mutual subgroup were more reliable and had higher self-peer agreement than those who did not. Partitioning networks into two subgroups achieved more consistent improvements than multiple subgroups. We discuss implications for multiple informant assessments.

  2. Plant improvements through the use of benchmarking analysis

    International Nuclear Information System (INIS)

    Messmer, J.R.

    1993-01-01

    As utilities approach the turn of the century, customer and shareholder satisfaction is threatened by rising costs. Environmental compliance expenditures, coupled with low load growth and aging plant assets are forcing utilities to operate existing resources in a more efficient and productive manner. PSI Energy set out in the spring of 1992 on a benchmarking mission to compare four major coal fired plants against others of similar size and makeup, with the goal of finding the best operations in the country. Following extensive analysis of the 'Best in Class' operation, detailed goals and objectives were established for each plant in seven critical areas. Three critical processes requiring rework were identified and required an integrated effort from all plants. The Plant Improvement process has already resulted in higher operation productivity, increased emphasis on planning, and lower costs due to effective material management. While every company seeks improvement, goals are often set in an ambiguous manner. Benchmarking aids in setting realistic goals based on others' actual accomplishments. This paper describes how the utility's short term goals will move them toward being a lower cost producer

  3. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  4. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  5. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    Palamalai, A.; Thankachan, T.S.; Balasubramanian, G.R.

    1979-01-01

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  6. [Improvement of 2-mercaptoimidazoline analysis in rubber products containing chlorine].

    Science.gov (United States)

    Kaneko, Reiko; Haneishi, Nahoko; Kawamura, Yoko

    2012-01-01

    An improved analysis method for 2-mercaptoimidazoline in rubber products containing chlorine was developed. 2-Mercaptoimidazoline (20 µg/mL) is detected by means of TLC with two developing solvents in the official method. But, this method is not quantitative. Instead, we employed HPLC using water-methanol (9 : 1) as the mobile phase. This procedure decreased interfering peaks, and the quantitation limit was 2 µg/mL of standard solution. 2-Mercaptoimidazoline was confirmed by GC-MS (5 µg/mL) and LC/MS (1 µg/mL) in the scan mode. For preparation of test solution, a soaking extraction method, in which 20 mL of methanol was added to the sample and allowed to stand overnight at about 40°C, was used. This gave similar values to the Soxhlet extraction method (official method) and was more convenient. The results indicate that our procedure is suitable for analysis of 2-mercaptoimidazoline. When 2-mercaptoimidazoline is detected, it is confirmed by either GC/MS or LC/MS.

  7. Sensitivity analysis for improving nanomechanical photonic transducers biosensors

    International Nuclear Information System (INIS)

    Fariña, D; Álvarez, M; Márquez, S; Lechuga, L M; Dominguez, C

    2015-01-01

    The achievement of high sensitivity and highly integrated transducers is one of the main challenges in the development of high-throughput biosensors. The aim of this study is to improve the final sensitivity of an opto-mechanical device to be used as a reliable biosensor. We report the analysis of the mechanical and optical properties of optical waveguide microcantilever transducers, and their dependency on device design and dimensions. The selected layout (geometry) based on two butt-coupled misaligned waveguides displays better sensitivities than an aligned one. With this configuration, we find that an optimal microcantilever thickness range between 150 nm and 400 nm would increase both microcantilever bending during the biorecognition process and increase optical sensitivity to 4.8   ×   10 −2  nm −1 , an order of magnitude higher than other similar opto-mechanical devices. Moreover, the analysis shows that a single mode behaviour of the propagating radiation is required to avoid modal interference that could misinterpret the readout signal. (paper)

  8. Effects of the addition of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on mechanical properties of luting and lining glass ionomer cement

    Science.gov (United States)

    Heravi, Farzin; Bagheri, Hossein; Rangrazi, Abdolrasoul; Mojtaba Zebarjad, Seyed

    2016-07-01

    Recently, the addition of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) into glass ionomer cements (GICs) has attracted interest due to its remineralization of teeth and its antibacterial effects. However, it should be investigated to ensure that the incorporation of CPP-ACP does not have significant adverse effects on its mechanical properties. The purpose of this study was to evaluate the effects of the addition of CPP-ACP on the mechanical properties of luting and lining GIC. The first step was to synthesize the CPP-ACP. Then the CPP-ACP at concentrations of 1%, 1.56% and 2% of CPP-ACP was added into a luting and lining GIC. GIC without CPP-ACP was used as a control group. The results revealed that the incorporation of CPP-ACP up to 1.56%(w/w) increased the flexural strength (29%), diametral tensile strength (36%) and microhardness (18%), followed by a reduction in these mechanical properties at 2%(w/w) CPP-ACP. The wear rate was significantly decreased (23%) in 1.56%(w/w) concentration of CPP-ACP and it was increased in 2%(w/w). Accordingly, the addition of 1.56%(w/w) CPP-ACP into luting and lining GIC had no adverse effect on the mechanical properties of luting and lining GIC and could be used in clinical practice.

  9. [Effect of casein phosphopeptide-amorphouscalcium phosphate (CPP-ACP) treatment on the shear bond strength of orthodontic brackets after tooth bleaching].

    Science.gov (United States)

    Lu, Jing; Ding, Xiao-jun; Yu, Xiao-ping; Gong, Yi-ming

    2015-10-01

    To evaluate the effect of casein phosphopeptide-amorphouscalcium phosphate (CPP-ACP) treatment on the shear bond strength of orthodontic brackets after tooth bleaching. One hundred extracted human premolars were randomly divided and treated according to 5 groups (n=20) : (1) no treatment; (2) 10% carbamide peroxide bleaching; (3) 38% hydrogen peroxide bleaching; (4)10% carbamide peroxide bleaching and CPP-ACP paste; (5)38% hydrogen peroxide bleaching and CPP-ACP paste. In all groups, the brackets were bonded using a conventional acid-etch and bond system (Transbond XT, 3M Unitek, Monrovia, Calif). The shear bond strength adhesive remnant index (ARI) of the brackets were determined and the data was analyzed by ANOVA and Bonferroni test using SPSS13.0 software package. The use of 10% carbamide peroxide and 38% hydrogen peroxide bleaching significantly decreased the shear bond strength of orthodontic brackets when compared with untreated group (P0.05). The ARI did not show any significant difference before and after CPP-ACP treatment. After tooth bleaching, CPP-ACP treatment have little influence on the shear bond strength of orthodontic brackets.

  10. Novel Tyrosine Phosphorylation Sites in Rat Skeletal Muscle Revealed by Phosphopeptide Enrichment and HPLC-ESI-MS/MS

    Science.gov (United States)

    Zhang, Xiangmin; Højlund, Kurt; Luo, Moulun; Meyer, Christian; Thangiah, Geetha; Yi, Zhengping

    2012-01-01

    Tyrosine phosphorylation plays a fundamental role in many cellular processes including differentiation, growth and insulin signaling. In insulin resistant muscle, aberrant tyrosine phosphorylation of several proteins has been detected. However, due to the low abundance of tyrosine phosphorylation (tyrosine phosphorylation sites have been identified in mammalian skeletal muscle to date. Here, we used immunoprecipitation of phosphotyrosine peptides prior to HPLC-ESI-MS/MS analysis to improve the discovery of tyrosine phosphorylation in relatively small skeletal muscle biopsies from rats. This resulted in the identification of 87 distinctly localized tyrosine phosphorylation sites in 46 muscle proteins. Among them, 31 appear to be novel. The tyrosine phosphorylated proteins included major enzymes in the glycolytic pathway and glycogen metabolism, sarcomeric proteins, and proteins involved in Ca2+ homeostasis and phosphocreatine resynthesis. Among proteins regulated by insulin, we found tyrosine phosphorylation sites in glycogen synthase, and two of its inhibitors, GSK-3α and DYRK1A. Moreover, tyrosine phosphorylation sites were identified in several MAP kinases and a protein tyrosine phosphatase, SHPTP2. These results provide the largest catalogue of mammalian skeletal muscle tyrosine phosphorylation sites to date and provide novel targets for the investigation of human skeletal muscle phosphoproteins in various disease states. PMID:22609512

  11. Improvement of top shield analysis technology for CANDU 6 reactor

    International Nuclear Information System (INIS)

    Kim, Kyo Yoon; Jin, Young Kwon; Lee, Sung Hee; Moon, Bok Ja; Kim, Yong Il

    1996-07-01

    As for Wolsung NPP unit 1, radiation shielding analysis was performed by using neutron diffusion codes, one-dimensional discrete ordinates code ANISN, and analytical methods. But for Wolsung NPP unit 2, 3, and 4, two-dimensional discrete ordinates code DOT substituted for neutron diffusion codes. In other words, the method of analysis and computer codes used for radiation shielding of CANDU 6 type reactor have been improved. Recently Monte Carlo MCNP code has been widely utilized in the field of radiation physics and other radiation related areas because it can describe an object sophisticately by use of three-dimensional modelling and can adopt continuous energy cross-section library. Nowadays Monte Carlo method has been reported to be competitive to discrete ordinate method in the field of radiation shielding and the former has been known to be superior to the latter for complex geometry problem. However, Monte Carlo method had not been used for radiation streaming calculation in the shielding design of CANDU type reactor. Neutron and gamma radiations are expected to be streamed from calandria through the penetrations to reactivity mechanism deck (R/M deck) because many reactivity control units which are established on R/M deck extend from R/M deck to calandria within penetrations, which are provided by guide tube extensions. More precise estimation of radiation streaming is required because R/M deck is classified as an accessible area where atomic worker can access when necessary. Therefore neutron and gamma dose rates were estimated using MCNP code on the R/M deck in the top shield system of CANDU 6 reactor. 9 tabs., 17 figs., 21 refs. (Author)

  12. Receiver operating characteristic analysis improves diagnosis by radionuclide ventriculography

    International Nuclear Information System (INIS)

    Dickinson, C.Z.; Forman, M.B.; Vaugh, W.K.; Sandler, M.P.; Kronenberg, M.W.

    1985-01-01

    Receiver operating characteristic analysis (ROC) evaluates continuous variables to define diagnostic criteria for the optimal sensitivity (SENS) and specificity (SPEC) of a test. The authors studied exercise-induced chest pain (CP), ST-changes on electrocardiography (ECG) and rest-exercise gated radionuclide ventriculography (RVG) using ROC to clarify the optimal criteria for detecting myocardial ischemia due to coronary artherosclerosis (CAD). The data of 95 consecutive patients studied with coronary angiography, rest-exercise RVG and ECG were reviewed. 77 patients had ''significant'' CAD (≥50% lesions). Exercise-induced CP, ECG abnormalities (ST-T shifts) and RVG abnormalities (change in ejection fraction, 2-view regional wall motion change and relative end-systolic volume) were evaluated to define optimal SENS/SPEC of each and for the combined data. ROC curves were constructed by multiple logistic regression (MLR). By MLR, RVG alone was superior to ECG and CP. The combination of all three produced the best ROC curve for the entire group and for clinical subsets based on the number of diseased vessels and the presence or absence of prior myocardial infarction. When CP, ECG and RVG were combined, the optimal SENS/SPEC for detection of single vessel disease was 88/86. The SENS/SPEC for 3 vessel disease was 93/95. Thus, the application of RVG for the diagnosis of myocardial ischemia is improved with the inclusion of ECG and CP data by the use of a multiple logistic regression model. ROC analysis allows clinical application of multiple data for diagnosing CAD at desired SENS/SPEC rather than by arbitrary single-standard criteria

  13. Improvement of powertrain efficiency through energy breakdown analysis

    International Nuclear Information System (INIS)

    Damiani, Lorenzo; Repetto, Matteo; Prato, Alessandro Pini

    2014-01-01

    Highlights: • Energy breakdown analysis for the vehicular powertrain. • Model for road vehicles simulation in different missions. • Implemented powertrain management strategies: intelligent gearbox, Stop and Start, free wheel. • Innovative hybrid powertrain turned to engine thermodynamic cycles minimization. • Evaluation of fuel savings associated to each management strategy. - Abstract: A vehicular powertrain can be thought as an energy conversion chain, each component being characterized by its efficiency. Significant global efficiency improvements can be achieved once is identified the system energy breakdown, individuating the losses connected to each powertrain component; it is then possible to carry out the most appropriate interventions. This paper presents a simulation study of a diesel-fuelled commercial vehicle powertrain based on the above summarized point of view. The work aims at individuating the energy flows involved in the system during different missions, proposing an intelligent combination of technical solutions which minimize fuel consumption. Through a validated Matlab–Simulink model, able to indicate the powertrain energy breakdown, simulations are carried out to evaluate the fuel saving associated to a series of powertrain management logics which lead to the minimization of engine losses, the recovery of reverse power in deceleration and braking, the elimination of useless engine cycles. Tests were performed for different real missions (urban, extra-urban and highway). The results obtained point out a –23% fuel consumption (average value for urban, extra-urban and highway missions) compared to the traditional powertrain. Clearly, such result affects positively the CO 2 emission

  14. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  15. Improvement of retinal blood vessel detection using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Javidi, Malihe; Pourreza, Hamid-Reza

    2015-03-01

    Detection and quantitative measurement of variations in the retinal blood vessels can help diagnose several diseases including diabetic retinopathy. Intrinsic characteristics of abnormal retinal images make blood vessel detection difficult. The major problem with traditional vessel segmentation algorithms is producing false positive vessels in the presence of diabetic retinopathy lesions. To overcome this problem, a novel scheme for extracting retinal blood vessels based on morphological component analysis (MCA) algorithm is presented in this paper. MCA was developed based on sparse representation of signals. This algorithm assumes that each signal is a linear combination of several morphologically distinct components. In the proposed method, the MCA algorithm with appropriate transforms is adopted to separate vessels and lesions from each other. Afterwards, the Morlet Wavelet Transform is applied to enhance the retinal vessels. The final vessel map is obtained by adaptive thresholding. The performance of the proposed method is measured on the publicly available DRIVE and STARE datasets and compared with several state-of-the-art methods. An accuracy of 0.9523 and 0.9590 has been respectively achieved on the DRIVE and STARE datasets, which are not only greater than most methods, but are also superior to the second human observer's performance. The results show that the proposed method can achieve improved detection in abnormal retinal images and decrease false positive vessels in pathological regions compared to other methods. Also, the robustness of the method in the presence of noise is shown via experimental result. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Analysis and improvement of the quantum image matching

    Science.gov (United States)

    Dang, Yijie; Jiang, Nan; Hu, Hao; Zhang, Wenyin

    2017-11-01

    We investigate the quantum image matching algorithm proposed by Jiang et al. (Quantum Inf Process 15(9):3543-3572, 2016). Although the complexity of this algorithm is much better than the classical exhaustive algorithm, there may be an error in it: After matching the area between two images, only the pixel at the upper left corner of the matched area played part in following steps. That is to say, the paper only matched one pixel, instead of an area. If more than one pixels in the big image are the same as the one at the upper left corner of the small image, the algorithm will randomly measure one of them, which causes the error. In this paper, an improved version is presented which takes full advantage of the whole matched area to locate a small image in a big image. The theoretical analysis indicates that the network complexity is higher than the previous algorithm, but it is still far lower than the classical algorithm. Hence, this algorithm is still efficient.

  17. Improved sampling and analysis of images in corneal confocal microscopy.

    Science.gov (United States)

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the

  18. A Multivariate Analysis of Factors Affecting Adoption of Improved ...

    African Journals Online (AJOL)

    may compete for scarce household resources such as draft power, labor and chemical ..... The probability distribution of the joint adoption probabilities of improved .... would be that improved varieties being divisible and scale neutral may.

  19. Ethical analysis to improve decision-making on health technologies.

    Science.gov (United States)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian; Lühmann, Dagmar; Mäkelä, Marjukka; Velasco-Garrido, Marcial; Autti-Rämö, Ilona

    2008-08-01

    Health technology assessment (HTA) is the multidisciplinary study of the implications of the development, diffusion and use of health technologies. It supports health-policy decisions by providing a joint knowledge base for decision-makers. To increase its policy relevance, HTA tries to extend beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs, and their implementation may also have significant impact on people other than the patient. These are essential considerations for health policy. The ethics model is structured around key ethical questions rather than philosophical theories, to be applicable to different cultures and usable by non-philosophers. Integrating ethical considerations into HTA can improve the relevance of technology assessments for health care and health policy in both developed and developing countries.

  20. Comparative analysis of metagenomes of Italian top soil improvers

    International Nuclear Information System (INIS)

    Gigliucci, Federica; Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano

    2017-01-01

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15–50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms’ profiles as indicators of their origin. - Highlights: • Sludge- and green- based biosolids analysed by metagenomics. • Biosolids may introduce microbial hazards in the food chain. • Metagenomics enables tracking biosolids’ sources.

  1. Comparative analysis of metagenomes of Italian top soil improvers

    Energy Technology Data Exchange (ETDEWEB)

    Gigliucci, Federica, E-mail: Federica.gigliucci@libero.it [Department of Veterinary Public Health and Food Safety, Istituto Superiore di Sanità, Viale Regina Elena, 299 00161 Rome (Italy); Department of Sciences, University Roma,Tre, Viale Marconi, 446, 00146 Rome (Italy); Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano [Department of Veterinary Public Health and Food Safety, Istituto Superiore di Sanità, Viale Regina Elena, 299 00161 Rome (Italy)

    2017-05-15

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15–50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms’ profiles as indicators of their origin. - Highlights: • Sludge- and green- based biosolids analysed by metagenomics. • Biosolids may introduce microbial hazards in the food chain. • Metagenomics enables tracking biosolids’ sources.

  2. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  3. The Effect of Casein Phosphopeptide-Amorf Calcium Phosphate and Acidulated Phosphate Fluoride Gel on Dental Erosion in Primary Teeth: An in Vitro Study.

    Science.gov (United States)

    Maden, Eda Arat; Acar, Özge; Altun, Ceyhan; Polat, Günseli Güven

    This study aimed to investigate the effect of acidulated phosphate fluoride (APF) gel and casein phosphopeptide/amorphous calciumphosphate (CPP-ACP) on the dental erosion produced by carbonated soft drink in primary teeth. This study evaluated by an in vitro model the effect of APF gel and CPP-ACP on the dental enamel previously subjected to erosive challenge with carbonated soft drink. Sixty sound human primary molars were prepared by embedding the crown sections in acrylic resin blocks leaving the enamel surfaces exposed. The surface roughness of the enamel was measured with prophilometry at baseline. Specimens were randomly divided into three treatment groups (n:20): artificial saliva, CPP-ACP, 1.23% APF gel. All specimens were then exposed to an erosive challenge of carbonated soft drink and artificial saliva for 20 cycles of 20 seconds each. Demineralization-remineralization cycles was repeated twice at eight-hour intervals and roughness values were measured. Enamel samples were treated with artificial saliva, CPP-ACP, 1.23% APF gel applied for 10 min after erosive challenge. The arithmetic average roughness (Ra) readings were recorded after remineralization agents were applied. The mean surface roughness in all groups increased significantly after erosion process and decreased after remineralization treatment. After treatment, the mean surface roughness of the 1.23% APF gel group was significantly less than the other groups and the mean surface roughness of the artificial saliva group was significantly more than the other groups. 1.23% APF gel showed the highest protective effect against erosive enamel loss. Under the conditions of this study, artificial saliva, CPP-ACP and 1.23% APF treatments were able to reduce erosive enamel loss produced by carbonated soft drink in primary teeth. However, 1.23% APF gel showed the highest protective effect against erosive enamel loss.

  4. A multivariate analysis of factors affecting adoption of improved ...

    African Journals Online (AJOL)

    This paper analyzes the synergies/tradeoffs involved in the adoption of improved varieties of multiple crops in the mixed crop-livestock production systems of the highlands of Ethiopia A multivariate probit (MVP) model involving a system of four equations for the adoption decision of improved varieties of barley, potatoes, ...

  5. Improvement of gas chromatographic analysis for organic acids and ...

    African Journals Online (AJOL)

    Yomi

    2010-08-27

    Aug 27, 2010 ... and ethanol fermentation by using the anaerobic bacterium. Clostridium ... GC analysis. Standard solution for GC analysis consisted of acetic acid (Sigma-. Aldrich ... Microorganism and inoculum preparation. C. beijerinckii ...

  6. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  7. Analysis and improvement of security of energy smart grids

    International Nuclear Information System (INIS)

    Halimi, Halim

    2014-01-01

    The Smart grid is the next generation power grid, which is a new self-healing, self-activating form of electricity network, and integrates power-flow control, increased quality of electricity, and energy reliability, energy efficiency and energy security using information and communication technologies. Communication networks play a critical role in smart grid, as the intelligence of smart grid is built based on information exchange across the power grid. Its two-way communication and electricity flow enable to monitor, predict and manage the energy usage. To upgrade an existing power grid into a smart grid, it requires an intelligent and secure communication infrastructure. Because of that, the main goal of this dissertation is to propose new architecture and implementation of algorithms for analysis and improvement of the security and reliability in smart grid. In power transmission segments of smart grid, wired communications are usually adopted to ensure robustness of the backbone power network. In contrast, for a power distribution grid, wireless communications provide many benefits such as low cost high speed links, easy setup of connections among different devices/appliances, and so on. Wireless communications are usually more vulnerable to security attacks than wired ones. Developing appropriate wireless communication architecture and its security measures is extremely important for a smart grid system. This research addresses physical layer security in a Wireless Smart Grid. Hence a defense Quorum- based algorithm is proposed to ensure physical security in wireless communication. The new security architecture for smart grid that supports privacy-preserving, data aggregation and access control is defined. This architecture consists of two parts. In the first part we propose to use an efficient and privacy-preserving aggregation scheme (EPPA), which aggregates real-time data of consumers by Local Gateway. During aggregation the privacy of consumers is

  8. Improved streaming analysis technique: spherical harmonics expansion of albedo data

    International Nuclear Information System (INIS)

    Albert, T.E.; Simmons, G.L.

    1979-01-01

    An improved albedo scattering technique was implemented with a three-dimensional Monte Carlo transport code for use in analyzing radiation streaming problems. The improvement was based on a shifted spherical Harmonics expansion of the doubly differential albedo data base. The result of the improvement was a factor of 3 to 10 reduction in data storage requirements and approximately a factor of 3 to 6 increase in computational speed. Comparisons of results obtained using the technique with measurements are shown for neutron streaming in one- and two-legged square concrete ducts

  9. Improved Extreme Learning Machine based on the Sensitivity Analysis

    Science.gov (United States)

    Cui, Licheng; Zhai, Huawei; Wang, Benchao; Qu, Zengtang

    2018-03-01

    Extreme learning machine and its improved ones is weak in some points, such as computing complex, learning error and so on. After deeply analyzing, referencing the importance of hidden nodes in SVM, an novel analyzing method of the sensitivity is proposed which meets people’s cognitive habits. Based on these, an improved ELM is proposed, it could remove hidden nodes before meeting the learning error, and it can efficiently manage the number of hidden nodes, so as to improve the its performance. After comparing tests, it is better in learning time, accuracy and so on.

  10. Does ownership of improved dairy cow breeds improve child nutrition? A pathway analysis for Uganda.

    Directory of Open Access Journals (Sweden)

    Nassul S Kabunga

    Full Text Available The promotion of livestock production is widely believed to support enhanced diet quality and child nutrition, but the empirical evidence for this causal linkage remains narrow and ambiguous. This study examines whether adoption of improved dairy cow breeds is linked to farm-level outcomes that translate into household-level benefits including improved child nutrition outcomes in Uganda. Using nationwide data from Uganda's National Panel Survey, propensity score matching is used to create an unbiased counterfactual, based on observed characteristics, to assess the net impacts of improved dairy cow adoption. All estimates were tested for robustness and sensitivity to variations in observable and unobservable confounders. Results based on the matched samples showed that households adopting improved dairy cows significantly increased milk yield-by over 200% on average. This resulted in higher milk sales and milk intakes, demonstrating the potential of this agricultural technology to both integrate households into modern value chains and increase households' access to animal source foods. Use of improved dairy cows increased household food expenditures by about 16%. Although undernutrition was widely prevalent in the study sample and in matched households, the adoption of improved dairy cows was associated with lower child stunting in adopter household. In scale terms, results also showed that holding larger farms tends to support adoption, but that this also stimulates the household's ability to achieve gains from adoption, which can translate into enhanced nutrition.

  11. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  12. Improved Methods for Pitch Synchronous Linear Prediction Analysis of Speech

    OpenAIRE

    劉, 麗清

    2015-01-01

    Linear prediction (LP) analysis has been applied to speech system over the last few decades. LP technique is well-suited for speech analysis due to its ability to model speech production process approximately. Hence LP analysis has been widely used for speech enhancement, low-bit-rate speech coding in cellular telephony, speech recognition, characteristic parameter extraction (vocal tract resonances frequencies, fundamental frequency called pitch) and so on. However, the performance of the co...

  13. Analysis of Alternatives (AoA) Process Improvement Study

    Science.gov (United States)

    2016-12-01

    currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES COVERED...analysis, cost analysis, sustainment considerations, early systems engineering analyses, threat projections, and market research. UNCLASSIFIED CAA...primarily the Equipping (EE), Sustaining (SS) and Training (TT) Program Evaluation Groups (PEGs) and Long-range Investment Requirements Analysis

  14. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  15. The Application of Fishbone Diagram Analysis to Improve School Quality

    Science.gov (United States)

    Slameto

    2016-01-01

    With the enactment of the National Education Standards (NES), the measurement of the school quality was clear; NES became a reference for school development program to improve the school quality. However, the form of the program that exist still in problematic, so that a good proposal need to be prepared. In the real condition, the school shows,…

  16. Does Competition Improve Public School Efficiency? A Spatial Analysis

    Science.gov (United States)

    Misra, Kaustav

    2010-01-01

    Proponents of educational reform often call for policies to increase competition between schools. It is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. In many parts of the country, public schools experience significant competition from private schools; however,…

  17. an improved structural model for seismic analysis of tall frames

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. This paper proposed and examined an improved structural model ... The equation of motion of multI-storey building shown in fig. 2 can be ... The response of the nth mode at any time t of the MDOF system demands the solution of ...

  18. Using semantic analysis to improve speech recognition performance

    OpenAIRE

    Erdoğan, Hakan; Erdogan, Hakan; Sarıkaya, Ruhi; Sarikaya, Ruhi; Chen, Stanley F.; Gao, Yuqing; Picheny, Michael

    2005-01-01

    Although syntactic structure has been used in recent work in language modeling, there has not been much effort in using semantic analysis for language models. In this study, we propose three new language modeling techniques that use semantic analysis for spoken dialog systems. We call these methods concept sequence modeling, two-level semantic-lexical modeling, and joint semantic-lexical modeling. These models combine lexical information with varying amounts of semantic information, using ann...

  19. Optimizing Bus Passenger Complaint Service through Big Data Analysis: Systematized Analysis for Improved Public Sector Management

    Directory of Open Access Journals (Sweden)

    Weng-Kun Liu

    2016-12-01

    Full Text Available With the advances in industry and commerce, passengers have become more accepting of environmental sustainability issues; thus, more people now choose to travel by bus. Government administration constitutes an important part of bus transportation services as the government gives the right-of-way to transportation companies allowing them to provide services. When these services are of poor quality, passengers may lodge complaints. The increase in consumer awareness and developments in wireless communication technologies have made it possible for passengers to easily and immediately submit complaints about transportation companies to government institutions, which has brought drastic changes to the supply–demand chain comprised of the public sector, transportation companies, and passengers. This study proposed the use of big data analysis technology including systematized case assignment and data visualization to improve management processes in the public sector and optimize customer complaint services. Taichung City, Taiwan, was selected as the research area. There, the customer complaint management process in public sector was improved, effectively solving such issues as station-skipping, allowing the public sector to fully grasp the service level of transportation companies, improving the sustainability of bus operations, and supporting the sustainable development of the public sector–transportation company–passenger supply chain.

  20. Accredited Health Department Partnerships to Improve Health: An Analysis of Community Health Assessments and Improvement Plans.

    Science.gov (United States)

    Kronstadt, Jessica; Chime, Chinecherem; Bhattacharya, Bulbul; Pettenati, Nicole

    The Public Health Accreditation Board (PHAB) Standards & Measures require the development and updating of collaborative community health assessments (CHAs) and community health improvement plans (CHIPs). The goal of this study was to analyze the CHAs and CHIPs of PHAB-accredited health departments to identify the types of partners engaged, as well as the objectives selected to measure progress toward improving community health. The study team extracted and coded data from documents from 158 CHA/CHIP processes submitted as part of the accreditation process. Extracted data included population size, health department type, data sources, and types of partner organizations. Health outcome objectives were categorized by Healthy People 2020 Leading Health Indicator (LHI), as well as by the 7 broad areas in the PHAB reaccreditation framework for population health outcomes reporting. Participants included health departments accredited between 2013 and 2016 that submitted CHAs and CHIPs to PHAB, including 138 CHAs/CHIPs from local health departments and 20 from state health departments. All the CHAs/CHIPs documented collaboration with a broad array of partners, with hospitals and health care cited most frequently (99.0%). Other common partners included nonprofit service organizations, education, business, and faith-based organizations. Small health departments more frequently listed many partner types, including law enforcement and education, compared with large health departments. The majority of documents (88.6%) explicitly reference Healthy People 2020 goals, with most addressing the LHIs nutrition/obesity/physical activity and access to health services. The most common broad areas from PHAB's reaccreditation framework were preventive health care and individual behavior. This study demonstrates the range of partners accredited health departments engage with to collaborate on improving their communities' health as well as the objectives used to measure community health

  1. An analysis of radioisotope power systems using improved ATEC cells

    International Nuclear Information System (INIS)

    El-Genk, M.S.; Tournier, J.M.

    1998-01-01

    Recently, a ground demo of eight AMTEC (PX-3G) cells has been tested successfully in vacuum at the Air Force Research laboratory (AFRL). Results showed that the electric power output and voltage of the best performing PX-3G cell are short of meeting the requirements of the Pluto/Express (PX) mission. Using the basic configuration of the PX-3G cell, several design changes are explored, to improve the cell performance. Also, several integration options of the improved PX-3G cells with General-Purpose Heat Source (GPHS) modules are investigated for an electric power level of 130 W e and a 15-year mission. The options explored include varying the number of GPHS modules and AMTEC cells, and using fresh or aged fuel. The effects of changing the generators' output voltage (24 V or 28 V) on the evaporator and BASE metal-ceramic brazes temperatures and temperature margin in the cell are also examined

  2. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    Science.gov (United States)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  3. Using Principal Component Analysis to Improve Fallout Characterization

    Science.gov (United States)

    2017-03-23

    Overview Current high fidelity methods of post-detonation forensic are time consuming. The ability to focus these methods on areas of highest...time intensive forensic methods [6]. This research will build upon that success by increasing the available data and improving the correlation between...provides a rapid method of characterization for each sample. Each sample was scanned using a uniform grid as shown in Figure 14. This grid provides

  4. Improvements in longwall downtime analysis and fault identification

    Energy Technology Data Exchange (ETDEWEB)

    Daniel Bongers [CRCMining (Australia)

    2006-12-15

    In this project we have developed a computer program for recording detailed information relating to face equipment downtime in longwall mining operations. This software is intended to replace the current manual recording of delay information, which has been proven to be inaccurate. The software developed is intended to be operated from the maingate computer. Users are provided with a simple user interface requesting that nature of each delay in production, which is time-stamped in alignment with the SCADA system, removing the need for operators to estimate the start time and duration of each delay. Each instance of non-production is recorded to a database, which may be accessed by surface computers, removing the need for transcribing of the deputy's report into the delay database. An additional suggestive element has been developed, based on sophisticated fault detection technology, which reduces the data input required by operators, and provides a basis for the implementation of real-time fault detection. Both the basic recording software and the suggestive element offer improvements in efficiency and accuracy to longwall operations. More accurate data allows improved maintenance planning and improved measures of operational KPIs. The suggestive element offers the potential for rapid fault diagnosis, and potentially delay forecasting, which may be used to reduce lost time associated with machine downtime.

  5. Improving the usefulness of accounting data in financial analysis

    Directory of Open Access Journals (Sweden)

    A Saville

    2004-04-01

    Full Text Available Accounting practices are flawed.  As a consequence, the accounting data generated by firms are generally open to interpretation, often misleading and sometimes patently false.  Yet, financial analysts place tremendous confidence in accounting data when appraising investments and investment strategies.  The implications of financial analysis based on questionable information are numerous, and range from inexact analysis to acute investment error.  To rectify this situation, this paper identifies a set of simple, yet highly effective corrective measures, which have the capacity to move accounting practice into a realm wherein accounting starts to ‘count what counts’.  The net result would be delivery of accounting data that more accurately reflect firms’ economic realities and, as such, are more useful in the task of financial analysis.

  6. Risk Analysis for Performance Improvement in a Romanian Pharmaceutical Company

    Directory of Open Access Journals (Sweden)

    Dana Corina Deselnicu

    2018-05-01

    Full Text Available The paper presents risk management analysis carried out to investigate the operations of a Romanian company dealing with the distribution of pharmaceutical products. The main risks challenging the company were identified, described and classified, providing a scientific base for further analysis. Then, the identified inherent risks were evaluated using tools as the risk index method and the risk matrix in order to emphasize their tolerance level. According to the results of the evaluation, risk mitigation strategies and measures were advanced for the management of the analysed risks. Relevant conclusions were drawn from the experience.

  7. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pRwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the injured in a resource-limited setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Comments: Improving Weighting Methods for Causal Mediation Analysis

    Science.gov (United States)

    Imai, Kosuke

    2012-01-01

    The author begins this discussion by thanking Larry Hedges, the editor of the journal, for giving him an opportunity to provide a commentary on this stimulating article. He also would like to congratulate the authors of the article for their insightful discussion on causal mediation analysis, which is one of the most important and challenging…

  9. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  10. Temporal Land Cover Analysis for Net Ecosystem Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Ke, Yinghai; Coleman, Andre M.; Diefenderfer, Heida L.

    2013-04-09

    We delineated 8 watersheds contributing to previously defined river reaches within the 1,468-km2 historical floodplain of the tidally influenced lower Columbia River and estuary. We assessed land-cover change at the watershed, reach, and restoration site scales by reclassifying remote-sensing data from the National Oceanic and Atmospheric Administration Coastal Change Analysis Program’s land cover/land change product into forest, wetland, and urban categories. The analysis showed a 198.3 km2 loss of forest cover during the first 6 years of the Columbia Estuary Ecosystem Restoration Program, 2001–2006. Total measured urbanization in the contributing watersheds of the estuary during the full 1996-2006 change analysis period was 48.4 km2. Trends in forest gain/loss and urbanization differed between watersheds. Wetland gains and losses were within the margin of error of the satellite imagery analysis. No significant land cover change was measured at restoration sites, although it was visible in aerial imagery, therefore, the 30-m land-cover product may not be appropriate for assessment of early-stage wetland restoration. These findings suggest that floodplain restoration sites in reaches downstream of watersheds with decreasing forest cover will be subject to increased sediment loads, and those downstream of urbanization will experience effects of increased impervious surfaces on hydrologic processes.

  11. Improvement of gas chromatographic analysis for organic acids and ...

    African Journals Online (AJOL)

    Yomi

    2010-08-27

    Aug 27, 2010 ... short retention time and fair recognition peak of the compounds were obtained under the ... GC for acid and solvent analysis from ABE fermentation ... FID was kept at 230°C. Nitrogen gas was used as a carrier gas at a.

  12. Improving the Computational Morphological Analysis of a Swahili ...

    African Journals Online (AJOL)

    approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation. Keywords: LEXICOGRAPHY, MORPHOLOGY, CORPUS ANNOTATION, LEMMATIZATION, MACHINE LEARNING, SWAHILI ...

  13. Engendering mobility: towards improved gender analysis in the transport sector

    CSIR Research Space (South Africa)

    Venter, C

    2006-01-01

    Full Text Available It is the purpose of this chapter to advance the discourse between gender analysis and transport, specifically within the urban development context, with a view to promoting an understanding of the strategic role of transport, access and mobility...

  14. An improvement analysis on video compression using file segmentation

    Science.gov (United States)

    Sharma, Shubhankar; Singh, K. John; Priya, M.

    2017-11-01

    From the past two decades the extreme evolution of the Internet has lead a massive rise in video technology and significantly video consumption over the Internet which inhabits the bulk of data traffic in general. Clearly, video consumes that so much data size on the World Wide Web, to reduce the burden on the Internet and deduction of bandwidth consume by video so that the user can easily access the video data.For this, many video codecs are developed such as HEVC/H.265 and V9. Although after seeing codec like this one gets a dilemma of which would be improved technology in the manner of rate distortion and the coding standard.This paper gives a solution about the difficulty for getting low delay in video compression and video application e.g. ad-hoc video conferencing/streaming or observation by surveillance. Also this paper describes the benchmark of HEVC and V9 technique of video compression on subjective oral estimations of High Definition video content, playback on web browsers. Moreover, this gives the experimental ideology of dividing the video file into several segments for compression and putting back together to improve the efficiency of video compression on the web as well as on the offline mode.

  15. Improved lumped parameter for annular fuel element thermohydraulic analysis

    International Nuclear Information System (INIS)

    Duarte, Juliana Pacheco; Su, Jian; Alvim, Antonio Carlos Marques

    2011-01-01

    Annular fuel elements have been intensively studied for the purpose of increasing power density in light water reactors (LWR). This paper presents an improved lumped parameter model for the dynamics of a LWR core with annular fuel elements, composed of three sub-models: the fuel dynamics model, the neutronics model, and the coolant energy balance model. The transient heat conduction in radial direction is analyzed through an improved lumped parameter formulation. The Hermite approximation for integration is used to obtain the average temperature of the fuel and cladding and also to obtain the average heat flux. The volumetric heat generation in fuel rods was obtained with the point kinetics equations with six delayed neutron groups. The equations for average temperature of fuel and cladding are solved along with the point kinetic equations, assuming linear reactivity and coolant temperature in cases of reactivity insertion. The analytical development of the model and the numeric solution of the ordinary differential equation system were obtained by using Mathematica 7.0. The dynamic behaviors for average temperatures of fuel, cladding and coolant in transient events as well as the reactor power were analyzed. (author)

  16. SIFT Based Vein Recognition Models: Analysis and Improvement

    Directory of Open Access Journals (Sweden)

    Guoqing Wang

    2017-01-01

    Full Text Available Scale-Invariant Feature Transform (SIFT is being investigated more and more to realize a less-constrained hand vein recognition system. Contrast enhancement (CE, compensating for deficient dynamic range aspects, is a must for SIFT based framework to improve the performance. However, evidence of negative influence on SIFT matching brought by CE is analysed by our experiments. We bring evidence that the number of extracted keypoints resulting by gradient based detectors increases greatly with different CE methods, while on the other hand the matching result of extracted invariant descriptors is negatively influenced in terms of Precision-Recall (PR and Equal Error Rate (EER. Rigorous experiments with state-of-the-art and other CE adopted in published SIFT based hand vein recognition system demonstrate the influence. What is more, an improved SIFT model by importing the kernel of RootSIFT and Mirror Match Strategy into a unified framework is proposed to make use of the positive keypoints change and make up for the negative influence brought by CE.

  17. Analysis of radial electric field in LHD towards improved confinement

    International Nuclear Information System (INIS)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.

    2001-05-01

    The radial electric field (E r ) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E r transition and bifurcation. The ambipolar E r is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E r properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E r and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E r in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E r can exist. This is the region where E r transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E r properties, which indicates an optimum path towards improved confinement. (author)

  18. Continuous quality improvement using intelligent infusion pump data analysis.

    Science.gov (United States)

    Breland, Burnis D

    2010-09-01

    The use of continuous quality-improvement (CQI) processes in the implementation of intelligent infusion pumps in a community teaching hospital is described. After the decision was made to implement intelligent i.v. infusion pumps in a 413-bed, community teaching hospital, drug libraries for use in the safety software had to be created. Before drug libraries could be created, it was necessary to determine the epidemiology of medication use in various clinical care areas. Standardization of medication administration was performed through the CQI process, using practical knowledge of clinicians at the bedside and evidence-based drug safety parameters in the scientific literature. Post-implementation, CQI allowed refinement of clinically important safety limits while minimizing inappropriate, meaningless soft limit alerts on a few select agents. Assigning individual clinical care areas (CCAs) to individual patient care units facilitated customization of drug libraries and identification of specific CCA compliance concerns. Between June 2007 and June 2008, there were seven library updates. These involved drug additions and deletions, customization of individual CCAs, and alterations of limits. Overall compliance with safety software use rose over time, from 33% in November 2006 to over 98% in December 2009. Many potentially clinically significant dosing errors were intercepted by the safety software, prompting edits by end users. Only 4-6% of soft limit alerts resulted in edits. Compliance rates for use of infusion pump safety software varied among CCAs over time. Education, auditing, and refinement of drug libraries led to improved compliance in most CCAs.

  19. Improved Monte Carlo Method for PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Choi, Jongsoo

    2016-01-01

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard

  20. Improved Monte Carlo Method for PSA Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jongsoo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    The treatment of uncertainty is an important issue for regulatory decisions. Uncertainties exist from knowledge limitations. A probabilistic approach has exposed some of these limitations and provided a framework to assess their significance and assist in developing a strategy to accommodate them in the regulatory process. The uncertainty analysis (UA) is usually based on the Monte Carlo method. This paper proposes a Monte Carlo UA approach to calculate the mean risk metrics accounting for the SOKC between basic events (including CCFs) using efficient random number generators and to meet Capability Category III of the ASME/ANS PRA standard. Audit calculation is needed in PSA regulatory reviews of uncertainty analysis results submitted for licensing. The proposed Monte Carlo UA approach provides a high degree of confidence in PSA reviews. All PSA needs accounting for the SOKC between event probabilities to meet the ASME/ANS PRA standard.

  1. Improved asymptotic stability analysis for uncertain delayed state neural networks

    International Nuclear Information System (INIS)

    Souza, Fernando O.; Palhares, Reinaldo M.; Ekel, Petr Ya.

    2009-01-01

    This paper presents a new linear matrix inequality (LMI) based approach to the stability analysis of artificial neural networks (ANN) subject to time-delay and polytope-bounded uncertainties in the parameters. The main objective is to propose a less conservative condition to the stability analysis using the Gu's discretized Lyapunov-Krasovskii functional theory and an alternative strategy to introduce slack matrices. Two computer simulations examples are performed to support the theoretical predictions. Particularly, in the first example, the Hopf bifurcation theory is used to verify the stability of the system when the origin falls into instability. The second example is presented to illustrate how the proposed approach can provide better stability performance when compared to other ones in the literature

  2. In House HSV PCR, Process Improvement and Cost Effectiveness Analysis

    Science.gov (United States)

    2017-09-15

    TYPE 09/15/2017 Poster 4. TITLE AND SUBTITLE Cost-Analysis: In-hous(l HSV P(’R capabilities 6. AUTHOR(S) Ma.i Nich() las R CaJT 7. PERFORMING...ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMIT A TIC ".’ OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES 3

  3. Optimization to improve precision in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani

    2010-01-01

    The level of precision or accuracy required in analysis should be satisfied the general requirements and customer needs. In presenting the results of the analysis, the level of precision is expressed as uncertainty. Requirement general is Horwitz prediction. Factors affecting the uncertainty in the Neutron Activation Analysis (NAA) include the mass of sample, mass standards, concentration in standard, count of sample, count of standard and counting geometry. Therefore, to achieve the expected level of precision, these parameters need to be optimized. A standard concentration of similar materials is applied as a basis of calculation. In the calculation NIST SRM 2704 is applied for sediment samples. Mass of sample, irradiation time and cooling time can be modified to obtain the expected uncertainty. The prediction results show the level of precision for Al, V, Mg, Mn, K, Na, As, Cr, Co, Fe, and Zn eligible the Horwitz. The predictive the count and standard deviation for Mg-27 and Zn-65 were higher than the actual value occurred due to overlapping of Mg-27 and Mn-54 peaks and Zn-65 and Fe-59 peaks. Precision level of Ca is greater than the Horwitz, since the value of microscopic cross section, the probability of radiation emission of Ca-49 and gamma spectrometer efficiency at 3084 keV is relatively small. Increased precision can only be done by extending the counting time and multiply the number of samples, because of the fixed value. The prediction results are in accordance with experimental results. (author)

  4. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  5. Improved methods for dependent failure analysis in PSA

    International Nuclear Information System (INIS)

    Ballard, G.M.; Games, A.M.

    1988-01-01

    The basic design principle used in ensuring the safe operation of nuclear power plant is defence in depth. This normally takes the form of redundant equipment and systems which provide protection even if a number of equipment failures occur. Such redundancy is particularly effective in ensuring that multiple, independent equipment failures with the potential for jeopardising reactor safety will be rare events. However the achievement of high reliability has served to highlight the potentially dominant role of multiple, dependent failures of equipment and systems. Analysis of reactor operating experience has shown that dependent failure events are the major contributors to safety system failures and reactor incidents and accidents. In parallel PSA studies have shown that the results of a safety analysis are sensitive to assumptions made about the dependent failure (CCF) probability for safety systems. Thus a Westinghouse Analysis showed that increasing system dependent failure probabilities by a factor of 5 led to a factor 4 increase in core. This paper particularly refers to the engineering concepts underlying dependent failure assessment touching briefly on aspects of data. It is specifically not the intent of our work to develop a new mathematical model of CCF but to aid the use of existing models

  6. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  7. Improvement of testing and maintenance based on fault tree analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2000-01-01

    Testing and maintenance of safety equipment is an important issue, which significantly contributes to safe and efficient operation of a nuclear power plant. In this paper a method, which extends the classical fault tree with time, is presented. Its mathematical model is represented by a set of equations, which include time requirements defined in the house event matrix. House events matrix is a representation of house events switched on and off through the discrete points of time. It includes house events, which timely switch on and off parts of the fault tree in accordance with the status of the plant configuration. Time dependent top event probability is calculated by the fault tree evaluations. Arrangement of components outages is determined on base of minimization of mean system unavailability. The results show that application of the method may improve the time placement of testing and maintenance activities of safety equipment. (author)

  8. Improved part-of-speech prediction in suffix analysis.

    Directory of Open Access Journals (Sweden)

    Mario Fruzangohar

    Full Text Available MOTIVATION: Predicting the part of speech (POS tag of an unknown word in a sentence is a significant challenge. This is particularly difficult in biomedicine, where POS tags serve as an input to training sophisticated literature summarization techniques, such as those based on Hidden Markov Models (HMM. Different approaches have been taken to deal with the POS tagger challenge, but with one exception--the TnT POS tagger--previous publications on POS tagging have omitted details of the suffix analysis used for handling unknown words. The suffix of an English word is a strong predictor of a POS tag for that word. As a pre-requisite for an accurate HMM POS tagger for biomedical publications, we present an efficient suffix prediction method for integration into a POS tagger. RESULTS: We have implemented a fully functional HMM POS tagger using experimentally optimised suffix based prediction. Our simple suffix analysis method, significantly outperformed the probability interpolation based TnT method. We have also shown how important suffix analysis can be for probability estimation of a known word (in the training corpus with an unseen POS tag; a common scenario with a small training corpus. We then integrated this simple method in our POS tagger and determined an optimised parameter set for both methods, which can help developers to optimise their current algorithm, based on our results. We also introduce the concept of counting methods in maximum likelihood estimation for the first time and show how counting methods can affect the prediction result. Finally, we describe how machine-learning techniques were applied to identify words, for which prediction of POS tags were always incorrect and propose a method to handle words of this type. AVAILABILITY AND IMPLEMENTATION: Java source code, binaries and setup instructions are freely available at http://genomes.sapac.edu.au/text_mining/pos_tagger.zip.

  9. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  10. Recent improvements in plutonium gamma-ray analysis using MGA

    International Nuclear Information System (INIS)

    Ruhter, W.D.; Gunnink, R.

    1992-06-01

    MGA is a gamma-ray spectrum analysis program for determining relative plutonium isotopic abundances. It can determine plutonium isotopic abundances better than 1% using a high-resolution, low-energy, planar germanium detector and measurement times ten minutes or less. We have modified MGA to allow determination of absolute plutonium isotopic abundances in solutions. With calibration of a detector using a known solution concentration in a well-defined sample geometry, plutonium solution concentrations can be determined. MGA can include analysis of a second spectrum of the high-energy spectrum to include determination of fission product abundances relative to total plutonium. For the high-energy gamma-ray measurements we have devised a new hardware configuration, so that both the low- and high-energy gamma-ray detectors are mounted in a single cryostat thereby reducing weight and volume of the detector systems. We describe the detector configuration, and the performance of the MGA program for determining plutonium concentrations in solutions and fission product abundances

  11. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada

    2016-08-01

    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  12. Systematic wavelength selection for improved multivariate spectral analysis

    Science.gov (United States)

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  13. Performance analysis of PV plants: Optimization for improving profitability

    International Nuclear Information System (INIS)

    Díez-Mediavilla, M.; Alonso-Tristán, C.; Rodríguez-Amigo, M.C.; García-Calderón, T.; Dieste-Velasco, M.I.

    2012-01-01

    Highlights: ► Real PV production from two 100 kW p grid-connected installations is conducted. ► Data sets on production were collected over an entire year. ► Economic results highlight the importance of properly selecting the system components. ► Performance of PV plants is directly related to improvements of all components. - Abstract: A study is conducted of real PV production from two 100 kW p grid-connected installations located in the same area, both of which experience the same fluctuations in temperature and radiation. Data sets on production were collected over an entire year and both installations were compared under various levels of radiation. The installations were assembled with mono-Si panels, mounted on the same support system, and the power supply was equal for the inverter and the measurement system; the same parameters were also employed for the wiring, and electrical losses were calculated in both cases. The results, in economic terms, highlight the importance of properly selecting the system components and the design parameters for maximum profitability.

  14. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  15. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...

  16. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  17. Trends in Mediation Analysis in Nursing Research: Improving Current Practice.

    Science.gov (United States)

    Hertzog, Melody

    2018-06-01

    The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.

  18. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    Health technology assessment (HTA) is the multidisciplinary study of the implications of the development, diffusion and use of health technologies. It supports health-policy decisions by providing a joint knowledge base for decision-makers. To increase its policy relevance, HTA tries to extend...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...

  19. Using task analysis to improve the requirements elicitation in health information system.

    Science.gov (United States)

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  20. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Directory of Open Access Journals (Sweden)

    Gafurov Andrey

    2018-01-01

    Full Text Available The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the “Project analysis scenario” flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  1. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Science.gov (United States)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  2. Revisiting Ulchin 4 SGTR Accident - Analysis for EOP Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun-Hye; Lee, Wook-Jo; Jerng, Dong-Wook [Chung-Ang University, Seoul (Korea, Republic of)

    2016-10-15

    The Steam Generator Tube Ruputure (SGTR) is an accident that U-tube inside the SG is defected so that the reactor coolant releases through broken U-tube and this is one of design basis accidents. Operating the Nuclear Power Plants (NPP), maintaing the integrity of core and preventing radiation release are most important things. Because of risks, many researchers have studied scenarios, impacts and the ways to mitigate SGTR accidents. The study to provide an experimental database of aerosol particle retention and to develop models to support accident management interventions during SGTR was performed. The scaled-down models of NPP were used for experiments, also, MELCOR and SCDAP/RELAP5 were used to simulate a design basis SGTR accident. This study had a major role to resolve uncertainties of various physical models for aerosol mechanical resuspension. The other study which analyzed SGTR accident for System-integrated Modular Advanced Reactor (SMART) was performed. In this analysis, the amount of break flow was focused and TASS/SMRS code was used. It assumed that maximum leak was generated, and found that high RCS pressure, low core inlet coolant temperature, and low break location of the SG cassette contributed to leakage. Although the leakage was large, there was no direct release to atmosphere because the pressure of secondary loop was maintained below the safety relief valve set point. In this analysis, comparison of mitigating procedure when SGTR occurs between shutdown condition and full power condition was performed. In shutdown condition, the core uncovery would not take place in 16 hours whether the cooling procedures are performed or not. Therefore, the integrated amount of break flow should be considered only. In this point of view, cooling through intact SG only, case 3, is the best way to minimize the amount of break flow. In full power condition, the core water level is changed due to high reactor power. The important thing to protect NPP is to keep

  3. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  4. Improving Power System Stability Using Transfer Function: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    G. Shahgholian

    2017-10-01

    Full Text Available In this paper, a small-signal dynamic model of a single-machine infinite-bus (SMIB power system that includes IEEE type-ST1 excitation system and PSS based on transfer fu¬n¬c¬¬tion structure is presented. The changes in the operating co¬n¬dition of a power system on dynamic performance have been exa¬m¬ined. The dynamic performance of the closed-loop system is ana¬lyzed base on its eigenvalues. The effectiveness of the par¬a¬m¬e¬t¬ers changes on dynamic stability is verified by simulation res¬u¬l¬ts. Three types of PSS have been considered for analysis: (a the derivative PSS, (b the lead-lag PSS or conventional PSS, and (c the proportional-integral-derivative PSS. The objective fu¬nc¬t¬i¬o¬n is formulated to increase the dam¬¬ping ratio of the electromechanical mode eigenvalues. Simu¬la¬tion results show that the PID-PSS performs better for less ov¬e¬r¬shoot and less settling time comp¬ared with the CPSS and DPSS un¬der different load ope¬ration and the significant system pa¬r¬am¬eter variation conditions.

  5. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    Science.gov (United States)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  6. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-01-30

    monitoring of content that is accessible. The study examines risks associated with information security, technological change and continued popularity of Scouting. Mitigation is based on system functions that are defined. The approach to developing an improved system for facilitating Boy Scout leader functions was iterative with insights into capabilities coming in the course of working through the used cases and sequence diagrams.

  7. Quantitation of multisite EGF receptor phosphorylation using mass spectrometry and a novel normalization approach

    DEFF Research Database (Denmark)

    Erba, Elisabetta Boeri; Matthiesen, Rune; Bunkenborg, Jakob

    2007-01-01

    Using stable isotope labeling and mass spectrometry, we performed a sensitive, quantitative analysis of multiple phosphorylation sites of the epidermal growth factor (EGF) receptor. Phosphopeptide detection efficiency was significantly improved by using the tyrosine phosphatase inhibitor sodium p...

  8. Using robust statistics to improve neutron activation analysis results

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G.

    2011-01-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  9. Comparative Phosphoproteomic Analysis of the Developing Seeds in Two Indica Rice ( Oryza sativa L.) Cultivars with Different Starch Quality.

    Science.gov (United States)

    Pang, Yuehan; Zhou, Xin; Chen, Yaling; Bao, Jinsong

    2018-03-21

    Protein phosphorylation plays important roles in regulation of various molecular events such as plant growth and seed development. However, its involvement in starch biosynthesis is less understood. Here, a comparative phosphoproteomic analysis of two indica rice cultivars during grain development was performed. A total of 2079 and 2434 phosphopeptides from 1273 and 1442 phosphoproteins were identified, covering 2441 and 2808 phosphosites in indica rice 9311 and Guangluai4 (GLA4), respectively. Comparative analysis identified 303 differentially phosphorylated peptides, and 120 and 258 specifically phosphorylated peptides in 9311 and GLA4, respectively. Phosphopeptides in starch biosynthesis related enzymes such as AGPase, SSIIa, SSIIIa, BEI, BEIIb, PUL, and Pho1were identified. GLA4 and 9311 had different amylose content, pasting viscosities, and gelatinization temperature, suggesting subtle difference in starch biosynthesis and regulation between GLA4 and 9311. Our study will give added impetus to further understanding the regulatory mechanism of starch biosynthesis at the phosphorylation level.

  10. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2012-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  11. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2013-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  12. The improvement gap in energy intensity: Analysis of China's thirty provincial regions using the improved DEA (data envelopment analysis) model

    International Nuclear Information System (INIS)

    Li, Ke; Lin, Boqiang

    2015-01-01

    Enacting a reduction target for energy intensity in provinces has become an important issue for the central and local governments in China. But the energy intensity index has provided little information about energy efficiency improvement potential. This study re-estimates the TFEE (total-factor energy efficiency) using an improved DEA (data envelopment analysis) model, which combines the super-efficiency and sequential DEA models to avoid “discriminating power problem” and “technical regress”, and then used it to calculated the TEI (target for energy intensity). The REI (improvement potential in energy intensity) is calculated by the difference between TEI and the actual level of energy intensity. In application, we calculate the REIs for different provinces under the metafrontier and group-frontier respectively, and their ratios are the technology gaps for energy use. The main result shows that China's REIs fluctuate around 21%, 7.5% and 12% for Eastern, Central and Western China respectively; and Eastern China has the highest level of energy technology. These findings reveal that energy intensities of China's provinces do not converge to the optimal level. Therefore, the target of energy-saving policy for regions should be enhancing the energy efficiency of the inefficient ones, and thereby reduce the gap for improvement in energy intensity across regions. - Highlights: • We present an improved DEA model to calculate the TFEE (total-factor energy efficiency). • The improved TFEE combines with a meta-frontier analysis. • We estabilish a new indicator for improvement gap in energy intensity. • Improvement in energy intensity of regions in China is analysed

  13. Logistics analysis to Improve Deployability (LOG-AID): Field Experiment/Results

    National Research Council Canada - National Science Library

    Evers, Kenneth

    2000-01-01

    .... Under sponsorship of the Air Force Research Laboratory Logistics Readiness Branch (AFRL/HESR), the Synergy team analyzed the current wing-level deployment process as part of the Logistics Analysis to Improve Deployability (LOG-AID) program...

  14. An Analysis of the Army Service Acquisition Review Requirements and the Perceived Effectiveness on Intended Improvements

    Science.gov (United States)

    2016-06-01

    Selection Evaluation Market Research Metrics Competition Acquisition Schedule Consolidation Recommendation As detailed in the Chapter IIB, OSD Policy...REQUIREMENTS AND THE PERCEIVED EFFECTIVENESS ON INTENDED IMPROVEMENTS June 2016 By: Roxanne Moss Michael Vukovich Megan Weidner Advisors...ANALYSIS OF THE ARMY SERVICE ACQUISITION REVIEW REQUIREMENTS AND THE PERCEIVED EFFECTIVENESS ON INTENDED IMPROVEMENTS 5. FUNDING NUMBERS 6. AUTHOR(S

  15. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    International Nuclear Information System (INIS)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H.

    2005-01-01

    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported

  16. Improvements to the COBRA-TF (EPRI) computer code for steam generator analysis. Final report

    International Nuclear Information System (INIS)

    Stewart, C.W.; Barnhart, J.S.; Koontz, A.S.

    1980-09-01

    The COBRA-TF (EPRI) code has been improved and extended for pressurized water reactor steam generator analysis. New features and models have been added in the areas of subcooled boiling and heat transfer, turbulence, numerics, and global steam generator modeling. The code's new capabilities are qualified against selected experimental data and demonstrated for typical global and microscale steam generator analysis

  17. Cause-Effect Analysis: Improvement of a First Year Engineering Students' Calculus Teaching Model

    Science.gov (United States)

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of…

  18. Design Improvements on Graded Insulation of Power Transformers Using Transient Electric Field Analysis and Visualization Technique

    OpenAIRE

    Yamashita, Hideo; Nakamae, Eihachiro; Namera, Akihiro; Cingoski, Vlatko; Kitamura, Hideo

    1998-01-01

    This paper deals with design improvements on graded insulation of power transformers using transient electric field analysis and a visualization technique. The calculation method for transient electric field analysis inside a power transformer impressed with impulse voltage is presented: Initially, the concentrated electric network for the power transformer is concentrated by dividing transformer windings into several blocks and by computing the electric circuit parameters.

  19. The Strategic Analysis as a Management Tool to Improve the Performance of National Enterprises

    Directory of Open Access Journals (Sweden)

    Shtal Tetiana V.

    2018-01-01

    Full Text Available The publication considers the issue of improving the performance of enterprises, in particular of their international activities. In order to address this problem, the management of development of international activities uses a variety of tools, one of which is strategic analysis, which allows to analyze the overall status of enterprise, as well as determine the directions of improvement of its efficiency. The main methods of strategic analysis, the appropriateness of their use depending on the set goals and objectives were analyzed. Practical application of separate methods in the strategic analysis (such as model by I. Adizes, model of «five forces» of competitiveness according to Porter, analysis of financial indicators and costs, PEST-analysis and SWOT-analysis is considered on the example of machine-building enterprises, specializing in the production of turbo-expanders. Recommendations on development of their efficiency have been offered.

  20. Peer-Assisted Analysis of Resident Feedback Improves Clinical Teaching: A Case Report.

    Science.gov (United States)

    Mai, Christine L; Baker, Keith

    2017-07-01

    Anesthesiologists play an important role in educating future clinicians. Yet few residency programs incorporate teaching skills into faculty development. Consequently, many anesthesiologists have limited training to supervise and educate residents. In turn, these attendings may receive negative feedback and poor evaluations from residents without a means to effectively improve. Peer-assisted teaching between faculty members may serve as a strategy to improve teaching skills. We report a case of peer-assisted analysis of resident feedback to identify specific areas of concern that were targeted for improvement. This approach resulted in improved teaching scores and feedback for the faculty member.

  1. Use-related risk analysis for medical devices based on improved FMEA.

    Science.gov (United States)

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  2. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment (DAVE)

    Science.gov (United States)

    2015-05-01

    1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis (LLA) Application...Making 3 2 1 3 L L A Methods • Lexical Link Analysis (LLA) Core – LLA Reports and Visualizations • Collaborative Learning Agents (CLA) for

  3. Improving the problem analysis in cost-benefit analysis for transport projects : An explorative study

    NARCIS (Netherlands)

    Annema, J.A.; Mouter, N.

    2013-01-01

    Key actors (consultants, scientists and policy makers) in the Netherlands transport policy cost-benefit analysis (CBA) practice consider ‘problem analysis’ to be one of the important CBA substantive problems. Their idea is that a good-quality problem analysis can help to identify proper solutions,

  4. Surface plasmon resonance thermodynamic and kinetic analysis as a strategic tool in drug design. Distinct ways for phosphopeptides to plug into Src- and Grb2 SH2 domains

    NARCIS (Netherlands)

    de Mol, Nico J; Dekker, Frank J; Broutin, Isabel; Fischer, Marcel J E; Liskamp, Rob M J; Dekker, Frank

    2005-01-01

    Thermodynamic and kinetic studies of biomolecular interactions give insight into specificity of molecular recognition processes and advance rational drug design. Binding of phosphotyrosine (pY)-containing peptides to Src- and Grb2-SH2 domains was investigated using a surface plasmon resonance

  5. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  6. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Malik, S; Hoehle, F; Lassila-Perini, K; Hinzmann, A; Wolf, R; Shipsey, I

    2012-01-01

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  7. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...... recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know...

  8. An Improved Biclustering Algorithm and Its Application to Gene Expression Spectrum Analysis

    OpenAIRE

    Qu, Hua; Wang, Liu-Pu; Liang, Yan-Chun; Wu, Chun-Guo

    2016-01-01

    Cheng and Church algorithm is an important approach in biclustering algorithms. In this paper, the process of the extended space in the second stage of Cheng and Church algorithm is improved and the selections of two important parameters are discussed. The results of the improved algorithm used in the gene expression spectrum analysis show that, compared with Cheng and Church algorithm, the quality of clustering results is enhanced obviously, the mining expression models are better, and the d...

  9. Internal environment analysis and its improvement in company Ltd "German Products Baltics"

    OpenAIRE

    Štekels, Jānis

    2012-01-01

    The topic of the Bachelors work is „Internal environment analysis and its improvment in company Ltd „German Products Baltics””. The objective of Bachelors work is to explore and analyze the internal environment and to develop proposals for its improvement. Subject of the work is actual, because each company before starting its business or to change something in companies work, should understood the companies strength and weaknesses by analyzing internal environment. Bachelor work consists of ...

  10. An improved multiple linear regression and data analysis computer program package

    Science.gov (United States)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  11. Virtual reality for improving balance in patients after stroke: A systematic review and meta-analysis.

    Science.gov (United States)

    Li, Zhen; Han, Xiu-Guo; Sheng, Jing; Ma, Shao-Jun

    2016-05-01

    To evaluate the effectiveness of virtual reality interventions for improving balance in people after stroke. Systematic review and meta-analysis of randomized controlled trials. Studies were obtained by searching the following databases: MEDLINE, CINAHL, EMBASE, Web of Science and CENTRAL. Two reviewers assessed studies for inclusion, extracted data and assessed trial quality. Sixteen studies involving 428 participants were included. People who received virtual reality interventions showed marked improvements in Berg Balance Scale (mean difference: 1.46, 95% confidence interval: 0.09-2.83, Pvirtual reality to improve balance after stroke. © The Author(s) 2015.

  12. Waste Minimization Improvements Achieved Through Six Sigma Analysis Result In Significant Cost Savings

    International Nuclear Information System (INIS)

    Mousseau, Jeffrey D.; Jansen, John R.; Janke, David H.; Plowman, Catherine M.

    2003-01-01

    Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results

  13. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    Itooka, Satoshi; Saito, Yoshinori; Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  14. Improvement in Student Data Analysis Skills after Out-of-Class Assignments

    Directory of Open Access Journals (Sweden)

    Kristen Lee Williams Walton

    2016-12-01

    Full Text Available The ability to understand and interpret data is a critical aspect of scientific thinking.  However, although data analysis is often a focus in biology majors classes, many textbooks for allied health majors classes are primarily content-driven and do not include substantial amounts of experimental data in the form of graphs and figures.  In a lower-division allied health majors microbiology class, students were exposed to data from primary journal articles as take-home assignments and their data analysis skills were assessed in a pre-/posttest format.  Students were given 3 assignments that included data analysis questions.  Assignments ranged from case studies that included a figure from a journal article to reading a short journal article and answering questions about multiple figures or tables.  Data were represented as line or bar graphs, gel photographs, and flow charts.  The pre- and posttest was designed incorporating the same types of figures to assess whether the assignments resulted in any improvement in data analysis skills.  The mean class score showed a small but significant improvement from the pretest to the posttest across three semesters of testing.  Scores on individual questions testing accurate conclusions and predictions improved the most.  This supports the conclusion that a relatively small number of out-of-class assignments through the semester resulted in a significant improvement in data analysis abilities in this population of students.

  15. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  16. Improvement potential of a real geothermal power plant using advanced exergy analysis

    International Nuclear Information System (INIS)

    Gökgedik, Harun; Yürüsoy, Muhammet; Keçebaş, Ali

    2016-01-01

    The main purpose of this paper is to quantitatively evaluate thermodynamic performance of a geothermal power plant (GPP) from potential for improvement point of view. Thus, sources of inefficiency and irreversibilities can be determined through exergy analysis. The advanced exergy analysis is more appropriate to determine real potential for thermodynamic improvements of the system by splitting exergy destruction into unavoidable and avoidable portions. The performance critical components and the potential for exergy efficiency improvement of a GPP were determined by means of the advanced exergy analysis. This plant is the Bereket GPP in Denizli/Turkey as a current operating system. The results show that the avoidable portion of exergy destruction in all components except for the turbines is higher than the unavoidable value. Therefore, much can be made to lessen the irreversibilities for components of the Bereket GPP. The total exergy efficiency of the system is found to be 9.60%. Its efficiency can be increased up to 15.40% by making improvements in the overall components. Although the heat exchangers had lower exergy and modified exergy efficiencies, their exergy improvement potentials were high. Finally, in the plant, the old technology is believed to be one of the main reasons for low efficiencies. - Highlights: • Evaluation of potential for improvement of a GPP using advanced exergy analysis. • Efficiency can be increased up to 15.40% by making improvements in the components. • Heat exchangers are the highest avoidable values, making them the least efficient components in plant. • The main reasons for low efficiencies are believed to be the old technology.

  17. An improved and explicit surrogate variable analysis procedure by coefficient adjustment.

    Science.gov (United States)

    Lee, Seunggeun; Sun, Wei; Wright, Fred A; Zou, Fei

    2017-06-01

    Unobserved environmental, demographic, and technical factors can negatively affect the estimation and testing of the effects of primary variables. Surrogate variable analysis, proposed to tackle this problem, has been widely used in genomic studies. To estimate hidden factors that are correlated with the primary variables, surrogate variable analysis performs principal component analysis either on a subset of features or on all features, but weighting each differently. However, existing approaches may fail to identify hidden factors that are strongly correlated with the primary variables, and the extra step of feature selection and weight calculation makes the theoretical investigation of surrogate variable analysis challenging. In this paper, we propose an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties. Simulation studies and real data analysis show that the method is competitive to state-of-the-art methods.

  18. U.S. Forest Service Region 1 Lake Chemistry, NADP, and IMPROVE air quality data analysis

    Science.gov (United States)

    Jill Grenon; Mark Story

    2009-01-01

    This report was developed to address the need for comprehensive analysis of U.S. Forest Service (USFS) Region 1 air quality monitoring data. The monitoring data includes Phase 3 (long-term data) lakes, National Atmospheric Deposition Program (NADP), and Interagency Monitoring of Protected Visual Environments (IMPROVE). Annual and seasonal data for the periods of record...

  19. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  20. Improving torque per kilogram magnet of permanent magnet couplings using finite element analysis

    DEFF Research Database (Denmark)

    Högberg, Stig; Jensen, Bogi Bech; Bendixen, Flemming Buus

    2013-01-01

    This paper presents the methodology and subsequent findings of a performance-improvement routine that employs automated finite element (FE) analysis to increase the torque-per-kilogram-magnet (TPKM) of a permanent magnet coupling (PMC). The routine is applied to a commercially available cylindrical...

  1. Thermal-hydraulic analysis and design improvement for coolant channel of ITER shield block

    International Nuclear Information System (INIS)

    Zhao Ling; Li Huaqi; Zheng Jiantao; Yi Jingwei; Kang Weishan; Chen Jiming

    2013-01-01

    As an important part for ITER, shield block is used to shield the neutron heat. The structure design of shield block, especially the inner coolant channel design will influence its cooling effect and safety significantly. In this study, the thermal-hydraulic analysis for shield block has been performed by the computational fluid dynamics software, some optimization suggestions have been proposed and thermal-hydraulic characteristics of the improved model has been analyzed again. The analysis results for improved model show that pressure drop through flow path near the inlet and outlet region of the shield block has been reduced, and the total pressure drop in cooling path has been reduced too; the uniformity of the mass flowrate distribution and the velocity distribution have been improved in main cooling branches; the local highest temperature of solid domain reduced considerably, which could avoid thermal stress becoming too large because of coolant effect unevenly. (authors)

  2. The improved Apriori algorithm based on matrix pruning and weight analysis

    Science.gov (United States)

    Lang, Zhenhong

    2018-04-01

    This paper uses the matrix compression algorithm and weight analysis algorithm for reference and proposes an improved matrix pruning and weight analysis Apriori algorithm. After the transactional database is scanned for only once, the algorithm will construct the boolean transaction matrix. Through the calculation of one figure in the rows and columns of the matrix, the infrequent item set is pruned, and a new candidate item set is formed. Then, the item's weight and the transaction's weight as well as the weight support for items are calculated, thus the frequent item sets are gained. The experimental result shows that the improved Apriori algorithm not only reduces the number of repeated scans of the database, but also improves the efficiency of data correlation mining.

  3. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.

  4. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase II

    Science.gov (United States)

    2014-04-30

    bäÉîÉåíÜ=^ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to...DATE 30 APR 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving...vocabulary or lexicon, to describe the attributes and surrounding environment of the system. Lexical Link Analysis (LLA) is a form of text mining in which

  5. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    Science.gov (United States)

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to Acquisition...2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service...processes. Lexical Link Analysis (LLA) can help, by applying automation to reveal and depict???to decisionmakers??? the correlations, associations, and

  6. Powerplant productivity improvement study: policy analysis and incentive assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    Policy options that the Illinois Commerce Commission might adopt in order to promote improved power plant productivity for existing units in Illinois are identified and analyzed. These policy options would generally involve either removing existing disincentives and/or adding direct incentives through the regulatory process. The following activities are reported: in-depth review of existing theoretical and empirical literature in the areas of power plant reliability, regulatory utility efficiency and performance incentives, and impacts of various regulatory mechanisms such as the Fuel Adjustment Clauses on productivity; contacts with other state public utility commissions known to be investigating or implementing productivity improvement incentive mechanisms; documentation and analysis of incentive mechanisms adopted or under consideration in other states; analysis of current regulatory practice in Illinois as it relates to power plant productivity incentives and disincentives; identification of candidate incentive mechanisms for consideration by the Illinois Commerce Commission; and analysis and evaluation of these candidates. 72 references, 8 figures.

  7. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Linnorm: improved statistical analysis for single cell RNA-seq expression data.

    Science.gov (United States)

    Yip, Shun H; Wang, Panwen; Kocher, Jean-Pierre A; Sham, Pak Chung; Wang, Junwen

    2017-12-15

    Linnorm is a novel normalization and transformation method for the analysis of single cell RNA sequencing (scRNA-seq) data. Linnorm is developed to remove technical noises and simultaneously preserve biological variations in scRNA-seq data, such that existing statistical methods can be improved. Using real scRNA-seq data, we compared Linnorm with existing normalization methods, including NODES, SAMstrt, SCnorm, scran, DESeq and TMM. Linnorm shows advantages in speed, technical noise removal and preservation of cell heterogeneity, which can improve existing methods in the discovery of novel subtypes, pseudo-temporal ordering of cells, clustering analysis, etc. Linnorm also performs better than existing DEG analysis methods, including BASiCS, NODES, SAMstrt, Seurat and DESeq2, in false positive rate control and accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Stakeholder analysis of the Programme for Improving Mental health carE (PRIME): baseline findings.

    Science.gov (United States)

    Makan, Amit; Fekadu, Abebaw; Murhar, Vaibhav; Luitel, Nagendra; Kathree, Tasneem; Ssebunya, Joshua; Lund, Crick

    2015-01-01

    The knowledge generated from evidence-based interventions in mental health systems research is seldom translated into policy and practice in low and middle-income countries (LMIC). Stakeholder analysis is a potentially useful tool in health policy and systems research to improve understanding of policy stakeholders and increase the likelihood of knowledge translation into policy and practice. The aim of this study was to conduct stakeholder analyses in the five countries participating in the Programme for Improving Mental health carE (PRIME); evaluate a template used for cross-country comparison of stakeholder analyses; and assess the utility of stakeholder analysis for future use in mental health policy and systems research in LMIC. Using an adapted stakeholder analysis instrument, PRIME country teams in Ethiopia, India, Nepal, South Africa and Uganda identified and characterised stakeholders in relation to the proposed action: scaling-up mental health services. Qualitative content analysis was conducted for stakeholder groups across countries, and a force field analysis was applied to the data. Stakeholder analysis of PRIME has identified policy makers (WHO, Ministries of Health, non-health sector Ministries and Parliament), donors (DFID UK, DFID country offices and other donor agencies), mental health specialists, the media (national and district) and universities as the most powerful, and most supportive actors for scaling up mental health care in the respective PRIME countries. Force field analysis provided a means of evaluating cross-country stakeholder power and positions, particularly for prioritising potential stakeholder engagement in the programme. Stakeholder analysis has been helpful as a research uptake management tool to identify targeted and acceptable strategies for stimulating the demand for research amongst knowledge users, including policymakers and practitioners. Implementing these strategies amongst stakeholders at a country level will

  10. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  11. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  12. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    International Nuclear Information System (INIS)

    Binder, Claudia R.; Hofer, Christoph; Wiek, Arnim; Scholz, Roland W.

    2004-01-01

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management

  13. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    Science.gov (United States)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  14. Root Cause Analysis and Productivity Improvement Of An Apparel Industry In Bangladesh Through Kaizen Implementation

    Directory of Open Access Journals (Sweden)

    Taposh Kumar Kapuria

    2017-12-01

    Full Text Available Garments industry is playing the pioneering role in improving Bangladesh economic condition. It was started in late 1970’s and now the leading foreign currency earner for Bangladesh. It’s no dubiousness to say that, the Bangladesh garment industry is ameliorating garment’s service quality and innovative design features to exist in the global competitive market. Global competition in the garment’s market is changing day to day. Leading garment manufacturer from all over the world are adopting new innovative features and techniques to sustain global fierce competitive market. But the point is, Bangladeshi garment manufacturers are not lingered. They are also emphasizing on better service quality by adding latest design features and using the latest technologies to the garments. The sole purpose of this paper is to identify the root causes of sewing defects of an apparel industry in Bangladesh and continuous improvement in reducing the defects through Kaizen (Continuous Improvement system. In short, productivity improvement of the apparel industry. Our studied garment manufacturing company is “ABONTI Color Tex. Ltd.”. Pareto Analysis is used to identify the top defect items. Cause-Effect Analysis helped to identify the root causes of sewing defects. Then, Kaizen is used for continuous improvement of the minimization of sewing defects.

  15. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  16. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Jonny; Nasution, Januar

    2013-01-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  17. Co-Inheritance Analysis within the Domains of Life Substantially Improves Network Inference by Phylogenetic Profiling.

    Directory of Open Access Journals (Sweden)

    Junha Shin

    Full Text Available Phylogenetic profiling, a network inference method based on gene inheritance profiles, has been widely used to construct functional gene networks in microbes. However, its utility for network inference in higher eukaryotes has been limited. An improved algorithm with an in-depth understanding of pathway evolution may overcome this limitation. In this study, we investigated the effects of taxonomic structures on co-inheritance analysis using 2,144 reference species in four query species: Escherichia coli, Saccharomyces cerevisiae, Arabidopsis thaliana, and Homo sapiens. We observed three clusters of reference species based on a principal component analysis of the phylogenetic profiles, which correspond to the three domains of life-Archaea, Bacteria, and Eukaryota-suggesting that pathways inherit primarily within specific domains or lower-ranked taxonomic groups during speciation. Hence, the co-inheritance pattern within a taxonomic group may be eroded by confounding inheritance patterns from irrelevant taxonomic groups. We demonstrated that co-inheritance analysis within domains substantially improved network inference not only in microbe species but also in the higher eukaryotes, including humans. Although we observed two sub-domain clusters of reference species within Eukaryota, co-inheritance analysis within these sub-domain taxonomic groups only marginally improved network inference. Therefore, we conclude that co-inheritance analysis within domains is the optimal approach to network inference with the given reference species. The construction of a series of human gene networks with increasing sample sizes of the reference species for each domain revealed that the size of the high-accuracy networks increased as additional reference species genomes were included, suggesting that within-domain co-inheritance analysis will continue to expand human gene networks as genomes of additional species are sequenced. Taken together, we propose that co

  18. Enrichment and separation of mono- and multiply phosphorylated peptides using sequential elution from IMAC prior to mass spectrometric analysis

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    Phospho-proteomics relies on methods for efficient purification and sequencing of phosphopeptides from highly complex biological systems using low amounts of starting material. Current methods for phosphopeptide enrichment, e.g., immobilized metal affinity chromatography and titanium dioxide chro...

  19. Improvement of burnup analysis for pebble bed reactors with an accumulative fuel loading scheme

    International Nuclear Information System (INIS)

    Simanullang, Irwan Liapto; Obara, Toru

    2015-01-01

    Given the limitations of natural uranium resources, innovative nuclear power plant concepts that increase the efficiency of nuclear fuel utilization are needed. The Pebble Bed Reactor (PBR) shows some potential to achieve high efficiency in natural uranium utilization. To simplify the PBR concept, PBR with an accumulation fuel loading scheme was introduced and the Fuel Handling System (FHS) removed. In this concept, the pebble balls are added little by little into the reactor core until the pebble balls reach the top of the reactor core, and all pebble balls are discharged from the core at the end of the operation period. A code based on the MVP/MVP-BURN method has been developed to perform an analysis of a PBR with the accumulative fuel loading scheme. The optimum fuel composition was found using the code for high burnup performance. Previous efforts provided several motivations to improve the burnup performance: First, some errors in the input code were corrected. This correction, and an overall simplification of the input code, was implemented for easier analysis of a PBR with the accumulative fuel loading scheme. Second, the optimum fuel design had been obtained in the infinite geometry. To improve the optimum fuel composition, a parametric survey was obtained by varying the amount of Heavy Metal (HM) uranium per pebble and the degree of uranium enrichment. Moreover, an entire analysis of the parametric survey was obtained in the finite geometry. The results show that improvements in the fuel composition can lead to more accurate analysis with the code. (author)

  20. The Effectiveness of Transactional Analysis Group-counseling on the Improvement of Couples’ Family Functioning

    Directory of Open Access Journals (Sweden)

    Ghorban Ali Yahyaee

    2015-06-01

    Full Text Available Background & Aims of the Study: Family functioning is among the most important factors ensuring the mental health of family members. Disorder or disturbance in family functioning would cause many psychological problems for family members. Current study intended to examine the effectiveness of transactional analysis group counseling on the improvement of couple's family functioning. Materials & Methods: The design of the study is as semi experimental research with pretest and posttest with follow up and control group. Statistical population consists all couples referring to the psychological and counseling centers of Rasht city in 2012. Samples were selected at first by available sampling method and after completing family assessment  device, and obtaining score for enter to research, were placement using random sampling method in two experimental and control groups (N = 8 couples per group. The experimental group participated in 12 sessions of group counseling based on transactional analysis and control group received no intervention. The gathered data were analyzed using covariance analysis. Results: The results show that there are significant differences between the pre-test and post test scores of the experimental group. This difference is significant at the level of 0.05. Therefore it seems that transactional group therapy improved the dimensions of family functioning in couples. Conclusions: The results indicated that transactional analysis group counseling can improve the family functioning and use this approach to working with couples is recommended.

  1. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  2. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  3. The effects of aromatherapy on sleep improvement: a systematic literature review and meta-analysis.

    Science.gov (United States)

    Hwang, Eunhee; Shin, Sujin

    2015-02-01

    To evaluate the existing data on aromatherapy interventions for improvement of sleep quality. Systematic literature review and meta-analysis on the effects of aromatherapy. Study Sources: Electronic databases, including the Korea Education and Research Information Service (KERIS), Korean studies Information Service System (KISS), National Assembly Library, and eight academies within the Korean Society of Nursing Science, were searched to identify studies published between 2000 and August 2013. Randomized controlled and quasi-experimental trials that included aromatherapy for the improvement of sleep quality. Of the 245 publications identified, 13 studies met the inclusion and exclusion criteria, and 12 studies were used in the meta-analysis. Meta-analysis of the 12 studies using a random-effects model revealed that the use of aromatherapy was effective in improving sleep quality (95% confidence interval [CI], 0.540-1.745; Z=3.716). Subgroup analysis revealed that inhalation aromatherapy (95% CI, 0.792-1.541; Z=6.107) was more effective than massage therapy (95% CI, 0.128-2.166; Z=2.205) in unhealthy (95% CI, 0.248-1.100; Z=3.100) and healthy (95% CI, 0.393-5.104; Z=2.287) participants, respectively. Readily available aromatherapy treatments appear to be effective and promote sleep. Thus, it is essential to develop specific guidelines for the efficient use of aromatherapy.

  4. Enhancing the discussion of alternatives in EIA using principle component analysis leads to improved public involvement

    International Nuclear Information System (INIS)

    Kamijo, Tetsuya; Huang, Guangwei

    2017-01-01

    The purpose of this study is to show the effectiveness of principle component analysis (PCA) as a method of alternatives analysis useful for improving the discussion of alternatives and public involvement. This study examined public consultations by applying quantitative text analysis (QTA) to the minutes of meetings and showed a positive correlation between the discussion of alternatives and the sense of public involvement. The discussion of alternatives may improve public involvement. A table of multiple criteria analysis for alternatives with detailed scores may exclude the public from involvement due to the general public's limited capacity to understand the mathematical algorithm and to process too much information. PCA allowed for the reduction of multiple criteria down to a small number of uncorrelated variables (principle components), a display of the merits and demerits of the alternatives, and potentially made the identification of preferable alternatives by the stakeholders easier. PCA is likely to enhance the discussion of alternatives and as a result, lead to improved public involvement.

  5. Prediction of improvement in skin fibrosis in diffuse cutaneous systemic sclerosis: a EUSTAR analysis.

    Science.gov (United States)

    Dobrota, Rucsandra; Maurer, Britta; Graf, Nicole; Jordan, Suzana; Mihai, Carina; Kowal-Bielecka, Otylia; Allanore, Yannick; Distler, Oliver

    2016-10-01

    Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. Analysis of transient heat conduction in a PWR fuel rod by an improved lumped parameter approach

    Energy Technology Data Exchange (ETDEWEB)

    Dourado, Eneida Regina G. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Cotta, Renato M. [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Mecanica; Jian, Su, E-mail: eneidadourado@gmail.com, E-mail: sujian@nuclear.ufrj.br, E-mail: cotta@mecanica.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2017-07-01

    This paper aims to analyze transient heat conduction in a nuclear fuel rod by an improved lumped parameter approach. One-dimensional transient heat conduction is considered, with the circumferential symmetry assumed and the axial conduction neglected. The thermal conductivity and specific heat in the fuel pellet are considered temperature dependent, while the thermophysical properties of the cladding are considered constant. Hermite approximation for integration is used to obtain the average temperature and heat flux in the radial direction. Significant improvement over the classical lumped parameter formulation has been achieved. The proposed model can be also used in dynamic analysis of PWR and nuclear power plant simulators. (author)

  7. Analysis of transient heat conduction in a PWR fuel rod by an improved lumped parameter approach

    International Nuclear Information System (INIS)

    Dourado, Eneida Regina G.; Cotta, Renato M.; Jian, Su

    2017-01-01

    This paper aims to analyze transient heat conduction in a nuclear fuel rod by an improved lumped parameter approach. One-dimensional transient heat conduction is considered, with the circumferential symmetry assumed and the axial conduction neglected. The thermal conductivity and specific heat in the fuel pellet are considered temperature dependent, while the thermophysical properties of the cladding are considered constant. Hermite approximation for integration is used to obtain the average temperature and heat flux in the radial direction. Significant improvement over the classical lumped parameter formulation has been achieved. The proposed model can be also used in dynamic analysis of PWR and nuclear power plant simulators. (author)

  8. Comparative evaluation of the effects of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) and xylitol-containing chewing gum on salivary flow rate, pH and buffering capacity in children: An in vivo study.

    Science.gov (United States)

    Hegde, Rahul J; Thakkar, Janhavi B

    2017-01-01

    This study aimed to compare and evaluate the changes in the salivary flow rate, pH, and buffering capacity before and after chewing casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) and xylitol-containing chewing gums in children. Sixty children aged between 8 and 12 years were selected for the study. They were randomly divided into Group 1 (CPP-ACP chewing gum) and Group 2 (xylitol-containing chewing gum) comprising thirty children each. Unstimulated and stimulated saliva samples at 15 and 30 min interval were collected from all children. All the saliva samples were estimated for salivary flow rate, pH, and buffering capacity. Significant increase in salivary flow rate, pH, and buffering capacity from baseline to immediately after spitting the chewing gum was found in both the study groups. No significant difference was found between the two study groups with respect to salivary flow rate and pH. Intergroup comparison indicated a significant increase in salivary buffer capacity in Group 1 when compared to Group 2. Chewing gums containing CPP-ACP and xylitol can significantly increase the physiochemical properties of saliva. These physiochemical properties of saliva have a definite relation with caries activity in children.

  9. Comparative evaluation of the effects of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP and xylitol-containing chewing gum on salivary flow rate, pH and buffering capacity in children: An in vivo study

    Directory of Open Access Journals (Sweden)

    Rahul J Hegde

    2017-01-01

    Full Text Available Aim: This study aimed to compare and evaluate the changes in the salivary flow rate, pH, and buffering capacity before and after chewing casein phosphopeptide-amorphous calcium phosphate (CPP-ACP and xylitol-containing chewing gums in children. Materials and Methods: Sixty children aged between 8 and 12 years were selected for the study. They were randomly divided into Group 1 (CPP-ACP chewing gum and Group 2 (xylitol-containing chewing gum comprising thirty children each. Unstimulated and stimulated saliva samples at 15 and 30 min interval were collected from all children. All the saliva samples were estimated for salivary flow rate, pH, and buffering capacity. Results: Significant increase in salivary flow rate, pH, and buffering capacity from baseline to immediately after spitting the chewing gum was found in both the study groups. No significant difference was found between the two study groups with respect to salivary flow rate and pH. Intergroup comparison indicated a significant increase in salivary buffer capacity in Group 1 when compared to Group 2. Conclusion: Chewing gums containing CPP-ACP and xylitol can significantly increase the physiochemical properties of saliva. These physiochemical properties of saliva have a definite relation with caries activity in children.

  10. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; hide

    2015-01-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  11. Construction Delay Analysis Techniques—A Review of Application Issues and Improvement Needs

    Directory of Open Access Journals (Sweden)

    Nuhu Braimah

    2013-07-01

    Full Text Available The time for performance of a project is usually of the essence to the employer and the contractor. This has made it quite imperative for contracting parties to analyse project delays for purposes of making right decisions on potential time and/or cost compensation claims. Over the years, existing delay analysis techniques (DATs for aiding this decision-making have been helpful but have not succeeded in curbing the high incidence of disputes associated with delay claims resolutions. A major source of the disputes lies with the limitations and capabilities of the techniques in their practical use. Developing a good knowledge of these aspects of the techniques is of paramount importance in understanding the real problematic issues involved and their improvement needs. This paper seeks to develop such knowledge and understanding (as part of a wider research work via: an evaluation of the most common DATs based on a case study, a review of the key relevant issues often not addressed by the techniques, and the necessary improvements needs. The evaluation confirmed that the various techniques yield different analysis results for the same delay claims scenario, mainly due to their unique application procedures. The issues that are often ignored in the analysis but would also affect delay analysis results are: functionality of the programming software employed for the analysis, resource loading and levelling requirements, resolving concurrent delays, and delay-pacing strategy. Improvement needs by way of incorporating these issues in the analysis and focusing on them in future research work are the key recommendations of the study.

  12. Three success factors for continual improvement in healthcare: an analysis of the reports of improvement team members.

    Science.gov (United States)

    Brandrud, Aleidis Skard; Schreiner, Ada; Hjortdahl, Per; Helljesen, Gro Sævil; Nyen, Bjørnar; Nelson, Eugene C

    2011-03-01

    The objectives of the Breakthrough Series Collaborative are to close the gap between what we know and what we do, and to contribute to continuous quality improvement (CQI) of healthcare through collaborative learning. The improvement efforts are guided by a systematic approach, combining professional and improvement knowledge. To explore what the improvement teams have learnt from participating in the collaborative and from dealing with promoting and inhibiting factors encountered. Qualitative interviews with 19 team members were conducted in four focus groups, using the Critical Incident Technique. A critical incident is one that makes significant contributions, either positively or negatively, to an activity. The elements of a culture of improvement are revealed by the critical incidents, and reflect the eight domains of knowledge, as a product of collaborative learning. The improvement knowledge and skills of individuals are important elements, but not enough to achieve sustainable changes. 90% of the material reflects the need for a system of CQI to solve the problems that organisations experience in trying to make lasting improvements. A pattern of three success factors for CQI emerges: (1) continuous and reliable information, including measurement, about best and current practice; (2) engagement of everybody in all phases of the improvement work: the patient and family, the leadership, the professional environment and the staff; and (3) an infrastructure based on improvement knowledge, with multidisciplinary teams, available coaching, learning systems and sustainability systems.

  13. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    Directory of Open Access Journals (Sweden)

    Bahram A. Ghanbari-Hashemabadi

    2012-03-01

    Full Text Available Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen based on screening method. In addition, they were randomly divided into three equal groups: control group (15 participants, cognitive experimental group (15 participants and transactional analysis group (15 participants. Conflict-solving questionnaire was used to collect data and the intervention methods were cognitive and transactional analysis group therapy that was administrated during 8 weekly two-hour sessions. Mean and standard deviation were used for data analysis in the descriptive level and One-Way ANOVA method was used at the inference level.Results: The results of the study suggest that the conflict-solving skills in the two experimental groups were significantly increased. Conclusion: The finding of this research is indicative of the fact that both cognitive and transactional analysis group therapy could be an effective intervention for improving conflict-solving skills

  14. Improvements of Physical Models in TRITGO code for Tritium Behavior Analysis in VHTR

    International Nuclear Information System (INIS)

    Yoo, Jun Soo; Tak, Nam Il; Lim, Hong Sik

    2010-01-01

    Since tritium is radioactive material with 12.32 year of half-life and is generated by a ternary fission reaction in fuel as well as by neutron absorption reactions of impurities in Very High Temperature gas-cooled Reactor (VHTR) core, accurate prediction of tritium behavior and its concentration in product hydrogen is definitely important in terms of public safety for its construction. In this respect, TRITGO code was developed for estimating the tritium production and distribution in high temperature gas-cooled reactors by General Atomics (GA). However, some models in it are hard-wired to specific reactor type or too simplified, which makes the analysis results less applicable. Thus, major improvements need to be considered for better predictions. In this study, some of model improvements have been suggested and its effect is evaluated based on the analysis work against PMR600 design concept

  15. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    Energy Technology Data Exchange (ETDEWEB)

    Park, Dae Woong [Korea Testing and Research Institute, Kwachun (Korea, Republic of)

    2015-03-15

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  16. Multivariate analysis method for energy calibration and improved mass assignment in recoil spectrometry

    International Nuclear Information System (INIS)

    El Bouanani, Mohamed; Hult, Mikael; Persson, Leif; Swietlicki, Erik; Andersson, Margaretha; Oestling, Mikael; Lundberg, Nils; Zaring, Carina; Cohen, D.D.; Dytlewski, Nick; Johnston, P.N.; Walker, S.R.; Bubb, I.F.; Whitlow, H.J.

    1994-01-01

    Heavy ion recoil spectrometry is rapidly becoming a well established analysis method, but the associated data analysis processing is still not well developed. The pronounced nonlinear response of silicon detectors for heavy ions leads to serious limitation and complication in mass gating, which is the principal factor in obtaining energy spectra with minimal cross talk between elements. To overcome the above limitation, a simple empirical formula with an associated multiple regression method is proposed for the absolute energy calibration of the time of flight-energy dispersive detector telescope used in recoil spectrometry. A radical improvement in mass assignment was realized, which allows a more accurate and improved depth profiling with the important feature of making the data processing much easier. ((orig.))

  17. Analysis of means of improving the uncontrolled lateral motions of personal airplanes

    Science.gov (United States)

    Mckinney, Marion O , Jr

    1951-01-01

    A theoretical analysis has been made of means of improving the uncontrolled motions of personal airplanes. The purpose of this investigation was to determine whether such airplanes could be made to fly uncontrolled for an indefinite period of time without getting into dangerous attitudes and for a reasonable period of time (1 to 3 min) without deviating excessively from their original course. The results of this analysis indicated that the uncontrolled motions of a personal airplane could be made safe as regards spiral tendencies and could be greatly improved as regards maintenance of course without resort to an autopilot. The only way to make the uncontrolled motions completely satisfactory as regards continuous maintenance of course, however, is to use a conventional type of autopilot.

  18. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    International Nuclear Information System (INIS)

    Park, Dae Woong

    2015-01-01

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  19. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  20. Demand, capacity, and access of the outpatient clinic: A framework for analysis and improvement.

    Science.gov (United States)

    van Bussel, Erik Martijn; van der Voort, Marc Boudewijn Victor Rouppe; Wessel, Ronald N; van Merode, Godefridus G

    2018-06-01

    While theoretical frameworks for optimization of the outpatient processes are abundant, practical step-by-step analyses to give leads for improvement, to forecast capacity, and to support decision making are sparse. This article demonstrates how to evaluate and optimize the triad of demand, (future) capacity, and access time of the outpatient clinic using a structured six-step method. All individual logistical patient data of an orthopaedic outpatient clinic of one complete year were analysed using a 6-step method to evaluate demand, supply, and access time. Trends in the data were retrospectively analysed and evaluated for potential improvements. A model for decision making was tested. Both the analysis of the method and actual results were considered as main outcomes. More than 25 000 appointments were analysed. The 6-step method showed to be sufficient to result in valuable insights and leads for improvement. While the overall match between demand and capacity was considered adequate, the variability in capacity was much higher than in demand, thereby leading to delays in access time. Holidays and subsequent weeks showed to be of great influence for demand, capacity, and access time. Using the six-step method, several unfavourable characteristics of the outpatient clinic were revealed and a better match between demand, supply, and access time could have been reached with only minor adjustments. Last, a clinic specific prediction and decision model for demand and capacity was made using the 6-step method. The 6-step analysis can successfully be applied to redesign and improve the outpatient health care process. The results of the analysis showed that national holidays and variability in demand and capacity have a big influence on the outpatient clinic. Using the 6-step method, practical improvements in outpatient logistics were easily found and leads for future decision making were contrived. © 2018 The Authors Journal of Evaluation in Clinical Practice

  1. Analysis of human serum by liquid chromatography-mass spectrometry: improved sample preparation and data analysis.

    Science.gov (United States)

    Govorukhina, N I; Reijmers, T H; Nyangoma, S O; van der Zee, A G J; Jansen, R C; Bischoff, R

    2006-07-07

    Discovery of biomarkers is a fast developing field in proteomics research. Liquid chromatography coupled on line to mass spectrometry (LC-MS) has become a powerful method for the sensitive detection, quantification and identification of proteins and peptides in biological fluids like serum. However, the presence of highly abundant proteins often masks those of lower abundance and thus generally prevents their detection and identification in proteomics studies. To perform future comparative analyses of samples from a serum bank of cervical cancer patients in a longitudinal and cross-sectional manner, methodology based on the depletion of high-abundance proteins followed by tryptic digestion and LC-MS has been developed. Two sample preparation methods were tested in terms of their efficiency to deplete high-abundance serum proteins and how they affect the repeatability of the LC-MS data sets. The first method comprised depletion of human serum albumin (HSA) on a dye ligand chromatographic and immunoglobulin G (IgG) on an immobilized Protein A support followed by tryptic digestion, fractionation by cation-exchange chromatography, trapping on a C18 column and reversed-phase LC-MS. The second method included depletion of the six most abundant serum proteins based on multiple immunoaffinity chromatography followed by tryptic digestion, trapping on a C18 column and reversed-phase LC-MS. Repeatability of the overall procedures was evaluated in terms of retention time and peak area for a selected number of endogenous peptides showing that the second method, besides being less time consuming, gave more repeatable results (retention time: <0.1% RSD; peak area: <30% RSD). Application of an LC-MS component detection algorithm followed by principal component analysis (PCA) enabled discrimination of serum samples that were spiked with horse heart cytochrome C from non-spiked serum and the detection of a concentration trend, which correlated to the amount of spiked horse heart

  2. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    Science.gov (United States)

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  3. Numerical Analysis of a Centrifugal Fan for Improved Performance using Splitter Vanes

    OpenAIRE

    N. Yagnesh Sharma; K. Vasudeva Karanth

    2009-01-01

    The flow field in a centrifugal fan is highly complex with flow reversal taking place on the suction side of impeller and diffuser vanes. Generally performance of the centrifugal fan could be enhanced by judiciously introducing splitter vanes so as to improve the diffusion process. An extensive numerical whole field analysis on the effect of splitter vanes placed in discrete regions of suspected separation points is possible using CFD. This paper examines the effect of sp...

  4. Improvement of safety by analysis of costs and benefits of the system

    OpenAIRE

    T. Karkoszka; M. Andraczke

    2011-01-01

    Purpose: of the paper has been the assessment of the dependence between improvement of the implemented occupational health and safety management system and both minimization of costs connected with occupational health and safety assurance and optimization of real work conditions.Design/methodology/approach: used for the analysis has included definition of the occupational health and safety system with regard to the rules and tool allowing for occupational safety assurance in the organisationa...

  5. Transcriptome Analysis of Maize Immature Embryos Reveals the Roles of Cysteine in Improving Agrobacterium Infection Efficiency

    Science.gov (United States)

    Liu, Yan; Zhang, Zhiqiang; Fu, Junjie; Wang, Guoying; Wang, Jianhua; Liu, Yunjun

    2017-01-01

    Maize Agrobacterium-mediated transformation efficiency has been greatly improved in recent years. Antioxidants, such as, cysteine, can significantly improve maize transformation frequency through improving the Agrobacterium infection efficiency. However, the mechanism underlying the transformation improvement after cysteine exposure has not been elucidated. In this study, we showed that the addition of cysteine to the co-cultivation medium significantly increased the Agrobacterium infection efficiency of hybrid HiII and inbred line Z31 maize embryos. Reactive oxygen species contents were higher in embryos treated with cysteine than that without cysteine. We further investigated the mechanism behind cysteine-related infection efficiency increase using transcriptome analysis. The results showed that the cysteine treatment up-regulated 939 genes and down-regulated 549 genes in both Z31 and HiII. Additionally, more differentially expressed genes were found in HiII embryos than those in Z31 embryos, suggesting that HiII was more sensitive to the cysteine treatment than Z31. GO analysis showed that the up-regulated genes were mainly involved in the oxidation reduction process. The up-regulation of these genes could help maize embryos to cope with the oxidative stress stimulated by Agrobacterium infection. The down-regulated genes were mainly involved in the cell wall and membrane metabolism, such as, aquaporin and expansin genes. Decreased expression of these cell wall integrity genes could loosen the cell wall, thereby improving the entry of Agrobacterium into plant cells. This study offers insight into the role of cysteine in improving Agrobacterium-mediated transformation of maize immature embryos. PMID:29089955

  6. Hyponatremia improvement is associated with a reduced risk of mortality: evidence from a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Giovanni Corona

    Full Text Available Hyponatremia is the most common electrolyte disorder and it is associated with increased morbidity and mortality. However, there is no clear demonstration that the improvement of serum sodium concentration ([Na(+] counteracts the increased risk of mortality associated with hyponatremia. Thus, we performed a meta-analysis that included the published studies that addressed the effect of hyponatremia improvement on mortality.A Medline, Embase and Cochrane search was performed to retrieve all English-language studies of human subjects published up to June 30th 2014, using the following words: "hyponatremia", "hyponatraemia", "mortality", "morbidity" and "sodium". Fifteen studies satisfied inclusion criteria encompassing a total of 13,816 patients. The identification of relevant abstracts, the selection of studies and the subsequent data extraction were performed independently by two of the authors, and conflicts resolved by a third investigator. Across all fifteen studies, any improvement of hyponatremia was associated with a reduced risk of overall mortality (OR=0.57[0.40-0.81]. The association was even stronger when only those studies (n=8 reporting a threshold for serum [Na(+] improvement to >130 mmol/L were considered (OR=0.51[0.31-0.86]. The reduced mortality rate persisted at follow-up (OR=0.55[0.36-0.84] at 12 months. Meta-regression analyses showed that the reduced mortality associated with hyponatremia improvement was more evident in older subjects and in those with lower serum [Na(+] at enrollment.This meta-analysis documents for the first time that improvement in serum [Na(+] in hyponatremic patients is associated with a reduction of overall mortality.

  7. Use of peers to improve adherence to antiretroviral therapy: a global network meta-analysis.

    Science.gov (United States)

    Kanters, Steve; Park, Jay Jh; Chan, Keith; Ford, Nathan; Forrest, Jamie; Thorlund, Kristian; Nachega, Jean B; Mills, Edward J

    2016-01-01

    It is unclear whether using peers can improve adherence to antiretroviral therapy (ART). To construct the World Health Organization's global guidance on adherence interventions, we conducted a systematic review and network meta-analysis to determine the effectiveness of using peers for achieving adequate adherence and viral suppression. We searched for randomized clinical trials of peer-based interventions to promote adherence to ART in HIV populations. We searched six electronic databases from inception to July 2015 and major conference abstracts within the last three years. We examined the outcomes of adherence and viral suppression among trials done worldwide and those specific to low- and middle-income countries (LMIC) using pairwise and network meta-analyses. Twenty-two trials met the inclusion criteria. We found similar results between pairwise and network meta-analyses, and between the global and LMIC settings. Peer supporter+Telephone was superior in improving adherence than standard-of-care in both the global network (odds-ratio [OR]=4.79, 95% credible intervals [CrI]: 1.02, 23.57) and the LMIC settings (OR=4.83, 95% CrI: 1.88, 13.55). Peer support alone, however, did not lead to improvement in ART adherence in both settings. For viral suppression, we found no difference of effects among interventions due to limited trials. Our analysis showed that peer support leads to modest improvement in adherence. These modest effects may be due to the fact that in many settings, particularly in LMICs, programmes already include peer supporters, adherence clubs and family disclosures for treatment support. Rather than introducing new interventions, a focus on improving the quality in the delivery of existing services may be a more practical and effective way to improve adherence to ART.

  8. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs.

  9. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson

    2008-06-01

    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  10. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs

  11. Analysis of improved and original designs of a 16 inch long penultimate stage turbine blade

    International Nuclear Information System (INIS)

    Carnero, A.; Kubiak, J.A.; Mendez, R.

    1994-01-01

    A finite element analysis of 16 inch long penultimate stage (L-1) blade was carried out to evaluate the improved and the original designs. The original design of the blade involved the ''blade-tenon-shroud'' system to make blade groups (6 blades per group). The improved design applied the concept of Integral Shroud Blade (ISB). Thus all the blades made a 360 degree group. The paper presents an application of the finite element analysis method to compute the natural frequencies, steady-state and alternating stresses, deformation due to forces acting on the blades and modal shapes of the blade group. In the case of the improved design it was also necessary to carry out computation of the dynamic response of a 360 degree blade-disk arc. This was to include the effect of the flexible disk fastening where blade and disk interaction were important to identify certain resonant conditions. It was concluded from the finite element results, that the steady-state stresses in the improved blade were lower, and the tangential mode shapes were eliminated. This was a great advantage since in the original design the first tangential mode shape and the higher steady-state stresses in the tenon contributed to the frequent failure of the ''blade-tenon-shroud'' system

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  13. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-11-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  14. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-02-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  15. Exergy Analysis of a Subcritical Refrigeration Cycle with an Improved Impulse Turbo Expander

    Directory of Open Access Journals (Sweden)

    Zhenying Zhang

    2014-08-01

    Full Text Available The impulse turbo expander (ITE is employed to replace the throttling valve in the vapor compression refrigeration cycle to improve the system performance. An improved ITE and the corresponding cycle are presented. In the new cycle, the ITE not only acts as an expansion device with work extraction, but also serves as an economizer with vapor injection. An increase of 20% in the isentropic efficiency can be attained for the improved ITE compared with the conventional ITE owing to the reduction of the friction losses of the rotor. The performance of the novel cycle is investigated based on energy and exergy analysis. A correlation of the optimum intermediate pressure in terms of ITE efficiency is developed. The improved ITE cycle increases the exergy efficiency by 1.4%–6.1% over the conventional ITE cycle, 4.6%–8.3% over the economizer cycle and 7.2%–21.6% over the base cycle. Furthermore, the improved ITE cycle is also preferred due to its lower exergy loss.

  16. Improvement in smile esthetics following orthodontic treatment: a retrospective study utilizing standardized smile analysis.

    Science.gov (United States)

    Maganzini, Anthony L; Schroetter, Sarah B; Freeman, Kathy

    2014-05-01

    To quantify smile esthetics following orthodontic treatment and determine whether these changes are correlated to the severity of the initial malocclusion. A standardized smile mesh analysis that evaluated nine lip-tooth characteristics was applied to two groups of successfully treated patients: group 1 (initial American Board of Orthodontics Discrepancy Index [DI] score20). T-tests were used to detect significant differences between the low-DI and high-DI groups for baseline pretreatment measurements, baseline posttreatment measurements, and changes from pre- to posttreatment. A Spearman correlation test compared the initial DI values with the changes in the nine smile measurements. Five of the smile measurements were improved in both groups following orthodontic treatment. Both groups demonstrated improved incisor exposure, an improved gingival smile line, an increase in smile width, a decreased buccal corridor space, and an improvement in smile consonance. Spearman correlation tests showed that initial DI value was not correlated to changes in any of the individual smile measurements. Smile esthetics is improved by orthodontic treatment regardless of the initial severity of the malocclusion. In other words, patients with more complex orthodontic issues and their counterparts with minor malocclusions benefitted equally from treatment in terms of their smile esthetics.

  17. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement.

    Science.gov (United States)

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna; Nyström, Monica Elisabeth

    2016-07-29

    Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. Published by the BMJ Publishing Group Limited. For

  18. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement

    Science.gov (United States)

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna

    2016-01-01

    Objectives Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. Methods We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. Results 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. Conclusions There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. PMID:27473953

  19. Improved spectrophotometric analysis of fullerenes C60 and C70 in high-solubility organic solvents.

    Science.gov (United States)

    Törpe, Alexander; Belton, Daniel J

    2015-01-01

    Fullerenes are among a number of recently discovered carbon allotropes that exhibit unique and versatile properties. The analysis of these materials is of great importance and interest. We present previously unreported spectroscopic data for C60 and C70 fullerenes in high-solubility solvents, including error bounds, so as to allow reliable colorimetric analysis of these materials. The Beer-Lambert-Bouguer law is found to be valid at all wavelengths. The measured data were highly reproducible, and yielded high-precision molar absorbance coefficients for C60 and C70 in o-xylene and o-dichlorobenzene, which both exhibit a high solubility for these fullerenes, and offer the prospect of improved extraction efficiency. A photometric method for a C60/C70 mixture analysis was validated with standard mixtures, and subsequently improved for real samples by correcting for light scattering, using a power-law fit. The method was successfully applied to the analysis of C60/C70 mixtures extracted from fullerene soot.

  20. Does Flywheel Paradigm Training Improve Muscle Volume and Force? A Meta-Analysis.

    Science.gov (United States)

    Nuñez Sanchez, Francisco J; Sáez de Villarreal, Eduardo

    2017-11-01

    Núñez Sanchez, FJ and Sáez de Villarreal, E. Does flywheel paradigm training improve muscle volume and force? A meta-analysis. J Strength Cond Res 31(11): 3177-3186, 2017-Several studies have confirmed the efficacy of flywheel paradigm training for improving or benefiting muscle volume and force. A meta-analysis of 13 studies with a total of 18 effect sizes was performed to analyse the role of various factors on the effectiveness of flywheel paradigm training. The following inclusion criteria were employed for the analysis: (a) randomized studies; (b) high validity and reliability instruments; (c) published in a high quality peer-reviewed journal; (d) healthy participants; (e) studies where the eccentric programme were described; and (f) studies where increases in muscle volume and force were measured before and after training. Increases in muscle volume and force were noted through the use of flywheel systems during short periods of training. The increase in muscle mass appears was not influenced by the existence of eccentric overload during the exercise. The increase in force was significantly higher with the existence of eccentric overload during the exercise. The responses identified in this analysis are essential and should be considered by strength and conditioning professionals regarding the most appropriate dose response trends for flywheel paradigm systems to optimize the increase in muscle volume and force.

  1. ASTM clustering for improving coal analysis by near-infrared spectroscopy.

    Science.gov (United States)

    Andrés, J M; Bona, M T

    2006-11-15

    Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.

  2. Personal microbiome analysis improves student engagement and interest in Immunology, Molecular Biology, and Genomics undergraduate courses

    Science.gov (United States)

    Bridgewater, Laura C.; Jensen, Jamie L.; Breakwell, Donald P.; Nielsen, Brent L.; Johnson, Steven M.

    2018-01-01

    A critical area of emphasis for science educators is the identification of effective means of teaching and engaging undergraduate students. Personal microbiome analysis is a means of identifying the microbial communities found on or in our body. We hypothesized the use of personal microbiome analysis in the classroom could improve science education by making courses more applied and engaging for undergraduate students. We determined to test this prediction in three Brigham Young University undergraduate courses: Immunology, Advanced Molecular Biology Laboratory, and Genomics. These three courses have a two-week microbiome unit and students during the 2016 semester students could submit their own personal microbiome kit or use the demo data, whereas during the 2017 semester students were given access to microbiome data from an anonymous individual. The students were surveyed before, during, and after the human microbiome unit to determine whether analyzing their own personal microbiome data, compared to analyzing demo microbiome data, impacted student engagement and interest. We found that personal microbiome analysis significantly enhanced the engagement and interest of students while completing microbiome assignments, the self-reported time students spent researching the microbiome during the two week microbiome unit, and the attitudes of students regarding the course overall. Thus, we found that integrating personal microbiome analysis in the classroom was a powerful means of improving student engagement and interest in undergraduate science courses. PMID:29641525

  3. Improved helicopter aeromechanical stability analysis using segmented constrained layer damping and hybrid optimization

    Science.gov (United States)

    Liu, Qiang; Chattopadhyay, Aditi

    2000-06-01

    Aeromechanical stability plays a critical role in helicopter design and lead-lag damping is crucial to this design. In this paper, the use of segmented constrained damping layer (SCL) treatment and composite tailoring is investigated for improved rotor aeromechanical stability using formal optimization technique. The principal load-carrying member in the rotor blade is represented by a composite box beam, of arbitrary thickness, with surface bonded SCLs. A comprehensive theory is used to model the smart box beam. A ground resonance analysis model and an air resonance analysis model are implemented in the rotor blade built around the composite box beam with SCLs. The Pitt-Peters dynamic inflow model is used in air resonance analysis under hover condition. A hybrid optimization technique is used to investigate the optimum design of the composite box beam with surface bonded SCLs for improved damping characteristics. Parameters such as stacking sequence of the composite laminates and placement of SCLs are used as design variables. Detailed numerical studies are presented for aeromechanical stability analysis. It is shown that optimum blade design yields significant increase in rotor lead-lag regressive modal damping compared to the initial system.

  4. Multiple breath washout analysis in infants: quality assessment and recommendations for improvement.

    Science.gov (United States)

    Anagnostopoulou, Pinelopi; Egger, Barbara; Lurà, Marco; Usemann, Jakob; Schmidt, Anne; Gorlanova, Olga; Korten, Insa; Roos, Markus; Frey, Urs; Latzin, Philipp

    2016-03-01

    Infant multiple breath washout (MBW) testing serves as a primary outcome in clinical studies. However, it is still unknown whether current software algorithms allow between-centre comparisons. In this study of healthy infants, we quantified MBW measurement errors and tried to improve data quality by simply changing software settings. We analyzed best quality MBW measurements performed with an ultrasonic flowmeter in 24 infants from two centres in Switzerland with the current software settings. To challenge the robustness of these settings, we also used alternative analysis approaches. Using the current analysis software, the coefficient of variation (CV) for functional residual capacity (FRC) differed significantly between centres (mean  ±  SD (%): 9.8  ±  5.6 and 5.8  ±  2.9, respectively, p  =  0.039). In addition, FRC values calculated during the washout differed between  -25 and  +30% from those of the washin of the same tracing. Results were mainly influenced by analysis settings and temperature recordings. Changing few algorithms resulted in significantly more robust analysis. Non-systematic inter-centre differences can be reduced by using correctly recorded environmental data and simple changes in the software algorithms. We provide implications that greatly improve infant MBW outcomes' quality and can be applied when multicentre trials are conducted.

  5. Improving the effectiveness of FMEA analysis in automotive – a case study

    Directory of Open Access Journals (Sweden)

    Ványi Gábor

    2016-06-01

    Full Text Available Many industries, for example automotive, have well defined product development process definitions and risk evaluation methods. The FMEA (Failure Mode and Effects Analysis is a first line risk analysis method in design, which has been implemented in development and production since decades. Although the first applications were focusing on mechanical and electrical design and functionalities, today, software components are implemented in many modern vehicle systems. However, standards or industry specific associations do not specify any “best practice” how to design the interactions of multiple entities in one model. This case study focuses on modelling interconnections and on the improvement of the FMEA modelling process in the automotive. Selecting and grouping software components for the analysis is discussed, but software architect design patterns are excluded from the study.

  6. Cause-effect analysis: improvement of a first year engineering students' calculus teaching model

    Science.gov (United States)

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of mathematics, factors that the tertiary sector has no control over. The analysis also indicates the undesirable issues that are at the root of impeding success in the calculus module. Most important is that students are not encouraged to become independent thinkers from an early age. This triggers problems in follow-up courses where students are expected to have learned to deal with the work load and understanding of certain concepts. A new model was designed to lessen the impact of these undesirable issues.

  7. Welding deformation analysis based on improved equivalent strain method to cover external constraint during cooling stage

    Directory of Open Access Journals (Sweden)

    Tae-Jun Kim

    2015-09-01

    Full Text Available In the present study, external restraints imposed normal to the plate during the cooling stage were determined to be effective for reduction of the angular distortion of butt-welded or fillet-welded plate. A welding analysis model under external force during the cooling stage was idealized as a prismatic member subjected to pure bending. The external restraint was represented by vertical force on both sides of the work piece and bending stress forms in the transverse direction. The additional bending stress distribution across the plate thickness was reflected in the improved inherent strain model, and a set of inherent strain charts with different levels of bending stress were newly calculated. From an elastic linear FE analysis using the inherent strain values taken from the chart and comparing them with those from a 3D thermal elasto-plastic FE analysis, welding deformation can be calculated.

  8. Improved method for minimizing sulfur loss in analysis of particulate organic sulfur.

    Science.gov (United States)

    Park, Ki-Tae; Lee, Kitack; Shin, Kyoungsoon; Jeong, Hae Jin; Kim, Kwang Young

    2014-02-04

    The global sulfur cycle depends primarily on the metabolism of marine microorganisms, which release sulfur gas into the atmosphere and thus affect the redistribution of sulfur globally as well as the earth's climate system. To better quantify sulfur release from the ocean, analysis of the production and distribution of organic sulfur in the ocean is necessary. This report describes a wet-based method for accurate analysis of particulate organic sulfur (POS) in the marine environment. The proposed method overcomes the considerable loss of sulfur (up to 80%) that occurs during analysis using conventional methods involving drying. Use of the wet-based POS extraction procedure in conjunction with a sensitive sulfur analyzer enabled accurate measurements of cellular POS. Data obtained using this method will enable accurate assessment of how rapidly sulfur can transfer among pools. Such information will improve understanding of the role of POS in the oceanic sulfur cycle.

  9. Economic analysis of interventions to improve village chicken production in Myanmar.

    Science.gov (United States)

    Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J

    2013-07-01

    A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for

  10. Combined analysis of cortical (EEG) and nerve stump signals improves robotic hand control.

    Science.gov (United States)

    Tombini, Mario; Rigosa, Jacopo; Zappasodi, Filippo; Porcaro, Camillo; Citi, Luca; Carpaneto, Jacopo; Rossini, Paolo Maria; Micera, Silvestro

    2012-01-01

    Interfacing an amputee's upper-extremity stump nerves to control a robotic hand requires training of the individual and algorithms to process interactions between cortical and peripheral signals. To evaluate for the first time whether EEG-driven analysis of peripheral neural signals as an amputee practices could improve the classification of motor commands. Four thin-film longitudinal intrafascicular electrodes (tf-LIFEs-4) were implanted in the median and ulnar nerves of the stump in the distal upper arm for 4 weeks. Artificial intelligence classifiers were implemented to analyze LIFE signals recorded while the participant tried to perform 3 different hand and finger movements as pictures representing these tasks were randomly presented on a screen. In the final week, the participant was trained to perform the same movements with a robotic hand prosthesis through modulation of tf-LIFE-4 signals. To improve the classification performance, an event-related desynchronization/synchronization (ERD/ERS) procedure was applied to EEG data to identify the exact timing of each motor command. Real-time control of neural (motor) output was achieved by the participant. By focusing electroneurographic (ENG) signal analysis in an EEG-driven time window, movement classification performance improved. After training, the participant regained normal modulation of background rhythms for movement preparation (α/β band desynchronization) in the sensorimotor area contralateral to the missing limb. Moreover, coherence analysis found a restored α band synchronization of Rolandic area with frontal and parietal ipsilateral regions, similar to that observed in the opposite hemisphere for movement of the intact hand. Of note, phantom limb pain (PLP) resolved for several months. Combining information from both cortical (EEG) and stump nerve (ENG) signals improved the classification performance compared with tf-LIFE signals processing alone; training led to cortical reorganization and

  11. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Total sensitivity and uncertainty analysis for LWR pin-cells with improved UNICORN code

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • A new model is established for the total sensitivity and uncertainty analysis. • The NR approximation applied in S&U analysis can be avoided by the new model. • Sensitivity and uncertainty analysis is performed to PWR pin-cells by the new model. • The effects of the NR approximation for the PWR pin-cells are quantified. - Abstract: In this paper, improvements to the multigroup cross-section perturbation model have been proposed and applied in the self-developed UNICORN code, which is capable of performing the total sensitivity and total uncertainty analysis for the neutron-physics calculations by applying the direct numerical perturbation method and the statistical sampling method respectively. The narrow resonance (NR) approximation was applied in the multigroup cross-section perturbation model, implemented in UNICORN. As improvements to the NR approximation to refine the multigroup cross-section perturbation model, an ultrafine-group cross-section perturbation model has been established, in which the actual perturbations are applied to the ultrafine-group cross-section library and the reconstructions of the resonance cross sections are performed by solving the neutron slowing-down equation. The total sensitivity and total uncertainty analysis were then applied to the LWR pin-cells, using both the multigroup and the ultrafine-group cross-section perturbation models. The numerical results show that the NR approximation overestimates the relative sensitivity coefficients and the corresponding uncertainty results for the LWR pin-cells, and the effects of the NR approximation are significant for σ_(_n_,_γ_) and σ_(_n_,_e_l_a_s_) of "2"3"8U. Therefore, the effects of the NR approximation applied in the total sensitivity and total uncertainty analysis for the neutron-physics calculations of LWR should be taken into account.

  13. Improving production of treated and untreated verbs in aphasia: A meta-analysis

    Directory of Open Access Journals (Sweden)

    Vânia de Aguiar

    2016-09-01

    Full Text Available BACKGROUND. Demographic and clinical predictors of aphasia recovery have been identified in the literature. However, little attention has been devoted to identifying and distinguishing predictors of improvement for different outcomes, e.g., production of treated vs. untreated materials. These outcomes may rely on different mechanisms, and therefore be predicted by different variables. Furthermore, treatment features are not typically accounted for when studying predictors of aphasia recovery. This is partly due to the small numbers of cases reported in studies, but also to limitations of data analysis techniques usually employed. METHOD. We reviewed the literature on predictors of aphasia recovery, and conducted a meta-analysis of single-case studies designed to assess the efficacy of treatments for verb production. The contribution of demographic, clinical, and treatment-related variables was assessed by means of Random Forests (a machine-learning technique used in classification and regression. Two outcomes were investigated: production of treated (for 142 patients and untreated verbs (for 166 patients. RESULTS. Improved production of treated verbs was predicted by a three-way interaction of pre-treatment scores on tests for verb comprehension and word repetition, and the frequency of treatment sessions. Improvement in production of untreated verbs was predicted by an interaction including the use of morphological cues, presence of grammatical impairment, pre-treatment scores on a test for noun comprehension and frequency of treatment sessions. CONCLUSION. Improvement in the production of treated verbs occurs frequently. It may depend on restoring access to and/or knowledge of lexeme representations, and requires relative sparing of semantic knowledge (as measured by verb comprehension and phonological output abilities (including working memory, as measured by word repetition. Improvement in the production of untreated verbs has not been

  14. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Pointurier, F.; Hubert, A.; Faure, A.L.; Pottin, A.C.; Mourier, W.; Marie, O.

    2010-01-01

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  15. AN IMPROVED DISTANCE AND MASS ESTIMATE FOR SGR A* FROM A MULTISTAR ORBIT ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Boehle, A.; Ghez, A. M.; Meyer, L.; Yelda, S.; Albers, S.; Martinez, G. D.; Becklin, E. E.; Do, T.; Morris, M. R.; Sitarski, B.; Witzel, G. [UCLA, Department of Physics and Astronomy, Los Angeles, CA 90095 (United States); Schödel, R. [Instituto de Astrofísica de Andalucía (CSIC), Glorieta de la Astronomía S/N, E-18008 Granada (Spain); Lu, J. R. [Institute for Astronomy, University of Hawaii, Honolulu, HI 96822 (United States); Matthews, K., E-mail: aboehle@astro.ucla.edu [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, MC 301-17, Pasadena, CA 91125 (United States)

    2016-10-10

    We present new, more precise measurements of the mass and distance of our Galaxy’s central supermassive black hole, Sgr A*. These results stem from a new analysis that more than doubles the time baseline for astrometry of faint stars orbiting Sgr A*, combining 2 decades of speckle imaging and adaptive optics data. Specifically, we improve our analysis of the speckle images by using information about a star’s orbit from the deep adaptive optics data (2005–2013) to inform the search for the star in the speckle years (1995–2005). When this new analysis technique is combined with the first complete re-reduction of Keck Galactic Center speckle images using speckle holography, we are able to track the short-period star S0-38 ( K -band magnitude = 17, orbital period = 19 yr) through the speckle years. We use the kinematic measurements from speckle holography and adaptive optics to estimate the orbits of S0-38 and S0-2 and thereby improve our constraints of the mass ( M {sub bh}) and distance ( R {sub o} ) of Sgr A*: M {sub bh} = (4.02 ± 0.16 ± 0.04) × 10{sup 6} M {sub ⊙} and 7.86 ± 0.14 ± 0.04 kpc. The uncertainties in M {sub bh} and R {sub o} as determined by the combined orbital fit of S0-2 and S0-38 are improved by a factor of 2 and 2.5, respectively, compared to an orbital fit of S0-2 alone and a factor of ∼2.5 compared to previous results from stellar orbits. This analysis also limits the extended dark mass within 0.01 pc to less than 0.13 × 10{sup 6} M {sub ⊙} at 99.7% confidence, a factor of 3 lower compared to prior work.

  16. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  17. Protein cleavage strategies for an improved analysis of the membrane proteome

    Directory of Open Access Journals (Sweden)

    Poetsch Ansgar

    2006-03-01

    Full Text Available Abstract Background Membrane proteins still remain elusive in proteomic studies. This is in part due to the distribution of the amino acids lysine and arginine, which are less frequent in integral membrane proteins and almost absent in transmembrane helices. As these amino acids are cleavage targets for the commonly used protease trypsin, alternative cleavage conditions, which should improve membrane protein analysis, were tested by in silico digestion for the three organisms Saccharomyces cerevisiae, Halobacterium sp. NRC-1, and Corynebacterium glutamicum as hallmarks for eukaryotes, archea and eubacteria. Results For the membrane proteomes from all three analyzed organisms, we identified cleavage conditions that achieve better sequence and proteome coverage than trypsin. Greater improvement was obtained for bacteria than for yeast, which was attributed to differences in protein size and GRAVY. It was demonstrated for bacteriorhodopsin that the in silico predictions agree well with the experimental observations. Conclusion For all three examined organisms, it was found that a combination of chymotrypsin and staphylococcal peptidase I gave significantly better results than trypsin. As some of the improved cleavage conditions are not more elaborate than trypsin digestion and have been proven useful in practice, we suppose that the cleavage at both hydrophilic and hydrophobic amino acids should facilitate in general the analysis of membrane proteins for all organisms.

  18. A model for improving energy efficiency in industrial motor system using multicriteria analysis

    International Nuclear Information System (INIS)

    Herrero Sola, Antonio Vanderley; Mota, Caroline Maria de Miranda; Kovaleski, Joao Luiz

    2011-01-01

    In the last years, several policies have been proposed by governments and global institutions in order to improve the efficient use of energy in industries worldwide. However, projects in industrial motor systems require new approach, mainly in decision making area, considering the organizational barriers for energy efficiency. Despite the wide application, multicriteria methods remain unexplored in industrial motor systems until now. This paper proposes a multicriteria model using the PROMETHEE II method, with the aim of ranking alternatives for induction motors replacement. A comparative analysis of the model, applied to a Brazilian industry, has shown that multicriteria analysis presents better performance on energy saving as well as return on investments than single criterion. The paper strongly recommends the dissemination of multicriteria decision aiding as a policy to support the decision makers in industries and to improve energy efficiency in electric motor systems. - Highlights: → Lack of decision model in industrial motor system is the main motivation of the research. → A multicriteria model based on PROMETHEE method is proposed with the aim of supporting the decision makers in industries. → The model can contribute to transpose some barriers within the industries, improving the energy efficiency in industrial motor system.

  19. A model for improving energy efficiency in industrial motor system using multicriteria analysis

    Energy Technology Data Exchange (ETDEWEB)

    Herrero Sola, Antonio Vanderley, E-mail: sola@utfpr.edu.br [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil); Mota, Caroline Maria de Miranda, E-mail: carolmm@ufpe.br [Federal University of Pernambuco, Cx. Postal 7462, CEP 50630-970, Recife (Brazil); Kovaleski, Joao Luiz [Federal University of Technology, Parana, Brazil (UTFPR)-Campus Ponta Grossa, Av. Monteiro Lobato, Km 4, CEP: 84016-210 (Brazil)

    2011-06-15

    In the last years, several policies have been proposed by governments and global institutions in order to improve the efficient use of energy in industries worldwide. However, projects in industrial motor systems require new approach, mainly in decision making area, considering the organizational barriers for energy efficiency. Despite the wide application, multicriteria methods remain unexplored in industrial motor systems until now. This paper proposes a multicriteria model using the PROMETHEE II method, with the aim of ranking alternatives for induction motors replacement. A comparative analysis of the model, applied to a Brazilian industry, has shown that multicriteria analysis presents better performance on energy saving as well as return on investments than single criterion. The paper strongly recommends the dissemination of multicriteria decision aiding as a policy to support the decision makers in industries and to improve energy efficiency in electric motor systems. - Highlights: > Lack of decision model in industrial motor system is the main motivation of the research. > A multicriteria model based on PROMETHEE method is proposed with the aim of supporting the decision makers in industries. > The model can contribute to transpose some barriers within the industries, improving the energy efficiency in industrial motor system.

  20. A data envelopment analysis based model for proposing safety improvements: a FMEA approach

    International Nuclear Information System (INIS)

    Garcia, Pauli A. de A.; Barbosa Junior, Gilberto V.; Melo, P.F. Frutuoso e

    2005-01-01

    When performing a probabilistic safety assessment, one important step is the identification of the critical or weak points of all systems to be considered. By properly ranking these critical points, improvement recommendations may be proposed, in order to reduce the associated risks. Many tools are available for the identification of critical points, like the Failure Mode and Effect Analysis (FMEA) and the Hazard and Operability Studies (HAZOP). Once the failure modes or deviations are identified, indices associated to the occurrence probabilities, detection potential, and the effects severity, are assigned to them, and so the failure modes or deviations ranking is performed. It is common practice to assign risk priority numbers for this purpose. These numbers are obtained by multiplying the three aforementioned indices, which typically vary from 1 to 10 (natural numbers). Here, the greater the index, the worst the situation. In this paper, a data envelopment analysis (DEA) based model is used to identify the most critical failure modes or deviations and, by means of their respective distances to the boundary, to assess the improvement percentage for each index of each failure mode or deviation. Starting from this identification procedure, the decision maker can more efficiently propose improvement actions, like reliability allocation, detection design, protective barriers, etc. (author)

  1. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  2. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2005-11-01

    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  3. Improved Regression Analysis of Temperature-Dependent Strain-Gage Balance Calibration Data

    Science.gov (United States)

    Ulbrich, N.

    2015-01-01

    An improved approach is discussed that may be used to directly include first and second order temperature effects in the load prediction algorithm of a wind tunnel strain-gage balance. The improved approach was designed for the Iterative Method that fits strain-gage outputs as a function of calibration loads and uses a load iteration scheme during the wind tunnel test to predict loads from measured gage outputs. The improved approach assumes that the strain-gage balance is at a constant uniform temperature when it is calibrated and used. First, the method introduces a new independent variable for the regression analysis of the balance calibration data. The new variable is designed as the difference between the uniform temperature of the balance and a global reference temperature. This reference temperature should be the primary calibration temperature of the balance so that, if needed, a tare load iteration can be performed. Then, two temperature{dependent terms are included in the regression models of the gage outputs. They are the temperature difference itself and the square of the temperature difference. Simulated temperature{dependent data obtained from Triumph Aerospace's 2013 calibration of NASA's ARC-30K five component semi{span balance is used to illustrate the application of the improved approach.

  4. Proposal for the improvement of IRD safety culture based on risk analysis

    International Nuclear Information System (INIS)

    Aguiar, L.A.; Ferreira, P.R.R.; Silveira, C.S.

    2017-01-01

    The Safety Culture (SC) is a concept about the relationship of individuals and organizations towards the safety in a specific activity. Any organization that carries out activities with risks has a SC, even at minimum levels. People perceive different types of radiation risks in very different ways, therefore, to identify and to analysis of the possible radiation risks resulting from normal operation or accident conditions is an important issue in order to improve the SC in organization. The main is to present guidelines for the improvement of the safety culture in the Institute of Radiation Protection and Dosimetry - IRD through on risk-based approach. The methodology proposed here is: A) select a division of the IRD for case study; B) assess the level of the 10 culture safety basic elements of the IRD division selected; C) conduct a survey of the hazards and risks associated with the various activities developed by the division; D) reassess the level of the 10 basic elements of CS; And E) analyze the results and correlate the impact of risk knowledge on safety culture improvement. The expected result is improvement the safety and of safety culture by understanding of radiation risks and hazards relating to work and to the working environment; and thus enforce a collective commitment to safety by teams and individuals and raise the safety culture to higher levels. (author)

  5. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chauhan, Narvendra Singh [Department of Agronomy, Uttar Banga Krishi Viswavidyalaya, P.O. Pundibari, District Cooch Behar (West Bengal) 736 165 (India)]. E-mail: nsc_01@rediffmail.com; Mohapatra, Pratap K.J. [Department of Industrial Engineering and Management, Indian Institute of Technology, Kharagpur (West Bengal) 721 302 (India); Pandey, Keshaw Prasad [Department of Agricultural and Food Engineering, Indian Institute of Technology, Kharagpur (West Bengal) 721 302 (India)

    2006-06-15

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone.

  6. Proposal for the improvement of IRD safety culture based on risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, L.A.; Ferreira, P.R.R. [Instituto de Radioproteção e Dosimetria (DIRAD/IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Silveira, C.S., E-mail: laguiar@ird.gov.br [Comissão Nacional de Energia Nuclear (DRS/CGMI/CNEN), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The Safety Culture (SC) is a concept about the relationship of individuals and organizations towards the safety in a specific activity. Any organization that carries out activities with risks has a SC, even at minimum levels. People perceive different types of radiation risks in very different ways, therefore, to identify and to analysis of the possible radiation risks resulting from normal operation or accident conditions is an important issue in order to improve the SC in organization. The main is to present guidelines for the improvement of the safety culture in the Institute of Radiation Protection and Dosimetry - IRD through on risk-based approach. The methodology proposed here is: A) select a division of the IRD for case study; B) assess the level of the 10 culture safety basic elements of the IRD division selected; C) conduct a survey of the hazards and risks associated with the various activities developed by the division; D) reassess the level of the 10 basic elements of CS; And E) analyze the results and correlate the impact of risk knowledge on safety culture improvement. The expected result is improvement the safety and of safety culture by understanding of radiation risks and hazards relating to work and to the working environment; and thus enforce a collective commitment to safety by teams and individuals and raise the safety culture to higher levels. (author)

  7. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    International Nuclear Information System (INIS)

    Chauhan, Narvendra Singh; Mohapatra, Pratap K.J.; Pandey, Keshaw Prasad

    2006-01-01

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone

  8. Economic analysis of the health impacts of housing improvement studies: a systematic review

    Science.gov (United States)

    Fenwick, Elisabeth; Macdonald, Catriona; Thomson, Hilary

    2013-01-01

    Background Economic evaluation of public policies has been advocated but rarely performed. Studies from a systematic review of the health impacts of housing improvement included data on costs and some economic analysis. Examination of these data provides an opportunity to explore the difficulties and the potential for economic evaluation of housing. Methods Data were extracted from all studies included in the systematic review of housing improvement which had reported costs and economic analysis (n=29/45). The reported data were assessed for their suitability to economic evaluation. Where an economic analysis was reported the analysis was described according to pre-set definitions of various types of economic analysis used in the field of health economics. Results 25 studies reported cost data on the intervention and/or benefits to the recipients. Of these, 11 studies reported data which was considered amenable to economic evaluation. A further four studies reported conducting an economic evaluation. Three of these studies presented a hybrid ‘balance sheet’ approach and indicated a net economic benefit associated with the intervention. One cost-effectiveness evaluation was identified but the data were unclearly reported; the cost-effectiveness plane suggested that the intervention was more costly and less effective than the status quo. Conclusions Future studies planning an economic evaluation need to (i) make best use of available data and (ii) ensure that all relevant data are collected. To facilitate this, economic evaluations should be planned alongside the intervention with input from health economists from the outset of the study. When undertaken appropriately, economic evaluation provides the potential to make significant contributions to housing policy. PMID:23929616

  9. Aggregate analysis of regulatory authority assessors' comments to improve the quality of periodic safety update reports.

    Science.gov (United States)

    Jullian, Sandra; Jaskiewicz, Lukasz; Pfannkuche, Hans-Jürgen; Parker, Jeremy; Lalande-Luesink, Isabelle; Lewis, David J; Close, Philippe

    2015-09-01

    Marketing authorization holders (MAHs) are expected to provide high-quality periodic safety update reports (PSURs) on their pharmaceutical products to health authorities (HAs). We present a novel instrument aiming at improving quality of PSURs based on standardized analysis of PSUR assessment reports (ARs) received from the European Union HAs across products and therapeutic areas. All HA comments were classified into one of three categories: "Request for regulatory actions," "Request for medical and scientific information," or "Data deficiencies." The comments were graded according to their impact on patients' safety, the drug's benefit-risk profile, and the MAH's pharmacovigilance system. A total of 476 comments were identified through the analysis of 63 PSUR HA ARs received in 2013 and 2014; 47 (10%) were classified as "Requests for regulatory actions," 309 (65%) as "Requests for medical and scientific information," and 118 (25%) comments were related to "Data deficiencies." The most frequent comments were requests for labeling changes (35 HA comments in 19 ARs). The aggregate analysis revealed commonly raised issues and prompted changes of the MAH's procedures related to the preparation of PSURs. The authors believe that this novel instrument based on the evaluation of PSUR HA ARs serves as a valuable mechanism to enhance the quality of PSURs and decisions about optimization of the use of the products and, therefore, contributes to improve further the MAH's pharmacovigilance system and patient safety. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm

    Science.gov (United States)

    Neri, P.

    2017-05-01

    Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.

  11. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  12. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    International Nuclear Information System (INIS)

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  13. State-of-the-art review of sodium fire analysis and current notions for improvements

    International Nuclear Information System (INIS)

    Tasi, S.S.

    1978-01-01

    Sodium releases from postulated pipe ruptures, as well as failures of sodium handling equipment in liquid metal fast breeder reactors, may lead to substantial pressure-temperature transients in the sodium system cells, as well as in the reactor containment building. Sodium fire analyses are currently performed with analytical tools, such as the SPRAY, SOMIX, SPOOL-FIRE and SOFIRE-II codes. A review and evaluation of the state-of-the-art in sodium fire analysis is presented, and suggestions for further improvements are made. This work is based, in part, on studies made at Brookhaven National Laboratory during the past several years in the areas of model development and improvement associated with the accident analyses of LMFBRs

  14. An improved principal component analysis based region matching method for fringe direction estimation

    Science.gov (United States)

    He, A.; Quan, C.

    2018-04-01

    The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.

  15. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  16. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    Science.gov (United States)

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  17. Analysis of the nutritional management practices in intensive care: Identification of needs for improvement.

    Science.gov (United States)

    Lázaro-Martín, N I; Catalán-González, M; García-Fuentes, C; Terceros-Almanza, L; Montejo-González, J C

    2015-12-01

    To analyze the nutritional management practices in Intensive Care (ICU) to detect the need for improvement actions. Re-evaluate the process after implementation of improvement actions. Prospective observational study in 3 phases: 1) observation; 2) analysis, proposal development and dissemination; 3) analysis of the implementation. ICU of a hospital of high complexity. Adult ICU forecast more than 48h of artificial nutrition. Parenteral nutrition (PN), enteral nutrition (EN) (type, average effective volume, complications) and average nutritional ratio. A total of 229 patients (phase 1: 110, phase 3: 119). After analyzing the initial results, were proposed: increased use and precocity of EN, increased protein intake, nutritional monitoring effectiveness and increased supplementary indication NP. The measures were broadcast at specific meetings. During phase 3 more patients received EN (55.5 vs. 78.2%, P=.001), with no significant difference in the start time (1.66 vs. 2.33 days), duration (6.82 vs. 10,12 days) or complications (37,7 vs. 47,3%).Use of hyperproteic diets was higher in phase 3 (0 vs. 13.01%, P<.05). The use of NP was similar (48.2 vs. 48,7%) with a tendency to a later onset in phase 3 (1.25±1.25 vs. 2.45±3.22 days). There were no significant differences in the average nutritional ratio (0.56±0.28 vs. 0.61±0.27, P=.56). The use of EN and the protein intake increased, without appreciating effects on other improvement measures. Other methods appear to be necessary for the proper implementation of improvement measures. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.

  18. Error analysis and algorithm implementation for an improved optical-electric tracking device based on MEMS

    Science.gov (United States)

    Sun, Hong; Wu, Qian-zhong

    2013-09-01

    In order to improve the precision of optical-electric tracking device, proposing a kind of improved optical-electric tracking device based on MEMS, in allusion to the tracking error of gyroscope senor and the random drift, According to the principles of time series analysis of random sequence, establish AR model of gyro random error based on Kalman filter algorithm, then the output signals of gyro are multiple filtered with Kalman filter. And use ARM as micro controller servo motor is controlled by fuzzy PID full closed loop control algorithm, and add advanced correction and feed-forward links to improve response lag of angle input, Free-forward can make output perfectly follow input. The function of lead compensation link is to shorten the response of input signals, so as to reduce errors. Use the wireless video monitor module and remote monitoring software (Visual Basic 6.0) to monitor servo motor state in real time, the video monitor module gathers video signals, and the wireless video module will sent these signals to upper computer, so that show the motor running state in the window of Visual Basic 6.0. At the same time, take a detailed analysis to the main error source. Through the quantitative analysis of the errors from bandwidth and gyro sensor, it makes the proportion of each error in the whole error more intuitive, consequently, decrease the error of the system. Through the simulation and experiment results shows the system has good following characteristic, and it is very valuable for engineering application.

  19. Confirmatory factor analysis and recommendations for improvement of the Autonomy-Preference-Index (API).

    Science.gov (United States)

    Simon, Daniela; Kriston, Levente; Loh, Andreas; Spies, Claudia; Scheibler, Fueloep; Wills, Celia; Härter, Martin

    2010-09-01

    Validation of the German version of the Autonomy-Preference-Index (API), a measure of patients' preferences for decision making and information seeking. Stepwise confirmatory factor analysis was conducted on a sample of patients (n = 1592) treated in primary care for depression (n = 186), surgical and internal medicine inpatients (n = 811) and patients with minor trauma treated in an emergency department (n = 595). An initial test of the model was done on calculation and validation halves of the sample. Both local and global indexes-of-fit suggested modifications to the scale. The scale was modified and re-tested in the calculation sample and confirmed in the validation sample. Subgroup analyses for age, gender and type of treatment setting were also performed. The confirmatory analysis led to a modified version of the API with better local and global indexes-of-fit for samples of German-speaking patients. Two items of the sub-scale, 'preference for decision-making', and one item of the sub-scale, 'preference for information seeking', showed very low reliability scores and were deleted. Thus, several global indexes-of-fit clearly improved significantly. The modified scale was confirmed on the validation sample with acceptable to good indices of fit. Results of subgroup analyses indicated that no adaptations were necessary. This first confirmatory analysis for a German-speaking population showed that the API was improved by the removal of several items. There were theoretically plausible explanations for this improvement suggesting that the modifications might also be appropriate in English and other language versions.

  20. Drug supply indicators: Pitfalls and possibilities for improvements to assist comparative analysis.

    Science.gov (United States)

    Singleton, Nicola; Cunningham, Andrew; Groshkova, Teodora; Royuela, Luis; Sedefov, Roumen

    2018-06-01

    Interventions to tackle the supply of drugs are seen as standard components of illicit drug policies. Therefore drug market-related administrative data, such as seizures, price, purity and drug-related offending, are used in most countries for policy monitoring and assessment of the drug situation. International agencies, such as the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) and the UN Office of Drugs and Crime, also monitor and report on the drug situation cross-nationally and therefore seek to collect and make available key data in a uniform manner from the countries they cover. However, these data are not primarily collected for this purpose, which makes interpretation and comparative analysis difficult. Examples of limitations of these data sources include: the extent to which they reflect operational priorities rather than market changes; question marks over the robustness of and consistency in data collection methods, and issues around the timeliness of data availability. Such problems are compounded by cultural, social and contextual differences between countries. Making sense of such data is therefore challenging and extreme care needs to be taken using it. Nevertheless, these data provide an important window on a hidden area, so improving the quality of the data collected and expanding its scope should be a priority for those seeking to understand or monitor drug markets and supply reduction. In addition to highlighting some of the potential pitfalls in using supply indicators for comparative analysis, this paper presents a selection of options for improvements based on the current EMCDDA programme of work to improve their supply-related monitoring and analysis. The conceptual framework developed to steer this work may have wider application. Adopting this approach has the potential to provide a richer picture of drug markets, at both national and international levels, and make it easier to compare data between countries. Copyright

  1. Cognitive-Based Interventions to Improve Mobility: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Marusic, Uros; Verghese, Joe; Mahoney, Jeannette R

    2018-06-01

    A strong relation between cognition and mobility has been identified in aging, supporting a role for enhancement mobility through cognitive-based interventions. However, a critical evaluation of the consistency of treatment effects of cognitive-based interventions is currently lacking. The objective of this study was 2-fold: (1) to review the existing literature on cognitive-based interventions aimed at improving mobility in older adults and (2) to assess the clinical effectiveness of cognitive interventions on gait performance. A systematic review of randomized controlled trials (RCT) of cognitive training interventions for improving simple (normal walking) and complex (dual task walking) gait was conducted in February 2018. Older adults without major cognitive, psychiatric, neurologic, and/or sensory impairments were included. Random effect meta-analyses and a subsequent meta-regression were performed to generate overall cognitive intervention effects on single- and dual-task walking conditions. Ten RCTs met inclusion criteria, with a total of 351 participants included in this meta-analysis. Cognitive training interventions revealed a small effect of intervention on complex gait [effect size (ES) = 0.47, 95% confidence interval (CI) 0.13 to 0.81, P = .007, I 2  = 15.85%], but not simple gait (ES = 0.35, 95% CI -0.01 to 0.71, P = .057, I 2  = 57.32%). Moreover, a meta-regression analysis revealed that intervention duration, training frequency, total number of sessions, and total minutes spent in intervention were not significant predictors of improvement in dual-task walking speed, though there was a suggestive trend toward a negative association between dual-task walking speed improvements and individual training session duration (P = .067). This meta-analysis provides support for the fact that cognitive training interventions can improve mobility-related outcomes, especially during challenging walking conditions requiring higher-order executive

  2. Improving power output of inertial energy harvesters by employing principal component analysis of input acceleration

    Science.gov (United States)

    Smilek, Jan; Hadas, Zdenek

    2017-02-01

    In this paper we propose the use of principal component analysis to process the measured acceleration data in order to determine the direction of acceleration with the highest variance on given frequency of interest. This method can be used for improving the power generated by inertial energy harvesters. Their power output is highly dependent on the excitation acceleration magnitude and frequency, but the axes of acceleration measurements might not always be perfectly aligned with the directions of movement, and therefore the generated power output might be severely underestimated in simulations, possibly leading to false conclusions about the feasibility of using the inertial energy harvester for the examined application.

  3. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole

    2002-01-01

    A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...... recovery. The annual energy cost and first cost of the HVAC system were calculat4ed using DOE 2.1E for different levels of air quality (10-50% dissatisfied). This was achieved by changing the outdoor air supply rate and the pollution loads. Previous studies have documented a 1.1% increase in office...

  4.  The Assembly of Lean Production: An Analysis of Doing Production Improvements

    OpenAIRE

    Andersson, Gunnar

    2011-01-01

    This thesis is an analysis of the assembly of the zero-defects project at Glomma Papp AS, a company on manufacture of paper, corrugated board, solid board and display, in Sarpsborg Norway. The zero-defects project was a local production improvement project based on approaches, tools and methods known as Lean. The project is seen as an actor-network, which means that its reality, and the understandings and practices of it, are effects of the web of people, structures, technologies and others w...

  5. Analysis and improvement of a chaos-based image encryption algorithm

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Wei Pengcheng

    2009-01-01

    The security of digital image attracts much attention recently. In Guan et al. [Guan Z, Huang F, Guan W. Chaos-based image encryption algorithm. Phys Lett A 2005; 346: 153-7.], a chaos-based image encryption algorithm has been proposed. In this paper, the cause of potential flaws in the original algorithm is analyzed in detail, and then the corresponding enhancement measures are proposed. Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.

  6. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  7. Analysis and Alternate Selection of Nanopowder Modifiers to Improve a Special Protective Coating System

    Directory of Open Access Journals (Sweden)

    S. P. Bardakhanov

    2017-01-01

    Full Text Available This paper presents a practical approach for rational choice of silica nanopowders as modifiers to control and improve the performance of protective coating systems operating in harsh environmental conditions. The approach is based on the multiparameter analysis of nanoparticle reactivity of similar silica synthesized by using chemical and physical methods. The analysis indicates distinct adsorption centers due to the differences in the particles formation; the features of the formation and adsorption mechanisms lead to higher diffusion capacity of the nanoparticles, synthesized by physical methods, into a paint material and finally result in stronger chemical bonds between the system elements. The approach allows reducing the consumption of paint materials by 30% or more, at least 2-3 times increasing of the coating adhesion and hence the system life. Validity of the approach is illustrated through the data obtained from comparative modeling, factory testing, and practical use of modified systems.

  8. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  9. Improved soil particle-size analysis by gamma-ray attenuation

    International Nuclear Information System (INIS)

    Oliveira, J.C.M.; Vaz, C.M.P.; Reichardt, K.; Swartzendruber, D.

    1997-01-01

    The size distribution of particles is useful for physical characterization of soil. This study was conducted to determine whether a new method of soil particle-size analysis by gamma-ray attenuation could be further improved by changing the depth and time of measurement of the suspended particle concentration during sedimentation. In addition to the advantage of nondestructive, undisturbed measurement by gamma-ray attenuation, as compared with conventional pipette or hydrometer methods, the modifications here suggested and employed do substantially decrease the total time for analysis, and will also facilitate total automation and generalize the method for other sedimentation studies. Experimental results are presented for three different Brazilian soil materials, and illustrate the nature of the fine detail provided in the cumulative particle-size distribution as given by measurements obtained during the relatively short time period of 28 min

  10. Improving Eastern Bluebird nest box performance using computer analysis of satellite images

    Directory of Open Access Journals (Sweden)

    Sarah Svatora

    2012-06-01

    Full Text Available Bird conservationists have been introducing man-made boxes in an effort to increase the bluebird population. In this study we use computer analysis of satellite images to show that the performance of the boxes used by Eastern Bluebirds (Sialia sialis in Michigan can be improved by about 48%. The analysis is based on a strongcorrelation found between the edge directionality measured in the satellite image of the area around the box, and the preferences of the birds when selecting their nesting site. The method is based on satellite images taken from Google Earth, and can be used by conservationists to select a box placement strategy that will optimize the efficacy of the boxes deployed in a given area.

  11. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  12. Improved Polynomial Fuzzy Modeling and Controller with Stability Analysis for Nonlinear Dynamical Systems

    Directory of Open Access Journals (Sweden)

    Hamed Kharrati

    2012-01-01

    Full Text Available This study presents an improved model and controller for nonlinear plants using polynomial fuzzy model-based (FMB systems. To minimize mismatch between the polynomial fuzzy model and nonlinear plant, the suitable parameters of membership functions are determined in a systematic way. Defining an appropriate fitness function and utilizing Taylor series expansion, a genetic algorithm (GA is used to form the shape of membership functions in polynomial forms, which are afterwards used in fuzzy modeling. To validate the model, a controller based on proposed polynomial fuzzy systems is designed and then applied to both original nonlinear plant and fuzzy model for comparison. Additionally, stability analysis for the proposed polynomial FMB control system is investigated employing Lyapunov theory and a sum of squares (SOS approach. Moreover, the form of the membership functions is considered in stability analysis. The SOS-based stability conditions are attained using SOSTOOLS. Simulation results are also given to demonstrate the effectiveness of the proposed method.

  13. Improving the Design of Capacitive Micromachined Ultrasonic Transducers Aided with Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    A Martowicz

    2016-09-01

    Full Text Available The paper presents the results of analysis performed to search for feasible design improvements for capacitive micromachined ultrasonic transducer. Carried out search has been aided with the sensitivity analysis and the application of Response Surface Method. The multiphysics approach has been taken into account in elaborated finite element model of one cell of described transducer in order to include significant physical phenomena present in modelled microdevice. The set of twelve input uncertain and design parameters consists of geometric, material and control properties. The amplitude of dynamic membrane deformation of the transducer has been chosen as studied parameter. The objective of performed study has been defined as the task of finding robust design configurations of the transducer, i.e. characterizing maximal value of deformation amplitude with its minimal variation.

  14. Use of sensitivity, uncertainty and data adjustment analysis to improve nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I [CEA-Centre d' Etudes de Saclay, SERMA/LEPP, Gif sur Yvette (France); Sartori, E [OECD/ NEA Data Bank, Issy-les-Moulineaux (France); Remec, I [Inst. Jozef Stefan, Ljubljana (Slovenia)

    1992-07-01

    Sensitivity and adjustment analysis provide some valuable information about the radiation transport calculations. Together they give us a clear insight into the importance of different calculational parameters and tell us how much we can trust our result. Without this information the calculation cannot be considered as comprehensive. On the other hand the adjustment permits the improvement of our data base and thus reduces the uncertainty of our target quantities for a range of specific applications. With the objective to validate the methodology used in the reactor shielding analysis, these techniques were applied to PWR power plants, manufactured by EDF (52 capsules analysed in France) and Westinghouse (capsule and cavity dosimetry for the Krsko NPP in Slovenia), as well as to the ASPIS benchmark experiment. (author)

  15. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  16. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  17. Improving estimation of kinetic parameters in dynamic force spectroscopy using cluster analysis

    Science.gov (United States)

    Yen, Chi-Fu; Sivasankar, Sanjeevi

    2018-03-01

    Dynamic Force Spectroscopy (DFS) is a widely used technique to characterize the dissociation kinetics and interaction energy landscape of receptor-ligand complexes with single-molecule resolution. In an Atomic Force Microscope (AFM)-based DFS experiment, receptor-ligand complexes, sandwiched between an AFM tip and substrate, are ruptured at different stress rates by varying the speed at which the AFM-tip and substrate are pulled away from each other. The rupture events are grouped according to their pulling speeds, and the mean force and loading rate of each group are calculated. These data are subsequently fit to established models, and energy landscape parameters such as the intrinsic off-rate (koff) and the width of the potential energy barrier (xβ) are extracted. However, due to large uncertainties in determining mean forces and loading rates of the groups, errors in the estimated koff and xβ can be substantial. Here, we demonstrate that the accuracy of fitted parameters in a DFS experiment can be dramatically improved by sorting rupture events into groups using cluster analysis instead of sorting them according to their pulling speeds. We test different clustering algorithms including Gaussian mixture, logistic regression, and K-means clustering, under conditions that closely mimic DFS experiments. Using Monte Carlo simulations, we benchmark the performance of these clustering algorithms over a wide range of koff and xβ, under different levels of thermal noise, and as a function of both the number of unbinding events and the number of pulling speeds. Our results demonstrate that cluster analysis, particularly K-means clustering, is very effective in improving the accuracy of parameter estimation, particularly when the number of unbinding events are limited and not well separated into distinct groups. Cluster analysis is easy to implement, and our performance benchmarks serve as a guide in choosing an appropriate method for DFS data analysis.

  18. The 3rd ATLAS Domestic Standard Problem for Improvement of Safety Analysis Technology

    International Nuclear Information System (INIS)

    Choi, Ki-Yong; Kang, Kyoung-Ho; Park, Yusun; Kim, Jongrok; Bae, Byoung-Uhn; Choi, Nam-Hyun

    2014-01-01

    The third ATLAS DSP (domestic standard problem exercise) was launched at the end of 2012 in response to the strong need for continuation of the ATLAS DSP. A guillotine break of a main steam line without LOOP at a zero power condition was selected as a target scenario, and it was successfully completed in the beginning of 2014. In the 3 rd ATLAS DSP, comprehensive utilization of the integral effect test data was made by dividing analysis with three topics; 1. scale-up where extrapolation of ATLAS IET data was investigated 2. 3D analysis where how much improvement can be obtained by 3D modeling was studied 3. 1D sensitivity analysis where the key phenomena affecting the SLB simulation were identified and the best modeling guideline was achieved. Through such DSP exercises, it has been possible to effectively utilize high-quality ATLAS experimental data of to enhance thermal-hydraulic understanding and to validate the safety analysis codes. A strong human network and technical expertise sharing among the various nuclear experts are also important outcomes from this program

  19. Discovery learning with hierarchy concept to improve analysis ability and study achievement hydrolysis subject

    Directory of Open Access Journals (Sweden)

    Leny Yuliatun

    2017-10-01

    Full Text Available The aim of this research is to applied Discovery Learning (DL by the support of hierarchy concept to improve analysis ability and chemistry study achievement in the Hydrolysis subject at eleventh-grade students of Science 1 of SMA N Karangpandan at the academic year of 2016/2017. This research is using Classroom Action Research which using two cycles. In each cycle has four steps of action, they are planning, implementing, observing, and reflecting. The research subject is the eleventh-grade students of science one which consists of 40 students. The data source is using teacher and students and the data were taken by interviewing, observing, documenting, testing, and using questionnaire. Data analysis technique is using descriptive qualitative analysis. Based on the research shows that the achievement of analysis cycle I am from 52,5% increase into 65% in the cycle II. Meanwhile, the rise in students’ achievement in cognitive aspect increase from 57,5% in cycle I to 75% in cycle II. Achievement in an affective aspect in cycle I am 90% become 92,5% in cycle II. Therefore, there is the increase meant of students number in this aspect although in cycle I all of the indicator has been reached.

  20. A social work study on the effect of transactional analysis on the improvement of intimacy attitude

    Directory of Open Access Journals (Sweden)

    Parvin Gol

    2013-04-01

    Full Text Available The purpose of this paper is to investigate the impact of group counseling using transactional analysis on the improvement of intimacy attitude in some depressed patients in city of Esfahan, Iran. In this paper, semi-experimental design with pretest posttest control groups was conducted among 30 patients. The sample was selected through available sampling method among the depressed patients referred to psychiatric centers. They were randomly assigned into experimental and control groups. The measurement instrument is intimacy attitude scale (IAS questionnaire by Amidon et al. (1983 [Amidon, E., Kumar, V. K., & Treadwell, T. (1983. Measurement of intimacy attitudes: The intimacy attitude scale-revisited. Journal of personality assessment, 47(6, 635-639.] and the Beck depression inventory (BDI. The pretest and posttest scores of the intimacy attitude scale questionnaire were analyzed in both experimental and control groups. For statistical analysis of data, repeated measures analysis of variance was carried out. The research findings indicated that group counseling using transactional analysis increases the level of intimacy attitude in depressed individuals. It also increases the emotional intimacy, but it does not increase the mental intimacy.

  1. Improving cardiovascular care through outpatient cardiac rehabilitation: an analysis of payment models that would improve quality and promote use.

    Science.gov (United States)

    Mead, Holly; Grantham, Sarah; Siegel, Bruce

    2014-01-01

    Much attention has been paid to improving the care of patients with cardiovascular disease by focusing attention on delivery system redesign and payment reforms that encompass the healthcare spectrum, from an acute episode to maintenance of care. However, 1 area of cardiovascular disease care that has received little attention in the advancement of quality is cardiac rehabilitation (CR), a comprehensive secondary prevention program that is significantly underused despite evidence-based guidelines that recommending its use. The purpose of this article was to analyze the applicability of 2 payment and reimbursement models-pay-for-performance and bundled payments for episodes of care--that can promote the use of CR. We conclude that a payment model combining elements of both pay-for-performance and episodes of care would increase the use of CR, which would both improve quality and increase efficiency in cardiac care. Specific elements would need to be clearly defined, however, including: (a) how an episode is defined, (b) how to hold providers accountable for the care they provider, (c) how to encourage participation among CR providers, and (d) how to determine an equitable distribution of payment. Demonstrations testing new payment models must be implemented to generate empirical evidence that a melded pay-for-performance and episode-based care payment model will improve quality and efficiency.

  2. Analysis of the dynamic response improvement of a turbocharged diesel engine driven alternating current generating set

    International Nuclear Information System (INIS)

    Katrasnik, Tomaz; Medica, Vladimir; Trenc, Ferdinand

    2005-01-01

    Reliability of electric supply systems is among the most required necessities of modern society. Turbocharged diesel engine driven alternating current generating sets are often used to prevent electric black outs and/or as prime electric energy suppliers. It is well known that turbocharged diesel engines suffer from an inadequate response to a sudden load increase, this being a consequence of the nature of the energy exchange between the engine and the turbocharger. The dynamic response of turbocharged diesel engines could be improved by electric assisting systems, either by direct energy supply with an integrated starter-generator-booster (ISG) mounted on the engine flywheel, or by an indirect energy supply with an electrically assisted turbocharger. An experimentally verified zero dimensional computer simulation method was used for the analysis of both types of electrical assistance. The paper offers an analysis of the interaction between a turbocharged diesel engine and different electric assisting systems, as well as the requirements for the supporting electric motors that could improve the dynamic response of a diesel engine while driving an AC generating set. When performance class compliance is a concern, it is evident that an integrated starter-generator-booster outperforms an electrically assisted turbocharger for the investigated generating set. However, the electric energy consumption and frequency recovery times are smaller when an electrically assisted turbocharger is applied

  3. Security Analysis and Improvements of Authentication and Access Control in the Internet of Things

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-01-01

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18–21 June 2012, pp. 588–592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost. PMID:25123464

  4. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  5. Improved ankle push-off power following cheilectomy for hallux rigidus: a prospective gait analysis study.

    Science.gov (United States)

    Smith, Sheryl M; Coleman, Scott C; Bacon, Stacy A; Polo, Fabian E; Brodsky, James W

    2012-06-01

    There is limited objective scientific information on the functional effects of cheilectomy. The purpose of this study was to test the hypothesis that cheilectomy for hallux rigidus improves gait by increasing ankle push-off power. Seventeen patients with symptomatic Stage 1 or Stage 2 hallux rigidus were studied. Pre- and postoperative first metatarsophalangeal (MTP) range of motion and AOFAS hallux scores were recorded. A gait analysis was performed within 4 weeks prior to surgery and repeated at a minimum of 1 year after surgery. Gait analysis was done using a three-dimensional motion capture system and a force platform embedded in a 10-m walkway. Gait velocity sagittal plane ankle range of motion and peak sagittal plane ankle push-off power were analyzed. Following cheilectomy, significant increases were noted for first MTP range of motion and AOFAS hallux score. First MTP motion improved an average of 16.7 degrees, from means of 33.9 degrees preoperatively to 50.6 degrees postoperatively (ppush-off power from 1.71±0.92 W/kg to 2.05±0.75 W/kg (ppush-off power.

  6. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  7. Vulnerability Identification and Design-Improvement-Feedback using Failure Analysis of Digital Control System Designs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eunchan; Bae, Yeonkyoung [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-05-15

    Fault tree analyses let analysts establish the failure sequences of components as a logical model and confirm the result at the plant level. These two analyses provide insights regarding what improvements are needed to increase availability because it expresses the quantified design attribute of the system as minimal cut sets and availability value interfaced with component reliability data in the fault trees. This combined failure analysis method helps system users understand system characteristics including its weakness and strength in relation to faults in the design stage before system operation. This study explained why a digital system could have weaknesses in methods to transfer control signals or data and how those vulnerabilities could cause unexpected outputs. In particular, the result of the analysis confirmed that complex optical communication was not recommended for digital data transmission in the critical systems of nuclear power plants. Regarding loop controllers in Design A, a logic configuration should be changed to prevent spurious actuation due to a single failure, using hardware or software improvements such as cross checking between redundant modules, or diagnosis of the output signal integrity. Unavailability calculations support these insights from the failure analyses of the systems. In the near future, KHNP will perform failure mode and effect analyses in the design stage before purchasing non-safety-related digital system packages. In addition, the design requirements of the system will be confirmed based on evaluation of overall system availability or unavailability.

  8. Security analysis and improvements of authentication and access control in the Internet of Things.

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-08-13

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  9. Improved detection of anterior left ventricular aneurysm with multiharmonic fourier analysis

    International Nuclear Information System (INIS)

    Valette, H.B.; Bourguignon, M.H.; Merlet, P.; Gregoire, M.C.; Le Guludec, D.; Pascal, O.; Briandet, P.; Syrota, A.

    1990-01-01

    Single and multiharmonic Fourier analysis of LAO 30-45 degrees gated blood-pool studies were performed in a selected group of 30 patients with a left ventricular anterior aneurysm proven by contrast angiography. The sensitivity of the first harmonic phase image for the diagnosis of ventricular aneurysm was 80%. The clear phase shift (greater than 110 degrees) between the normal and the aneurysmal areas was missing in six patients. Peak acceleration images (negative maximum of the second derivative of the Fourier series) were calculated for each pixel with the analytical Fourier formula using two or three harmonics. A clear phase shift (greater than 126 degrees) than appeared in all the patients. This improvement was related to the increased weight of the second and third harmonics in the aneurysmal area when compared to control patients or to patients with dilative cardiomyopathy. Multiharmonic Fourier analysis clearly improved the sensitivity of the diagnosis of anterior left ventricular aneurysm on LAO 30 degrees-45 degrees gated blood-pool images

  10. Prediction of line failure fault based on weighted fuzzy dynamic clustering and improved relational analysis

    Science.gov (United States)

    Meng, Xiaocheng; Che, Renfei; Gao, Shi; He, Juntao

    2018-04-01

    With the advent of large data age, power system research has entered a new stage. At present, the main application of large data in the power system is the early warning analysis of the power equipment, that is, by collecting the relevant historical fault data information, the system security is improved by predicting the early warning and failure rate of different kinds of equipment under certain relational factors. In this paper, a method of line failure rate warning is proposed. Firstly, fuzzy dynamic clustering is carried out based on the collected historical information. Considering the imbalance between the attributes, the coefficient of variation is given to the corresponding weights. And then use the weighted fuzzy clustering to deal with the data more effectively. Then, by analyzing the basic idea and basic properties of the relational analysis model theory, the gray relational model is improved by combining the slope and the Deng model. And the incremental composition and composition of the two sequences are also considered to the gray relational model to obtain the gray relational degree between the various samples. The failure rate is predicted according to the principle of weighting. Finally, the concrete process is expounded by an example, and the validity and superiority of the proposed method are verified.

  11. Improving the sensory quality of flavored liquid milk by engaging sensory analysis and consumer preference.

    Science.gov (United States)

    Zhi, Ruicong; Zhao, Lei; Shi, Jingye

    2016-07-01

    Developing innovative products that satisfy various groups of consumers helps a company maintain a leading market share. The hedonic scale and just-about-right (JAR) scale are 2 popular methods for hedonic assessment and product diagnostics. In this paper, we chose to study flavored liquid milk because it is one of the most necessary nutrient sources in China. The hedonic scale and JAR scale methods were combined to provide directional information for flavored liquid milk optimization. Two methods of analysis (penalty analysis and partial least squares regression on dummy variables) were used and the results were compared. This paper had 2 aims: (1) to investigate consumer preferences of basic flavor attributes of milk from various cities in China; and (2) to determine the improvement direction for specific products and the ideal overall liking for consumers in various cities. The results showed that consumers in China have local-specific requirements for characteristics of flavored liquid milk. Furthermore, we provide a consumer-oriented product design method to improve sensory quality according to the preference of particular consumers. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  13. Improving student satisfaction of Andalas University Dormitory through Service Quality and Importance Performance Analysis

    Science.gov (United States)

    Putri, Nilda Tri; Anggraini, Larisa

    2018-03-01

    Residential satisfaction of university dormitories serve as one of the significant aspects in the framework of sustainability in higher education. This research investigated the quality of dormitory services in Andalas University Dormitory based on student’s satisfaction. According to management residential, the enrollment of residential student has increased gradually in Andalas University. In 2016, capacity of residential student is 1686, but only 1081 students can stay at dormitory because some rooms in bad condition. There are a lot of problems and complaints regarding dormitory’s service quality i.e water problems, leaky rooms and bathrooms, cleanliness and inadequate facilities in residential college. In addition, there are 20% of last year student’s residential check out before the time of contract runs out. The aim of this research are understanding the level of GAP exists between expectation and perception students’ residential in the content of service quality and evaluating the improvement priority services using Importance Performance Analysis. This study is measuring service quality by using Responsiveness, Assurance, Empathy, Reliability and Tangible dimension. A negative GAP indicates that the actual services are than what was expected and the GAP is highlighted area for improvement. Based on IPA, management should improve this following dimension services : responsiveness, tangible and assurance dimension.

  14. Why Does Rebalancing Class-Unbalanced Data Improve AUC for Linear Discriminant Analysis?

    Science.gov (United States)

    Xue, Jing-Hao; Hall, Peter

    2015-05-01

    Many established classifiers fail to identify the minority class when it is much smaller than the majority class. To tackle this problem, researchers often first rebalance the class sizes in the training dataset, through oversampling the minority class or undersampling the majority class, and then use the rebalanced data to train the classifiers. This leads to interesting empirical patterns. In particular, using the rebalanced training data can often improve the area under the receiver operating characteristic curve (AUC) for the original, unbalanced test data. The AUC is a widely-used quantitative measure of classification performance, but the property that it increases with rebalancing has, as yet, no theoretical explanation. In this note, using Gaussian-based linear discriminant analysis (LDA) as the classifier, we demonstrate that, at least for LDA, there is an intrinsic, positive relationship between the rebalancing of class sizes and the improvement of AUC. We show that the largest improvement of AUC is achieved, asymptotically, when the two classes are fully rebalanced to be of equal sizes.

  15. Life cycle assessment of Italian citrus-based products. Sensitivity analysis and improvement scenarios.

    Science.gov (United States)

    Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina

    2010-07-01

    Though many studies concern the agro-food sector in the EU and Italy, and its environmental impacts, literature is quite lacking in works regarding LCA application on citrus products. This paper represents one of the first studies on the environmental impacts of citrus products in order to suggest feasible strategies and actions to improve their environmental performance. In particular, it is part of a research aimed to estimate environmental burdens associated with the production of the following citrus-based products: essential oil, natural juice and concentrated juice from oranges and lemons. The life cycle assessment of these products, published in a previous paper, had highlighted significant environmental issues in terms of energy consumption, associated CO(2) emissions, and water consumption. Starting from such results the authors carry out an improvement analysis of the assessed production system, whereby sustainable scenarios for saving water and energy are proposed to reduce environmental burdens of the examined production system. In addition, a sensitivity analysis to estimate the effects of the chosen methods will be performed, giving data on the outcome of the study. Uncertainty related to allocation methods, secondary data sources, and initial assumptions on cultivation, transport modes, and waste management is analysed. The results of the performed analyses allow stating that every assessed eco-profile is differently influenced by the uncertainty study. Different assumptions on initial data and methods showed very sensible variations in the energy and environmental performances of the final products. Besides, the results show energy and environmental benefits that clearly state the improvement of the products eco-profile, by reusing purified water use for irrigation, using the railway mode for the delivery of final products, when possible, and adopting efficient technologies, as the mechanical vapour recompression, in the pasteurisation and

  16. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency

    International Nuclear Information System (INIS)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-01-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. - Highlights: • The improvements of method and device make the decontaminatin factor (DF) of Cs-137 of distillation reach over 450,000. • The decontamination factor ( DF) value may be increased about 20 times by increasing the height of the container from 7 cm to 20 cm. • The device provides an easy-handling, water-saving, low cost, stable and effective way for the distillation of triated water analysis.

  17. Crossing the Barriers: An Analysis of Land Access Barriers to Geothermal Development and Potential Improvement Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Katherine R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-04

    Developers have identified many non-technical barriers to geothermal power development, including access to land. Activities required for accessing land, such as environmental review and private and public leasing can take a considerable amount of time and can delay or prevent project development. This paper discusses the impacts to available geothermal resources and deployment caused by land access challenges, including tribal and cultural resources, environmentally sensitive areas, biological resources, land ownership, federal and state lease queues, and proximity to military installations. In this analysis, we identified challenges that have the potential to prevent development of identified and undiscovered hydrothermal geothermal resources. We found that an estimated 400 MW of identified geothermal resource potential and 4,000 MW of undiscovered geothermal resource potential were either unallowed for development or contained one or more significant barriers that could prevent development at the site. Potential improvement scenarios that could be employed to overcome these barriers include (1) providing continuous funding to the U.S. Forest Service (USFS) for processing geothermal leases and permit applications and (2) the creation of advanced environmental mitigation measures. The model results forecast that continuous funding to the USFS could result in deployment of an additional 80 MW of geothermal capacity by 2030 and 124 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The creation of advanced environmental mitigation measures coupled with continuous funding to the USFS could result in deployment of an additional 97 MW of geothermal capacity by 2030 and 152 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The small impact on potential deployment in these improvement scenarios suggests that these 4,400 MW have other barriers to development in addition to land access. In other words, simply

  18. Knee Kinematic Improvement After Total Knee Replacement Using a Simplified Quantitative Gait Analysis Method

    Directory of Open Access Journals (Sweden)

    Hassan Sarailoo

    2013-10-01

    Full Text Available Objectives: The aim of this study was to extract suitable spatiotemporal and kinematic parameters to determine how Total Knee Replacement (TKR alters patients’ knee kinematics during gait, using a rapid and simplified quantitative two-dimensional gait analysis procedure. Methods: Two-dimensional kinematic gait pattern of 10 participants were collected before and after the TKR surgery, using a 60 Hz camcorder in sagittal plane. Then, the kinematic parameters were extracted using the gait data. A student t-test was used to compare the group-average of spatiotemporal and peak kinematic characteristics in the sagittal plane. The knee condition was also evaluated using the Oxford Knee Score (OKS Questionnaire to ensure thateach subject was placed in the right group. Results: The results showed a significant improvement in knee flexion during stance and swing phases after TKR surgery. The walking speed was increased as a result of stride length and cadence improvement, but this increment was not statistically significant. Both post-TKR and control groups showed an increment in spatiotemporal and peak kinematic characteristics between comfortable and fast walking speeds. Discussion: The objective kinematic parameters extracted from 2D gait data were able to show significant improvements of the knee joint after TKR surgery. The patients with TKR surgery were also able to improve their knee kinematics during fast walking speed equal to the control group. These results provide a good insight into the capabilities of the presented method to evaluate knee functionality before and after TKR surgery and to define a more effective rehabilitation program.

  19. Improvement of the methods for company’s fixed assets analysis

    Directory of Open Access Journals (Sweden)

    T. A. Zhurkina

    2018-01-01

    Full Text Available Fixed assets are an integral component of the productive capacity of any enterprise. The financial results of the enterprise largely depend on their intensity and efficiency of use. The analysis of fixed assets is usually carried out using an integrated and systematic approach, based on their availability, their movement, efficiency of use (including their active part. In the opinion of some authors, the traditional methods of analyzing fixed assets have a number of shortcomings, since they do not take into account the life cycle of an enterprise, the ecological aspects of the operation of fixed assets, the operation specifics of the individual divisions of a company and its branches. In order to improve the methodology for analyzing fixed assets, the authors proposed to use formalized and nonformalized criteria for analyzing the risks associated with the fixed asset use. A survey questionnaire was designed to determine the likelihood of the risk of economic losses associated with the use of fixed assets. The authors propose using the integral indicator for the purpose of analyzing the risk of using fixed assets in dynamics. In order to improve the procedure for auditing, the authors proposed segregation of economic transactions with fixed assets according to their cycles in accordance with the stage of their reproduction. Operational analysis is important for managing the efficiency of the fixed asset use, especially during a critical period. Using the analysis of the regularity in grain combines performance would reduce losses during harvesting, implement the work within strictly defined time frame and remunerate the employees for high-quality and intensive performance of their tasks.

  20. Workplace interventions to improve work ability: A systematic review and meta-analysis of their effectiveness.

    Science.gov (United States)

    Oakman, Jodi; Neupane, Subas; Proper, Karin I; Kinsman, Natasha; Nygård, Clas-Håkan

    2018-03-01

    Objective Extended working lives due to an ageing population will necessitate the maintenance of work ability across the life course. This systematic review aimed to analyze whether workplace interventions positively impact work ability. Methods We searched Medline, PsycINFO, CINAHL and Embase databases using relevant terms. Work-based interventions were those focused on individuals, the workplace, or multilevel (combination). Work ability - measured using the work ability index (WAI) or the single-item work ability score (WAS) - was the outcome measure. Grading of Recommendations Assessment, Development & Evaluation (GRADE) criteria was used to assess evidence quality, and impact statements were developed to synthesize the results. Meta-analysis was undertaken where appropriate. Results We reviewed 17 randomized control trials (comprising 22 articles). Multilevel interventions (N=5) included changes to work arrangements and liaisons with supervisors, whilst individual-focused interventions (N=12) involved behavior change or exercise programs. We identified only evidence of a moderate quality for either individual or multilevel interventions aiming to improve work ability. The meta-analysis of 13 studies found a small positive significant effect for interventions on work ability [overall pooled mean 0.12, 95% confidence interval (CI) 0.03-0.21] with no heterogeneity for the effect size (Chi 2 =11.28, P=0.51; I 2 =0%). Conclusions The meta-analysis showed a small positive effect, suggesting that workplace interventions might improve work ability. However, the quality of the evidence base was only moderate, precluding any firm conclusion. Further high quality studies are require to establish the role of interventions on work ability.

  1. An improved method for Multipath Hemispherical Map (MHM) based on Trend Surface Analysis

    Science.gov (United States)

    Wang, Zhiren; Chen, Wen; Dong, Danan; Yu, Chao

    2017-04-01

    Among various approaches developed for detecting the multipath effect in high-accuracy GNSS positioning, Only MHM (Multipath Hemispherical Map) and SF (Sidereal Filtering) can be implemented to real-time GNSS data processing. SF is based on the time repeatability of satellites which just suitable for static environment, while the spatiotemporal repeatability-based MHM is applicable not only for static environment but also for dynamic carriers with static multipath environment such as ships and airplanes, and utilizes much smaller number of parameters than ASF. However, the MHM method also has certain defects. Since the MHM take the mean of residuals from the grid as the filter value, it is more suitable when the multipath regime is medium to low frequency. Now existing research data indicate that the newly advanced Sidereal Filtering (ASF) method perform better with high frequency multipath reduction than MHM by contrast. To solve the above problem and improve MHM's performance on high frequency multipath, we combined binary trend surface analysis method with original MHM model to effectively analyze particular spatial distribution and variation trends of multipath effect. We computed trend surfaces of the residuals within a grid by least-square procedures, and chose the best results through the moderate successive test. The enhanced MHM grid was constructed from a set of coefficients of the fitted equation instead of mean value. According to the analysis of the actual observation, the improved MHM model shows positive effect on high frequency multipath reduction, and significantly reduced the root mean square (RMS) value of the carrier residuals. Keywords: Trend Surface Analysis; Multipath Hemispherical Map; high frequency multipath effect

  2. Interventions to improve gross motor performance in children with neurodevelopmental disorders: a meta-analysis.

    Science.gov (United States)

    Lucas, Barbara R; Elliott, Elizabeth J; Coggan, Sarah; Pinto, Rafael Z; Jirikowic, Tracy; McCoy, Sarah Westcott; Latimer, Jane

    2016-11-29

    Gross motor skills are fundamental to childhood development. The effectiveness of current physical therapy options for children with mild to moderate gross motor disorders is unknown. The aim of this study was to systematically review the literature to investigate the effectiveness of conservative interventions to improve gross motor performance in children with a range of neurodevelopmental disorders. A systematic review with meta-analysis was conducted. MEDLINE, EMBASE, AMED, CINAHL, PsycINFO, PEDro, Cochrane Collaboration, Google Scholar databases and clinical trial registries were searched. Published randomised controlled trials including children 3 to ≤18 years with (i) Developmental Coordination Disorder (DCD) or Cerebral Palsy (CP) (Gross Motor Function Classification System Level 1) or Developmental Delay or Minimal Acquired Brain Injury or Prematurity (gross motor outcomes obtained using a standardised assessment tool. Meta-analysis was performed to determine the pooled effect of intervention on gross motor function. Methodological quality and strength of meta-analysis recommendations were evaluated using PEDro and the GRADE approach respectively. Of 2513 papers, 9 met inclusion criteria including children with CP (n = 2) or DCD (n = 7) receiving 11 different interventions. Only two of 9 trials showed an effect for treatment. Using the least conservative trial outcomes a large beneficial effect of intervention was shown (SMD:-0.8; 95% CI:-1.1 to -0.5) with "very low quality" GRADE ratings. Using the most conservative trial outcomes there is no treatment effect (SMD:-0.1; 95% CI:-0.3 to 0.2) with "low quality" GRADE ratings. Study limitations included the small number and poor quality of the available trials. Although we found that some interventions with a task-orientated framework can improve gross motor outcomes in children with DCD or CP, these findings are limited by the very low quality of the available evidence. High quality intervention

  3. Improving primary health care facility performance in Ghana: efficiency analysis and fiscal space implications.

    Science.gov (United States)

    Novignon, Jacob; Nonvignon, Justice

    2017-06-12

    Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private

  4. An improved model for whole genome phylogenetic analysis by Fourier transform.

    Science.gov (United States)

    Yin, Changchuan; Yau, Stephen S-T

    2015-10-07

    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Modified Truncated Multiplicity Analysis to Improve Verification of Uranium Fuel Cycle Materials

    International Nuclear Information System (INIS)

    LaFleur, A.; Miller, K.; Swinhoe, M.; Belian, A.; Croft, S.

    2015-01-01

    Accurate verification of 235U enrichment and mass in UF6 storage cylinders and the UO2F2 holdup contained in the process equipment is needed to improve international safeguards and nuclear material accountancy at uranium enrichment plants. Small UF6 cylinders (1.5'' and 5'' diameter) are used to store the full range of enrichments from depleted to highly-enriched UF6. For independent verification of these materials, it is essential that the 235U mass and enrichment measurements do not rely on facility operator declarations. Furthermore, in order to be deployed by IAEA inspectors to detect undeclared activities (e.g., during complementary access), it is also imperative that the measurement technique is quick, portable, and sensitive to a broad range of 235U masses. Truncated multiplicity analysis is a technique that reduces the variance in the measured count rates by only considering moments 1, 2, and 3 of the multiplicity distribution. This is especially important for reducing the uncertainty in the measured doubles and triples rates in environments with a high cosmic ray background relative to the uranium signal strength. However, we believe that the existing truncated multiplicity analysis throws away too much useful data by truncating the distribution after the third moment. This paper describes a modified truncated multiplicity analysis method that determines the optimal moment to truncate the multiplicity distribution based on the measured data. Experimental measurements of small UF6 cylinders and UO2F2 working reference materials were performed at Los Alamos National Laboratory (LANL). The data were analyzed using traditional and modified truncated multiplicity analysis to determine the optimal moment to truncate the multiplicity distribution to minimize the uncertainty in the measured count rates. The results from this analysis directly support nuclear safeguards at enrichment plants and provide a more accurate verification method for UF6

  6. Flipped classroom improves student learning in health professions education: a meta-analysis.

    Science.gov (United States)

    Hew, Khe Foon; Lo, Chung Kwan

    2018-03-15

    The use of flipped classroom approach has become increasingly popular in health professions education. However, no meta-analysis has been published that specifically examines the effect of flipped classroom versus traditional classroom on student learning. This study examined the findings of comparative articles through a meta-analysis in order to summarize the overall effects of teaching with the flipped classroom approach. We focused specifically on a set of flipped classroom studies in which pre-recorded videos were provided before face-to-face class meetings. These comparative articles focused on health care professionals including medical students, residents, doctors, nurses, or learners in other health care professions and disciplines (e.g., dental, pharmacy, environmental or occupational health). Using predefined study eligibility criteria, seven electronic databases were searched in mid-April 2017 for relevant articles. Methodological quality was graded using the Medical Education Research Study Quality Instrument (MERSQI). Effect sizes, heterogeneity estimates, analysis of possible moderators, and publication bias were computed using the COMPREHENSIVE META-ANALYSIS software. A meta-analysis of 28 eligible comparative studies (between-subject design) showed an overall significant effect in favor of flipped classrooms over traditional classrooms for health professions education (standardized mean difference, SMD = 0.33, 95% confidence interval, CI = 0.21-0.46, p flipped classroom approach was more effective when instructors used quizzes at the start of each in-class session. More respondents reported they preferred flipped to traditional classrooms. Current evidence suggests that the flipped classroom approach in health professions education yields a significant improvement in student learning compared with traditional teaching methods.

  7. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  8. Improving the durability of the optical fiber sensor based on strain transfer analysis

    Science.gov (United States)

    Wang, Huaping; Jiang, Lizhong; Xiang, Ping

    2018-05-01

    To realize the reliable and long-term strain detection, the durability of optical fiber sensors has attracted more and more attention. The packaging technique has been considered as an effective method, which can enhance the survival ratios of optical fiber sensors to resist the harsh construction and service environment in civil engineering. To monitor the internal strain of structures, the embedded installation is adopted. Due to the different material properties between host material and the protective layer, the monitored structure embedded with sensors can be regarded as a typical model containing inclusions. Interfacial characteristic between the sensor and host material exists obviously, and the contacted interface is prone to debonding failure induced by the large interfacial shear stress. To recognize the local interfacial debonding damage and extend the effective life cycle of the embedded sensor, strain transfer analysis of a general three-layered sensing model is conducted to investigate the failure mechanism. The perturbation of the embedded sensor on the local strain field of host material is discussed. Based on the theoretical analysis, the distribution of the interfacial shear stress along the sensing length is characterized and adopted for the diagnosis of local interfacial debonding, and the sensitive parameters influencing the interfacial shear stress are also investigated. The research in this paper explores the interfacial debonding failure mechanism of embedded sensors based on the strain transfer analysis and provides theoretical basis for enhancing the interfacial bonding properties and improving the durability of embedded optical fiber sensors.

  9. SWOT analysis of a pediatric rehabilitation programme: a participatory evaluation fostering quality improvement.

    Science.gov (United States)

    Camden, Chantal; Swaine, Bonnie; Tétreault, Sylvie; Bergeron, Sophie

    2009-01-01

    To present the results of a strengths, weaknesses, opportunities and threats (SWOT) analysis used as part of a process aimed at reorganising services provided within a pediatric rehabilitation programme (PRP) in Quebec, Canada and to report the perceptions of the planning committee members regarding the usefulness of the SWOT in this process. Thirty-six service providers working in the PRP completed a SWOT questionnaire and reported what they felt worked and what did not work in the existing model of care. Their responses were used by a planning committee over a 12-month period to assist in the development of a new service delivery model. Committee members shared their thoughts about the usefulness of the SWOT. Current programme strengths included favourable organisational climate and interdisciplinary work whereas weaknesses included lack of psychosocial support to families and long waiting times for children. Opportunities included working with community partners, whereas fear of losing professional autonomy with the new service model was a threat. The SWOT results helped the planning committee redefine the programme goals and make decisions to improve service coordination. SWOT analysis was deemed as a very useful tool to help guide service reorganisation. SWOT analysis appears to be an interesting evaluation tool to promote awareness among service providers regarding the current functioning of a rehabilitation programme. It fosters their active participation in the reorganisation of a new service delivery model for pediatric rehabilitation.

  10. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Yoshida, Koichi; Yabuta, Tetsuro

    1992-01-01

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  11. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  12. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    International Nuclear Information System (INIS)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H 2 /air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author)

  13. MO-FG-202-06: Improving the Performance of Gamma Analysis QA with Radiomics- Based Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wootton, L; Nyflot, M; Ford, E [University of Washington Department of Radiation Oncology, Seattle, WA (United States); Chaovalitwongse, A [University of Washington Department of Industrial and Systems Engineering, Seattle, Washington (United States); University of Washington Department of Radiology, Seattle, WA (United States); Li, N [University of Washington Department of Industrial and Systems Engineering, Seattle, Washington (United States)

    2016-06-15

    Purpose: The use of gamma analysis for IMRT quality assurance has well-known limitations. Traditionally, a simple thresholding technique is used to evaluated passing criteria. However, like any image the gamma distribution is rich in information which thresholding mostly discards. We therefore propose a novel method of analyzing gamma images that uses quantitative image features borrowed from radiomics, with the goal of improving error detection. Methods: 368 gamma images were generated from 184 clinical IMRT beams. For each beam the dose to a phantom was measured with EPID dosimetry and compared to the TPS dose calculated with and without normally distributed (2mm sigma) errors in MLC positions. The magnitude of 17 intensity histogram and size-zone radiomic features were derived from each image. The features that differed most significantly between image sets were determined with ROC analysis. A linear machine-learning model was trained on these features to classify images as with or without errors on 180 gamma images.The model was then applied to an independent validation set of 188 additional gamma distributions, half with and half without errors. Results: The most significant features for detecting errors were histogram kurtosis (p=0.007) and three size-zone metrics (p<1e-6 for each). The sizezone metrics detected clusters of high gamma-value pixels under mispositioned MLCs. The model applied to the validation set had an AUC of 0.8, compared to 0.56 for traditional gamma analysis with the decision threshold restricted to 98% or less. Conclusion: A radiomics-based image analysis method was developed that is more effective in detecting error than traditional gamma analysis. Though the pilot study here considers only MLC position errors, radiomics-based methods for other error types are being developed, which may provide better error detection and useful information on the source of detected errors. This work was partially supported by a grant from the Agency for

  14. Partnership capacity for community health improvement plan implementation: findings from a social network analysis.

    Science.gov (United States)

    McCullough, J Mac; Eisen-Cohen, Eileen; Salas, S Bianca

    2016-07-13

    Many health departments collaborate with community organizations on community health improvement processes. While a number of resources exist to plan and implement a community health improvement plan (CHIP), little empirical evidence exists on how to leverage and expand partnerships when implementing a CHIP. The purpose of this study was to identify characteristics of the network involved in implementing the CHIP in one large community. The aims of this analysis are to: 1) identify essential network partners (and thereby highlight potential network gaps), 2) gauge current levels of partner involvement, 3) understand and effectively leverage network resources, and 4) enable a data-driven approach for future collaborative network improvements. We collected primary data via survey from n = 41 organizations involved in the Health Improvement Partnership of Maricopa County (HIPMC), in Arizona. Using the previously validated Program to Analyze, Record, and Track Networks to Enhance Relationships (PARTNER) tool, organizations provided information on existing ties with other coalition members, including frequency and depth of partnership and eight categories of perceived value/trust of each current partner organization. The coalition's overall network had a density score of 30 %, degree centralization score of 73 %, and trust score of 81 %. Network maps are presented to identify existing relationships between HIPMC members according to partnership frequency and intensity, duration of involvement in the coalition, and self-reported contributions to the coalition. Overall, number of ties and other partnership measures were positively correlated with an organization's perceived value and trustworthiness as rated by other coalition members. Our study presents a novel use of social network analysis methods to evaluate the coalition of organizations involved in implementing a CHIP in an urban community. The large coalition had relatively low network density but high

  15. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  16. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. © The Author(s) 2016.

  17. Analysis of the characteristics of taxi services as a prerequisite for their improvement

    Directory of Open Access Journals (Sweden)

    Vujić Nenad

    2014-01-01

    Full Text Available The expansion of services sector is the characteristics of modern and developed societies that influence national economy. Therefore, the analysis of services, as a concept and part of marketing is very significant. In this sense, the paper researches a particular service - the taxi services in the capital of Serbia. Through this research, the authors try to define the groups of customer of taxi services and their preference and attitudes. The research was performed in period May to July 2014, by direct contact with customer of taxi services. The results of research have confirmed the initial hypothesis and provide possibilities for further insight into the way of using taxi services and general circumstances that characterize them in mentioned region. On this basis, it is provided proposals for improvement of taxi services and easier outreach of target groups.

  18. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Zamorano, Montserrat

    2010-01-01

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  19. Analysis from reviews in Social Media to improve hotel´s online reputation

    Directory of Open Access Journals (Sweden)

    Daissy Hatblathy Moya Sánchez

    2017-07-01

    Full Text Available Today, hoteliers have problems with handling online reputation due to bad reviews they’ve received on social networks. The aim of this research is to identify the key factors to consider in the operation of each hotel to avoid negative comments and to increase their online reputation. The ratings received by virtual means in 57 Latin American hotels belonging to the GHL Hotel Chain from March 31st, 2015 until March 31st, 2016. By using the software Revinate, there were analyzed the reviews by department. Then, they were classified to developed a manual of good practices. From the analysis of those comments, recommendations were made on six areas of the hotels: Rooms, Food and Beverage, Front Desk, Business Center, Security, and Management to optimize the quality in hotels and thus improve their online reputation.

  20. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    Science.gov (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  1. DAMBE7: New and Improved Tools for Data Analysis in Molecular Biology and Evolution.

    Science.gov (United States)

    Xia, Xuhua

    2018-06-01

    DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux, and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca (last accessed, April 17, 2018).

  2. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot

    2009-03-01

    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  3. Analysis of an Online Match Discussion Board: Improving the Otolaryngology—Head and Neck Surgery Match

    Science.gov (United States)

    Kozin, Elliott D.; Sethi, Rosh; Lehmann, Ashton; Remenschneider, Aaron K.; Golub, Justin S.; Reyes, Samuel A.; Emerick, Kevin; Lee, Daniel J.; Gray, Stacey T.

    2015-01-01

    Introduction “The Match” has become the accepted selection process for graduate medical education. Otomatch.com has provided an online forum for Otolaryngology-Head and Neck Surgery (OHNS) Match-related questions for over a decade. Herein, we aim to 1) delineate the types of posts on Otomatch to better understand the perspective of medical students applying for residency and 2) provide recommendations to potentially improve the Match process. Methods Discussion forum posts on Otomatch between December 2001 and April 2014 were reviewed. The title of each thread and total number of views were recorded for quantitative analysis. Each thread was organized into one of six major categories and one of eighteen subcategories, based on chronology within the application cycle and topic. National Resident Matching Program (NRMP) data were utilized for comparison. Results We identified 1,921 threads corresponding to over 2 million page views. Over 40% of threads related to questions about specific programs, and 27% were discussions about interviews. Views, a surrogate measure for popularity, reflected different trends. The majority of individuals viewed posts on interviews (42%), program specific questions (20%) and how to rank programs (11%). Increase in viewership tracked with a rise in applicant numbers based on NRMP data. Conclusions Our study provides an in depth analysis of a popular discussion forum for medical students interested in the OHNS Match. The most viewed posts are about interview dates and questions regarding specific programs. We provide suggestions to address unmet needs for medical students and potentially improve the Match process. PMID:25550223

  4. A critical analysis of energy efficiency improvement potentials in Taiwan's cement industry

    International Nuclear Information System (INIS)

    Huang, Yun-Hsun; Chang, Yi-Lin; Fleiter, Tobias

    2016-01-01

    The cement industry is the second most energy-intensive sector in Taiwan, which underlines the need to understand its potential for energy efficiency improvement. A bottom-up model-based assessment is utilized to conduct a scenario analysis of energy saving opportunities up to the year 2035. The analysis is supported by detailed expert interviews in all cement plants of Taiwan. The simulation results reveal that by 2035, eighteen energy efficient technologies could result in 25% savings for electricity and 9% savings for fuels under the technical diffusion scenario. This potential totally amounts to about 5000 TJ/year, of which 91% can be implemented cost-effectively assuming a discount rate of 10%. Policy makers should support a fast diffusion of these technologies. Additionally, policy makers can tap further saving potentials. First, by decreasing the clinker share, which is currently regulated to a minimum of 95%. Second, by extending the prohibition to build new cement plants by allowing for replacement of existing capacity with new innovative plants in the coming years. Third, by supporting the use of alternative fuels, which is currently still a niche in Taiwan. - Highlights: •We analyze energy efficiency improvement potentials in Taiwan's cement industry. •Eighteen process-specific technologies are analyzed using a bottom-up model. •Our model systematically reflects the diffusion of technologies over time. •We find energy-saving potentials of 25% for electricity and 9% for fuels in 2035. •91% of the energy-saving potentials can be realized cost-effectively.

  5. Observations of Tunable Resistive Pulse Sensing for Exosome Analysis: Improving System Sensitivity and Stability.

    Science.gov (United States)

    Anderson, Will; Lane, Rebecca; Korbie, Darren; Trau, Matt

    2015-06-16

    Size distribution and concentration measurements of exosomes are essential when investigating their cellular function and uptake. Recently, a particle size distribution and concentration measurement platform known as tunable resistive pulse sensing (TRPS) has seen increased use for the characterization of exosome samples. TRPS measures the brief increase in electrical resistance (a resistive pulse) produced by individual submicrometer/nanoscale particles as they translocate through a size-tunable submicrometer/micrometer-sized pore, embedded in an elastic membrane. Unfortunately, TRPS measurements are susceptible to issues surrounding system stability, where the pore can become blocked by particles, and sensitivity issues, where particles are too small to be detected against the background noise of the system. Herein, we provide a comprehensive analysis of the parameters involved in TRPS exosome measurements and demonstrate the ability to improve system sensitivity and stability by the optimization of system parameters. We also provide the first analysis of system noise, sensitivity cutoff limits, and accuracy with respect to exosome measurements and offer an explicit definition of system sensitivity that indicates the smallest particle diameter that can be detected within the noise of the trans-membrane current. A comparison of exosome size measurements from both TRPS and cryo-electron microscopy is also provided, finding that a significant number of smaller exosomes fell below the detection limit of the TRPS platform and offering one potential insight as to why there is such large variability in the exosome size distribution reported in the literature. We believe the observations reported here may assist others in improving TRPS measurements for exosome samples and other submicrometer biological and nonbiological particles.

  6. Significant improvement of accuracy and precision in the determination of trace rare earths by fluorescence analysis

    International Nuclear Information System (INIS)

    Ozawa, L.; Hersh, H.N.

    1976-01-01

    Most of the rare earths in yttrium, gadolinium and lanthanum oxides emit characteristic fluorescent line spectra under irradiation with photons, electrons and x rays. The sensitivity and selectivity of the rare earth fluorescences are high enough to determine the trace amounts (0.01 to 100 ppM) of rare earths. The absolute fluorescent intensities of solids, however, are markedly affected by the synthesis procedure, level of contamination and crystal perfection, resulting in poor accuracy and low precision for the method (larger than 50 percent error). Special care in preparation of the samples is required to obtain good accuracy and precision. It is found that the accuracy and precision for the determination of trace (less than 10 ppM) rare earths by fluorescence analysis improved significantly, while still maintaining the sensitivity, when the determination is made by comparing the ratio of the fluorescent intensities of the trace rare earths to that of a deliberately added rare earth as reference. The variation in the absolute fluorescent intensity remains, but is compensated for by measuring the fluorescent line intensity ratio. Consequently, the determination of trace rare earths (with less than 3 percent error) is easily made by a photoluminescence technique in which the rare earths are excited directly by photons. Accuracy is still maintained when the absolute fluorescent intensity is reduced by 50 percent through contamination by Ni, Fe, Mn or Pb (about 100 ppM). Determination accuracy is also improved for fluorescence analysis by electron excitation and x-ray excitation. For some rare earths, however, accuracy by these techniques is reduced because indirect excitation mechanisms are involved. The excitation mechanisms and the interferences between rare earths are also reported

  7. Failure mode and effect analysis: improving intensive care unit risk management processes.

    Science.gov (United States)

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  8. Improved automated analysis of radon (222Rn) and thoron (220Rn) in natural waters.

    Science.gov (United States)

    Dimova, Natasha; Burnett, William C; Lane-Smith, Derek

    2009-11-15

    Natural radon ((222)Rn) and thoron ((220)Rn) can be used as tracers of various chemical and physical processes in the environment. We present here results from an extended series of laboratory experiments intended to improve the automated analysis of (222)Rn and (220)Rn in water using a modified RAD AQUA (Durridge Inc.) system. Previous experience with similar equipment showed that it takes about 30-40 min for the system to equilibrate to radon-in-water concentration increases and even longer for the response to return to baseline after a sharp spike. While the original water/gas exchanger setup was built only for radon-in-water measurement, our goal here is to provide an automated system capable of high resolution and good sensitivity for both radon- and thoron-in-water detections. We found that faster water flow rates substantially improved the response for both isotopes while thoron is detected most efficiently at airflow rates of 3 L/min. Our results show that the optimum conditions for fastest response and sensitivity for both isotopes are at water flow rates up to 17 L/min and an airflow rate of 3 L/min through the detector. Applications for such measurements include prospecting for naturally occurring radioactive material (NORM) in pipelines and locating points of groundwater/surface water interaction.

  9. Risk analysis of urban gas pipeline network based on improved bow-tie model

    Science.gov (United States)

    Hao, M. J.; You, Q. J.; Yue, Z.

    2017-11-01

    Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.

  10. Wavelet analysis to decompose a vibration simulation signal to improve pre-distribution testing of packaging

    Science.gov (United States)

    Griffiths, K. R.; Hicks, B. J.; Keogh, P. S.; Shires, D.

    2016-08-01

    In general, vehicle vibration is non-stationary and has a non-Gaussian probability distribution; yet existing testing methods for packaging design employ Gaussian distributions to represent vibration induced by road profiles. This frequently results in over-testing and/or over-design of the packaging to meet a specification and correspondingly leads to wasteful packaging and product waste, which represent 15bn per year in the USA and €3bn per year in the EU. The purpose of the paper is to enable a measured non-stationary acceleration signal to be replaced by a constructed signal that includes as far as possible any non-stationary characteristics from the original signal. The constructed signal consists of a concatenation of decomposed shorter duration signals, each having its own kurtosis level. Wavelet analysis is used for the decomposition process into inner and outlier signal components. The constructed signal has a similar PSD to the original signal, without incurring excessive acceleration levels. This allows an improved and more representative simulated input signal to be generated that can be used on the current generation of shaker tables. The wavelet decomposition method is also demonstrated experimentally through two correlation studies. It is shown that significant improvements over current international standards for packaging testing are achievable; hence the potential for more efficient packaging system design is possible.

  11. Using containment analysis to improve component cooling water heat exchanger limits

    International Nuclear Information System (INIS)

    Da Silva, H.C.; Tajbakhsh, A.

    1995-01-01

    The Comanche Peak Steam Electric Station design requires that exit temperatures from the Component Cooling Water Heat Exchanger remain below 330.37 K during the Emergency Core Cooling System recirculation stage, following a hypothetical Loss of Coolant Accident (LOCA). Due to measurements indicating a higher than expected combination of: (a) high fouling factor in the Component Cooling Water Heat Exchanger with (b) high ultimate heat sink temperatures, that might lead to temperatures in excess of the 330.37 K limit, if a LOCA were to occur, TUElectric adjusted key flow rates in the Component Cooling Water network. This solution could only be implemented with improvements to the containment analysis methodology of record. The new method builds upon the CONTEMPT-LT/028 code by: (a) coupling the long term post-LOCA thermohydraulics with a more detailed analytical model for the complex Component Cooling Water Heat Exchanger network and (b) changing the way mass and energy releases are calculated after core reflood and steam generator energy is dumped to the containment. In addition, a simple code to calculate normal cooldowns was developed to confirm RHR design bases were met with the improved limits

  12. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  13. Improvement of the fringe analysis algorithm for wavelength scanning interferometry based on filter parameter optimization.

    Science.gov (United States)

    Zhang, Tao; Gao, Feng; Muhamedsalih, Hussam; Lou, Shan; Martin, Haydn; Jiang, Xiangqian

    2018-03-20

    The phase slope method which estimates height through fringe pattern frequency and the algorithm which estimates height through the fringe phase are the fringe analysis algorithms widely used in interferometry. Generally they both extract the phase information by filtering the signal in frequency domain after Fourier transform. Among the numerous papers in the literature about these algorithms, it is found that the design of the filter, which plays an important role, has never been discussed in detail. This paper focuses on the filter design in these algorithms for wavelength scanning interferometry (WSI), trying to optimize the parameters to acquire the optimal results. The spectral characteristics of the interference signal are analyzed first. The effective signal is found to be narrow-band (near single frequency), and the central frequency is calculated theoretically. Therefore, the position of the filter pass-band is determined. The width of the filter window is optimized with the simulation to balance the elimination of the noise and the ringing of the filter. Experimental validation of the approach is provided, and the results agree very well with the simulation. The experiment shows that accuracy can be improved by optimizing the filter design, especially when the signal quality, i.e., the signal noise ratio (SNR), is low. The proposed method also shows the potential of improving the immunity to the environmental noise by adapting the signal to acquire the optimal results through designing an adaptive filter once the signal SNR can be estimated accurately.

  14. Analysis of the Convention on Nuclear Safety and Suggestions for Improvement

    International Nuclear Information System (INIS)

    Choi, K. S.; Viet, Phuong Nguyen

    2013-01-01

    The innovative approach of the Convention, which is based on incentive after than legal binding, had been considered successful in strengthening the nuclear safety worldwide. However, the nuclear accident at the Fukushima Dai-ichi Nuclear Power Plant (Japan) in March 2011 has exposed a number of weaknesses of the Convention. Given that context, this paper will analyse the characteristics of the CNS in order to understand the advantages and disadvantages of the Convention, and finally to suggest some possible improvements. The analysis in this paper shows that the incentive approach of the CNS has succeeded in facilitating the active roles of its Contracting Parties in making the National Reports and participating in the peer review of these reports. However, the incoherent quality of the National Reports, the different level of participation in the peer review process by different Contracting Parties, and the lack of transparency of the peer review have undermined the effectiveness of the Convention in strengthening the international safety regime as well as preventing serious regulatory errors that had happened in Japan before the Fukushima accident. Therefore, the peer review process should be reformed into a more transparent and independent direction, while an advisory group of regulators within the CNS might also be useful in improving the effectiveness of the Convention as already proven by the good practice in the European Union. Only with such effective change, the CNS can maintain its pivotal role in the international safety regime

  15. Numerical analysis of an entire ceramic kiln under actual operating conditions for the energy efficiency improvement.

    Science.gov (United States)

    Milani, Massimo; Montorsi, Luca; Stefani, Matteo; Saponelli, Roberto; Lizzano, Maurizio

    2017-12-01

    The paper focuses on the analysis of an industrial ceramic kiln in order to improve the energy efficiency and thus the fuel consumption and the corresponding carbon dioxide emissions. A lumped and distributed parameter model of the entire system is constructed to simulate the performance of the kiln under actual operating conditions. The model is able to predict accurately the temperature distribution along the different modules of the kiln and the operation of the many natural gas burners employed to provide the required thermal power. Furthermore, the temperature of the tiles is also simulated so that the quality of the final product can be addressed by the modelling. Numerical results are validated against experimental measurements carried out on a real ceramic kiln during regular production operations. The developed numerical model demonstrates to be an efficient tool for the investigation of different design solutions for the kiln's components. In addition, a number of control strategies for the system working conditions can be simulated and compared in order to define the best trade off in terms of fuel consumption and product quality. In particular, the paper analyzes the effect of a new burner type characterized by internal heat recovery capability aimed at improving the energy efficiency of the ceramic kiln. The fuel saving and the relating reduction of carbon dioxide emissions resulted in the order of 10% when compared to the standard burner. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Does Self-Control Training Improve Self-Control? A Meta-Analysis.

    Science.gov (United States)

    Friese, Malte; Frankenbach, Julius; Job, Veronika; Loschelder, David D

    2017-11-01

    Self-control is positively associated with a host of beneficial outcomes. Therefore, psychological interventions that reliably improve self-control are of great societal value. A prominent idea suggests that training self-control by repeatedly overriding dominant responses should lead to broad improvements in self-control over time. Here, we conducted a random-effects meta-analysis based on robust variance estimation of the published and unpublished literature on self-control training effects. Results based on 33 studies and 158 effect sizes revealed a small-to-medium effect of g = 0.30, confidence interval (CI 95 ) [0.17, 0.42]. Moderator analyses found that training effects tended to be larger for (a) self-control stamina rather than strength, (b) studies with inactive compared to active control groups, (c) males than females, and (d) when proponents of the strength model of self-control were (co)authors of a study. Bias-correction techniques suggested the presence of small-study effects and/or publication bias and arrived at smaller effect size estimates (range: g corrected = .13 to .24). The mechanisms underlying the effect are poorly understood. There is not enough evidence to conclude that the repeated control of dominant responses is the critical element driving training effects.

  17. Systematic review and meta-analysis of behavioral interventions to improve child pedestrian safety.

    Science.gov (United States)

    Schwebel, David C; Barton, Benjamin K; Shen, Jiabin; Wells, Hayley L; Bogar, Ashley; Heath, Gretchen; McCullough, David

    2014-09-01

    Pedestrian injuries represent a pediatric public health challenge. This systematic review/meta-analysis evaluated behavioral interventions to teach children pedestrian safety. Multiple strategies derived eligible manuscripts (published before April 1, 2013, randomized design, evaluated behavioral child pedestrian safety interventions). Screening 1,951 abstracts yielded 125 full-text retrievals. 25 were retained for data extraction, and 6 were later omitted due to insufficient data. In all, 19 articles reporting 25 studies were included. Risk of bias and quality of evidence were assessed. Behavioral interventions generally improve children's pedestrian safety, both immediately after training and at follow-up several months later. Quality of the evidence was low to moderate. Available evidence suggested interventions targeting dash-out prevention, crossing at parked cars, and selecting safe routes across intersections were effective. Individualized/small-group training for children was the most effective training strategy based on available evidence. Behaviorally based interventions improve children's pedestrian safety. Efforts should continue to develop creative, cost-efficient, and effective interventions. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A fuzzy MICMAC analysis for improving supply chain performance of basic vaccines in developing countries.

    Science.gov (United States)

    Chandra, Dheeraj; Kumar, Dinesh

    2018-03-01

    In recent years, demand to improve child immunization coverage globally, and the development of the latest vaccines and technology has made the vaccine market very complex. The rise in such complexities often gives birth to numerous issues in the vaccine supply chain, which are the primary cause of its poor performance. Figuring out the cause of the performance problem can help you decide how to address it. The goal of the present study is to identify and analyze important issues in the supply chain of basic vaccines required for child immunization in the developing countries. Twenty-five key issues as various factors of the vaccine supply chain have been presented in this paper. Fuzzy MICMAC analysis has been carried out to classify the factors based on their driving and dependence power and to develop a hierarchy based model. Further, the findings have been discussed with the field experts to identify the critical factors. Three factors: better demand forecast, communication between the supply chain members, and proper planning and scheduling have been identified as the critical factors of vaccine supply chain. These factors should be given special care to improve vaccine supply chain performance.

  19. [Security of hospital infusion practices: From an a priori risk analysis to an improvement action plan].

    Science.gov (United States)

    Pignard, J; Cosserant, S; Traore, O; Souweine, B; Sautou, V

    2016-03-01

    Infusion in care units, and all the more in intensive care units, is a complex process which can be the source of many risks for the patient. Under cover of an institutional approach for the improvement of the quality and safety of patient healthcare, a risk mapping infusion practices was performed. The analysis was focused on intravenous infusion situations in adults, the a priori risk assessment methodology was applied and a multidisciplinary work group established. Forty-three risks were identified for the infusion process (prescription, preparation and administration). The risks' assessment and the existing means of control showed that 48% of them would have a highly critical patient security impact. Recommendations were developed for 20 risks considered to be most critical, to limit their occurrence and severity, and improve their control level. An institutional action plan was developed and validated in the Drug and Sterile Medical Devices Commission. This mapping allowed the realization of an exhaustive inventory of potential risks associated with the infusion. At the end of this work, multidisciplinary groups were set up to work on different themes and regular quarterly meetings were established to follow the progress of various projects. Risk mapping will be performed in pediatric and oncology unit where the risks associated with the handling of toxic products is omnipresent. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  20. Analysis of drought characteristics for improved understanding of a water resource system

    Directory of Open Access Journals (Sweden)

    A. T. Lennard

    2014-09-01

    Full Text Available Droughts are a reoccurring feature of the UK climate; recent drought events (2004–2006 and 2010–2012 have highlighted the UK’s continued vulnerability to this hazard. There is a need for further understanding of extreme events, particularly from a water resource perspective. A number of drought indices are available, which can help to improve our understanding of drought characteristics such as frequency, severity and duration. However, at present little of this is applied to water resource management in the water supply sector. Improved understanding of drought characteristics using indices can inform water resource management plans and enhance future drought resilience. This study applies the standardised precipitation index (SPI to a series of rainfall records (1962–2012 across the water supply region of a single utility provider. Key droughts within this period are analysed to develop an understanding of the meteorological characteristics that lead to, exist during and terminate drought events. The results of this analysis highlight how drought severity and duration can vary across a small-scale water supply region, indicating that the spatial coherence of drought events cannot be assumed.

  1. SEISMIC FRAGILITY ANALYSIS OF IMPROVED RC FRAMES USING DIFFERENT TYPES OF BRACING

    Directory of Open Access Journals (Sweden)

    HAMED HAMIDI JAMNANI

    2017-04-01

    Full Text Available Application of bracings to increase the lateral stiffness of building structures is a technique of seismic improvement that engineers frequently have recourse to. Accordingly, investigating the role of bracings in concrete structures along with the development of seismic fragility curves are of overriding concern to civil engineers. In this research, an ordinary RC building, designed according to the 1st edition of Iranian seismic code, was selected for examination. According to FEMA 356 code, this building is considered to be vulnerable. To improve the seismic performance of this building, 3 different types of bracings, which are Concentrically Braced Frames, Eccentrically Braced Frames and Buckling Restrained Frames were employed, and each bracing element was distributed in 3 different locations in the building. The researchers developed fragility curves and utilized 30 earthquake records on the Peak Ground Acceleration seismic intensity scale to carry out a time history analysis. Tow damage scale, including Inter-Story Drifts and Plastic Axial Deformation were also used. The numerical results obtained from this investigation confirm that Plastic Axial Deformation is more reliable than conventional approaches in developing fragility curves for retrofitted frames. In lieu of what is proposed, the researchers selected the suitable damage scale and developed and compared log-normal distribution of fragility curves first for the original and then for the retrofitted building.

  2. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  3. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    Science.gov (United States)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  4. Natural gas to improve energy security in Small Island Developing States: A techno-economic analysis

    Directory of Open Access Journals (Sweden)

    Pravesh Raghoo

    Full Text Available There is a paucity of studies on natural gas-based energy production in Small Island Developing States (SIDS even though technological improvements today are likely to make the application of natural gas more and more feasible. The development of natural gas in some of the regions of the Pacific, Africa, Indian Ocean and Caribbean attracts nearby countries and the coming up of the compressed natural gas (CNG technology which can serve regional markets are two motivations for SIDS to develop natural gas-based energy provision. A third factor concerns long-term energy security. Due to continued reliance on fossil fuels and slow uptake of renewable energy, there is a need to diversify SIDS’ energy mix for a sustainable electricity industry. Comparing the opportunities and constraints of liquefied natural gas (LNG and compressed natural gas (CNG in a SIDS-specific context, this paper discusses how to improve the integration of natural gas in prevailing energy regimes in SIDS as an alternative fuel to oil and complementary to renewable energy sources. To illustrate feasibility in practice, a techno-economic analysis is carried out using the island of Mauritius as an example. Keywords: Energy security, Natural gas, Small Island Developing States

  5. Reactive Landing of Gas-Phase Ions as a Tool for the Fabrication of Metal Oxide Surfaces for In Situ Phosphopeptide Enrichment

    Czech Academy of Sciences Publication Activity Database

    Blacken, G. R.; Volný, Michael; Diener, M.; Jackson, K. E.; Ranjitkar, P.; Maly, D. J.; Tureček, F.

    2009-01-01

    Roč. 20, č. 6 (2009), s. 915-926 ISSN 1044-0305 Institutional research plan: CEZ:AV0Z50200510 Keywords : TANDEM MASS-SPECTROMETRY * SELECTIVE DETECTION * PHOSPHOPROTEOME ANALYSIS Subject RIV: EE - Microbiology, Virology Impact factor: 3.391, year: 2009

  6. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Moon, Young Min; Lee, Dong Won; Lee, Sang Ik; Kim, Eung Soo; Yeom, Keum Soo [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2002-03-15

    The objective of the present research is to perform the separate effect tests and to assess the RELAP5/MOD3.2 code for the analysis of thermal-hydraulic behavior in the reactor coolant system and the improvement of the auditing technology of safety analysis. Three Separate Effect Tests (SETs) are the reflux condensation in the U-tube, the direct contact condensation in the hot-leg and the mixture level buildup in the pressurizer. The experimental data and the empirical correlations are obtained through SETs. On the ases of the three SET works, models in RELAP5 are modified and improved, which are compared with the data. The Korea Standard Nuclear Power Plant (KSNP) are assessed using the modified RELAP5. In the reflux condensation test, the data of heat transfer coefficients and flooding are obtained and the condensation models are modified using the non-iterative model, as results, modified code better predicts the data. In the direct contact condensation test, the data of heat transfer coefficients are obtained for the cocurrent and countercurrent flow between the mixture gas and the water in condition of horizontal stratified flow. Several condensation and friction models are modified, which well predict the present data. In the mixture level test, the data for the mixture level and the onset of water draining into the surge line are obtained. The standard RELAP5 over-predicts the mixture level and the void fraction in the pressurizer. Simple modification of model related to the pool void fraction is suggested. The KSNP is assessed using the standard and the modified RELAP5 resulting from the experimental and code works for the SETs. In case of the pressurizer manway opening with available secondary side of the steam generators, the modified code predicts that the collapsed level in the pressurizer is little accumulated. The presence and location of the opening and the secondary condition of the steam generators have an effect on the coolant inventory. The

  7. [Examination of safety improvement by failure record analysis that uses reliability engineering].

    Science.gov (United States)

    Kato, Kyoichi; Sato, Hisaya; Abe, Yoshihisa; Ishimori, Yoshiyuki; Hirano, Hiroshi; Higashimura, Kyoji; Amauchi, Hiroshi; Yanakita, Takashi; Kikuchi, Kei; Nakazawa, Yasuo

    2010-08-20

    How the maintenance checks of the medical treatment system, including start of work check and the ending check, was effective for preventive maintenance and the safety improvement was verified. In this research, date on the failure of devices in multiple facilities was collected, and the data of the trouble repair record was analyzed by the technique of reliability engineering. An analysis of data on the system (8 general systems, 6 Angio systems, 11 CT systems, 8 MRI systems, 8 RI systems, and the radiation therapy system 9) used in eight hospitals was performed. The data collection period assumed nine months from April to December 2008. Seven items were analyzed. (1) Mean time between failures (MTBF) (2) Mean time to repair (MTTR) (3) Mean down time (MDT) (4) Number found by check in morning (5) Failure generation time according to modality. The classification of the breakdowns per device, the incidence, and the tendency could be understood by introducing reliability engineering. Analysis, evaluation, and feedback on the failure generation history are useful to keep downtime to a minimum and to ensure safety.

  8. Improving the accuracy of effect-directed analysis: the role of bioavailability.

    Science.gov (United States)

    You, Jing; Li, Huizhen

    2017-12-13

    Aquatic ecosystems have been suffering from contamination by multiple stressors. Traditional chemical-based risk assessment usually fails to explain the toxicity contributions from contaminants that are not regularly monitored or that have an unknown identity. Diagnosing the causes of noted adverse outcomes in the environment is of great importance in ecological risk assessment and in this regard effect-directed analysis (EDA) has been designed to fulfill this purpose. The EDA approach is now increasingly used in aquatic risk assessment owing to its specialty in achieving effect-directed nontarget analysis; however, a lack of environmental relevance makes conventional EDA less favorable. In particular, ignoring the bioavailability in EDA may cause a biased and even erroneous identification of causative toxicants in a mixture. Taking bioavailability into consideration is therefore of great importance to improve the accuracy of EDA diagnosis. The present article reviews the current status and applications of EDA practices that incorporate bioavailability. The use of biological samples is the most obvious way to include bioavailability into EDA applications, but its development is limited due to the small sample size and lack of evidence for metabolizable compounds. Bioavailability/bioaccessibility-based extraction (bioaccessibility-directed and partitioning-based extraction) and passive-dosing techniques are recommended to be used to integrate bioavailability into EDA diagnosis in abiotic samples. Lastly, the future perspectives of expanding and standardizing the use of biological samples and bioavailability-based techniques in EDA are discussed.

  9. Simplified inelastic seismic response analysis of piping system using improved capacity spectrum method

    International Nuclear Information System (INIS)

    Iijima, Tadashi

    2005-01-01

    We applied improved capacity spectrum method (ICSM) to a piping system with an asymmetric load-deformation relationship in a piping elbow. The capacity spectrum method can predict an inelastic response by balancing the structural capacity obtained from the load-deformation relationship with the seismic demand defined by an acceleration-displacement response spectrum. The ICSM employs (1) effective damping ratio and period that are based on a statistical methodology, (2) practical procedures necessary to obtain a balance between the structural capacity and the seismic demand. The effective damping ratio and period are defined so as to maximize the probability that predicted response errors lie inside the -10 to 20% range. However, without taking asymmetry into consideration the displacement calculated by using the load-deformation relationship on the stiffer side was 39% larger than that of a time history analysis by a direct integral method. On the other hand, when asymmetry was taken into account, the calculated displacement was only 14% larger than that of a time history analysis. Thus, we verified that the ICSM could predict the inelastic response with errors lying within the -10 to 20% range, by taking into account the asymmetric load-deformation relationship of the piping system. (author)

  10. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    Science.gov (United States)

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  11. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    International Nuclear Information System (INIS)

    Sandusky, Peter; Appiah-Amponsah, Emmanuel; Raftery, Daniel

    2011-01-01

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  12. Sensitivity improvements, in the determination of mercury in biological tissues by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cornett, C R; Samudralwar, D L; Ehmann, W D [Kentucky Univ., Lexington, KY (United States). Dept. of Chemistry; Markesbery, W R [Kentucky Univ., Lexington, KY (United States)

    1995-08-01

    The possible association of dental amalgam surface exposure, brain mercury (Hg) levels, and pathological markers of Alzheimer`s disease (AD) in the brain is the subject of an on-going study in our laboratory. Two radiochemical neutron activation analysis methods and the use of instrumental neutron activation analysis (INAA) with Compton suppression spectrometry have been evaluated for improving our INAA Hg detection limit (2.8{+-}0.6 ng/g, wet-weight basis) in human tissue. Large numbers of samples dictated the use of a purely instrumental method or rapid, simple radiochemical separations. Human brain tissues and NIST biological standards were analyzed using a precipitation of Hg{sub 2}Cl{sub 2}, a solvent extraction utilizing sodium diethyldithiocarbomate, conventional INAA, and INAA with Compton suppression. The radiochemical precipitation of Hg{sub 2}Cl{sub 2} proved to be the most useful method for use in our study because it provided a simultaneous, quantitative determination of silver (Ag) and a Hg detection limit in brain tissue of 1.6{+-}0.1 ng/g (wet-weight basis). (author). 12 refs., 2 tabs.

  13. Scanner Uniformity improvements for radiochromic film analysis with matt reflectance backing

    International Nuclear Information System (INIS)

    Butson, M.; Yu, P.K.N.

    2011-01-01

    Full text: A simple and reproducible method for increasing desktop scanner uniformity for the analysis of radiochromic films is presented. Scanner uniformity, especially in the non-scan direction, for transmission scanning is well known to be problematic for radiochromic film analysis and normally corrections need to be applied. These corrections are dependant on scanner coordinates and dose level applied which complicates dosimetry procedures. This study has highlighted that using reflectance scanning in combination with a matt, white backing material instead of the conventional gloss scanner finish, substantial increases in the scanner uniformity can be achieved within 90% of the scanning area. Uniformity within ±I% over the scanning area for our epsonV700 scanner tested was found. This is compared to within ±3% for reflection scanning with the gloss backing material and within ±4% for transmission scanning. The matt backing material used was simply 5 layers of standard quality white printing paper (80 g/m It was found that 5 layers was the optimal result for backing material however most of the improvements were seen with a minimum of 3 layers. Above 5 layers, no extra benefit was seen. This may eliminate the need to perform scanner corrections for position on the desktop scanners for radiochromic film dosimetry. (author)

  14. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  15. Improvements to PATRIC, the all-bacterial Bioinformatics Database and Analysis Resource Center

    Science.gov (United States)

    Wattam, Alice R.; Davis, James J.; Assaf, Rida; Boisvert, Sébastien; Brettin, Thomas; Bun, Christopher; Conrad, Neal; Dietrich, Emily M.; Disz, Terry; Gabbard, Joseph L.; Gerdes, Svetlana; Henry, Christopher S.; Kenyon, Ronald W.; Machi, Dustin; Mao, Chunhong; Nordberg, Eric K.; Olsen, Gary J.; Murphy-Olson, Daniel E.; Olson, Robert; Overbeek, Ross; Parrello, Bruce; Pusch, Gordon D.; Shukla, Maulik; Vonstein, Veronika; Warren, Andrew; Xia, Fangfang; Yoo, Hyunseung; Stevens, Rick L.

    2017-01-01

    The Pathosystems Resource Integration Center (PATRIC) is the bacterial Bioinformatics Resource Center (https://www.patricbrc.org). Recent changes to PATRIC include a redesign of the web interface and some new services that provide users with a platform that takes them from raw reads to an integrated analysis experience. The redesigned interface allows researchers direct access to tools and data, and the emphasis has changed to user-created genome-groups, with detailed summaries and views of the data that researchers have selected. Perhaps the biggest change has been the enhanced capability for researchers to analyze their private data and compare it to the available public data. Researchers can assemble their raw sequence reads and annotate the contigs using RASTtk. PATRIC also provides services for RNA-Seq, variation, model reconstruction and differential expression analysis, all delivered through an updated private workspace. Private data can be compared by ‘virtual integration’ to any of PATRIC's public data. The number of genomes available for comparison in PATRIC has expanded to over 80 000, with a special emphasis on genomes with antimicrobial resistance data. PATRIC uses this data to improve both subsystem annotation and k-mer classification, and tags new genomes as having signatures that indicate susceptibility or resistance to specific antibiotics. PMID:27899627

  16. Improved candidate generation and coverage analysis methods for design optimization of symmetric multi-satellite constellations

    Science.gov (United States)

    Matossian, Mark G.

    1997-01-01

    Much attention in recent years has focused on commercial telecommunications ventures involving constellations of spacecraft in low and medium Earth orbit. These projects often require investments on the order of billions of dollars (US$) for development and operations, but surprisingly little work has been published on constellation design optimization for coverage analysis, traffic simulation and launch sequencing for constellation build-up strategies. This paper addresses the two most critical aspects of constellation orbital design — efficient constellation candidate generation and coverage analysis. Inefficiencies and flaws in the current standard algorithm for constellation modeling are identified, and a corrected and improved algorithm is presented. In the 1970's, John Walker and G. V. Mozhaev developed innovative strategies for continuous global coverage using symmetric non-geosynchronous constellations. (These are sometimes referred to as rosette, or Walker constellations. An example is pictured above.) In 1980, the late Arthur Ballard extended and generalized the work of Walker into a detailed algorithm for the NAVSTAR/GPS program, which deployed a 24 satellite symmetric constellation. Ballard's important contribution was published in his "Rosette Constellations of Earth Satellites."

  17. Drought Characteristic Analysis Based on an Improved PDSI in the Wei River Basin of China

    Directory of Open Access Journals (Sweden)

    Lei Zou

    2017-03-01

    Full Text Available In this study, to improve the efficiency of the original Palmer Drought Severity Index (PDSI_original, we coupled the Soil and Water Assessment tool (SWAT and PDSI_original to construct a drought index called PDSI_SWAT. The constructed PDSI_SWAT is applied in the Wei River Basin (WRB of China during 1960–2012. The comparison of the PDSI_SWAT with four other commonly used drought indices reveals the effectiveness of the PDSI_SWAT in describing the drought propagation processes in WRB. The whole WRB exhibits a dry trend, with more significant trends in the northern, southeastern and western WRB than the remaining regions. Furthermore, the drought frequencies show that drought seems to occur more likely in the northern part than the southern part of WRB. The principle component analysis method based on the PDSI_SWAT reveals that the whole basin can be further divided into three distinct sub-regions with different drought variability, i.e., the northern, southeastern and western part. Additionally, these three sub-regions are also consistent with the spatial pattern of drought shown by the drought frequency. The wavelet transform analysis method indicates that the El Niño-Southern Oscillation (ENSO events have strong impacts on inducing droughts in the WRB. The results of this study could be beneficial for a scientific water resources management and drought assessment in the current study area and also provide a valuable reference for other areas with similar climatic characteristics.

  18. Signal Quality Improvement Algorithms for MEMS Gyroscope-Based Human Motion Analysis Systems: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Jiaying Du

    2018-04-01

    Full Text Available Motion sensors such as MEMS gyroscopes and accelerometers are characterized by a small size, light weight, high sensitivity, and low cost. They are used in an increasing number of applications. However, they are easily influenced by environmental effects such as temperature change, shock, and vibration. Thus, signal processing is essential for minimizing errors and improving signal quality and system stability. The aim of this work is to investigate and present a systematic review of different signal error reduction algorithms that are used for MEMS gyroscope-based motion analysis systems for human motion analysis or have the potential to be used in this area. A systematic search was performed with the search engines/databases of the ACM Digital Library, IEEE Xplore, PubMed, and Scopus. Sixteen papers that focus on MEMS gyroscope-related signal processing and were published in journals or conference proceedings in the past 10 years were found and fully reviewed. Seventeen algorithms were categorized into four main groups: Kalman-filter-based algorithms, adaptive-based algorithms, simple filter algorithms, and compensation-based algorithms. The algorithms were analyzed and presented along with their characteristics such as advantages, disadvantages, and time limitations. A user guide to the most suitable signal processing algorithms within this area is presented.

  19. Signal Quality Improvement Algorithms for MEMS Gyroscope-Based Human Motion Analysis Systems: A Systematic Review.

    Science.gov (United States)

    Du, Jiaying; Gerdtman, Christer; Lindén, Maria

    2018-04-06

    Motion sensors such as MEMS gyroscopes and accelerometers are characterized by a small size, light weight, high sensitivity, and low cost. They are used in an increasing number of applications. However, they are easily influenced by environmental effects such as temperature change, shock, and vibration. Thus, signal processing is essential for minimizing errors and improving signal quality and system stability. The aim of this work is to investigate and present a systematic review of different signal error reduction algorithms that are used for MEMS gyroscope-based motion analysis systems for human motion analysis or have the potential to be used in this area. A systematic search was performed with the search engines/databases of the ACM Digital Library, IEEE Xplore, PubMed, and Scopus. Sixteen papers that focus on MEMS gyroscope-related signal processing and were published in journals or conference proceedings in the past 10 years were found and fully reviewed. Seventeen algorithms were categorized into four main groups: Kalman-filter-based algorithms, adaptive-based algorithms, simple filter algorithms, and compensation-based algorithms. The algorithms were analyzed and presented along with their characteristics such as advantages, disadvantages, and time limitations. A user guide to the most suitable signal processing algorithms within this area is presented.

  20. Is Recreational Soccer Effective for Improving VO2max A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Milanović, Zoran; Pantelić, Saša; Čović, Nedim; Sporiš, Goran; Krustrup, Peter

    2015-09-01

    Soccer is the most popular sport worldwide, with a long history and currently more than 500 million active participants, of whom 300 million are registered football club members. On the basis of scientific findings showing positive fitness and health effects of recreational soccer, FIFA (Fédération Internationale de Football Association) introduced the slogan "Playing football for 45 min twice a week-best prevention of non-communicable diseases" in 2010. The objective of this paper was to perform a systematic review and meta-analysis of the literature to determine the effects of recreational soccer on maximal oxygen uptake (VO2max). Six electronic databases (MEDLINE, PubMed, SPORTDiscus, Web of Science, CINAHL and Google Scholar) were searched for original research articles. A manual search was performed to cover the areas of recreational soccer, recreational physical activity, recreational small-sided games and VO2max using the following key terms, either singly or in combination: recreational small-sided games, recreational football, recreational soccer, street football, street soccer, effect, maximal oxygen uptake, peak oxygen uptake, cardiorespiratory fitness, VO2max. The inclusion criteria were divided into four sections: type of study, type of participants, type of interventions and type of outcome measures. Probabilistic magnitude-based inferences for meta-analysed effects were based on standardised thresholds for small, moderate and large changes (0.2, 0.6 and 1.2, respectively) derived from between-subject standard deviations for baseline fitness. Seventeen studies met the inclusion criteria and were included in the systematic review and meta-analysis. Mean differences showed that VO2max increased by 3.51 mL/kg/min (95 % CI 3.07-4.15) over a recreational soccer training programme in comparison with other training models. The meta-analysed effects of recreational soccer on VO2max compared with the controls of no exercise, continuous running and strength

  1. Improving emissions inventories in North America through systematic analysis of model performance during ICARTT and MILAGRO

    Science.gov (United States)

    Mena, Marcelo Andres

    During 2004 and 2006 the University of Iowa provided air quality forecast support for flight planning of the ICARTT and MILAGRO field campaigns. A method for improvement of model performance in comparison to observations is showed. The method allows identifying sources of model error from boundary conditions and emissions inventories. Simultaneous analysis of horizontal interpolation of model error and error covariance showed that error in ozone modeling is highly correlated to the error of its precursors, and that there is geographical correlation also. During ICARTT ozone modeling error was improved by updating from the National Emissions Inventory from 1999 and 2001, and furthermore by updating large point source emissions from continuous monitoring data. Further improvements were achieved by reducing area emissions of NOx y 60% for states in the Southeast United States. Ozone error was highly correlated to NOy error during this campaign. Also ozone production in the United States was most sensitive to NOx emissions. During MILAGRO model performance in terms of correlation coefficients was higher, but model error in ozone modeling was high due overestimation of NOx and VOC emissions in Mexico City during forecasting. Large model improvements were shown by decreasing NOx emissions in Mexico City by 50% and VOC by 60%. Recurring ozone error is spatially correlated to CO and NOy error. Sensitivity studies show that Mexico City aerosol can reduce regional photolysis rates by 40% and ozone formation by 5-10%. Mexico City emissions can enhance NOy and O3 concentrations over the Gulf of Mexico in up to 10-20%. Mexico City emissions can convert regional ozone production regimes from VOC to NOx limited. A method of interpolation of observations along flight tracks is shown, which can be used to infer on the direction of outflow plumes. The use of ratios such as O3/NOy and NOx/NOy can be used to provide information on chemical characteristics of the plume, such as age

  2. Error analysis and system improvements in phase-stepping methods for photoelasticity

    International Nuclear Information System (INIS)

    Wenyan Ji

    1997-11-01

    In the past automated photoelasticity has been demonstrated to be one of the most efficient technique for determining the complete state of stress in a 3-D component. However, the measurement accuracy, which depends on many aspects of both the theoretical foundations and experimental procedures, has not been studied properly. The objective of this thesis is to reveal the intrinsic properties of the errors, provide methods for reducing them and finally improve the system accuracy. A general formulation for a polariscope with all the optical elements in an arbitrary orientation was deduced using the method of Mueller Matrices. The deduction of this formulation indicates an inherent connectivity among the optical elements and gives a knowledge of the errors. In addition, this formulation also shows a common foundation among the photoelastic techniques, consequently, these techniques share many common error sources. The phase-stepping system proposed by Patterson and Wang was used as an exemplar to analyse the errors and provide the proposed improvements. This system can be divided into four parts according to their function, namely the optical system, light source, image acquisition equipment and image analysis software. All the possible error sources were investigated separately and the methods for reducing the influence of the errors and improving the system accuracy are presented. To identify the contribution of each possible error to the final system output, a model was used to simulate the errors and analyse their consequences. Therefore the contribution to the results from different error sources can be estimated quantitatively and finally the accuracy of the systems can be improved. For a conventional polariscope, the system accuracy can be as high as 99.23% for the fringe order and the error less than 5 degrees for the isoclinic angle. The PSIOS system is limited to the low fringe orders. For a fringe order of less than 1.5, the accuracy is 94.60% for fringe

  3. Reduced COPD Exacerbation Risk Correlates With Improved FEV1: A Meta-Regression Analysis.

    Science.gov (United States)

    Zider, Alexander D; Wang, Xiaoyan; Buhr, Russell G; Sirichana, Worawan; Barjaktarevic, Igor Z; Cooper, Christopher B

    2017-09-01

    The mechanism by which various classes of medication reduce COPD exacerbation risk remains unknown. We hypothesized a correlation between reduced exacerbation risk and improvement in airway patency as measured according to FEV 1 . By systematic review, COPD trials were identified that reported therapeutic changes in predose FEV 1 (dFEV 1 ) and occurrence of moderate to severe exacerbations. Using meta-regression analysis, a model was generated with dFEV 1 as the moderator variable and the absolute difference in exacerbation rate (RD), ratio of exacerbation rates (RRs), or hazard ratio (HR) as dependent variables. The analysis of RD and RR included 119,227 patients, and the HR analysis included 73,475 patients. For every 100-mL change in predose FEV 1 , the HR decreased by 21% (95% CI, 17-26; P < .001; R 2  = 0.85) and the absolute exacerbation rate decreased by 0.06 per patient per year (95% CI, 0.02-0.11; P = .009; R 2  = 0.05), which corresponded to an RR of 0.86 (95% CI, 0.81-0.91; P < .001; R 2  = 0.20). The relationship with exacerbation risk remained statistically significant across multiple subgroup analyses. A significant correlation between increased FEV 1 and lower COPD exacerbation risk suggests that airway patency is an important mechanism responsible for this effect. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  4. Improvements in quantification of low z element analysis for Sr- and conventional TXRF

    International Nuclear Information System (INIS)

    Baur, K.; Brennan, S.; Pianetta, P.; Kerner, J.; Zhu, Q.; Burrow, B.

    2000-01-01

    As the dimensions of integrated circuits continue to shrink also the amount of tolerable contamination on Si wafer surfaces decreases. Contaminants of primary concern are transition metals and light elements like Al. Total reflection x-ray fluorescence (TXRF) spectroscopy using synchrotron radiation from the Stanford synchrotron radiation laboratory (SSRL) is one of the most powerful techniques for trace impurity analysis on Si wafer surfaces. In addition, it is among the more sensitive techniques and the only one, which is non-destructive. Upon having established a better detection sensitivity for transition elements than required by semiconductor industry, the current effort focuses on the improvement of the sensitivity for the detection and data analysis of light elements. Due to the presence of the neighboring Si signal from the substrate this can only be achieved by tuning the excitation energy below the Si-K absorption edge. For conventional TXRF systems this can be done by using a W-M fluorescence line (1.78 keV) for excitation or by employing the tunability of synchrotron radiation. However, this results in a substantial increase in background due to resonant X-ray Raman scattering. This scattering dominates the background behavior of the Al K fluorescence line, and consequently limits the achievable sensitivity for the detection of Al surface contaminants. In particular, we find that for a precise determination of the achievable sensitivity, the specific shape of the continuous Raman background must be used in the deconvolution. This data analysis opens a new perspective for conventional TXRF systems to overcome background problems in quantification and first results will be presented. (author)

  5. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  6. Using the failure mode and effects analysis model to improve parathyroid hormone and adrenocorticotropic hormone testing

    Directory of Open Access Journals (Sweden)

    Magnezi R

    2016-12-01

    Full Text Available Racheli Magnezi,1 Asaf Hemi,1 Rina Hemi2 1Department of Management, Public Health and Health Systems Management Program, Bar Ilan University, Ramat Gan, 2Endocrine Service Unit, Sheba Medical Center, Tel Aviv, Israel Background: Risk management in health care systems applies to all hospital employees and directors as they deal with human life and emergency routines. There is a constant need to decrease risk and increase patient safety in the hospital environment. The purpose of this article is to review the laboratory testing procedures for parathyroid hormone and adrenocorticotropic hormone (which are characterized by short half-lives and to track failure modes and risks, and offer solutions to prevent them. During a routine quality improvement review at the Endocrine Laboratory in Tel Hashomer Hospital, we discovered these tests are frequently repeated unnecessarily due to multiple failures. The repetition of the tests inconveniences patients and leads to extra work for the laboratory and logistics personnel as well as the nurses and doctors who have to perform many tasks with limited resources.Methods: A team of eight staff members accompanied by the Head of the Endocrine Laboratory formed the team for analysis. The failure mode and effects analysis model (FMEA was used to analyze the laboratory testing procedure and was designed to simplify the process steps and indicate and rank possible failures.Results: A total of 23 failure modes were found within the process, 19 of which were ranked by level of severity. The FMEA model prioritizes failures by their risk priority number (RPN. For example, the most serious failure was the delay after the samples were collected from the department (RPN =226.1.Conclusion: This model helped us to visualize the process in a simple way. After analyzing the information, solutions were proposed to prevent failures, and a method to completely avoid the top four problems was also developed. Keywords: failure mode

  7. Improving configuration management of thermalhydraulic analysis by automating the linkage between pipe geometry and plant idealization

    International Nuclear Information System (INIS)

    Gibb, R.; Girard, R.; Thompson, W.

    1997-01-01

    All safety analysis codes require some representation of actual plant data as a part of their input. Such representations, referred to at Point Lepreau Generating Station (PLGS) as plant idealizations, may include piping layout, orifice, pump or valve opening characteristics, boundary conditions of various sorts, reactor physics parameters, etc. As computing power increases, the numerical capabilities of thermalhydraulic analysis tools become more sophisticated, requiring more detailed assessments, and consequently more complex and complicated idealizations of the system models. Thus, a need has emerged to create a precise plant model layout in electronic form which ensures a realistic representation of the plant systems, and form which analytical approximations of any chosen degree of accuracy may be created. The benefits of this process are twofold. Firstly, the job of developing a plant idealization is made simpler, and therefore is cheaper for the utility. More important however, are the improvements in documentation and reproducibility that this process imparts to the resultant idealization. Just as the software that performs the numerical operations on the input data must be subject to verification/validation, equally robust measures must be taken to ensure that these software operations are being applied to valid idealizations, that are formally documented. Since the CATHENA Code is one of the most important thermalhydraulic code used for safety analysis at PLGS the main effort was directed towards the systems plant models for this code. This paper reports the results of the work carried on at PLGS and ANSL to link the existing piping data base to the actual CATHENA plant idealization. An introduction to the concept is given first, followed by a description of the databases, and the supervisory tool which manages the data, and associated software. An intermediate code, which applied some thermalhydraulic rules to the data, and translated the resultant data

  8. Qualitative Improvement Methods Through Analysis of Inquiry Contents for Cancer Registration

    Science.gov (United States)

    Boo, Yoo-Kyung; Lim, Hyun-Sook; Kim, Jung-Eun; Kim, Kyoung-Beom; Won, Young-Joo

    2017-06-25

    Background: In Korea, the national cancer database was constructed after the initiation of the national cancer registration project in 1980, and the annual national cancer registration report has been published every year since 2005. Consequently, data management must begin even at the stage of data collection in order to ensure quality. Objectives: To determine the suitability of cancer registries’ inquiry tools through the inquiry analysis of the Korea Central Cancer Registry (KCCR), and identify the needs to improve the quality of cancer registration. Methods: Results of 721 inquiries to the KCCR from 2000 to 2014 were analyzed by inquiry year, question type, and medical institution characteristics. Using Stata version 14.1, descriptive analysis was performed to identify general participant characteristics, and chi-square analysis was applied to investigate significant differences in distribution characteristics by factors affecting the quality of cancer registration data. Results: The number of inquiries increased in 2005–2009. During this period, there were various changes, including the addition of cancer registration items such as brain tumors and guideline updates. Of the inquirers, 65.3% worked at hospitals in metropolitan cities and 60.89% of hospitals had 601–1000 beds. Tertiary hospitals had the highest number of inquiries (64.91%), and the highest number of questions by type were 353 (48.96%) for histological codes, 92 (12.76%) for primary sites, and 76 (10.54%) for reportable. Conclusions: A cancer registration inquiry system is an effective method when not confident about codes during cancer registration, or when confronting cancer cases in which previous clinical knowledge or information on the cancer registration guidelines are insufficient. Creative Commons Attribution License

  9. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter

    2013-01-01

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  10. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  11. Recommendations to improve imaging and analysis of brain lesion load and atrophy in longitudinal studies of multiple sclerosis

    DEFF Research Database (Denmark)

    Vrenken, H; Jenkinson, M; Horsfield, M A

    2013-01-01

    resonance image analysis methods for assessing brain lesion load and atrophy, this paper makes recommendations to improve these measures for longitudinal studies of MS. Briefly, they are (1) images should be acquired using 3D pulse sequences, with near-isotropic spatial resolution and multiple image......Focal lesions and brain atrophy are the most extensively studied aspects of multiple sclerosis (MS), but the image acquisition and analysis techniques used can be further improved, especially those for studying within-patient changes of lesion load and atrophy longitudinally. Improved accuracy...

  12. An Improved Rigid Multibody Model for the Dynamic Analysis of the Planetary Gearbox in a Wind Turbine

    Directory of Open Access Journals (Sweden)

    Wenguang Yang

    2016-01-01

    Full Text Available This paper proposes an improved rigid multibody model for the dynamic analysis of the planetary gearbox in a wind turbine. The improvements mainly include choosing the inertia frame as the reference frame of the carrier, the ring, and the sun and adding a new degree of freedom for each planet. An element assembly method is introduced to build the model, and a time-varying mesh stiffness model is presented. A planetary gear study case is employed to verify the validity of the improved model. Comparisons between the improvement model and the traditional model show that the natural characteristics are very close; the improved model can obtain the right equivalent moment of inertia of the planetary gear in the transient simulation, and all the rotation speeds satisfy the transmission relationships well; harmonic resonance and resonance modulation phenomena can be found in their vibration signals. The improved model is applied in a multistage gearbox dynamics analysis to reveal the prospects of the model. Modal analysis and transient analysis with and without time-varying mesh stiffness considered are conducted. The rotation speeds from the transient analysis are consistent with the theory, and resonance modulation can be found in the vibration signals.