WorldWideScience

Sample records for improved phosphopeptide analysis

  1. Online Nanoflow Multidimensional Fractionation for High Efficiency Phosphopeptide Analysis*

    Science.gov (United States)

    Ficarro, Scott B.; Zhang, Yi; Carrasco-Alfonso, Marlene J.; Garg, Brijesh; Adelmant, Guillaume; Webber, James T.; Luckey, C. John; Marto, Jarrod A.

    2011-01-01

    Despite intense, continued interest in global analyses of signaling cascades through mass spectrometry-based studies, the large-scale, systematic production of phosphoproteomics data has been hampered in-part by inefficient fractionation strategies subsequent to phosphopeptide enrichment. Here we explore two novel multidimensional fractionation strategies for analysis of phosphopeptides. In the first technique we utilize aliphatic ion pairing agents to improve retention of phosphopeptides at high pH in the first dimension of a two-dimensional RP-RP. The second approach is based on the addition of strong anion exchange as the second dimension in a three-dimensional reversed phase (RP)-strong anion exchange (SAX)-RP configuration. Both techniques provide for automated, online data acquisition, with the 3-D platform providing the highest performance both in terms of separation peak capacity and the number of unique phosphopeptide sequences identified per μg of cell lysate consumed. Our integrated RP-SAX-RP platform provides several analytical figures of merit, including: (1) orthogonal separation mechanisms in each dimension; (2) high separation peak capacity (3) efficient retention of singly- and multiply-phosphorylated peptides; (4) compatibility with automated, online LC-MS analysis. We demonstrate the reproducibility of RP-SAX-RP and apply it to the analysis of phosphopeptides derived from multiple biological contexts, including an in vitro model of acute myeloid leukemia in addition to primary polyclonal CD8+ T-cells activated in vivo through bacterial infection and then purified from a single mouse. PMID:21788404

  2. Enhanced detection and desalting free protocol for phosphopeptides eluted from immobilized Fe (III) affinity chromatography in direct MALDI TOF analysis.

    Science.gov (United States)

    Zhu, Li; Zhang, Jing; Guo, Yinlong

    2014-01-16

    IMAC strategy is widely used in phosphopeptide enrichment, but most of the current eluents contain large amount of salt, which must be discarded before MS detection. Here, we present techniques to elute phosphopeptides with low ionization efficiency reagents, which could be left in the eluate for direct MS analysis, thus saving desalting and the following steps. Several reagents were studied, including 5-sulfosalicylic acid dihydrate, acetyl acetone and glyphosate. The results show that glyphosate has very outstanding advantages: only monophosphopeptides can be eluted with glyphosate solution, while all phosphopeptides can be eluted with negatively charged glyphosate ions with pH9. Moreover, the high ionic strength can minimize nonspecific electrostatic interactions in elution step and limit the generation of potential phosphopeptide-metal ion adducts such as sodium or Fe(3+) counterparts. S/N of phosphopeptides could be enhanced 3-5 folds in MALDI MS detection and phosphopeptide recovery is greatly improved while compared with its counterparts eluted by commonly used elution buffers. By applying this reagent into IMAC elution, the whole experimental process could be more convenient, time-saving and cost-saving, which is of great importance to the enrichment and detection of phosphopeptides in phosphoproteomics research. This potent desalting-free and signal enhanced elution method can improve the sensitivity and detection of phosphopeptides in MALDI TOF MS analysis, both time saving and cost saving. With these advantages, it's highly appropriate for the high throughout analysis of phosphoproteomics. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Phosphopeptide analysis of rodent epididymal spermatozoa.

    Science.gov (United States)

    Baker, Mark A; Hetherington, Louise; Weinberg, Anita; Velkov, Tony

    2014-12-30

    Spermatozoa are quite unique amongst cell types. Although produced in the testis, both nuclear gene transcription and translation are switched off once the pre-cursor round cell begins to elongate and differentiate into what is morphologically recognized as a spermatozoon. However, the spermatozoon is very immature, having no ability for motility or egg recognition. Both of these events occur once the spermatozoa transit a secondary organ known as the epididymis. During the ~12 day passage that it takes for a sperm cell to pass through the epididymis, post-translational modifications of existing proteins play a pivotal role in the maturation of the cell. One major facet of such is protein phosphorylation. In order to characterize phosphorylation events taking place during sperm maturation, both pure sperm cell populations and pre-fractionation of phosphopeptides must be established. Using back flushing techniques, a method for the isolation of pure spermatozoa of high quality and yield from the distal or caudal epididymides is outlined. The steps for solubilization, digestion, and pre-fractionation of sperm phosphopeptides through TiO2 affinity chromatography are explained. Once isolated, phosphopeptides can be injected into MS to identify both protein phosphorylation events on specific amino acid residues and quantify the levels of phosphorylation taking place during the sperm maturation processes.

  4. Improved detection of hydrophilic phosphopeptides using graphite powder microcolumns and mass spectrometry: evidence for in vivo doubly phosphorylated dynamin I and dynamin III

    DEFF Research Database (Denmark)

    Larsen, Martin Røssel; Graham, Mark E; Robinson, Phillip J

    2004-01-01

    A common strategy in proteomics to improve the number and quality of peptides detected by mass spectrometry (MS) is to desalt and concentrate proteolytic digests using reversed phase (RP) chromatography prior to analysis. However, this does not allow for detection of small or hydrophilic peptides...... a large improvement in the detection of small amounts of phosphopeptides by MS and the approach has major implications for both small- and large-scale projects in phosphoproteomics.......A common strategy in proteomics to improve the number and quality of peptides detected by mass spectrometry (MS) is to desalt and concentrate proteolytic digests using reversed phase (RP) chromatography prior to analysis. However, this does not allow for detection of small or hydrophilic peptides......, or peptides altered in hydrophilicity such as phosphopeptides. We used microcolumns to compare the ability of RP resin or graphite powder to retain phosphopeptides. A number of standard phosphopeptides and a biologically relevant phosphoprotein, dynamin I, were analyzed. MS revealed that some phosphopeptides...

  5. Phosphoric acid as a matrix additive for MALDI MS analysis of phosphopeptides and phosphoproteins

    DEFF Research Database (Denmark)

    Kjellström, Sven; Jensen, Ole Nørregaard

    2004-01-01

    ,5-dihydroxybenzoic acid (2,5-DHB) matrix. Phosphoric acid in combination with 2,5-DHB matrix significantly enhanced phosphopeptide ion signals in MALDI mass spectra of crude peptide mixtures derived from the phosphorylated proteins alpha-casein and beta-casein. The beneficial effects of adding up to 1% phosphoric...... acid to 2,5-DHB were also observed in LC-MALDI-MS analysis of tryptic phosphopeptides of B. subtilis PrkC phosphoprotein. Finally, the mass resolution of MALDI mass spectra of intact proteins was significantly improved by using phosphoric acid in 2,5-DHB matrix....

  6. Assessment of phosphopeptide enrichment/precipitation method for LC-MS/MS based phosphoproteomic analysis of plant tissue

    DEFF Research Database (Denmark)

    Ye, Juanying; Rudashevskaya, Elena; Hansen, Thomas Aarup

    selectivity. The overlap between the 3 enrichment experiments was quite small. We are currently investigating further combination of enrichment methods: SIMAC enrichment and the combination of CPP and IMAC enrichment. Samples will be analyzed by LTQ-Orbitrap-ETD MS, and the behavior of phosphopeptides on CID...... is necessary. At present, numerous phosphopeptide enrichment approaches have been established and applied to complex biological samples. We and others have reported that multi-step phosphopeptide purification methods enable better recovery of phosphopeptide and achieve higher selectivity and sensitivity than...... stardand sample preparation protocols. Here, we combine 3 phosphpeptide enrichment methods (IMAC, TiO2 and Calcium Phosphate Precipitation (CPP)), and apply them to phosphoproteomic analysis of Arabidopsis thaliana plasma membrane preparation. Method Plant plasma membranes were isolated from Arabidopsis...

  7. Evaluating multiplexed quantitative phosphopeptide analysis on a hybrid quadrupole mass filter/linear ion trap/orbitrap mass spectrometer.

    Science.gov (United States)

    Erickson, Brian K; Jedrychowski, Mark P; McAlister, Graeme C; Everley, Robert A; Kunz, Ryan; Gygi, Steven P

    2015-01-20

    As a driver for many biological processes, phosphorylation remains an area of intense research interest. Advances in multiplexed quantitation utilizing isobaric tags (e.g., TMT and iTRAQ) have the potential to create a new paradigm in quantitative proteomics. New instrumentation and software are propelling these multiplexed workflows forward, which results in more accurate, sensitive, and reproducible quantitation across tens of thousands of phosphopeptides. This study assesses the performance of multiplexed quantitative phosphoproteomics on the Orbitrap Fusion mass spectrometer. Utilizing a two-phosphoproteome model of precursor ion interference, we assessed the accuracy of phosphopeptide quantitation across a variety of experimental approaches. These methods included the use of synchronous precursor selection (SPS) to enhance TMT reporter ion intensity and accuracy. We found that (i) ratio distortion remained a problem for phosphopeptide analysis in multiplexed quantitative workflows, (ii) ratio distortion can be overcome by the use of an SPS-MS3 scan, (iii) interfering ions generally possessed a different charge state than the target precursor, and (iv) selecting only the phosphate neutral loss peak (single notch) for the MS3 scan still provided accurate ratio measurements. Remarkably, these data suggest that the underlying cause of interference may not be due to coeluting and cofragmented peptides but instead from consistent, low level background fragmentation. Finally, as a proof-of-concept 10-plex experiment, we compared phosphopeptide levels from five murine brains to five livers. In total, the SPS-MS3 method quantified 38 247 phosphopeptides, corresponding to 11 000 phosphorylation sites. With 10 measurements recorded for each phosphopeptide, this equates to more than 628 000 binary comparisons collected in less than 48 h.

  8. Identification of phosphopeptides with unknown cleavage specificity by a de novo sequencing assisted database search strategy.

    Science.gov (United States)

    Dong, Mingming; Ye, Mingliang; Cheng, Kai; Dong, Jing; Zhu, Jun; Qin, Hongqiang; Bian, Yangyang; Zou, Hanfa

    2014-11-01

    In theory, proteases with broad cleavage specificity could be applied to digest protein samples to improve the phosphoproteomic analysis coverage. However, in practice this approach is seldom employed. This is because the identification of phosphopeptides without enzyme specificity by conventional database search strategy is extremely difficult due to the huge search space. In this study, we investigated the performance of a de novo sequencing assisted database search strategy for the identification of such phosphopeptides. Firstly, we compared the performance of conventional database search strategy and the de novo sequencing assisted database search strategy for the identification of peptides and phosphopeptides without stetting enzyme specificity. It was found that the identification sensitivity dropped significantly for the conventional one while it was only slightly decreased for the new approach. Then, this new search strategy was applied to identify phosphopeptides generated by Proteinase K digestion, which resulted in the identification of 717 phosphopeptides. Finally, this strategy was utilized for the identification of serum endogenous phosphopeptides, which were generated in vivo by different kinds of proteases and kinases, and the identification of 68 unique serum endogenous phosphopepitdes was successfully achieved. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Desalting of phosphopeptides by tandem polypyrrole-c18 reverse phase micropipette tip (TMTip(PPY-C18)) based on hybrid electrostatic, Π-Π stacking and hydrophobic interactions for mass spectrometric analysis.

    Science.gov (United States)

    Zheng, Shi; Wang, Xiaoli; Fu, Jieying; Hu, Xuejiao; Xiao, Xiao; Huang, Lulu; Zhou, Youe; Zhong, Hongying

    2012-04-29

    Desalting and concentration of peptides using reverse phase (RP) C18 chromatographic material based on hydrophobic interaction is a routine approach used in mass spectrometry (MS)-based proteomics. However, MS detection of small hydrophilic peptides, in particular, phosphopeptides that bear multiple negative charges, is challenging due to the insufficient binding to C18 stationary phase. We described here the development of a new desalting method that takes the unique properties of polypyrrole (PPY). The presence of positively charged nitrogen atoms under acidic conditions and polyunsaturated bonds in polypyrrole provide a prospect for enhanced adsorption of phosphopeptides or hydrophilic peptides through extra electrostatic and Π-Π stacking interactions in addition to hydrophobic interactions. In tandem with reversed phase C18 chromatographic material, the new type of desalting method termed as TMTip(PPY-C18) can significantly improve the MS detection of phosphopeptides with multiple phosphate groups and other small hydrophilic peptides. It has been applied to not only tryptic digest of model proteins but also the analysis of complex lysates of zebrafish eggs. The number of detected phosphate groups on a peptide ranged from 1 to 6. Particularly, polypyrrole based method can also be used in basic condition. Thus it provides a useful means to handle peptides that may not be detectable in acidic condition. It can be envisioned that the TMTip(PPY-C18) should be able to facilitate the exploration of large scale phosphoproteome. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Desalting of phosphopeptides by tandem polypyrrole-c18 reverse phase micropipette tip (TMTip{sub PPY-C18}) based on hybrid electrostatic, {Pi}-{Pi} stacking and hydrophobic interactions for mass spectrometric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zheng Shi; Wang Xiaoli; Fu Jieying; Hu Xuejiao; Xiao Xiao; Huang Lulu; Zhou Youe [Key Laboratory of Pesticides and Chemical Biology, Ministry of Education, College of Chemistry, Central China Normal University, Wuhan, Hubei 430079 (China); Zhong Hongying, E-mail: hyzhong@mail.ccnu.edu.cn [Key Laboratory of Pesticides and Chemical Biology, Ministry of Education, College of Chemistry, Central China Normal University, Wuhan, Hubei 430079 (China)

    2012-04-29

    Highlights: Black-Right-Pointing-Pointer A new micropipette tip TMTip{sub PPY-C18} was developed for desalting of phosphopeptides. Black-Right-Pointing-Pointer TMTip{sub PPY-C18} is based on polypyrrole in tandem with C18 chromatographic material. Black-Right-Pointing-Pointer TMTip{sub PPY-C18} combines electrostatic, {Pi}-{Pi} stacking and hydrophobic interactions. Black-Right-Pointing-Pointer TMTip{sub PPY-C18} can be used in both acidic and basic experimental conditions. - Abstract: Desalting and concentration of peptides using reverse phase (RP) C18 chromatographic material based on hydrophobic interaction is a routine approach used in mass spectrometry (MS)-based proteomics. However, MS detection of small hydrophilic peptides, in particular, phosphopeptides that bear multiple negative charges, is challenging due to the insufficient binding to C18 stationary phase. We described here the development of a new desalting method that takes the unique properties of polypyrrole (PPY). The presence of positively charged nitrogen atoms under acidic conditions and polyunsaturated bonds in polypyrrole provide a prospect for enhanced adsorption of phosphopeptides or hydrophilic peptides through extra electrostatic and {Pi}-{Pi} stacking interactions in addition to hydrophobic interactions. In tandem with reversed phase C18 chromatographic material, the new type of desalting method termed as TMTip{sub PPY-C18} can significantly improve the MS detection of phosphopeptides with multiple phosphate groups and other small hydrophilic peptides. It has been applied to not only tryptic digest of model proteins but also the analysis of complex lysates of zebrafish eggs. The number of detected phosphate groups on a peptide ranged from 1 to 6. Particularly, polypyrrole based method can also be used in basic condition. Thus it provides a useful means to handle peptides that may not be detectable in acidic condition. It can be envisioned that the TMTip{sub PPY-C18} should be able to

  11. Estimating the Efficiency of Phosphopeptide Identification by Tandem Mass Spectrometry

    Science.gov (United States)

    Hsu, Chuan-Chih; Xue, Liang; Arrington, Justine V.; Wang, Pengcheng; Paez Paez, Juan Sebastian; Zhou, Yuan; Zhu, Jian-Kang; Tao, W. Andy

    2017-06-01

    Mass spectrometry has played a significant role in the identification of unknown phosphoproteins and sites of phosphorylation in biological samples. Analyses of protein phosphorylation, particularly large scale phosphoproteomic experiments, have recently been enhanced by efficient enrichment, fast and accurate instrumentation, and better software, but challenges remain because of the low stoichiometry of phosphorylation and poor phosphopeptide ionization efficiency and fragmentation due to neutral loss. Phosphoproteomics has become an important dimension in systems biology studies, and it is essential to have efficient analytical tools to cover a broad range of signaling events. To evaluate current mass spectrometric performance, we present here a novel method to estimate the efficiency of phosphopeptide identification by tandem mass spectrometry. Phosphopeptides were directly isolated from whole plant cell extracts, dephosphorylated, and then incubated with one of three purified kinases—casein kinase II, mitogen-activated protein kinase 6, and SNF-related protein kinase 2.6—along with 16O4- and 18O4-ATP separately for in vitro kinase reactions. Phosphopeptides were enriched and analyzed by LC-MS. The phosphopeptide identification rate was estimated by comparing phosphopeptides identified by tandem mass spectrometry with phosphopeptide pairs generated by stable isotope labeled kinase reactions. Overall, we found that current high speed and high accuracy mass spectrometers can only identify 20%-40% of total phosphopeptides primarily due to relatively poor fragmentation, additional modifications, and low abundance, highlighting the urgent need for continuous efforts to improve phosphopeptide identification efficiency. [Figure not available: see fulltext.

  12. Phosphopeptide enrichment by immobilized metal affinity chromatography

    DEFF Research Database (Denmark)

    Thingholm, Tine E.; Larsen, Martin R.

    2016-01-01

    Immobilized metal affinity chromatography (IMAC) has been the method of choice for phosphopeptide enrichment prior to mass spectrometric analysis for many years and it is still used extensively in many laboratories. Using the affinity of negatively charged phosphate groups towards positively...

  13. Identification of a major phosphopeptide in human tristetraprolin by phosphopeptide mapping and mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Heping Cao

    Full Text Available Tristetraprolin/zinc finger protein 36 (TTP/ZFP36 binds and destabilizes some pro-inflammatory cytokine mRNAs. TTP-deficient mice develop a profound inflammatory syndrome due to excessive production of pro-inflammatory cytokines. TTP expression is induced by various factors including insulin and extracts from cinnamon and green tea. TTP is highly phosphorylated in vivo and is a substrate for several protein kinases. Multiple phosphorylation sites are identified in human TTP, but it is difficult to assign major vs. minor phosphorylation sites. This study aimed to generate additional information on TTP phosphorylation using phosphopeptide mapping and mass spectrometry (MS. Wild-type and site-directed mutant TTP proteins were expressed in transfected human cells followed by in vivo radiolabeling with [32P]-orthophosphate. Histidine-tagged TTP proteins were purified with Ni-NTA affinity beads and digested with trypsin and lysyl endopeptidase. The digested peptides were separated by C18 column with high performance liquid chromatography. Wild-type and all mutant TTP proteins were localized in the cytosol, phosphorylated extensively in vivo and capable of binding to ARE-containing RNA probes. Mutant TTP with S90 and S93 mutations resulted in the disappearance of a major phosphopeptide peak. Mutant TTP with an S197 mutation resulted in another major phosphopeptide peak being eluted earlier than the wild-type. Additional mutations at S186, S296 and T271 exhibited little effect on phosphopeptide profiles. MS analysis identified the peptide that was missing in the S90 and S93 mutant protein as LGPELSPSPTSPTATSTTPSR (corresponding to amino acid residues 83-103 of human TTP. MS also identified a major phosphopeptide associated with the first zinc-finger region. These analyses suggest that the tryptic peptide containing S90 and S93 is a major phosphopeptide in human TTP.

  14. Evaluation of phosphopeptide enrichment strategies for quantitative TMT analysis of complex network dynamics in cancer-associated cell signalling

    Directory of Open Access Journals (Sweden)

    Benedetta Lombardi

    2015-03-01

    Full Text Available Defining alterations in signalling pathways in normal and malignant cells is becoming a major field in proteomics. A number of different approaches have been established to isolate, identify and quantify phosphorylated proteins and peptides. In the current report, a comparison between SCX prefractionation versus an antibody based approach, both coupled to TiO2 enrichment and applied to TMT labelled cellular lysates, is described. The antibody strategy was more complete for enriching phosphopeptides and allowed the identification of a large set of proteins known to be phosphorylated (715 protein groups with a minimum number of not previously known phosphorylated proteins (2.

  15. Identification of class I MHC-associated phosphopeptides as targets for cancer immunotherapy.

    Science.gov (United States)

    Zarling, Angela L; Polefrone, Joy M; Evans, Anne M; Mikesh, Leann M; Shabanowitz, Jeffrey; Lewis, Sarah T; Engelhard, Victor H; Hunt, Donald F

    2006-10-03

    Alterations in phosphorylation of cellular proteins are a hallmark of malignant transformation. Degradation of these phosphoproteins could generate cancer-specific class I MHC-associated phosphopeptides recognizable by CD8+ T lymphocytes. In a comparative analysis of phosphopeptides presented on the surface of melanoma, ovarian carcinoma, and B lymphoblastoid cells, we find 5 of 36 that are restricted to the solid tumors and common to both cancers. Differential presentation of these peptides can result from differential phosphorylation of the source proteins. Recognition of the peptides on cancer cells by phosphopeptide-specific CD8+ T lymphocytes validates the potential of these phosphopeptides as immunotherapeutic targets.

  16. Correction of errors in tandem mass spectrum extraction enhances phosphopeptide identification.

    Science.gov (United States)

    Hao, Piliang; Ren, Yan; Tam, James P; Sze, Siu Kwan

    2013-12-06

    The tandem mass spectrum extraction of phosphopeptides is more difficult and error-prone than that of unmodified peptides due to their lower abundance, lower ionization efficiency, the cofragmentation with other high-abundance peptides, and the use of MS(3) on MS(2) fragments with neutral losses. However, there are still no established methods to evaluate its correctness. Here we propose to identify and correct these errors via the combinatorial use of multiple spectrum extraction tools. We evaluated five free and two commercial extraction tools using Mascot and phosphoproteomics raw data from LTQ FT Ultra, in which RawXtract 1.9.9.2 identified the highest number of unique phosphopeptides (peptide expectation value exporting MS/MS fragments. We then corrected the errors by selecting the best extracted MGF file for each spectrum among the three tools for another database search. With the errors corrected, it results in the 22.4 and 12.2% increase in spectrum matches and unique peptide identification, respectively, compared with the best single method. Correction of errors in spectrum extraction improves both the sensitivity and confidence of phosphopeptide identification. Data analysis on nonphosphopeptide spectra indicates that this strategy applies to unmodified peptides as well. The identification of errors in spectrum extraction will promote the improvement of spectrum extraction tools in future.

  17. Remineralization effect of casein phosphopeptide-amorphous calcium phosphate on enamel white spot lesions. A quantitative energy dispersive X ray elemental analysis: An in vitro study

    Directory of Open Access Journals (Sweden)

    Fabrizio Guerra

    2014-06-01

    Full Text Available Background: The objective of this study was to evaluate, by means of elemental analysis the mineral density, calcium, and phosphorus weight percent of sound enamel, demineralized and CPP-ACP treated enamel. Elemental analysis allows elemental and isotopic composition of a biologic sample. It can be qualitative (determining what elements are present, and quantitative (determining how much of each are present. INCA Energy 250, Oxford Analytical Instruments Ltd. (UK, energy-dispersive X-ray spectroscopy system for elemental analysis was performed on random assigned samples. Methods: 12 sound premolars were extracted for orthodontic reason. Each tooth was sectioned by using a double-faced diamond microtome under water cooling into three section for a total of 36 samples and randomly assigned to three groups: Group 1 (control, Group 2 (WS: white spot , Group 3 (WST white spot treated of 12 samples each. Samples (Group 2 and Group 3 underwent equally to 24 h and 48 h of acid bath duration. Then all the treated samples (Group 3 were coated with CPP-ACP for 5 min before immersion into water twice a day. Group 2 served as control for enamel damage evaluation. Inca Point & ID, an analytic platform software for SEM was used for elemental analysis on samples from Group 1 (C, 2 (WS and Group 3 (WST in order to determine the weight % and atomic % presence of Ca and P. Results: The results of the samples analysis from the three Groups show different weight % and atomic% of Ca and P, and clearly reflect the different mineralization rates. Conclusions: 10% Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP complex, promotes remineralization in vitro. The results of this in vitro study completely agree with this statement. Clinical studies to investigate the intraoral effectiveness of topical applications of CPP-ACP on white spot lesions are required to confirm these results.

  18. PhosphoHunter: An Efficient Software Tool for Phosphopeptide Identification

    Directory of Open Access Journals (Sweden)

    Alessandra Tiengo

    2015-01-01

    Full Text Available Phosphorylation is a protein posttranslational modification. It is responsible of the activation/inactivation of disease-related pathways, thanks to its role of “molecular switch.” The study of phosphorylated proteins becomes a key point for the proteomic analyses focused on the identification of diagnostic/therapeutic targets. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS is the most widely used analytical approach. Although unmodified peptides are automatically identified by consolidated algorithms, phosphopeptides still require automated tools to avoid time-consuming manual interpretation. To improve phosphopeptide identification efficiency, a novel procedure was developed and implemented in a Perl/C tool called PhosphoHunter, here proposed and evaluated. It includes a preliminary heuristic step for filtering out the MS/MS spectra produced by nonphosphorylated peptides before sequence identification. A method to assess the statistical significance of identified phosphopeptides was also formulated. PhosphoHunter performance was tested on a dataset of 1500 MS/MS spectra and it was compared with two other tools: Mascot and Inspect. Comparisons demonstrated that a strong point of PhosphoHunter is sensitivity, suggesting that it is able to identify real phosphopeptides with superior performance. Performance indexes depend on a single parameter (intensity threshold that users can tune according to the study aim. All the three tools localized >90% of phosphosites.

  19. The use of titanium dioxide micro-columns to selectively isolate phosphopeptides from proteolytic digests

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Larsen, Martin R

    2009-01-01

    Titanium dioxide has very high affinity for phosphopeptides and it has become an efficient alternative to already existing methods for phosphopeptide enrichment from complex samples. Peptide loading in a highly acidic environment in the presence of 2,5-dihydroxybenzoic acid (DHB), phthalic acid......, or glycolic acid has been shown to improve selectivity significantly by reducing unspecific binding from nonphosphorylated peptides. The enriched phosphopeptides bound to the titanium dioxide are subsequently eluted from the micro-column using an alkaline buffer. Titanium dioxide chromatography is extremely...... tolerant towards most buffers used in biological experiments. It is highly robust and as such it has become one of the methods of choice in large-scale phospho-proteomics. Here we describe the protocol for phosphopeptide enrichment using titanium dioxide chromatography followed by desalting...

  20. Functionalized diamond nanopowder for phosphopeptides enrichment from complex biological fluids

    Energy Technology Data Exchange (ETDEWEB)

    Hussain, Dilshad [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Najam-ul-Haq, Muhammad, E-mail: najamulhaq@bzu.edu.pk [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Institute of Analytical Chemistry and Radiochemistry, Leopold-Franzens University, Innrain 80-82, A-6020 Innsbruck (Austria); Jabeen, Fahmida; Ashiq, Muhammad N.; Athar, Muhammad [Division of Analytical Chemistry, Institute of Chemical Sciences, Bahauddin Zakariya University, Multan 60800 (Pakistan); Rainer, Matthias; Huck, Christian W.; Bonn, Guenther K. [Institute of Analytical Chemistry and Radiochemistry, Leopold-Franzens University, Innrain 80-82, A-6020 Innsbruck (Austria)

    2013-05-02

    Graphical abstract: -- Highlights: •Derivatization of diamond nanopowder as IMAC and RP. •Characterization with SEM, EDX and FT-IR. •Phosphopeptide enrichment from standard as well as real samples. •Desalting and human serum profiling with reproducible results. •MALDI-MS analysis with database identification. -- Abstract: Diamond is known for its high affinity and biocompatibility towards biomolecules and is used exclusively in separation sciences and life science research. In present study, diamond nanopowder is derivatized as Immobilized Metal Ion Affinity Chromatographic (IMAC) material for the phosphopeptides enrichment and as Reversed Phase (C-18) media for the desalting of complex mixtures and human serum profiling through MALDI-TOF-MS. Functionalized diamond nanopowder is characterized by Fourier transform infrared (FT-IR) spectroscopy, scanning electron microscopy (SEM) and energy dispersive X-ray (EDX) spectroscopy. Diamond-IMAC is applied to the standard protein (β-casein), spiked human serum, egg yolk and non-fat milk for the phosphopeptides enrichment. Results show the selectivity of synthesized IMAC-diamond immobilized with Fe{sup 3+} and La{sup 3+} ions. To comprehend the elaborated use, diamond-IMAC is also applied to the serum samples from gall bladder carcinoma for the potential biomarkers. Database search is carried out by the Mascot program ( (www.matrixscience.com)) for the assignment of phosphorylation sites. Diamond nanopowder is thus a separation media with multifunctional use and can be applied to cancer protein profiling for the diagnosis and biomarker identification.

  1. Phosphopeptide Purification by IMAC with Fe(III) and Ga(III)

    DEFF Research Database (Denmark)

    Steen, Hanno; Stensballe, Allan

    2007-01-01

    INTRODUCTIONImmobilized metal ion affinity chromatography (IMAC) makes use of matrix-bound metals to affinity-purify phosphoproteins and phosphopeptides. Commonly used metals in early studies such as Ni(2+), Co(2+), Zn(2+), and Mn(2+) were shown to bind strongly to proteins with a high density...... of histidines. More recently, immobilized Fe(3+), Ga(3+), and Al(3+) metal ions have been used for the selective enrichment of phosphopeptides from complex proteolytic digest mixtures containing both phosphorylated and nonphosphorylated components. The use of a nitrilotriacetic acid (NTA) matrix over...... iminodiacetic-acid-modified matrices has been reported to provide an advantage in selectivity. The development of elution conditions that are directly compatible with MS analysis of the enriched phosphopeptide samples provides the option to interface IMAC and MS online. This protocol describes the enrichment...

  2. Phosphopeptide enrichment with cross-linked Os(II)(dmebpy)2Cl-derivatized acrylamide and vinylimidazole copolymer.

    Science.gov (United States)

    Zhou, Jie

    2018-01-15

    Reversible phosphorylation of proteins catalyzed by kinases and phosphatases plays a key regulatory role in intracellular biological processes. Protein phosphorylation profiling is still a challenge due to its low stoichiometry, diversity of phosphorylated protein forms, and dynamic nature of phosphorylation states. Mass spectrometry (MS) has been widely used for the characterization of protein phosphorylation, due to its high sensitivity and MS/MS sequencing capability. However, the low abundance and ionization efficiency of phosphorylated peptides and interference from their non-phosphorylated counterparts and other peptides in the enzymatic digests of proteins complicate the localization of phosphorylation sites in liquid chromatography (LC)/MS analysis. So the enrichment of phosphopeptides from the digests is often required before LC/MS. Immobilized metal affinity chromatography (IMAC) and metal oxide affinity chromatography (MOAC) are the two most commonly used enrichment techniques for phosphopeptides prior to MS analysis. Cross-linked Os(II)(4,4'-dimethyl-2,2'-bipyridine) 2 Cl-derivatized acrylamide and vinylimidazole copolymer were applied for the enrichment of phosphopeptides. Under neutral loading buffer conditions phosphopeptides bind on the Os-polymer without nonspecific binding of acidic peptides. Differential binding of monophosphorylated and multiply phosphorylated peptides can be achieved under different concentrations of imidazole. Sequential elution of bound phosphopeptides can be obtained with elution buffers of different pH values below 3. The loading buffers with imidazole can be aqueous or 7/3 H 2 O/ACN. Once phosphopeptides bind onto the Os-polymer, washing with water, 0.1% acetic acid (pH ~ 3) or 1/1 H 2 O/ACN 0.05% acetic acid (pH ~3) does not elute phosphopeptides. The Os-polymer does not show bias of binding and elution toward phosphopeptide standards with singly, doubly and triply phosphorylated sites. Cross-linked Os

  3. Parallel reaction monitoring on a Q Exactive mass spectrometer increases reproducibility of phosphopeptide detection in bacterial phosphoproteomics measurements.

    Science.gov (United States)

    Taumer, Christoph; Griesbaum, Lena; Kovacevic, Alen; Soufi, Boumediene; Nalpas, Nicolas C; Macek, Boris

    2018-03-29

    Increasing number of studies report the relevance of protein Ser/Thr/Tyr phosphorylation in bacterial physiology, yet the analysis of this type of modification in bacteria still presents a considerable challenge. Unlike in eukaryotes, where tens of thousands of phosphorylation events likely occupy more than two thirds of the proteome, the abundance of protein phosphorylation is much lower in bacteria. Even the state-of-the-art phosphopeptide enrichment protocols fail to remove the high background of abundant unmodified peptides, leading to low signal intensity and undersampling of phosphopeptide precursor ions in consecutive data-dependent MS runs. Consequently, large-scale bacterial phosphoproteomic datasets often suffer from poor reproducibility and a high number of missing values. Here we explore the application of parallel reaction monitoring (PRM) on a Q Exactive mass spectrometer in bacterial phosphoproteome analysis, focusing especially on run-to-run sampling reproducibility. In multiple measurements of identical phosphopeptide-enriched samples, we show that PRM outperforms data-dependent acquisition (DDA) in terms of detection frequency, reaching almost complete sampling efficiency, compared to 20% in DDA. We observe a similar trend over multiple heterogeneous phosphopeptide-enriched samples and conclude that PRM shows a great promise in bacterial phosphoproteomics analyses where reproducible detection and quantification of a relatively small set of phosphopeptides is desired. Bacterial phosphorylated peptides occur in low abundance compared to their unmodified counterparts, and are therefore rarely reproducibly detected in shotgun (DDA) proteomics measurements. Here we show that parallel reaction monitoring complements DDA analyses and makes detection of known, targeted phosphopeptides more reproducible. This will be of significance in replicated MS measurements that have a goal to reproducibly detect and quantify phosphopeptides of interest. Copyright

  4. Proteolytic Digestion and TiO2 Phosphopeptide Enrichment Microreactor for Fast MS Identification of Proteins

    Science.gov (United States)

    Deng, Jingren; Lazar, Iulia M.

    2016-04-01

    The characterization of phosphorylation state(s) of a protein is best accomplished by using isolated or enriched phosphoprotein samples or their corresponding phosphopeptides. The process is typically time-consuming as, often, a combination of analytical approaches must be used. To facilitate throughput in the study of phosphoproteins, a microreactor that enables a novel strategy for performing fast proteolytic digestion and selective phosphopeptide enrichment was developed. The microreactor was fabricated using 100 μm i.d. fused-silica capillaries packed with 1-2 mm beds of C18 and/or TiO2 particles. Proteolytic digestion-only, phosphopeptide enrichment-only, and sequential proteolytic digestion/phosphopeptide enrichment microreactors were developed and tested with standard protein mixtures. The protein samples were adsorbed on the C18 particles, quickly digested with a proteolytic enzyme infused over the adsorbed proteins, and further eluted onto the TiO2 microreactor for enrichment in phosphopeptides. A number of parameters were optimized to speed up the digestion and enrichments processes, including microreactor dimensions, sample concentrations, digestion time, flow rates, buffer compositions, and pH. The effective time for the steps of proteolytic digestion and enrichment was less than 5 min. For simple samples, such as standard protein mixtures, this approach provided equivalent or better results than conventional bench-top methods, in terms of both enzymatic digestion and selectivity. Analysis times and reagent costs were reduced ~10- to 15-fold. Preliminary analysis of cell extracts and recombinant proteins indicated the feasibility of integration of these microreactors in more advanced workflows amenable for handling real-world biological samples.

  5. Development of an enrichment method for endogenous phosphopeptide characterization in human serum.

    Science.gov (United States)

    La Barbera, Giorgia; Capriotti, Anna Laura; Cavaliere, Chiara; Ferraris, Francesca; Laus, Michele; Piovesana, Susy; Sparnacci, Katia; Laganà, Aldo

    2018-01-01

    The work describes the development of an enrichment method for the analysis of endogenous phosphopeptides in serum. Endogenous peptides can play significant biological roles, and some of them could be exploited as future biomarkers. In this context, blood is one of the most useful biofluids for screening, but a systematic investigation of the endogenous peptides, especially phosphorylated ones, is still lacking, mainly due to the lack of suitable analytical methods. Thus, in this paper, different phosphopeptide enrichment strategies were pursued, based either on metal oxide affinity chromatography (MOAC, in the form of commercial TiO 2 spin columns or magnetic graphitized carbon black-TiO 2 composite), or on immobilized metal ion affinity chromatography (IMAC, in the form of Ti 4+ -IMAC magnetic material or commercial Fe 3+ -IMAC spin columns). While MOAC strategies proved completely unsuccessful, probably due to interfering phospholipids displacing phosphopeptides, the IMAC materials performed very well. Different sample preparation strategies were tested, comprising direct dilution with the loading buffer, organic solvent precipitation, and lipid removal from the matrix, as well as the addition of phosphatase inhibitors during sample handling for maximized endogenous phosphopeptide enrichment. All data were acquired by a shotgun peptidomics approach, in which peptide samples were separated by reversed-phase nanoHPLC hyphenated with high-resolution tandem mass spectrometry. The devised method allowed the identification of 176 endogenous phosphopeptides in fresh serum added with inhibitors by the direct dilution protocol and the Ti 4+ -IMAC magnetic material enrichment, but good results could also be obtained from the commercial Fe 3+ -IMAC spin column adapted to the batch enrichment protocol.

  6. Enrichment and characterization of phosphopeptides by immobilized metal affinity chromatography (IMAC) and mass spectrometry

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N

    2009-01-01

    The combination of immobilized metal affinity chromatography (IMAC) and mass spectrometry is a widely used technique for enrichment and sequencing of phosphopeptides. In the IMAC method, negatively charged phosphate groups interact with positively charged metal ions (Fe3+, Ga3+, and Al3......+) and this interaction makes it possible to enrich phosphorylated peptides from rather complex peptide samples. Phosphopeptide enrichment by IMAC is sensitive and specific for peptide mixtures derived from pure proteins or simple protein mixtures. The selectivity of the IMAC method is, however, limited when working...... with peptide mixtures derived from highly complex samples, e.g., whole-cell extracts, where sample prefractionation is advisable. Furthermore, lowering the pH value of the sample loading buffer reduces nonspecific binding to the IMAC resin significantly, thereby improving the selectivity of IMAC...

  7. Phosphopeptide enrichment with inorganic nanofibers prepared by forcespinning technology

    Czech Academy of Sciences Publication Activity Database

    Křenková, Jana; Morávková, J.; Buk, J.; Foret, František

    2016-01-01

    Roč. 1427, JAN (2016), s. 8-15 ISSN 0021-9673 R&D Projects: GA ČR(CZ) GA14-06319S; GA ČR(CZ) GBP206/12/G014 Institutional support: RVO:68081715 Keywords : nanofibers * enrichment * phosphopeptides Subject RIV: CB - Analytical Chemistry , Separation Impact factor: 3.981, year: 2016

  8. Optimized IMAC-IMAC protocol for phosphopeptide recovery from complex biological samples

    DEFF Research Database (Denmark)

    Ye, Juanying; Zhang, Xumin; Young, Clifford

    2010-01-01

    using Fe(III)-NTA IMAC resin and it proved to be highly selective in the phosphopeptide enrichment of a highly diluted standard sample (1:1000) prior to MALDI MS analysis. We also observed that a higher iron purity led to an increased IMAC enrichment efficiency. The optimized method was then adapted...... to phosphoproteome analyses of cell lysates of high protein complexity. From either 20 microg of mouse sample or 50 microg of Drosophila melanogaster sample, more than 1000 phosphorylation sites were identified in each study using IMAC-IMAC and LC-MS/MS. We demonstrate efficient separation of multiply phosphorylated...... characterization of phosphoproteins in functional phosphoproteomics research projects....

  9. 3-Aminoquinoline/p-coumaric acid as a MALDI matrix for glycopeptides, carbohydrates, and phosphopeptides.

    Science.gov (United States)

    Fukuyama, Yuko; Funakoshi, Natsumi; Takeyama, Kohei; Hioki, Yusaku; Nishikaze, Takashi; Kaneshiro, Kaoru; Kawabata, Shin-Ichirou; Iwamoto, Shinichi; Tanaka, Koichi

    2014-02-18

    Glycosylation and phosphorylation are important post-translational modifications in biological processes and biomarker research. The difficulty in analyzing these modifications is mainly their low abundance and dissociation of labile regions such as sialic acids or phosphate groups. One solution in matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is to improve matrices for glycopeptides, carbohydrates, and phosphopeptides by increasing the sensitivity and suppressing dissociation of the labile regions. Recently, a liquid matrix 3-aminoquinoline (3-AQ)/α-cyano-4-hydroxycinnamic acid (CHCA) (3-AQ/CHCA), introduced by Kolli et al. in 1996, has been reported to increase sensitivity for carbohydrates or phosphopeptides, but it has not been systematically evaluated for glycopeptides. In addition, 3-AQ/CHCA enhances the dissociation of labile regions. In contrast, a liquid matrix 1,1,3,3-tetramethylguanidium (TMG, G) salt of p-coumaric acid (CA) (G3CA) was reported to suppress dissociation of sulfate groups or sialic acids of carbohydrates. Here we introduce a liquid matrix 3-AQ/CA for glycopeptides, carbohydrates, and phosphopeptides. All of the analytes were detected as [M + H](+) or [M - H](-) with higher or comparable sensitivity using 3-AQ/CA compared with 3-AQ/CHCA or 2,5-dihydroxybenzoic acid (2,5-DHB). The sensitivity was increased 1- to 1000-fold using 3-AQ/CA. The dissociation of labile regions such as sialic acids or phosphate groups and the fragmentation of neutral carbohydrates were suppressed more using 3-AQ/CA than using 3-AQ/CHCA or 2,5-DHB. 3-AQ/CA was thus determined to be an effective MALDI matrix for high sensitivity and the suppression of dissociation of labile regions in glycosylation and phosphorylation analyses.

  10. A novel tantalum-based sol-gel packed microextraction syringe for highly specific enrichment of phosphopeptides in MALDI-MS applications.

    Science.gov (United States)

    Çelikbıçak, Ömür; Atakay, Mehmet; Güler, Ülkü; Salih, Bekir

    2013-08-07

    A new tantalum-based sol-gel material was synthesized using a unique sol-gel synthesis pathway by PEG incorporation into the sol-gel structure without performing a calcination step. This improved its chemical and physical properties for the high capacity and selective enrichment of phosphopeptides from protein digests in complex biological media. The specificity of the tantalum-based sol-gel material for phosphopeptides was evaluated and compared with tantalum(V) oxide (Ta2O5) in different phosphopeptide enrichment applications. The tantalum-based sol-gel and tantalum(V) oxide were characterized in detail using FT-IR spectroscopy, X-ray diffraction (XRD) and scanning electron microscopy (SEM), and also using a surface area and pore size analyzer. In the characterization studies, the surface morphology, pore volume, crystallinity of the materials and PEG incorporation into the sol-gel structure to produce a more hydrophilic material were successfully demonstrated. The X-ray diffractograms of the two different materials were compared and it was noted that the broad signals of the tantalum-based sol-gel clearly represented the amorphous structure of the sol-gel material, which was more likely to create enough surface area and to provide more accessible tantalum atoms for phosphopeptides to be easily adsorbed when compared with the neat and more crystalline structure of Ta2O5. Therefore, the phosphopeptide enrichment performance of the tantalum-based sol-gels was found to be remarkably higher than the more crystalline Ta2O5 in our studies. Phosphopeptides at femtomole levels could be selectively enriched using the tantalum-based sol-gel and detected with a higher signal-to-noise ratio by matrix-assisted laser desorption/ionization-mass spectrometer (MALDI-MS). Moreover, phosphopeptides in a tryptic digest of non-fat bovine milk as a complex real-world biological sample were retained with higher yield using a tantalum-based sol-gel. Additionally, the sol-gel material

  11. Nanoparticle-modified monolithic pipette tips for phosphopeptide enrichment

    Czech Academy of Sciences Publication Activity Database

    Křenková, Jana; Foret, František

    2013-01-01

    Roč. 405, č. 7 (2013), s. 2175-2183 ISSN 1618-2642 R&D Projects: GA ČR(CZ) GAP301/11/2055 Grant - others:Jihomoravský kraj(CZ) 2SGA2721 Program:2SGA Institutional support: RVO:68081715 Keywords : nanoparticles * monolith * phosphopeptide Enrichment Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.578, year: 2013

  12. Global proteomic profiling of phosphopeptides using electron transfer dissociation tandem mass spectrometry

    DEFF Research Database (Denmark)

    Molina, Henrik; Horn, David M; Tang, Ning

    2007-01-01

    Electron transfer dissociation (ETD) is a recently introduced mass spectrometric technique that provides a more comprehensive coverage of peptide sequences and posttranslational modifications. Here, we evaluated the use of ETD for a global phosphoproteome analysis. In all, we identified a total...... of 1,435 phosphorylation sites from human embryonic kidney 293T cells, of which 1,141 ( approximately 80%) were not previously described. A detailed comparison of ETD and collision-induced dissociation (CID) modes showed that ETD identified 60% more phosphopeptides than CID, with an average of 40% more...... fragment ions that facilitated localization of phosphorylation sites. Although our data indicate that ETD is superior to CID for phosphorylation analysis, the two methods can be effectively combined in alternating ETD and CID modes for a more comprehensive analysis. Combining ETD and CID, from this single...

  13. Comprehensive and Reproducible Phosphopeptide Enrichment Using Iron Immobilized Metal Ion Affinity Chromatography (Fe-IMAC) Columns

    NARCIS (Netherlands)

    Ruprecht, Benjamin; Koch, Heiner; Medard, Guillaume; Mundt, Max; Kuster, Bernhard; Lemeer, Simone

    Advances in phosphopeptide enrichment methods enable the identification of thousands of phosphopeptides from complex samples. Current offline enrichment approaches using TiO2, Ti, and Fe immobilized metal ion affinity chromatography (IMAC) material in batch or microtip format are widely used, but

  14. On-bead chemical synthesis and display of phosphopeptides for affinity pull-down proteomics

    DEFF Research Database (Denmark)

    Malene, Brandt; Madsen, Jens C.; Bunkenborg, Jakob

    2006-01-01

    (aldehyde) at the C terminus for potential activity-based proteomics. The synthetic support-bound Bad phosphopeptides were able to pull down 14-3-3zeta. Furthermore, Bad phosphopeptides bound endogenous 14-3-3 proteins, and all seven members of the 14-3-3 family were identified by mass spectrometry......We describe a new method for phosphopeptide proteomics based on the solid-phase synthesis of phosphopeptides on beads suitable for affinity pull-down experiments. Peptide sequences containing the Bad Ser112 and Ser136 phosphorylation motifs were used as bait in affinity pull-down experiments....... In control experiments, none of the unphosphorylated Bad peptides bound transfected 14-3-3zeta or endogenous 14-3-3. We conclude that the combined synthesis and display of phosphopeptides on-bead is a fast and efficient method for affinity pull-down proteomics....

  15. Molar incisor hypomineralization treatment with casein phosphopeptide and amorphous calcium phosphate in children.

    Science.gov (United States)

    Pasini, Marco; Giuca, Maria R; Scatena, Martina; Gatto, Roberto; Caruso, Silvia

    2018-02-01

    The purpose of this study was to evaluate the sensitivity of teeth with MIH in children before and after the use of a tooth mousse containing casein phosphopeptide and amorphous calcium phosphate (CPP-ACP). Forty patients, both males and females, aged from 8 to 13 years old that had a molar with MIH hypersensitivity were included in this study. In the test group (20 subjects), a tooth mousse with CPP-ACP was used while fluoride toothpaste was used in the control group. Dental sensitivity to mechanical and thermal stimuli was evaluated before (T0) and 120 days after the beginning of the treatment (T1). In the test group, the thermal sensitivity decreased significantly (P0.05) was observed by comparing males with females. The use of the remineralizing agent containing CPP-ACP resulted in a significant improvement in dental sensitivity in patients with MIH.

  16. Treatment of post-orthodontic white spot lesions with casein phosphopeptide-stabilised amorphous calcium phosphate.

    Science.gov (United States)

    Bröchner, Ann; Christensen, Carsten; Kristensen, Bjarne; Tranæus, Sofia; Karlsson, Lena; Sonnesen, Liselotte; Twetman, Svante

    2011-06-01

    This study aims to investigate the effect of topical applications of 10% casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on white spot lesions (WSL) detected after treatment with fixed orthodontic appliances. Sixty healthy adolescents with ≥1 clinically visible WSL at debonding were recruited and randomly allocated to a randomised controlled trial with two parallel groups. The intervention group was instructed to topically apply a CPP-ACP -containing agent (Tooth Mousse, GC Europe) once daily and the subjects of the control group brushed their teeth with standard fluoride toothpaste. The intervention period was 4 weeks and the endpoints were quantitative light-induced fluorescence (QLF) on buccal surfaces of the upper incisors, cuspids and first premolars and visual scoring from digital photos. The attrition rate was 15%, mostly due to technical errors, and 327 lesions were included in the final evaluation. A statistically significant (p < 0.05) regression of the WSL was disclosed in both study groups compared to baseline, but there was no difference between the groups. The mean area of the lesions decreased by 58% in the CPP-ACP group and 26% in the fluoride group (p = 0.06). The QLF findings were largely reflected by the clinical scores. No side effects were reported. Topical treatment of white spot lesions after debonding of orthodontic appliances with a casein phosphopeptide-stabilised amorphous calcium phosphate agent resulted in significantly reduced fluorescence and a reduced area of the lesions after 4 weeks as assessed by QLF. The improvement was however not superior to the "natural" regression following daily use of fluoride toothpaste.

  17. Magnetic graphitic carbon nitride anion exchanger for specific enrichment of phosphopeptides.

    Science.gov (United States)

    Zhu, Gang-Tian; He, Xiao-Mei; Chen, Xi; Hussain, Dilshad; Ding, Jun; Feng, Yu-Qi

    2016-03-11

    Anion-exchange chromatography (AEX) is one of the chromatography-based methods effectively being used for phosphopeptide enrichment. However, the development of AEX materials with high specificity toward phosphopeptides is still less explored as compared to immobilized metal affinity chromatography (IMAC) or metal oxide affinity chromatography (MOAC). In this work, magnetic graphitic carbon nitride (MCN) was successfully prepared and introduced as a promising AEX candidate for phosphopeptide enrichment. Due to the extremely abundant content of nitrogen with basic functionality on the surface, this material kept excellent retention for phosphopeptides at pH as low as 1.8. Benefiting from the large binding capacity at such low pH, MCN showed remarkable specificity to capture phosphopeptides from tryptic digests of standard protein mixtures as well as nonfat milk and human serum. In addition, MCN was also applied to selective enrichment of phosphopeptides from the tryptic digests of rat brain lysate and 2576 unique phosphopeptides were successfully identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Phosphopeptide derivatization signatures to identify serine and threonine phosphorylated peptides by mass spectrometry.

    Science.gov (United States)

    Molloy, M P; Andrews, P C

    2001-11-15

    The development of rapid, global methods for monitoring states of protein phosphorylation would provide greater insight for understanding many fundamental biological processes. Current best practices use mass spectrometry (MS) to profile digests of purified proteins for evidence of phosphorylation. However, this approach is beset by inherent difficulties in both identifying phosphopeptides from within a complex mixture containing many other unmodified peptides and ionizing phosphopeptides in positive-ion MS. We have modified an approach that uses barium hydroxide to rapidly eliminate the phosphoryl group of serine and threonine modified amino acids, creating dehydroamino acids that are susceptible to nucleophilic derivatization. By derivatizing a protein digest with a mixture of two different alkanethiols, phosphopeptide-specific derivatives were readily distinguished by MS due to their characteristic ion-pair signature. The resulting tagged ion pairs accommodate simple and rapid screening for phosphopeptides in a protein digest, obviating the use of isotopically labeled samples for qualitative phosphopeptide detection. MALDI-MS is used in a first pass manner to detect derivatized phosphopeptides, while the remaining sample is available for tandem MS to reveal the site of derivatization and, thus, phosphorylation. We demonstrated the technique by identifying phosphopeptides from beta-casein and ovalbumin. The approach was further used to examine in vitro phosphorylation of recombinant human HSP22 by protein kinase C, revealing phosphorylation of Thr-63.

  19. The antigenic identity of human class I MHC phosphopeptides is critically dependent upon phosphorylation status.

    Science.gov (United States)

    Mohammed, Fiyaz; Stones, Daniel H; Zarling, Angela L; Willcox, Carrie R; Shabanowitz, Jeffrey; Cummings, Kara L; Hunt, Donald F; Cobbold, Mark; Engelhard, Victor H; Willcox, Benjamin E

    2017-08-15

    Dysregulated post-translational modification provides a source of altered self-antigens that can stimulate immune responses in autoimmunity, inflammation, and cancer. In recent years, phosphorylated peptides have emerged as a group of tumour-associated antigens presented by MHC molecules and recognised by T cells, and represent promising candidates for cancer immunotherapy. However, the impact of phosphorylation on the antigenic identity of phosphopeptide epitopes is unclear. Here we examined this by determining structures of MHC-bound phosphopeptides bearing canonical position 4-phosphorylations in the presence and absence of their phosphate moiety, and examining phosphopeptide recognition by the T cell receptor (TCR). Strikingly, two peptides exhibited major conformational changes upon phosphorylation, involving a similar molecular mechanism, which focussed changes on the central peptide region most critical for T cell recognition. In contrast, a third epitope displayed little conformational alteration upon phosphorylation. In addition, binding studies demonstrated TCR interaction with an MHC-bound phosphopeptide was both epitope-specific and absolutely dependent upon phosphorylation status. These results highlight the critical influence of phosphorylation on the antigenic identity of naturally processed class I MHC epitopes. In doing so they provide a molecular framework for understanding phosphopeptide-specific immune responses, and have implications for the development of phosphopeptide antigen-specific cancer immunotherapy approaches.

  20. Three-dimensional ordered titanium dioxide-zirconium dioxide film-based microfluidic device for efficient on-chip phosphopeptide enrichment.

    Science.gov (United States)

    Zhao, De; He, Zhongyuan; Wang, Gang; Wang, Hongzhi; Zhang, Qinghong; Li, Yaogang

    2016-09-15

    Microfluidic technology plays a significant role in separating biomolecules, because of its miniaturization, integration, and automation. Introducing micro/nanostructured functional materials can improve the properties of microfluidic devices, and extend their application. Inverse opal has a three-dimensional ordered net-like structure. It possesses a large surface area and exhibits good mass transport, making it a good candidate for bio-separation. This study exploits inverse opal titanium dioxide-zirconium dioxide films for on-chip phosphopeptide enrichment. Titanium dioxide-zirconium dioxide inverse opal film-based microfluidic devices were constructed from templates of 270-, 340-, and 370-nm-diameter poly(methylmethacrylate) spheres. The phosphopeptide enrichments of these devices were determined by matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. The device constructed from the 270-nm-diameter sphere template exhibited good comprehensive phosphopeptide enrichment, and was the best among these three devices. Because the size of opal template used in construction was the smallest, the inverse opal film therefore had the smallest pore sizes and the largest surface area. Enrichment by this device was also better than those of similar devices based on nanoparticle films and single component films. The titanium dioxide-zirconium dioxide inverse opal film-based device provides a promising approach for the efficient separation of various biomolecules. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Improving Loop Dependence Analysis

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven

    2017-01-01

    Programmers can no longer depend on new processors to have significantly improved single-thread performance. Instead, gains have to come from other sources such as the compiler and its optimization passes. Advanced passes make use of information on the dependencies related to loops. We improve th...

  2. Phosphotyrosine-based-phosphoproteomics scaled-down to biopsy level for analysis of individual tumor biology and treatment selection.

    Science.gov (United States)

    Labots, Mariette; van der Mijn, Johannes C; Beekhof, Robin; Piersma, Sander R; de Goeij-de Haas, Richard R; Pham, Thang V; Knol, Jaco C; Dekker, Henk; van Grieken, Nicole C T; Verheul, Henk M W; Jiménez, Connie R

    2017-06-06

    Mass spectrometry-based phosphoproteomics of cancer cell and tissue lysates provides insight in aberrantly activated signaling pathways and potential drug targets. For improved understanding of individual patient's tumor biology and to allow selection of tyrosine kinase inhibitors in individual patients, phosphoproteomics of small clinical samples should be feasible and reproducible. We aimed to scale down a pTyr-phosphopeptide enrichment protocol to biopsy-level protein input and assess reproducibility and applicability to tumor needle biopsies. To this end, phosphopeptide immunoprecipitation using anti-phosphotyrosine beads was performed using 10, 5 and 1mg protein input from lysates of colorectal cancer (CRC) cell line HCT116. Multiple needle biopsies from 7 human CRC resection specimens were analyzed at the 1mg-level. The total number of phosphopeptides captured and detected by LC-MS/MS ranged from 681 at 10mg input to 471 at 1mg HCT116 protein. ID-reproducibility ranged from 60.5% at 10mg to 43.9% at 1mg. Per 1mg-level biopsy sample, >200 phosphopeptides were identified with 57% ID-reproducibility between paired tumor biopsies. Unsupervised analysis clustered biopsies from individual patients together and revealed known and potential therapeutic targets. This study demonstrates the feasibility of label-free pTyr-phosphoproteomics at the tumor biopsy level based on reproducible analyses using 1mg of protein input. The considerable number of identified phosphopeptides at this level is attributed to an effective down-scaled immuno-affinity protocol as well as to the application of ID propagation in the data processing and analysis steps. Unsupervised cluster analysis reveals patient-specific profiles. Together, these findings pave the way for clinical trials in which pTyr-phosphoproteomics will be performed on pre- and on-treatment biopsies. Such studies will improve our understanding of individual tumor biology and may enable future p

  3. Manipulating the fragmentation patterns of phosphopeptides via gas-phase boron derivatization: determining phosphorylation sites in peptides with multiple serines.

    Science.gov (United States)

    Gronert, Scott; Li, Kathy H; Horiuchi, Mizue

    2005-12-01

    Trivalent boron species readily react with protonated phosphopeptides to give addition products with the loss of boron ligands. In the present study, trimethoxyborane (TMB), diisopropoxymethylborane (DIPM), and diethylmethoxyborane (DEMB) were allowed to react with four phosphopeptides, VsSF, LSsF, LsGASA, and VSGAsA (lower-case s indicates phosphoserine). Each of the phosphopeptides contains one serine that is phosphorylated and one that is not. Under collision-activated dissociation (CAD) conditions, the boron-derivatized peptides give fragmentation patterns that differ significantly from that of the protonated phosphopeptide. The patterns vary, depending on the number of labile (i.e., alkoxy) ligands on the boron. In general, boron derivatization increases the yield of phosphate-containing sequence ions, but dramatic effects are only seen with certain reagent/peptide combinations. However, the suite of reagents provides a means of altering and increasing the information content of phosphopeptide CAD spectra.

  4. Treatment of post-orthodontic white spot lesions with casein phosphopeptide-stabilised amorphous calcium phosphate

    DEFF Research Database (Denmark)

    Bröchner, Ann; Christensen, Carsten; Kristensen, Bjarne

    2010-01-01

    This study aims to investigate the effect of topical applications of 10% casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on white spot lesions (WSL) detected after treatment with fixed orthodontic appliances. Sixty healthy adolescents with >/=1 clinically visible WSL at debonding were...... findings were largely reflected by the clinical scores. No side effects were reported. Topical treatment of white spot lesions after debonding of orthodontic appliances with a casein phosphopeptide-stabilised amorphous calcium phosphate agent resulted in significantly reduced fluorescence and a reduced...

  5. Comparison the effect of casein phosphopeptide amorphous calcium phosphate and fluoride varnish on dentin hypersensitivity reduction

    Directory of Open Access Journals (Sweden)

    Leila Pishevar

    2015-09-01

    Full Text Available Introduction: Nowadays, several casein components such as casein phosphopeptide amorphous calcium phosphate (CPP-ACP are vastly considered as a suitable replacement for fluoride component . The aim of this study was to compare the effect of CPP-ACP paste and fluoride varnish on dentin hypersensitivity reduction. Materials & Methods: In this clinical trial study, thirty adult patients between the ages of 20-50 years, presenting with the chief complaint of dentin hypersensitivity were examined. The loss of dentin was less than 0.5mm. The subjects divided into three groups: In groups I and II, patients were treated using CPP-ACP and fluoride varnish following manufacturer instructions. Group III received placebo gel. A visual analog scale was used to assess subjects' response to compressed air and ice stimuli at baseline, 7 days, 28 days and 60 days after treatment. Data was analyzed by One Way Analysis of Variance (ANOVA, Duncan Post Hoc test using the SPSS software version 21. Results: The results showed significant statistical difference between the groups (P<0.05. In fluoride varnish group and CPP-ACP paste group, the dentin hypersensitivity significantly decreased when baseline scores compared to post treatment scores at 7, 28, 60 days (P<0.05. There was no significant statistical difference in dentin hypersensitivity reduction in fluoride varnish and CPP-ACP paste groups . Conclusion: The results of this study showed that both of fluoride varnish and CPP-ACP paste effectively reduced dentinal hypersensitivity compared with placebo-control group.

  6. Comparative evaluation of Nano-hydroxyapatite and casein Phosphopeptide-amorphous calcium phosphate on the remineralization potential of early enamel lesions: An in vitro study

    Directory of Open Access Journals (Sweden)

    Anshul Sharma

    2017-01-01

    Full Text Available Background: Benefits of remineralizing agents in a wide variety of formulations have been proved beneficial in caries management. Casein phosphopeptide-amorphous calcium phosphate (CPP–ACP nanocomplex has been recommended and used as remineralizing agent. Nano-hydroxyapatite (n-HAp is one of the most biocompatible and bioactive material having wide range of application in dentistry, but does it excel better compared to CPP-ACP. Aims: To evaluate and compare the remineralizing efficiency of the paste containing hydroxyapatite and casein phosphopeptide-amorphous calcium phosphate. Settings and Design: The study was an in vitro single blinded study with lottery method of randomization approved by the Institutional Ethics Committee. Materials and methods: 30 non carious premolar teeth. The teeth were demineralized and divided into 2 groups and subjected to remineralization. The samples were analysed for surface hardness and mineral content. Statistical Analysis: Student t’ test and repeated measures of ANOVA was applied. Results: Average hardness in Nano-hydroxyapatite group increased to 340 ± 31.70 SD and 426 ± 50.62 SD for 15 and 30 days respectively and that of (CPP–ACP, 355.83 ± 38.55 SD and 372.67 ± 53.63 SD. The change in the hardness values was not statistically significant with P value of 0.39 (P > 0.05. Calcium and Phosphorous levels increased in both the groups but was not significant. Conclusion: Both the agents used are effective in causing remineralization of enamel. Nano-hydroxyapatite is more effective as compared to Casein phosphopeptide-amorphous calcium phosphate, in increasing the Calcium and Phosphorus content of enamel, and this effect is more evident over a longer treatment period. Key Message: Remineralizing agents are a boon for caries management. With the advent of many formulations it is difficult to clinically select the agent. This study compares the remineralizing potential of Casein

  7. Enzymatic Dissolution of Biocomposite Solids Consisting of Phosphopeptides to Form Supramolecular Hydrogels

    KAUST Repository

    Shi, Junfeng

    2015-10-14

    Enzyme-catalyzed dephosphorylation is essential for biomineralization and bone metabolism. Here we report the exploration of using enzymatic reaction to transform biocomposites of phosphopeptides and calcium (or strontium) ions to supramolecular hydrogels as a mimic of enzymatic dissolution of biominerals. 31P NMR shows that strong affinity between the phosphopeptides and alkaline metal ions (e.g., Ca2+ or Sr2+) induces the formation of biocomposites as precipitates. Electron microscopy reveals that the enzymatic reaction regulates the morphological transition from particles to nanofibers. Rheology confirms the formation of a rigid hydrogel. As the first example of enzyme-instructed dissolution of a solid to form supramolecular nanofibers/hydrogels, this work provides an approach to generate soft materials with desired properties, expands the application of supramolecular hydrogelators, and offers insights to control the demineralization of calcified soft tissues.

  8. Iminodiacetic acid-modified magnetic poly(2-hydroxyethyl methacrylate)-based microspheres for phosphopeptide enrichment

    Czech Academy of Sciences Publication Activity Database

    Novotná, L.; Emmerová, T.; Horák, Daniel; Kučerová, Z.; Tichá, M.

    2010-01-01

    Roč. 1217, č. 51 (2010), s. 8032-8040 ISSN 0021-9673 R&D Projects: GA AV ČR(CZ) KAN401220801; GA ČR GA203/09/0857; GA ČR GAP503/10/0664 Institutional research plan: CEZ:AV0Z40500505 Keywords : IMAC phosphopeptide separation * IDA-modified magnetic microspheres * Porcine pepsin A Subject RIV: EE - Microbiology, Virology Impact factor: 4.194, year: 2010

  9. Monodisperse REPO4 (RE = Yb, Gd, Y) hollow microspheres covered with nanothorns as affinity probes for selectively capturing and labeling phosphopeptides.

    Science.gov (United States)

    Cheng, Gong; Zhang, Ji-Lin; Liu, Yan-Lin; Sun, De-Hui; Ni, Jia-Zuan

    2012-02-13

    Rare-earth phosphate microspheres with unique structures were developed as affinity probes for the selective capture and tagging of phosphopeptides. Prickly REPO(4) (RE = Yb, Gd, Y) monodisperse microspheres, that have hollow structures, low densities, high specific surface areas, and large adsorptive capacities were prepared by an ion-exchange method. The elemental compositions and crystal structures of these affinity probes were confirmed by energy-dispersive spectroscopy (EDS), powder X-ray diffraction (XRD), and Fourier-transform infrared (FTIR) spectroscopy. The morphologies of these compounds were investigated using scanning electron microscopy (SEM), transmission electron microscopy (TEM), and nitrogen-adsorption isotherms. The potential ability of these microspheres for selectively capturing and labeling target biological molecules was evaluated by using protein-digestion analysis and a real sample as well as by comparison with the widely used TiO(2) affinity microspheres. These results show that these porous rare-earth phosphate microspheres are highly promising probes for the rapid purification and recognition of phosphopeptides. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Iron oxide nanoparticle coating of organic polymer-based monolithic columns for phosphopeptide enrichment

    Czech Academy of Sciences Publication Activity Database

    Křenková, Jana; Foret, František

    2011-01-01

    Roč. 34, 16-17 (2011), s. 2106-2112 ISSN 1615-9306 R&D Projects: GA ČR(CZ) GAP301/11/2055; GA ČR(CZ) GPP206/11/P004 Grant - others:Jihomoravský kraj(CZ) 2SGA2721 Program:2SGA Institutional research plan: CEZ:AV0Z40310501 Keywords : iron oxide nanoparticles * monolithic column * phosphopeptide enrichment Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.733, year: 2011

  11. Hierarchically templated beads with tailored pore structure for phosphopeptide capture and phosphoproteomics

    DEFF Research Database (Denmark)

    Wierzbicka, Celina; Torsetnes, Silje B.; Jensen, Ole N.

    2017-01-01

    Two templating approaches to produce imprinted phosphotyrosine capture beads with a controllable pore structure are reported and compared with respect to their ability to enrich phosphopeptides from a tryptic peptide mixture. The beads were prepared by the polymerization of urea-based host monomers...... and crosslinkers inside the pores of macroporous silica beads with both free and immobilized template. In the final step the silica was removed by fluoride etching resulting in mesoporous polymer replicas with narrow pore size distributions, pore diameters ≈ 10 nm and surface area > 260 m2 g-1. The beads displayed...

  12. Neighbor-directed histidine N(τ) alkylation. A route to imidazolium-containing phosphopeptide macrocycles

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Wen-Jian [National Cancer Inst., Frederick, MD (United States); Park, Jung-Eun [National Cancer Inst., Bethesda, MD (United States); Grant, Robert [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Lai, Christopher C. [National Cancer Inst., Frederick, MD (United States); Kelley, James A. [National Cancer Inst., Frederick, MD (United States); Yaffe, Michael B. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Lee, Kyung S. [National Cancer Inst., Bethesda, MD (United States); Burke, Terrence R. [National Cancer Inst., Frederick, MD (United States)

    2015-07-07

    Our recently discovered, selective, on-resin route to N(τ)-alkylated imidazolium-containing histidine residues affords new strategies for peptide mimetic design. In this, we demonstrate the use of this chemistry to prepare a series of macrocyclic phosphopeptides, in which imidazolium groups serve as ring-forming junctions. These cationic moieties subsequently serve to charge-mask the phosphoamino acid group that directed their formation. Furthermore, neighbor-directed histidine N(τ)-alkylation opens the door to new families of phosphopeptidomimetics for use in a range of chemical biology contexts.

  13. Uranyl Photocleavage of Phosphopeptides Yields Truncated C-Terminally Amidated Peptide Products

    DEFF Research Database (Denmark)

    Elnegaard, Rasmus L B; Møllegaard, Niels Erik; Zhang, Qiang

    2017-01-01

    The uranyl ion (UO2(2+) ) binds phosphopeptides with high affinity, and when irradiated with UV-light, it can cleave the peptide backbone. In this study, high-accuracy tandem mass spectrometry and enzymatic assays were used to characterise the photocleavage products resulting from the uranyl phot...... challenges in green pharmaceutical chemistry....... photocleavage reaction of a tetraphosphorylated β-casein model peptide. We show that the primary photocleavage products of the uranyl-catalysed reaction are C-terminally amidated. This could be of great interest to the pharmaceutical industry, as efficient peptide amidation reactions are one of the top...

  14. Undesirable charge-enhancement of isobaric tagged phosphopeptides leads to reduced identification efficiency

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Palmisano, Giuseppe; Kjeldsen, Frank

    2010-01-01

    labeling of proteins and peptides for in vitro cell culture systems (stable isotope labeling using amino acids in cell culture, SILAC) or isobaric peptide labels such as isobaric tags for relative and absolute quantitation (iTRAQ) and tandem mass tags (TMT) for both in vitro and in vivo systems...... identifications observed for large cell- or tissue-based data sets from labeled and nonlabeled peptide mixtures. Ammonia vapor sprayed perpendicular to the electrospray needle during ionization resulted in an overall decrease in the average charge states and a concomitant increase in phosphopeptide...

  15. Phosphopeptide Enrichment by Covalent Chromatography after Derivatization of Protein Digests Immobilized on Reversed-Phase Supports

    Science.gov (United States)

    Nika, Heinz; Nieves, Edward; Hawke, David H.; Angeletti, Ruth Hogue

    2013-01-01

    A rugged sample-preparation method for comprehensive affinity enrichment of phosphopeptides from protein digests has been developed. The method uses a series of chemical reactions to incorporate efficiently and specifically a thiol-functionalized affinity tag into the analyte by barium hydroxide catalyzed β-elimination with Michael addition using 2-aminoethanethiol as nucleophile and subsequent thiolation of the resulting amino group with sulfosuccinimidyl-2-(biotinamido) ethyl-1,3-dithiopropionate. Gentle oxidation of cysteine residues, followed by acetylation of α- and ε-amino groups before these reactions, ensured selectivity of reversible capture of the modified phosphopeptides by covalent chromatography on activated thiol sepharose. The use of C18 reversed-phase supports as a miniaturized reaction bed facilitated optimization of the individual modification steps for throughput and completeness of derivatization. Reagents were exchanged directly on the supports, eliminating sample transfer between the reaction steps and thus, allowing the immobilized analyte to be carried through the multistep reaction scheme with minimal sample loss. The use of this sample-preparation method for phosphopeptide enrichment was demonstrated with low-level amounts of in-gel-digested protein. As applied to tryptic digests of α-S1- and β-casein, the method enabled the enrichment and detection of the phosphorylated peptides contained in the mixture, including the tetraphosphorylated species of β-casein, which has escaped chemical procedures reported previously. The isolates proved highly suitable for mapping the sites of phosphorylation by collisionally induced dissociation. β-Elimination, with consecutive Michael addition, expanded the use of the solid-phase-based enrichment strategy to phosphothreonyl peptides and to phosphoseryl/phosphothreonyl peptides derived from proline-directed kinase substrates and to their O-sulfono- and O-linked β-N-acetylglucosamine (O

  16. Adding casein phosphopeptide-amorphous calcium phosphate to sports drinks to eliminate in vitro erosion.

    Science.gov (United States)

    Ramalingam, L; Messer, L B; Reynolds, E C

    2005-01-01

    Enamel erosion can occur with frequent consumption of sports drinks. The purpose of this study was to determine a minimal concentration of casein phosphopeptide-stabilized amorphous calcium phosphate (CPP-ACP) which when added to a sports drink would eliminate such erosion in vitro. Human enamel specimens were immersed in: (1) the sports drink Powerade; (2) Poweradeplus 4 concentrations of CPP-ACP (0.063%, 0.09%, 0.125%, 0.25%); or (3) double deionized water. Windows of test and control enamel were profiled, and the enamel surface characteristics were examined under scanning electron microscopy (SEM). The pH of test solutions increased and the titratable acidity decreased with increasing CPP-ACP concentrations. Erosive step lesions occurred in specimens immersed in Powerade (mean depth=38.70kA +/- 5.60), which were eliminated by the addition of CPP-ACP to Powerade at all test concentrations except 0.063% CPP-ACP. Microscopic surface irregularities on test enamel were observed, apparent as adherent granules or globules. These may represent redeposited mineral phases following mobilization of calcium and phosphate from CPP-ACP. Tasters in a taste panel could not distinguish Powerade from Powerade plus 0.125% CPP-ACP. Adding casein phosphopeptide-stabilized amorphous calcium phosphate to the sports drink Powerade significantly reduced the beverage's erosivity without affecting the product's taste.

  17. [Preventive and remineralization effect over incipient lesions of caries decay by phosphopeptide-amorphous calcium phosphate].

    Science.gov (United States)

    Juárez-López, María Lilia Adriana; Hernández-Palacios, Rosa Diana; Hernández-Guerrero, Juan Carlos; Jiménez-Farfán, Dolores; Molina-Frechero, Nelly

    2014-01-01

    INTRODUCTION. Dental caries continues to affect a large percentage of Mexican children and currently advises that if diagnosed at an early stage can be reversed with minimally invasive treatments. The casein phosphopeptide amorphous calcium phosphate known as CPP-ACP is a phosphoprotein capable of releasing calcium and phosphate ions in the oral environment promoting remineralization. OBJECTIVE. To evaluate the effect of CPP-ACP with fluoride added in a scholar preventive program. MATERIAL AND METHODS. A cuasi- experimental study was conducted in 104 schools of six years old. The children were classified into three groups and received six months biweekly applications of different treatments: casein phosphopeptide amorphous calcium phosphate added fluoride (CPP-ACPF), sodium fluoride (NaF) and a control group. Clinical evaluation was performed with the laser fluorescence technique (Diagnodent model 2095). 1340 teeth were included: 294 teeth with incipient lesions and 1,046 healthy teeth. Statistical tests of χ2 y Mc Nemar were used. RESULTS. In the group that received the application of CPP-ACPF, 38% of incipient carious lesions were remineralizing compared with 21% in the group receiving the NaF (p application biweekly for six months of CPP-ACPF showed a protective and remineralizing effect on incipient carious lesions. His action was better than the application of NaF. However, to reduce the impact from dental caries in schoolchildren is important to have a comprehensive preventive approach that includes promoting self-care, as well as the application of sealants.

  18. Magnetite/Ceria-Codecorated Titanoniobate Nanosheet: A 2D Catalytic Nanoprobe for Efficient Enrichment and Programmed Dephosphorylation of Phosphopeptides.

    Science.gov (United States)

    Min, Qianhao; Li, Siyuan; Chen, Xueqin; Abdel-Halim, E S; Jiang, Li-Ping; Zhu, Jun-Jie

    2015-05-13

    Global characterization and in-depth understanding of phosphoproteome based on mass spectrometry (MS) desperately needs a highly efficient affinity probe during sample preparation. In this work, a ternary nanocomposite of magnetite/ceria-codecorated titanoniobate nanosheet (MC-TiNbNS) was synthesized by the electrostatic assembly of Fe3O4 nanospheres and in situ growth of CeO 2 nanoparticles on pre-exfoliated titanoniobate and eventually utilized as the probe and catalyst for the enrichment and dephosphorylation of phosphopeptides. The two-dimensional (2D) structured titanoniobate nanosheet not only promoted the efficacy of capturing phosphopeptides with enlarged surface area, but also functioned as a substrate for embracing the magnetic anchor Fe3O4 to enable magnetic separation and mimic phosphatase CeO2 to produce identifying signatures of phosphopeptides. Compared to single-component TiNbNS or CeO2 nanoparticles, the ternary nanocomposite provided direct evidence of the number of phosphorylation sites while maintaining the enrichment efficiency. Moreover, by altering the on-sheet CeO2 coverage, the dephosphorylation activity could be fine-tuned, generating continuously adjustable signal intensities of both phosphopeptides and their dephosphorylated tags. Exhaustive detection of both mono- and multiphosphorylated peptides with precise counting of their phosphorylation sites was achieved in the primary mass spectra in the cases of digests of standard phosphoprotein and skim milk, as well as a more complex biological sample, human serum. With the resulting highly informative mass spectra, this multifunctional probe can be used as a promising tool for the fast and comprehensive characterization of phosphopeptides in MS-based phosphoproteomics.

  19. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    Science.gov (United States)

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  20. Synthesis of Thermally Switchable Chromatographic Materials with Immobilized Ti4+ for Enrichment of Phosphopeptides by Reversible Addition Fragmentation Chain Transfer Polymerization

    Science.gov (United States)

    Wang, Di; Cao, Zhihan; Pang, Xinzhu; Deng, Yulin; Li, Bo; Dai, Rongji

    2018-01-01

    Reversible phosphorylation of proteins is one of the most crucial types of post-translational modifications (PTMs). And it shows significant work in diversified biological processes. However, the separation technology of phosphorylated peptides is still an analytical challenge in phosphoproteomics, because phosphopeptides are alway in low stoichiometry. Thus, enrichment of phosphopeptides before detection is indispensable. In this study, a novel temperature regulated separation protocol was developed. Silica@p (NIPAAm-co-IPPA)-Ti4+, a new Ti(IV)-IMAC (Immobilized Metal Affinity chromatography) materials was synthesized by reversible addition fragmentation chain transfer polymerization (RAFT). By the unique thermally responsive properties of poly(N-isopropylacrylamide) (PNIPAAm), the captured phosphorylated peptides could be released by changing temperature only without applying any other eluant which could damage the phosphopeptides. We employed isopropanol phosphonic acid (IPPA) as an IMAC ligand for the immobilization of Ti(IV) which could increase the specific adsorption of phosphopeptides. The enrichment and release properties were examined by treatment with pyridoxal 5’-phosphate (PLP) and casein phosphopeptides (CPP). Two phosphorylated compounds above have temperature-stimulated binding to Ti4+. Finally, silica@p (NIPAAm-co-IPPA)-Ti4+ was successfully employed in pretreatment of phosphopeptides in a tryptic digest of a-casein and human serum albumin (HSA). The results indicated a great potential of this new temperature-responsive material in phosphoproteomics study.

  1. An SH2 domain model of STAT5 in complex with phospho-peptides define "STAT5 Binding Signatures".

    Science.gov (United States)

    Gianti, Eleonora; Zauhar, Randy J

    2015-05-01

    The signal transducer and activator of transcription 5 (STAT5) is a member of the STAT family of proteins, implicated in cell growth and differentiation. STAT activation is regulated by phosphorylation of protein monomers at conserved tyrosine residues, followed by binding to phospho-peptide pockets and subsequent dimerization. STAT5 is implicated in the development of severe pathological conditions, including many cancer forms. However, nowadays a few STAT5 inhibitors are known, and only one crystal structure of the inactive STAT5 dimer is publicly available. With a view to enabling structure-based drug design, we have: (1) analyzed phospho-peptide binding pockets on SH2 domains of STAT5, STAT1 and STAT3; (2) generated a model of STAT5 bound to phospho-peptides; (3) assessed our model by docking against a class of known STAT5 inhibitors (Müller et al. in ChemBioChem 9:723-727, 2008); (4) used molecular dynamics simulations to optimize the molecular determinants responsible for binding and (5) proposed unique "Binding Signatures" of STAT5. Our results put in place the foundations to address STAT5 as a target for rational drug design, from sequence, structural and functional perspectives.

  2. An SH2 domain model of STAT5 in complex with phospho-peptides define ``STAT5 Binding Signatures''

    Science.gov (United States)

    Gianti, Eleonora; Zauhar, Randy J.

    2015-05-01

    The signal transducer and activator of transcription 5 (STAT5) is a member of the STAT family of proteins, implicated in cell growth and differentiation. STAT activation is regulated by phosphorylation of protein monomers at conserved tyrosine residues, followed by binding to phospho-peptide pockets and subsequent dimerization. STAT5 is implicated in the development of severe pathological conditions, including many cancer forms. However, nowadays a few STAT5 inhibitors are known, and only one crystal structure of the inactive STAT5 dimer is publicly available. With a view to enabling structure-based drug design, we have: (1) analyzed phospho-peptide binding pockets on SH2 domains of STAT5, STAT1 and STAT3; (2) generated a model of STAT5 bound to phospho-peptides; (3) assessed our model by docking against a class of known STAT5 inhibitors (Müller et al. in ChemBioChem 9:723-727, 2008); (4) used molecular dynamics simulations to optimize the molecular determinants responsible for binding and (5) proposed unique "Binding Signatures" of STAT5. Our results put in place the foundations to address STAT5 as a target for rational drug design, from sequence, structural and functional perspectives.

  3. Structural Basis for the Presentation of Tumor-Associated MHC Class II-Restricted Phosphopeptides to CD4+ T Cells

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y.; Depontieu, F; Sidney, J; Salay, T; Engelhard, V; Hunt, D; Sette, A; Topalian, S; Mariuzza, R

    2010-01-01

    Dysregulated protein phosphorylation is a hallmark of malignant transformation. Transformation can generate major histocompatibility complex (MHC)-bound phosphopeptides that are differentially displayed on tumor cells for specific recognition by T cells. To understand how phosphorylation alters the antigenic identity of self-peptides and how MHC class II molecules present phosphopeptides for CD4{sup +} T-cell recognition, we determined the crystal structure of a phosphopeptide derived from melanoma antigen recognized by T cells-1 (pMART-1), selectively expressed by human melanomas, in complex with HLA-DR1. The structure revealed that the phosphate moiety attached to the serine residue at position P5 of pMART-1 is available for direct interactions with T-cell receptor (TCR) and that the peptide N-terminus adopts an unusual conformation orienting it toward TCR. This structure, combined with measurements of peptide affinity for HLA-DR1 and of peptide-MHC recognition by pMART-1-specific T cells, suggests that TCR recognition is focused on the N-terminal portion of pMART-1. This recognition mode appears to be distinct from that of foreign antigen complexes but is remarkably reminiscent of the way autoreactive TCRs engage self- or altered self-peptides, consistent with the tolerogenic nature of tumor-host immune interactions.

  4. An Innovative Approach to Treat Incisors Hypomineralization (MIH: A Combined Use of Casein Phosphopeptide-Amorphous Calcium Phosphate and Hydrogen Peroxide—A Case Report

    Directory of Open Access Journals (Sweden)

    Stefano Mastroberardino

    2012-01-01

    Full Text Available Molar Incisor Hypomineralization (MIH is characterized by a developmentally derived deficiency in mineral enamel. Affected teeth present demarcated enamel opacities, ranging from white to brown; also hypoplasia can be associated. Patient frequently claims aesthetic discomfort if anterior teeth are involved. This problem leads patients to request a bleaching treatment to improve aestheticconditions.Nevertheless, hydrogen peroxide can produce serious side-effects, resulting from further mineral loss. Microabrasion and/or a composite restoration are the treatments of choice in teeth with mild/moderate MIH, but they also need enamel loss. Recently, a new remineralizing agent based on Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP has been proposed to be effective in hypomineralized enamel, improving also aesthetic conditions. The present paper presents a case report of a young man with white opacities on incisors treated with a combined use of CPP-ACP mousse and hydrogen peroxide gel to correct the aesthetic defect. The patient was instructed to use CPP-ACP for two hours per day for three months in order to obtain enamel remineralization followed by a combined use of CPP-ACP and bleaching agent for further two months. At the end of this five-month treatment, a noticeable aesthetic improvement of the opacities was observed.

  5. An Innovative Approach to Treat Incisors Hypomineralization (MIH): A Combined Use of Casein Phosphopeptide-Amorphous Calcium Phosphate and Hydrogen Peroxide—A Case Report

    Science.gov (United States)

    Mastroberardino, Stefano; Campus, Guglielmo; Strohmenger, Laura; Villa, Alessandro; Cagetti, Maria Grazia

    2012-01-01

    Molar Incisor Hypomineralization (MIH) is characterized by a developmentally derived deficiency in mineral enamel. Affected teeth present demarcated enamel opacities, ranging from white to brown; also hypoplasia can be associated. Patient frequently claims aesthetic discomfort if anterior teeth are involved. This problem leads patients to request a bleaching treatment to improve aestheticconditions.Nevertheless, hydrogen peroxide can produce serious side-effects, resulting from further mineral loss. Microabrasion and/or a composite restoration are the treatments of choice in teeth with mild/moderate MIH, but they also need enamel loss. Recently, a new remineralizing agent based on Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP) has been proposed to be effective in hypomineralized enamel, improving also aesthetic conditions. The present paper presents a case report of a young man with white opacities on incisors treated with a combined use of CPP-ACP mousse and hydrogen peroxide gel to correct the aesthetic defect. The patient was instructed to use CPP-ACP for two hours per day for three months in order to obtain enamel remineralization followed by a combined use of CPP-ACP and bleaching agent for further two months. At the end of this five-month treatment, a noticeable aesthetic improvement of the opacities was observed. PMID:23243519

  6. Analysis and Improvement of Fireworks Algorithm

    OpenAIRE

    Xi-Guang Li; Shou-Fei Han; Chang-Qing Gong

    2017-01-01

    The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA), this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opp...

  7. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  8. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16...

  9. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  10. Improvement of electrophoresis performance by spectral analysis ...

    African Journals Online (AJOL)

    This paper describes a new design of standard agarose gel electrophoresis procedure for nucleic acids analysis. The electrophoresis was improved by using the real-time spectral analysis of the samples to increase its performance. A laser beam illuminated the analysed sample at wavelength with the highest absorption of ...

  11. The decreased of Streptococcus Mutans growth after topical application of phosphopeptide amorphous calcium phosphate paste

    Directory of Open Access Journals (Sweden)

    Tika Faradina Araf

    2011-07-01

    Full Text Available Casein Phosphopeptide-Amorphous Calcium Phosphate (CPP-ACP paste is a topical application substance that consisted of a series of milk derivative peptide as a result of phosphorylation and has an antibacterial activity. The objective of this research was to find out the difference of Streptococcus mutans growth before and after CPP-ACP paste given topically to child's teeth. The method of the research was a quasi-experiment. Research samples were 10 students of MI Al Falah Islamic Boarding School, Jatinangor, West Jawa Indonesia and collected with purposive sampling technique. This research used dental plaque from child's teeth before and after applicated by CPP-ACP paste. The plaque was cultivated in selective media Tryptone Yeast Cysteine Sucrose Bacitracin (TYCSB with repeated twice. Streptococcus mutans colony in TYCSB were counted by Stuart colony counter and statistically analyzed based on paired t-test. The results showed the average of Streptococcus mutans growth before applicated CPP-ACP paste was 57.05, whereas after applicated CPP-ACP paste for 1 days was 9.4; for 3 days was 2.85, and for 14 days was 1.7 colony. The research concluded that there was a decrease of Streptococcus mutans growth in isolate plaque after CPP-ACP paste topically given to child's teeth.

  12. Improved security analysis of Fugue-256 (poster)

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde; Bagheri, Nasoor

    2011-01-01

    We present some improved analytical results as part of the ongoing work on the analysis of Fugue-256 hash function, a second round candidate in the NIST's SHA3 competition. First we improve Aumasson and Phans' integral distinguisher on the 5.5 rounds of the final transformation of Fugue-256 to 16.......5 rounds. Next we improve the designers' meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Finally, we comment on possible methods to obtain free-start distinguishers and free-start collisions for Fugue-256. © 2011 Springer-Verlag....

  13. Effect of casein phosphopeptide - amorphous calcium phosphate containing chewing gum on salivary concentration of calcium and phosphorus: An in-vivo study

    Directory of Open Access Journals (Sweden)

    B P Santhosh

    2012-01-01

    Full Text Available Aim: Caries clinical trials of sugar-free chewing gum have shown that the gum is noncariogenic and in fact has anticariogenic effect through the stimulation of saliva. Sugar-free gums, therefore, may be an excellent delivery vehicle for safe and effective additive, capable of promoting enamel remineralization. Casein phosphopeptide - amorphous calcium phosphate (CPP-ACP nanocomplexes incorporated into sugar-free chewing gum have shown to remineralize enamel subsurface lesions in situ. So this study was conducted to evaluate the effect of CPP-ACP containing sugar-free chewing gum on salivary concentration of calcium and phosphorous. Materials and Methods : Unstimulated saliva from each 24 selected subjects was collected. Then each subject was given two pellets of chewing gum containing CPP-ACP and asked to chew for a period of 20 min, after which saliva samples were collected from each individual. Once all the samples were collected they were assessed for calcium and phosphorous concentration using affiliated reagent kits and photometer. Statistical Analysis Used: Data obtained were analyzed using student′s paired t test. Results: Significant difference was found in the calcium and phosphorus concentration of saliva before and after chewing CPP-ACP containing chewing gum. Conclusions: Chewing of CPP-ACP containing chewing gum showed a significant increase in the salivary concentration of calcium for a prolonged period of time hence it may help in the remineralization of tooth surfaces.

  14. MS1 Peptide Ion Intensity Chromatograms in MS2 (SWATH) Data Independent Acquisitions. Improving Post Acquisition Analysis of Proteomic Experiments*

    Science.gov (United States)

    Rardin, Matthew J.; Schilling, Birgit; Cheng, Lin-Yang; MacLean, Brendan X.; Sorensen, Dylan J.; Sahu, Alexandria K.; MacCoss, Michael J.; Vitek, Olga; Gibson, Bradford W.

    2015-01-01

    Quantitative analysis of discovery-based proteomic workflows now relies on high-throughput large-scale methods for identification and quantitation of proteins and post-translational modifications. Advancements in label-free quantitative techniques, using either data-dependent or data-independent mass spectrometric acquisitions, have coincided with improved instrumentation featuring greater precision, increased mass accuracy, and faster scan speeds. We recently reported on a new quantitative method called MS1 Filtering (Schilling et al. (2012) Mol. Cell. Proteomics 11, 202–214) for processing data-independent MS1 ion intensity chromatograms from peptide analytes using the Skyline software platform. In contrast, data-independent acquisitions from MS2 scans, or SWATH, can quantify all fragment ion intensities when reference spectra are available. As each SWATH acquisition cycle typically contains an MS1 scan, these two independent label-free quantitative approaches can be acquired in a single experiment. Here, we have expanded the capability of Skyline to extract both MS1 and MS2 ion intensity chromatograms from a single SWATH data-independent acquisition in an Integrated Dual Scan Analysis approach. The performance of both MS1 and MS2 data was examined in simple and complex samples using standard concentration curves. Cases of interferences in MS1 and MS2 ion intensity data were assessed, as were the differentiation and quantitation of phosphopeptide isomers in MS2 scan data. In addition, we demonstrated an approach for optimization of SWATH m/z window sizes to reduce interferences using MS1 scans as a guide. Finally, a correlation analysis was performed on both MS1 and MS2 ion intensity data obtained from SWATH acquisitions on a complex mixture using a linear model that automatically removes signals containing interferences. This work demonstrates the practical advantages of properly acquiring and processing MS1 precursor data in addition to MS2 fragment ion

  15. MS1 Peptide Ion Intensity Chromatograms in MS2 (SWATH) Data Independent Acquisitions. Improving Post Acquisition Analysis of Proteomic Experiments.

    Science.gov (United States)

    Rardin, Matthew J; Schilling, Birgit; Cheng, Lin-Yang; MacLean, Brendan X; Sorensen, Dylan J; Sahu, Alexandria K; MacCoss, Michael J; Vitek, Olga; Gibson, Bradford W

    2015-09-01

    Quantitative analysis of discovery-based proteomic workflows now relies on high-throughput large-scale methods for identification and quantitation of proteins and post-translational modifications. Advancements in label-free quantitative techniques, using either data-dependent or data-independent mass spectrometric acquisitions, have coincided with improved instrumentation featuring greater precision, increased mass accuracy, and faster scan speeds. We recently reported on a new quantitative method called MS1 Filtering (Schilling et al. (2012) Mol. Cell. Proteomics 11, 202-214) for processing data-independent MS1 ion intensity chromatograms from peptide analytes using the Skyline software platform. In contrast, data-independent acquisitions from MS2 scans, or SWATH, can quantify all fragment ion intensities when reference spectra are available. As each SWATH acquisition cycle typically contains an MS1 scan, these two independent label-free quantitative approaches can be acquired in a single experiment. Here, we have expanded the capability of Skyline to extract both MS1 and MS2 ion intensity chromatograms from a single SWATH data-independent acquisition in an Integrated Dual Scan Analysis approach. The performance of both MS1 and MS2 data was examined in simple and complex samples using standard concentration curves. Cases of interferences in MS1 and MS2 ion intensity data were assessed, as were the differentiation and quantitation of phosphopeptide isomers in MS2 scan data. In addition, we demonstrated an approach for optimization of SWATH m/z window sizes to reduce interferences using MS1 scans as a guide. Finally, a correlation analysis was performed on both MS1 and MS2 ion intensity data obtained from SWATH acquisitions on a complex mixture using a linear model that automatically removes signals containing interferences. This work demonstrates the practical advantages of properly acquiring and processing MS1 precursor data in addition to MS2 fragment ion

  16. Structure of a 14-3-3σ–YAP phosphopeptide complex at 1.15 Å resolution

    International Nuclear Information System (INIS)

    Schumacher, Benjamin; Skwarczynska, Malgorzata; Rose, Rolf; Ottmann, Christian

    2010-01-01

    The first structure of a 14-3-3 protein–phosphopeptide complex is reported at 1.15 Å resolution. The YAP 14-3-3-binding motif is revealed for the first time using crystallographic tools. The 14-3-3 proteins are a class of eukaryotic acidic adapter proteins, with seven isoforms in humans. 14-3-3 proteins mediate their biological function by binding to target proteins and influencing their activity. They are involved in pivotal pathways in the cell such as signal transduction, gene expression, enzyme activation, cell division and apoptosis. The Yes-associated protein (YAP) is a WW-domain protein that exists in two transcript variants of 48 and 54 kDa in humans. By transducing signals from the cytoplasm to the nucleus, YAP is important for transcriptional regulation. In both variants, interaction with 14-3-3 proteins after phosphorylation of Ser127 is important for nucleocytoplasmic trafficking, via which the localization of YAP is controlled. In this study, 14-3-3σ has been cloned, purified and crystallized in complex with a phosphopeptide from the YAP 14-3-3-binding domain, which led to a crystal that diffracted to 1.15 Å resolution. The crystals belonged to space group C222 1 , with unit-cell parameters a = 82.3, b = 112.1, c = 62.9 Å

  17. Characterization of biases in phosphopeptide enrichment by Ti(4+)-immobilized metal affinity chromatography and TiO2 using a massive synthetic library and human cell digests

    NARCIS (Netherlands)

    Matheron, Lucrece; van den Toorn, Henk; Heck, Albert J R; Mohammed, Shabaz

    2014-01-01

    Outcomes of comparative evaluations of enrichment methods for phosphopeptides depend highly on the experimental protocols used, the operator, the source of the affinity matrix, and the samples analyzed. Here, we attempt such a comparative study exploring a very large synthetic library containing

  18. Spinel-type manganese ferrite (MnFe2O4) microspheres: A novel affinity probe for selective and fast enrichment of phosphopeptides.

    Science.gov (United States)

    Long, Xing-Yu; Li, Jia-Yuan; Sheng, Dong; Lian, Hong-Zhen

    2017-05-01

    The spinel-type magnetic manganese ferrite (MnFe 2 O 4 ) microspheres synthesized by simple solvothermal method were used as a novel adsorbent for selective enrichment and effective isolation of phosphopeptides. The uniform MnFe 2 O 4 magnetic affinity microspheres (MAMSs) had a narrow particle size distribution between 250 and 260nm, and displayed superparamagnetism with a saturation magnetization value of 67.0emu/g. Comprehensively, the possible formation mechanism of MnFe 2 O 4 microspheres with ferric and manganous sources as dual precursors was elucidated by comparison with those of Fe 3 O 4 nanoparticles and MnOOH nanosheets respectively with either ferric or manganous source as single precursor. It was suggested that the spherical or sheet nanostructures could be achieved via secondary recrystallization or Ostwald ripening. The MnFe 2 O 4 MAMSs probe exhibited excellent dispersibility in aqueous solution, and rapid magnetic separation within 15s, as well as good reusability. More importantly, MnFe 2 O 4 was highly selective for phosphopeptides because of the strong coordination interaction between metal ions (Fe 3+ and Mn 2+ ) and phosphate groups of phosphopeptdies. This high specificity was demonstrated by effectively enriching phosphopeptides from digest mixture of β-casein and bovine serum albumin (BSA) with high content of non-phosphopeptides, and embodied further in phosphopeptides enrichment from non-fat milk digests and human serum. Consequently, the prepared MnFe 2 O 4 affinity materials are expected to possess great potential in phosphoproteome research. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Economic Analysis of Improved Alkaline Water Electrolysis

    International Nuclear Information System (INIS)

    Kuckshinrichs, Wilhelm; Ketelaer, Thomas; Koj, Jan Christian

    2017-01-01

    Alkaline water electrolysis (AWE) is a mature hydrogen production technology and there exists a range of economic assessments for available technologies. For advanced AWEs, which may be based on novel polymer-based membrane concepts, it is of prime importance that development comes along with new configurations and technical and economic key process parameters for AWE that might be of interest for further economic assessments. This paper presents an advanced AWE technology referring to three different sites in Europe (Germany, Austria, and Spain). The focus is on financial metrics, the projection of key performance parameters of advanced AWEs, and further financial and tax parameters. For financial analysis from an investor’s (business) perspective, a comprehensive assessment of a technology not only comprises cost analysis but also further financial analysis quantifying attractiveness and supply/market flexibility. Therefore, based on cash flow (CF) analysis, a comprehensible set of metrics may comprise levelised cost of energy or, respectively, levelized cost of hydrogen (LCH) for cost assessment, net present value (NPV) for attractiveness analysis, and variable cost (VC) for analysis of market flexibility. The German AWE site turns out to perform best in all three financial metrics (LCH, NPV, and VC). Though there are slight differences in investment cost and operation and maintenance cost projections for the three sites, the major cost impact is due to the electricity cost. Although investment cost is slightly lower and labor cost is significantly lower in Spain, the difference can not outweigh the higher electricity cost compared to Germany. Given the assumption that the electrolysis operators are customers directly and actively participating in power markets, and based on the regulatory framework in the three countries, in this special case electricity cost in Germany is lowest. However, as electricity cost is profoundly influenced by political decisions as

  20. Analysis and Improvement of Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Xi-Guang Li

    2017-02-01

    Full Text Available The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA, this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opposition-based learning initialization population. Secondly a new explosion amplitude mechanism for the optimal firework is proposed. In addition, the adaptive t-distribution mutation for non-optimal individuals and elite opposition-based learning for the optimal individual are used. Finally, a new selection strategy, namely Disruptive Selection, is proposed to reduce the running time of the algorithm compared with FWA. In our simulation, we apply the CEC2013 standard functions and compare the proposed algorithm (IFWA with SPSO2011, FWA, EFWA and dynFWA. The results show that the proposed algorithm has better overall performance on the test functions.

  1. Casein phosphopeptides and CaCl2increase penicillin production and cause an increment in microbody/peroxisome proteins in Penicillium chrysogenum.

    Science.gov (United States)

    Domínguez-Santos, Rebeca; Kosalková, Katarina; García-Estrada, Carlos; Barreiro, Carlos; Ibáñez, Ana; Morales, Alejandro; Martín, Juan-Francisco

    2017-03-06

    Transport of penicillin intermediates and penicillin secretion are still poorly characterized in Penicillium chrysogenum (re-identified as Penicillium rubens). Calcium (Ca 2+ ) plays an important role in the metabolism of filamentous fungi, and casein phosphopeptides (CPP) are involved in Ca 2+ internalization. In this study we observe that the effect of CaCl 2 and CPP is additive and promotes an increase in penicillin production of up to 10-12 fold. Combination of CaCl 2 and CPP greatly promotes expression of the three penicillin biosynthetic genes. Comparative proteomic analysis by 2D-DIGE, identified 39 proteins differentially represented in P. chrysogenum Wisconsin 54-1255 after CPP/CaCl 2 addition. The most interesting group of overrepresented proteins were a peroxisomal catalase, three proteins of the methylcitrate cycle, two aminotransferases and cystationine β-synthase, which are directly or indirectly related to the formation of penicillin amino acid precursors. Importantly, two of the enzymes of the penicillin pathway (isopenicillin N synthase and isopenicillin N acyltransferase) are clearly induced after CPP/CaCl 2 addition. Most of these overrepresented proteins are either authentic peroxisomal proteins or microbody-associated proteins. This evidence suggests that addition of CPP/CaCl 2 promotes the formation of penicillin precursors and the penicillin biosynthetic enzymes in peroxisomes and vesicles, which may be involved in transport and secretion of penicillin. Penicillin biosynthesis in Penicillium chrysogenum is one of the best characterized secondary metabolism processes. However, the mechanism by which penicillin is secreted still remains to be elucidated. Taking into account the role played by Ca 2+ and CPP in the secretory pathway and considering the positive effect that Ca 2+ exerts on penicillin production, the analysis of global protein changes produced after CPP/CaCl 2 addition is very helpful to decipher the processes related to the

  2. Methodological considerations for improving Western blot analysis.

    Science.gov (United States)

    MacPhee, Daniel J

    2010-01-01

    The need for a technique that could allow the determination of antigen specificity of antisera led to the development of a method that allowed the production of a replica of proteins, which had been separated electrophoretically on polyacrylamide gels, on to a nitrocellulose membrane. This method was coined Western blotting and is very useful to study the presence, relative abundance, relative molecular mass, post-translational modification, and interaction of specific proteins. As a result it is utilized routinely in many fields of scientific research such as chemistry, biology and biomedical sciences. This review serves to touch on some of the methodological conditions that should be considered to improve Western blot analysis, particularly as a guide for graduate students but also scientists who wish to continue adapting this now fundamental research tool. Copyright 2009 Elsevier Inc. All rights reserved.

  3. In vitro evaluation of casein phosphopeptide-amorphous calcium phosphate effect on the shear bond strength of dental adhesives to enamel

    Directory of Open Access Journals (Sweden)

    Niloofar Shadman

    2015-01-01

    Full Text Available Background: Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP is applied for remineralization of early caries lesions or tooth sensitivity conditions and may affect subsequent resin bonding. This in vitro study investigated the effect of CPP-ACP on the shear bond strength of dental adhesives to enamel. Materials and Methods: Sixty extracted human molar teeth were selected and randomly divided into three groups and six subgroups. Buccal or lingual surfaces of teeth were prepared to create a flat enamel surface. Adhesives used were Tetric N-Bond, AdheSE and AdheSE One F. In three subgroups, before applying adhesives, enamel surfaces were treated with Tooth Mousse CPP-ACP for one hour, rinsed and stored in 37°C temperature with 100% humidity. This procedure was repeated for 5 days and then adhesives were applied and Tetric N-Ceram composite was adhered to the enamel. This procedure was also fulfilled for the other three subgroups without CPP-ACP treatment. After 24 hour water storage, samples were tested for shear bond strength test in a universal testing machine. Failure modes were determined by stereomicroscope. Data were analyzed by t-test and one-way analysis of variance with P 0.05. In non-applied CPP-ACP subgroups, there were statistically significant differences among all subgroups. Tetric N-Bond had the highest and AdheSE One F had the lowest shear bond strength. Conclusion: CPP-ACP application reduces the shear bond strength of AdheSE and AdheSE One F to enamel but not Tetric N-Bond.

  4. The molecular basis of FHA domain:phosphopeptide binding specificity and implications for phospho-dependent signaling mechanisms.

    Science.gov (United States)

    Durocher, D; Taylor, I A; Sarbassova, D; Haire, L F; Westcott, S L; Jackson, S P; Smerdon, S J; Yaffe, M B

    2000-11-01

    Forkhead-associated (FHA) domains are a class of ubiquitous signaling modules that appear to function through interactions with phosphorylated target molecules. We have used oriented peptide library screening to determine the optimal phosphopeptide binding motifs recognized by several FHA domains, including those within a number of DNA damage checkpoint kinases, and determined the X-ray structure of Rad53p-FHA1, in complex with a phospho-threonine peptide, at 1.6 A resolution. The structure reveals a striking similarity to the MH2 domains of Smad tumor suppressor proteins and reveals a mode of peptide binding that differs from SH2, 14-3-3, or PTB domain complexes. These results have important implications for DNA damage signaling and CHK2-dependent tumor suppression, and they indicate that FHA domains play important and unsuspected roles in S/T kinase signaling mechanisms in prokaryotes and eukaryotes.

  5. Identification of 14-3-3 Proteins Phosphopeptide-Binding Specificity Using an Affinity-Based Computational Approach.

    Directory of Open Access Journals (Sweden)

    Zhao Li

    Full Text Available The 14-3-3 proteins are a highly conserved family of homodimeric and heterodimeric molecules, expressed in all eukaryotic cells. In human cells, this family consists of seven distinct but highly homologous 14-3-3 isoforms. 14-3-3σ is the only isoform directly linked to cancer in epithelial cells, which is regulated by major tumor suppressor genes. For each 14-3-3 isoform, we have 1,000 peptide motifs with experimental binding affinity values. In this paper, we present a novel method for identifying peptide motifs binding to 14-3-3σ isoform. First, we propose a sampling criteria to build a predictor for each new peptide sequence. Then, we select nine physicochemical properties of amino acids to describe each peptide motif. We also use auto-cross covariance to extract correlative properties of amino acids in any two positions. Finally, we consider elastic net to predict affinity values of peptide motifs, based on ridge regression and least absolute shrinkage and selection operator (LASSO. Our method tests on the 1,000 known peptide motifs binding to seven 14-3-3 isoforms. On the 14-3-3σ isoform, our method has overall pearson-product-moment correlation coefficient (PCC and root mean squared error (RMSE values of 0.84 and 252.31 for N-terminal sublibrary, and 0.77 and 269.13 for C-terminal sublibrary. We predict affinity values of 16,000 peptide sequences and relative binding ability across six permutated positions similar with experimental values. We identify phosphopeptides that preferentially bind to 14-3-3σ over other isoforms. Several positions on peptide motifs are in the same amino acid category with experimental substrate specificity of phosphopeptides binding to 14-3-3σ. Our method is fast and reliable and is a general computational method that can be used in peptide-protein binding identification in proteomics research.

  6. Continuous improvement projects: an authorship bibliometric analysis.

    Science.gov (United States)

    Gonzalez Aleu, Fernando; Van Aken, Eileen M

    2017-06-12

    Purpose The purpose of this paper is to describe the current research on hospital continuous improvement projects (CIPs) from an author characteristics' perspective. This work addresses the following questions: who are the predominant research authors in hospital CIPs? To what extent are the research communities collaborating in distinct research groups? How internationalized has hospital CIPs research become with respect to author location? Design/methodology/approach A systematic literature review was conducted, identifying 302 academic publications related to hospital CIPs. Publications were analyzed using: author, quantity, diversity, collaboration, and impact. Findings Hospital CIPs are increasingly attracting new scholars each year. Based on the authors' analysis, authors publishing in this area can be described as a relatively new international community given the countries represented. Originality/value This paper describes the current hospital CIP research by assessing author characteristics. Future work should examine additional attributes to characterize maturity such as how new knowledge is being created and to what extent new knowledge is being disseminated to practitioners.

  7. SPORTS ORGANIZATIONS MANAGEMENT IMPROVEMENT: A SURVEY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alin Molcut

    2015-07-01

    Full Text Available Sport organizations exist to perform tasks that can only be executed through cooperative effort, and sport management is responsible for the performance and success of these organizations. The main of the paper is to analyze several issues of management sports organizations in order to asses their quality management. In this respect a questionnaire has been desingned for performing a survey analysis through a statistical approach. Investigation was conducted over a period of 3 months, and have been questioned a number of managers and coaches of football, all while pursuing an activity in football clubs in the counties of Timis and Arad, the level of training for children and juniors. The results suggest that there is a significant interest for the improvement of management across teams of children and under 21 clubs, emphasis on players' participation and rewarding performance. Furthermore, we can state that in the sports clubs there is established a vision and a mission as well as the objectives of the club's general refers to both sporting performance, and financial performance.

  8. Neighbor-Directed Histidine N (s)–Alkylation: A Route to Imidazolium-Containing Phosphopeptide Macrocycles-Biopolymers | Center for Cancer Research

    Science.gov (United States)

    Our recently discovered, selective, on-resin route to N(s)-alkylated imidazolium-containing histidine residues affords new strategies for peptide mimetic design. In this, we demonstrate the use of this chemistry to prepare a series of macrocyclic phosphopeptides, in which imidazolium groups serve as ring-forming junctions. Interestingly, these cationic moieties subsequently serve to charge-mask the phosphoamino acid group that directed their formation.

  9. Improved security analysis of Fugue-256

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Bagheri, Nasour; Knudsen, Lars Ramkilde

    2011-01-01

    in the G transform. Next we improve the designers’ meet-in-the-middle preimage attack on Fugue-256 from 2480 time and memory to 2416. Next we study the security of Fugue-256 against free-start distinguishers and free-start collisions. In this direction, we use an improved variant of the differential...... transform is mapped with a transform to a 256-bit digest. In this paper, we present some improved as well as new analytical results of Fugue-256 (with lengthpadding). First we improve Aumasson and Phans’ integral distinguisher on the 5.5 rounds of the G transform to 16.5 rounds, thus showing weak diffusion...

  10. Hydrophilic Nb{sup 5+}-immobilized magnetic core–shell microsphere – A novel immobilized metal ion affinity chromatography material for highly selective enrichment of phosphopeptides

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xueni; Liu, Xiaodan; Feng, Jianan [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China); Li, Yan, E-mail: yanli@fudan.edu.cn [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China); Deng, Chunhui [Department of Chemistry and Institutes of Biomedical Sciences, Fudan University, Shanghai 200433 (China); Duan, Gengli [Pharmaceutical Analysis Department, School of Pharmacy, Fudan University, Shanghai 201203 (China)

    2015-06-23

    Highlights: • A new IMAC material (Fe{sub 3}O{sub 4}@PD-Nb{sup 5+}) was synthesized. • The strong magnetic behaviors of the microspheres ensure fast and easy separation. • The enrichment ability was tested by human serum and nonfat milk. • The results were compared with other IMAC materials including the commercial kits. • All results proved the good enrichment ability, especially for multiphosphopeptides. - Abstract: Rapid and selective enrichment of phosphopeptides from complex biological samples is essential and challenging in phosphorylated proteomics. In this work, for the first time, niobium ions were directly immobilized on the surface of polydopamine-coated magnetic microspheres through a facile and effective synthetic route. The Fe{sub 3}O{sub 4}@polydopamine-Nb{sup 5+} (denoted as Fe{sub 3}O{sub 4}@PD-Nb{sup 5+}) microspheres possess merits of high hydrophilicity and good biological compatibility, and demonstrated low limit of detection (2 fmol). The selectivity was also basically satisfactory (β-casein:BSA = 1:500) to capture phosphopeptides. They were also successfully applied for enrichment of phosphopeptides from real biological samples such as human serum and nonfat milk. Compared with Fe{sub 3}O{sub 4}@PD-Ti{sup 4+} microspheres, the Fe{sub 3}O{sub 4}@PD-Nb{sup 5+} microspheres exhibit superior selectivity to multi-phosphorylated peptides, and thus may be complementary to the conventional IMAC materials.

  11. Facile preparation of SiO2/TiO2 composite monolithic capillary column and its application in enrichment of phosphopeptides.

    Science.gov (United States)

    Wang, Shao-Ting; Wang, Meng-Ya; Su, Xin; Yuan, Bi-Feng; Feng, Yu-Qi

    2012-09-18

    A novel SiO(2)/TiO(2) composite monolithic capillary column was prepared by sol-gel technology and successfully applied to enrich phosphopeptides as a metal oxide affinity chromatography (MOAC) material. For the monolith preparation, tetramethoxysilane (TMOS) and tetrabutoxytitanium (TBOT) were used as silica and titania source, respectively, and glycerol was introduced to attenuate the activity of titanium precursor, which provided a mild synthetic condition. The prepared monolith was characterized by energy dispersive X-ray spectroscopy (EDX) and X-ray diffraction (XRD). The results revealed an approximate 1/2 molar ratio of titanium to silica as well as an atom-scale homogeneity in the framework. The scanning electron microscopy (SEM) results demonstrated an excellent anchorage between the column and the inner capillary wall, and nitrogen adsorption-desorption experiments showed a bimodal porosity with a narrow mesopore distribution around 3.6 nm. The prepared monolith was then applied for selective enrichment of phosphopeptides from the digestion mixture of phosphoproteins and bovine serum albumin (BSA) as well as human blood serum, nonfat milk, and egg white using an in-tube solid phase microextraction (SPME) system. Our results showed that SiO(2)/TiO(2) composite monolithic capillary column could efficiently enrich the phosphopeptides from complex matrixes. To the best of our knowledge, this is the first attempt for preparing the silica-metal composite monolithic capillary column, which offers the promising application of the monolith on phosphoproteomics study.

  12. Occupational Analysis: A Continuous Improvement Approach

    National Research Council Canada - National Science Library

    Duffy, Tom

    1998-01-01

    .... In doing so, the Air Force has implemented "Quality Air Force (QAF)" (AF Handbook 90-502). QAF is a leadership commitment that inspires trust, teamwork, and continuous improvement everywhere in the Air Force...

  13. Improving Public Perception of Behavior Analysis.

    Science.gov (United States)

    Freedman, David H

    2016-05-01

    The potential impact of behavior analysis is limited by the public's dim awareness of the field. The mass media rarely cover behavior analysis, other than to echo inaccurate negative stereotypes about control and punishment. The media instead play up appealing but less-evidence-based approaches to problems, a key example being the touting of dubious diets over behavioral approaches to losing excess weight. These sorts of claims distort or skirt scientific evidence, undercutting the fidelity of behavior analysis to scientific rigor. Strategies for better connecting behavior analysis with the public might include reframing the field's techniques and principles in friendlier, more resonant form; pushing direct outcome comparisons between behavior analysis and its rivals in simple terms; and playing up the "warm and fuzzy" side of behavior analysis.

  14. Helping agencies improve their planning analysis techniques.

    Science.gov (United States)

    2011-11-18

    This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...

  15. Analysis of improvement ways of creative accounting

    Directory of Open Access Journals (Sweden)

    I.A. Yuhimenko-Nazaruk

    2017-02-01

    Full Text Available The necessity of carrying out the research in the direction of finding out the ways to improve creative accounting is grounded. The existing approaches of researchers to eliminate the negative consequences of creative accounting are analyzed. Four main groups of researchers' approaches to the improvement of creative accounting are singled out and analyzed. The general and distinctive features of the researchers’ proposals on the improvement of creative accounting are examined. The reasons for the impossibility of using the ethical approach to the improvement of creative accounting in Ukraine in modern conditions are grounded. The necessity of procedural aspects perfection of the creative accounting on the basis of the concept of true and fair view is proved. The classification of the approaches to the construction of accounting methodology in the context of the use of creative accounting is developed. The main regulations of the concept of true and fair view are studied, their use provides an adequate reflection of the company's economic reality in financial reporting.

  16. Calcium bioaccessibility and uptake by human intestinal like cells following in vitro digestion of casein phosphopeptide-calcium aggregates.

    Science.gov (United States)

    Perego, Silvia; Del Favero, Elena; De Luca, Paola; Dal Piaz, Fabrizio; Fiorilli, Amelia; Cantu', Laura; Ferraretto, Anita

    2015-06-01

    Casein phosphopeptides (CPPs), derived by casein proteolysis, can bind calcium ions and keep them in solution. In vitro studies have demonstrated CPP-induced cell calcium uptake, depending on the formation of (CPP + calcium) complexes and on the degree of differentiation of the intestinal cells. With the present study, we address the persistence of the complexes and of the CPP-induced calcium uptake in intestinal like cells after the digestion process, thus examining their eligibility to serve as nutraceuticals. A calcium-preloaded CPP preparation of commercial origin (Ca-CPPs) was subjected to in vitro digestion. The evolution of the supramolecular structure of the Ca-CPP complexes was studied using laser-light and X-ray scattering. The bioactivity of the pre- and post-digestion Ca-CPPs was determined in differentiated Caco2 and HT-29 cells by video imaging experiments using Fura-2. We found that Ca-CPP aggregates keep a complex supramolecular organization upon digestion, despite getting smaller in size and increasing internal calcium dispersion. Concomitantly and most interestingly, digested Ca-CPPs clearly enhance the uptake of calcium ions, especially in Caco2 cells. In contrast, digestion depletes the ability of post-loaded decalcified-CPPs (Ca-dekCPPs), with a weaker internal structure, to induce calcium uptake. The enhanced bioactivity reached upon digestion strongly suggests a recognized role of Ca-CPPs, in the form used here, as nutraceuticals.

  17. International Space Station Future Correlation Analysis Improvements

    Science.gov (United States)

    Laible, Michael R.; Pinnamaneni, Murthy; Sugavanam, Sujatha; Grygier, Michael

    2018-01-01

    Ongoing modal analyses and model correlation are performed on different configurations of the International Space Station (ISS). These analyses utilize on-orbit dynamic measurements collected using four main ISS instrumentation systems: External Wireless Instrumentation System (EWIS), Internal Wireless Instrumentation System (IWIS), Space Acceleration Measurement System (SAMS), and Structural Dynamic Measurement System (SDMS). Remote Sensor Units (RSUs) are network relay stations that acquire flight data from sensors. Measured data is stored in the Remote Sensor Unit (RSU) until it receives a command to download data via RF to the Network Control Unit (NCU). Since each RSU has its own clock, it is necessary to synchronize measurements before analysis. Imprecise synchronization impacts analysis results. A study was performed to evaluate three different synchronization techniques: (i) measurements visually aligned to analytical time-response data using model comparison, (ii) Frequency Domain Decomposition (FDD), and (iii) lag from cross-correlation to align measurements. This paper presents the results of this study.

  18. Planning, Conducting, and Documenting Data Analysis for Program Improvement

    Science.gov (United States)

    Winer, Abby; Taylor, Cornelia; Derrington, Taletha; Lucas, Anne

    2015-01-01

    This 2015 document was developed to help technical assistance (TA) providers and state staff define and limit the scope of data analysis for program improvement efforts, including the State Systemic Improvement Plan (SSIP); develop a plan for data analysis; document alternative hypotheses and additional analyses as they are generated; and…

  19. Recent improvements in Thomson scattering data analysis

    International Nuclear Information System (INIS)

    Tillack, M.S.; Lazarus, E.A.

    1980-04-01

    A new profile analysis package for use with the Thomson scattering data on ISX-B has recently been implemented. The primary feature of this package is a weighted least squares fitting of temperature and density data to generate a representative curve, as opposed to the previous hand-fitting technique. The changes will automate the manner in which data are transmitted and manipulated, without affecting the calculational techniques previously used. The computer programs have also been used to estimate the sensitivity of various plasma quantities to the accuracy of the Thomson scattering data

  20. Landsat analysis for improving bamboo forest mapping

    Science.gov (United States)

    Yamamoto, Y.; Suzuoki, Y.; Tsuboi, T.; Iinuma, T.; Iwashita, K.; Nishikawa, H.

    Using satellite data and field data collected periodically over the past years, the vegetation area and the underlying substrate have been mapped. Numerous methods for detecting "vegetation changes" with the aid of digital satellite data have been utilized. Among those methods , the vegetation indices such as RVI, NDVI, or SVI are the most suitable methods to estimate "the change". Vegetation indices are mathematical transformations designed to assess the spectral contribution of vegetation to multispectral observations. Bamboo grove as primary subject in this study is welknown as rapid growing plant, and, on the other hand, the expansion of bamboo grove have been discussed as a regional environmental issue. Change detection of bamboo covered area would be useful to help the preventive countermeasures for bamboo expansion in unwanted areas. As a result, to detect the accurate bamboo covered area, the optimal vegetation indices and band combinations were established through continuous Landsat data based statistical analysis.

  1. Improvement of product design process by knowledge value analysis

    OpenAIRE

    XU, Yang; BERNARD, Alain; PERRY, Nicolas; LAROCHE, Florent

    2013-01-01

    Nowadays, design activities remain the core issue for global product development. As knowledge is more and more integrated, effective analysis of knowledge value becomes very useful for the improvement of product design processes. This paper aims at proposing a framework of knowledge value analysis in the context of product design process. By theoretical analysis and case study, the paper illustrates how knowledge value can be calculated and how the results can help the improvement of product...

  2. Atomic force microscopic comparison of remineralization with casein-phosphopeptide amorphous calcium phosphate paste, acidulated phosphate fluoride gel and iron supplement in primary and permanent teeth: An in-vitro study

    Directory of Open Access Journals (Sweden)

    Nikita Agrawal

    2014-01-01

    Full Text Available Context: Demineralization of tooth by erosion is caused by frequent contact between the tooth surface and acids present in soft drinks. Aim: The present study objective was to evaluate the remineralization potential of casein-phosphopeptide-amorphous calcium phosphate (CPP-ACP paste, 1.23% acidulated phosphate fluoride (APF gel and iron supplement on dental erosion by soft drinks in human primary and permanent enamel using atomic force microscopy (AFM. Materials and Methods: Specimens were made from extracted 15 primary and 15 permanent teeth which were randomly divided into three treatment groups: CPP-ACP paste, APF gel and iron supplement. AFM was used for baseline readings followed by demineralization and remineralization cycle. Results and Statistics: Almost all group of samples showed remineralization that is a reduction in surface roughness which was higher with CPP-ACP paste. Statistical analysis was performed using by one-way ANOVA and Mann-Whitney U-test with P < 0.05. Conclusions: It can be concluded that the application of CPP-ACP paste is effective on preventing dental erosion from soft drinks.

  3. Casein phosphopeptide-amorphous calcium phosphate incorporated into sugar confections inhibits the progression of enamel subsurface lesions in situ.

    Science.gov (United States)

    Walker, G D; Cai, F; Shen, P; Adams, G G; Reynolds, C; Reynolds, E C

    2010-01-01

    Casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) has been demonstrated to exhibit anticariogenic activity in randomized, controlled clinical trials of sugar-free gum and a tooth cream. Two randomized, double-blind, crossover studies were conducted to investigate the potential of CPP-ACP added to hard candy confections to slow the progression of enamel subsurface lesions in an in situ model. The confections studied were: (1) control sugar (65% sucrose + 33% glucose syrup); (2) control sugar-free; (3) sugar + 0.5% (w/w) CPP-ACP; (4) sugar + 1.0% (w/w) CPP-ACP; (5) sugar-free + 0.5% (w/w) CPP-ACP. Participants (10 and 14 in study 1 and 2) wore a removable palatal appliance containing enamel half-slabs with subsurface lesions, except for meals and oral hygiene procedures, and consumed 1 confection 6 times a day for 10 days. The enamel half-slabs were inset to allow the development of plaque on the enamel surface. Participants rested for 1 week before crossing over to another confection. The appliances were stored in a humid container at 37 degrees C when not in the mouth. After each treatment period, the enamel half-slabs were removed, paired with their demineralized control half-slabs, embedded, sectioned and then analysed using transverse microradiography. In both studies consumption of the control sugar confection resulted in significant demineralization (progression) of the enamel subsurface lesions. However, consumption of the sugar confections containing CPP-ACP did not result in lesion progression, but in fact in significant remineralization (regression) of the lesions. Remineralization by consumption of the sugar + 1.0% CPP-ACP confection was significantly greater than that obtained with the sugar-free confection. Copyright 2010 S. Karger AG, Basel.

  4. Casein Phosphopeptide-Amorphous Calcium Phosphate Reduces Streptococcus mutans Biofilm Development on Glass Ionomer Cement and Disrupts Established Biofilms.

    Directory of Open Access Journals (Sweden)

    Stuart G Dashper

    Full Text Available Glass ionomer cements (GIC are dental restorative materials that are suitable for modification to help prevent dental plaque (biofilm formation. The aim of this study was to determine the effects of incorporating casein phosphopeptide-amorphous calcium phosphate (CPP-ACP into a GIC on the colonisation and establishment of Streptococcus mutans biofilms and the effects of aqueous CPP-ACP on established S mutans biofilms. S. mutans biofilms were either established in flow cells before a single ten min exposure to 1% w/v CPP-ACP treatment or cultured in static wells or flow cells with either GIC or GIC containing 3% w/w CPP-ACP as the substratum. The biofilms were then visualised using confocal laser scanning microscopy after BacLight LIVE/DEAD staining. A significant decrease in biovolume and average thickness of S. mutans biofilms was observed in both static and flow cell assays when 3% CPP-ACP was incorporated into the GIC substratum. A single ten min treatment with aqueous 1% CPP-ACP resulted in a 58% decrease in biofilm biomass and thickness of established S. mutans biofilms grown in a flow cell. The treatment also significantly altered the structure of these biofilms compared with controls. The incorporation of 3% CPP-ACP into GIC significantly reduced S. mutans biofilm development indicating another potential anticariogenic mechanism of this material. Additionally aqueous CPP-ACP disrupted established S. mutans biofilms. The use of CPP-ACP containing GIC combined with regular CPP-ACP treatment may lower S. mutans challenge.

  5. Casein Phosphopeptide-Amorphous Calcium Phosphate Reduces Streptococcus mutans Biofilm Development on Glass Ionomer Cement and Disrupts Established Biofilms.

    Science.gov (United States)

    Dashper, Stuart G; Catmull, Deanne V; Liu, Sze-Wei; Myroforidis, Helen; Zalizniak, Ilya; Palamara, Joseph E A; Huq, N Laila; Reynolds, Eric C

    2016-01-01

    Glass ionomer cements (GIC) are dental restorative materials that are suitable for modification to help prevent dental plaque (biofilm) formation. The aim of this study was to determine the effects of incorporating casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) into a GIC on the colonisation and establishment of Streptococcus mutans biofilms and the effects of aqueous CPP-ACP on established S mutans biofilms. S. mutans biofilms were either established in flow cells before a single ten min exposure to 1% w/v CPP-ACP treatment or cultured in static wells or flow cells with either GIC or GIC containing 3% w/w CPP-ACP as the substratum. The biofilms were then visualised using confocal laser scanning microscopy after BacLight LIVE/DEAD staining. A significant decrease in biovolume and average thickness of S. mutans biofilms was observed in both static and flow cell assays when 3% CPP-ACP was incorporated into the GIC substratum. A single ten min treatment with aqueous 1% CPP-ACP resulted in a 58% decrease in biofilm biomass and thickness of established S. mutans biofilms grown in a flow cell. The treatment also significantly altered the structure of these biofilms compared with controls. The incorporation of 3% CPP-ACP into GIC significantly reduced S. mutans biofilm development indicating another potential anticariogenic mechanism of this material. Additionally aqueous CPP-ACP disrupted established S. mutans biofilms. The use of CPP-ACP containing GIC combined with regular CPP-ACP treatment may lower S. mutans challenge.

  6. Analysis of event data recorder data for vehicle safety improvement

    Science.gov (United States)

    2008-04-01

    The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...

  7. Improving Department of Defense Global Distribution Performance Through Network Analysis

    Science.gov (United States)

    2016-06-01

    maximum time allowed by SDDB business rules, 365 days, and run the improvement algorithm again. Using the budget of 20 improvement days for both the...OF DEFENSE GLOBAL DISTRIBUTION PERFORMANCE THROUGH NETWORK ANALYSIS by Justin A. Thompson June 2016 Thesis Advisor: Samuel E. Buttrey...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE IMPROVING DEPARTMENT OF DEFENSE GLOBAL DISTRIBUTION

  8. Improving Quality Using Architecture Fault Analysis with Confidence Arguments

    Science.gov (United States)

    2015-03-01

    Improving Quality Using Architecture Fault Analysis with Confidence Arguments Peter H. Feiler Charles B. Weinstock John B. Goodenough ...argument are represented explicitly. As reasons for doubt, called defeaters, are removed, confidence in system claims increases [ Goodenough 2013, Weinstock...Peter, Goodenough , John, Gurfinkel, Arie, Weinstock, Charles, & Wrage, Lutz. Reliability Improvement and Validation Framework (CMU/SEI-2012-SR-013

  9. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  10. Improved time complexity analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2015-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm with population size μ≤n1/8−ε requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations...

  11. an improved structural model for seismic analysis of tall frames

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. This paper proposed and examined an improved structural model that overcomes the deficiencies of the shear frame model by considering the effects of flexible horizontal members and column axial loads in seismic analysis of multi-storey frames. Matrix displacement method of analysis is used on the basis of ...

  12. De Novo Sequencing of Tryptic Phosphopeptides Using Matrix-Assisted Laser Desorption/Ionization Based Tandem Mass Spectrometry with Hydrogen Atom Attachment.

    Science.gov (United States)

    Asakawa, Daiki; Takahashi, Hidenori; Iwamoto, Shinichi; Tanaka, Koichi

    2018-02-20

    Phosphorylation is the most abundant protein modification, and tandem mass spectrometry (MS/MS) with radical-based fragmentation techniques has proven to be a promising method for phosphoproteomic applications, owing to its ability to determine phosphorylation sites on proteins. The radical-induced fragmentation technique involves the attachment or abstraction of hydrogen to peptides in an ion trap mass spectrometer, in a process called hydrogen attachment/abstraction dissociation (HAD), which has only been recently developed. In the present investigation, we have analyzed model phosphopeptides and phosphoprotein digests using HAD-MS/MS, combined with matrix-assisted laser desorption/ionization (MALDI), in order to demonstrate the usefulness of the HAD-MS/MS-based analytical method. The tryptic peptides were categorized as arginine- and lysine-terminated peptides, and MALDI HAD-MS/MS is found to facilitate the sequencing of arginine-terminated tryptic peptides, because of the selective observation of C-terminal side fragment ions. In contrast, MALDI HAD-MS/MS of lysine-terminated tryptic peptides produced both N- and C-terminal side fragments, such that the mass spectra were complex. The guanidination of peptide converted lysine into homoarginine, which facilitated the interpretation of MALDI HAD-MS/MS mass spectra. The present method was useful for de novo sequencing of tryptic phosphopeptides.

  13. Combined effect of paste containing casein phosphopeptide-amorphous calcium phosphate and fluoride on enamel lesions: an in vitro pH-cycling study.

    Science.gov (United States)

    Ogata, Kiyokazu; Warita, Sachie; Shimazu, Kisaki; Kawakami, Tomomi; Aoyagi, Kyoko; Karibe, Hiroyuki

    2010-01-01

    The purpose of this study was to show through enamel remineralization that a combination of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) and fluoride is better than fluoride alone when the processing time for remineralization is short. The bovine enamel slabs (n=28) were subjected to pH cycling for 4 days. Paste containing CPP-ACP and/or fluoride was applied for 30 minutes daily. The sections were observed using a polarizing microscope and microradiographs. In the groups treated solely with sodium fluoride (NaF) solution, tissue loss on the enamel surface was observed. On the other hand, in the groups that had been treated with a mixture of the NaF solution and CPP-ACP, the enamel surface was maintained. These results show that casein phosphopeptide-amorphous calcium phosphate-containing paste has the ability to maintain the enamel surface; the combined use of CPP-ACP paste and fluoride enhances this ability, thereby reducing demineralization.

  14. Improvement of software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, K.; Ivanović, I.

    2015-01-01

    In this paper, we present improvements made on our software for the analysis of visual meteor data. R package MetFns received major updates. Selection filters and algorithms for calculation of zenithal hourly rate and population index, as well as accompanying graphics, are corrected and their performance is improved. Web application MetRApp contains a completely remade user interface and some new features. Also, calculation performances are optimized.

  15. Improving Automatic Text Classification by Integrated Feature Analysis

    Science.gov (United States)

    Busagala, Lazaro S. P.; Ohyama, Wataru; Wakabayashi, Tetsushi; Kimura, Fumitaka

    Feature transformation in automatic text classification (ATC) can lead to better classification performance. Furthermore dimensionality reduction is important in ATC. Hence, feature transformation and dimensionality reduction are performed to obtain lower computational costs with improved classification performance. However, feature transformation and dimension reduction techniques have been conventionally considered in isolation. In such cases classification performance can be lower than when integrated. Therefore, we propose an integrated feature analysis approach which improves the classification performance at lower dimensionality. Moreover, we propose a multiple feature integration technique which also improves classification effectiveness.

  16. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  17. Can an understanding of transactional analysis improve postgraduate clinical supervision?

    Science.gov (United States)

    Sivan, Manoj; McKimm, Judy; Held, Sam

    2011-01-01

    Clinical supervision in postgraduate medical training is vital in producing competent and safe health-care practitioners. Effective communication between supervisors and trainees at an interpersonal and professional level determines the quality of the supervision process. Transactional analysis, a theory of personality, can be used to enhance understanding of interpersonal interactions and improve the outcomes of clinical training.

  18. Costs and returns analysis of improved and alternative cassava ...

    African Journals Online (AJOL)

    The specific objectives of the study was an analysis of the costs and returns of improved and alternative technologies available in the study area by farmers and their level of adoption of the new technologies. Data were collected from a random sample of 250 farmers and 30 extension Staff in the three (3) agricultural zones ...

  19. Sensitivity analysis and its application for dynamic improvement

    Indian Academy of Sciences (India)

    Keywords. Sensitivity analysis; dynamic improvement structural modoficaton; laser beam printer; motorbike; disc drive; mechatronics; automobile engine. Abstract. In order to determine appropriate points where natural frequency or mode shape under consideration can be effectively modified by structural modification, the ...

  20. Improved Runtime Analysis of the Simple Genetic Algorithm

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2013-01-01

    A runtime analysis of the Simple Genetic Algorithm (SGA) for the OneMax problem has recently been presented proving that the algorithm requires exponential time with overwhelming probability. This paper presents an improved analysis which overcomes some limitations of our previous one. Firstly......, the new result holds for population sizes up to mu = n1/4-epsilon which is an improvement up to a power of 2 larger. Secondly, we present a technique to bound the diversity of the population that does not require a bound on its bandwidth. Apart from allowing a stronger result, we believe this is a major...... improvement towards the reusability of the techniques in future systematic analyses of GAs. Finally, we consider the more natural SGA using selection with replacement rather than without replacement although the results hold for both algorithmic versions. Experiments are presented to explore the limits...

  1. Thermal hydraulic analysis of the JMTR improved LEU-core

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Toshio; Nagao, Yoshiharu; Komukai, Bunsaku; Naka, Michihiro; Fujiki, Kazuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Takeda, Takashi [Radioactive Waste Management and Nuclear Facility Decommissioning Technology Center, Tokai, Ibaraki (Japan)

    2003-01-01

    After the investigation of the new core arrangement for the JMTR reactor in order to enhance the fuel burn-up and consequently extend the operation period, the ''improved LEU core'' that utilized 2 additional fuel elements instead of formerly installed reflector elements, was adopted. This report describes the results of the thermal-hydraulic analysis of the improved LEU core as a part of safety analysis for the licensing. The analysis covers steady state, abnormal operational transients and accidents, which were described in the annexes of the licensing documents as design bases events. Calculation conditions for the computer codes were conservatively determined based on the neutronic analysis results and others. The results of the analysis, that revealed the safety criteria were satisfied on the fuel temperature, DNBR and primary coolant temperature, were used in the licensing. The operation license of the JMTR with the improved LEU core was granted in March 2001, and the reactor operation with new core started in November 2001 as 142nd operation cycle. (author)

  2. Improving preconception health and care: a situation analysis.

    Science.gov (United States)

    Goodfellow, Ashley; Frank, John; McAteer, John; Rankin, Jean

    2017-08-23

    The purpose of this situation analysis was to explore the views of health and non-health professionals working with women of childbearing age on current and future delivery of preconception care in one National Health Service (NHS) Board area in Scotland. The situation analysis was undertaken using a mixed methods approach. Six focus groups were conducted organised by profession - general practitioners (GPs), practice nurses, health visitors, family nurses, guidance teachers and youth workers. Existing evidence of effective preconception care interventions informed focus group guides. A survey was undertaken with community pharmacists which provided qualitative data for analysis. Focus group transcripts were analysed by two researchers using a thematic analysis approach. There was lack of awareness of preconception health and its importance amongst the target group. Levels of unplanned pregnancy hampered efforts to deliver interventions. Professional knowledge, capacity and consistency of practice were viewed as challenges, as was individual compliance with preconception care advice. Improvement requires multifaceted action, including ensuring the school curriculum adequately prepares adolescents for future parenthood, increasing awareness through communication and marketing, supporting professional knowledge and practice and capitalising on existing opportunities for preconception care, and ensuring services are equitable and targeted to need. Delivery of preconception care needs to be improved both before and between pregnancies to improve outcomes for women and infants. Action is required at individual, organisational and community levels to ensure this important issue is at the forefront of preventative care and preventative spending.

  3. Domain analysis and modeling to improve comparability of health statistics.

    Science.gov (United States)

    Okada, M; Hashimoto, H; Ohida, T

    2001-01-01

    Health statistics is an essential element to improve the ability of managers of health institutions, healthcare researchers, policy makers, and health professionals to formulate appropriate course of reactions and to make decisions based on evidence. To ensure adequate health statistics, standards are of critical importance. A study on healthcare statistics domain analysis is underway in an effort to improve usability and comparability of health statistics. The ongoing study focuses on structuring the domain knowledge and making the knowledge explicit with a data element dictionary being the core. Supplemental to the dictionary are a domain term list, a terminology dictionary, and a data model to help organize the concepts constituting the health statistics domain.

  4. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  5. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  6. MEK-1 phosphorylation by MEK kinase, Raf, and mitogen-activated protein kinase: analysis of phosphopeptides and regulation of activity.

    OpenAIRE

    Gardner, A M; Vaillancourt, R R; Lange-Carter, C A; Johnson, G L

    1994-01-01

    MEK-1 is a dual threonine and tyrosine recognition kinase that phosphorylates and activates mitogen-activated protein kinase (MAPK). MEK-1 is in turn activated by phosphorylation. Raf and MAPK/extracellular signal-regulated kinase kinase (MEKK) independently phosphorylate and activate MEK-1. Recombinant MEK-1 is also capable of autoactivation. Purified recombinant wild type MEK-1 and a mutant kinase inactive MEK-1 were used as substrates for MEKK, Raf, and autophosphorylation. MEK-1 phosphory...

  7. Training needs analysis for MSMEs: how to improve training effectiveness

    Science.gov (United States)

    Rohayati, Y.; Wulandari, S.

    2017-12-01

    The study aims to analyze training needs for MSMEs in the area of Kabupaten Bandung by selecting the case of MSMEs joined in Association for Agricultural Product Process, focusing on marketing as the main topic of the training. The needs analysis was required to improve training participation and effectiveness. Both aspects are important to notice since making MSMEs participate in training is not an easy task. Similarly, the needs analysis was carried out to anticipate participants’ thoughts that the training does not give any benefits for them or is ineffective because it does not meet their needs although it was actually to help MSMEs improve their marketing knowledge expected to lead to their success. This research involved 100 MSMEs with business ages starting from less than five years to more than 15 years. Those involved MSMEs were dominated by MSMEs targeting local marketing areas. The data were collected by survey and judgmental sampling technique. By conducting a descriptive analysis, it can be concluded that the needs of SMEs on marketing training materials should focus on improving marketing skills such as product development, sales, and use of marketing media as well as discussing legal aspects such as the need for certification and product brand. The results of the study also concluded that there is a need for training that is supplemented by making visits to more successful SMEs as well as practices with on the job training methods.

  8. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  9. Requirements Analysis for Future Satellite Gravity Mission Improved-GRACE

    Science.gov (United States)

    Zheng, Wei; Hsu, Houtse; Zhong, Min; Yun, Meijuan

    2015-01-01

    The Earth's gravitational field from the Next-Generation Gravimetry Mission (NGGM) and the Improved-Gravity Recovery and Climate Experiment (Improved-GRACE) complete up to degree and order 120 is recovered by a closed-loop numerical simulation using different orbital altitudes of 325 and 300 km, different orbital inclinations of 96.78° and 89° and different inter-satellite ranges of 10 and 50 km. The preferred orbit parameters of the future twin Improved-GRACE satellites are proposed based on the results of the simulations in this study. The research results show: (1) In order to achieve the scientific objectives, which require that the accuracy of the next-generation Earth gravity field models is at least one order of magnitude better than that of the current gravity models, the orbit design at an altitude of 300 ± 50 km is recommended for the future Improved-GRACE mission. This altitude is determined by a trade-off analysis between the recovery accuracy of the gravity field and the operational lifetime of the satellite system. (2) Because the accuracy of the Earth's gravitational field from NGGM with an orbital inclination of 96.78° will be decreased due to a lack of the observation data in the polar areas, we propose that a near-polar orbit (inclination of 89° ± 2°) is a preferable selection for the future twin Improved-GRACE satellites. (3) The future Improved-GRACE mission has to adopt an inter-satellite range of 50 ± 10 km, because the common signals of the Earth's gravitational field between the twin NGGM satellites will be substantially eliminated with a shorter inter-satellite range of 10 km. With these orbit design parameters, the Earth's gravitational field from the Improved-GRACE mission is precisely recovered complete up to degree and order 120 with a cumulative geoid height error of about 0.7 mm.

  10. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  11. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was used...... to model correlation between 19 register-based indicators for the quality of non-steroidal anti-inflammatory drug (NSAID) prescribing. RESULTS: The correlation between indicators ranged widely from 0 to 0.93. Factor analysis revealed three dimensions of quality: (1) "Coxib preference", comprising...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  12. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  13. Security analysis and improvements to the PsychoPass method.

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  14. Micromechanical analysis of polyacrylamide-modified concrete for improving strengths

    International Nuclear Information System (INIS)

    Sun Zengzhi; Xu Qinwu

    2008-01-01

    This paper studies how polyacrylamide (PAM) alters the physicochemical and mechanical properties of concrete. The microstructure of PAM-modified concrete and the physicochemical reaction between PAM and concrete were studied through scanning electron microscope (SEM), differential thermal analysis (DTA), thermal gravimetric analysis (TGA), and infrared spectrum analysis. Meanwhile, the workability and strengths of cement paste and concrete were tested. PAM's modification mechanism was also discussed. Results indicate that PAM reacts with the Ca 2+ and Al 3+ cations produced by concrete hydration to form the ionic compounds and reduce the crystallization of Ca(OH) 2 , acting as a flexible filler and reinforcement in the porosity of concrete and, therefore, improving concrete's engineering properties. PAM also significantly alters the microstructure at the aggregate-cement interfacial transition zone. Mechanical testing results indicate that the fluidity of cement paste decreases initially, then increases, and decreases again with increasing PAM content. PAM can effectively improve the flexural strength, bonding strength, dynamic impact resistance, and fatigue life of concrete, though it reduces the compressive strength to some extent

  15. Modified paraffin wax for improvement of histological analysis efficiency.

    Science.gov (United States)

    Lim, Jin Ik; Lim, Kook-Jin; Choi, Jin-Young; Lee, Yong-Keun

    2010-08-01

    Paraffin wax is usually used as an embedding medium for histological analysis of natural tissue. However, it is not easy to obtain enough numbers of satisfactory sectioned slices because of the difference in mechanical properties between the paraffin and embedded tissue. We describe a modified paraffin wax that can improve the histological analysis efficiency of natural tissue, composed of paraffin and ethylene vinyl acetate (EVA) resin (0, 3, 5, and 10 wt %). Softening temperature of the paraffin/EVA media was similar to that of paraffin (50-60 degrees C). The paraffin/EVA media dissolved completely in xylene after 30 min at 50 degrees C. Physical properties such as the amount of load under the same compressive displacement, elastic recovery, and crystal intensity increased with increased EVA content. EVA medium (5 wt %) was regarded as an optimal composition, based on the sectioning efficiency measured by the numbers of unimpaired sectioned slices, amount of load under the same compressive displacement, and elastic recovery test. Based on the staining test of sectioned slices embedded in a 5 wt % EVA medium by hematoxylin and eosin (H&E), Masson trichrome (MT), and other staining tests, it was concluded that the modified paraffin wax can improve the histological analysis efficiency with various natural tissues. (c) 2010 Wiley-Liss, Inc.

  16. Toward improved analysis of concentration data: Embracing nondetects.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien

    2018-03-01

    Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.

  17. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  18. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  19. An improved convergence analysis of smoothed aggregation algebraic multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Brezina, Marian [Univ. of Colorado, Boulder, CO (United States). Dept. of Applied Mathematics; Vaněk, Petr [University of West Bohemia (Czech Republic). Dept. of Mathematics; Vassilevski, Panayot S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing

    2011-03-02

    We present an improved analysis of the smoothed aggregation (SA) alge- braic multigrid method (AMG) extending the original proof in [SA] and its modification in [Va08]. The new result imposes fewer restrictions on the aggregates that makes it eas- ier to verify in practice. Also, we extend a result in [Van] that allows us to use aggressive coarsening at all levels due to the special properties of the polynomial smoother, that we use and analyze, and thus provide a multilevel convergence estimate with bounds independent of the coarsening ratio.

  20. Improved method and apparatus for chromatographic quantitative analysis

    Science.gov (United States)

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  1. MITG post-test analysis and design improvements

    International Nuclear Information System (INIS)

    Schock, A.

    1983-01-01

    The design, performance analysis, and key attributes of the Modular Isotopic Thermoelectric Generator (MITG) were described in a 1981 IECEC paper; and the design, fabrication, and testing of prototypical MITG test assemblies were described in preceding papers in these proceedings. Each test assembly simulated a typical modular slice of the flight generator. The present paper describes a detailed thermal-stress analysis, which identified the causes of stress-related problems observed during the tests. It then describes how additional analyses were used to evaluate design changes to alleviate those problems. Additional design improvements are discussed in the next paper in these proceedings, which also describes revised fabrication procedures and updated performance estimates for the generator

  2. Missile placement analysis based on improved SURF feature matching algorithm

    Science.gov (United States)

    Yang, Kaida; Zhao, Wenjie; Li, Dejun; Gong, Xiran; Sheng, Qian

    2015-03-01

    The precious battle damage assessment by use of video images to analysis missile placement is a new study area. The article proposed an improved speeded up robust features algorithm named restricted speeded up robust features, which combined the combat application of TV-command-guided missiles and the characteristics of video image. Its restrictions mainly reflected in two aspects, one is to restrict extraction area of feature point; the second is to restrict the number of feature points. The process of missile placement analysis based on video image was designed and a video splicing process and random sample consensus purification were achieved. The RSURF algorithm is proved that has good realtime performance on the basis of guarantee the accuracy.

  3. Improvements and experience in the analysis of reprocessing samples

    International Nuclear Information System (INIS)

    Koch, L.; Cricchio, A.; Meester, R. de; Romkowski, M.; Wilhelmi, M.; Arenz, H.J.; Stijl, E. van der; Baeckmann, A. von

    1976-01-01

    Improvements in the analysis of input samples for reprocessing were obtained. To cope with the decomposition of reprocessing input solutions owling to the high radioactivity, an aluminium capsule technique was developed. A known amount of the dissolver solution was weighed into an aluminium can, dried, and the capsule was sealed. In this form, the sample could be stored over a long period and could be redissolved later for the analysis. The isotope correlation technique offers an attractive alternative for measuring the plutonium isotopic content in the dissolver solution. Moreover, this technique allows for consistency checks of analytical results. For this purpose, a data bank of correlated isotopic data is in use. To improve the efficiency of analytical work, four automatic instruments have been developed. The conditioning of samples for the U-Pu isotopic measurement was achieved by an automatic ion exchanger. A mass spectrometer, to which a high vacuum lock is connected, allows the automatic measurement of U-Pu samples. A process-computer controls the heating, focusing and scanning processes during the measurement and evaluates the data. To ease the data handling, alpha-spectrometry as well as a balance have been automated. (author)

  4. Multispectral fingerprinting for improved in vivo cell dynamics analysis

    Directory of Open Access Journals (Sweden)

    Cooper Cameron HJ

    2010-09-01

    Full Text Available Abstract Background Tracing cell dynamics in the embryo becomes tremendously difficult when cell trajectories cross in space and time and tissue density obscure individual cell borders. Here, we used the chick neural crest (NC as a model to test multicolor cell labeling and multispectral confocal imaging strategies to overcome these roadblocks. Results We found that multicolor nuclear cell labeling and multispectral imaging led to improved resolution of in vivo NC cell identification by providing a unique spectral identity for each cell. NC cell spectral identity allowed for more accurate cell tracking and was consistent during short term time-lapse imaging sessions. Computer model simulations predicted significantly better object counting for increasing cell densities in 3-color compared to 1-color nuclear cell labeling. To better resolve cell contacts, we show that a combination of 2-color membrane and 1-color nuclear cell labeling dramatically improved the semi-automated analysis of NC cell interactions, yet preserved the ability to track cell movements. We also found channel versus lambda scanning of multicolor labeled embryos significantly reduced the time and effort of image acquisition and analysis of large 3D volume data sets. Conclusions Our results reveal that multicolor cell labeling and multispectral imaging provide a cellular fingerprint that may uniquely determine a cell's position within the embryo. Together, these methods offer a spectral toolbox to resolve in vivo cell dynamics in unprecedented detail.

  5. Group learning improves case analysis in veterinary medicine.

    Science.gov (United States)

    Pickrell, John A; Boyer, John; Oehme, Frederick W; Clegg, Victoria L; Sells, Nikki

    2002-01-01

    Group learning has become important to professional students in the healing sciences. Groups share factual and procedural resources to enhance their performances. We investigated the extent to which students analyzing case-based evaluations as teams acquired an immediate performance advantage relative to those analyzing them as individuals and the extent to which group work on one problem led to better performance by individual students on related problems. We blinded written evaluations by randomly assigning numbers to groups of students and using removable tracers. Differences between groups and individuals were evaluated using Student's t statistic. Similar comparisons were evaluated by meta-analysis to determine overall trends. Students who analyzed evaluations as a group had an 8.5% performance advantage over those who analyzed them as individuals. When evaluations were divided into those asking questions related to treatment, differential diagnosis, and prognosis, specific performance advantages for groups relative to individuals were 8.9%, 5.9%, and 6.1% respectively. Students who had previously been trained by group evaluations had a 1.5% advantage relative to those who received their training as individuals. Answers by students analyzing evaluations as groups suggested a deeper understanding, in large part because of their improved ability to explain treatment and to conduct differential diagnosis. These improvements suggested limited abilities to use previous experience to improve present performance.

  6. Skill analysis part 3: improving a practice skill.

    Science.gov (United States)

    Price, Bob

    In this, the third and final article in a series on practice skill analysis, attention is given to imaginative ways of improving a practice skill. Having analysed and evaluated a chosen skill in the previous two articles, it is time to look at new ways to proceed. Creative people are able to be analytical and imaginative. The process of careful reasoning involved in analysing and evaluating a skill will not necessarily be used to improve it. To advance a skill, there is a need to engage in more imaginative, free-thinking processes that allow the nurse to think afresh about his or her chosen skill. Suggestions shared in this article are not exhaustive, but the material presented does illustrate measures that in the author's experience seem to have potential. Consideration is given to how the improved skill might be envisaged (an ideal skill in use). The article is illustrated using the case study of empathetic listening, which has been used throughout this series.

  7. ECONOMIC AND ENERGETICAL ANALYSIS OF IMPROVED WASTE UTILIZATION PLASMA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Serghei VAMBOL

    2015-07-01

    Full Text Available Purpose. Energy and economic evaluation of the improved plasma waste utilization technological process, as well as an expediency substantiation of the use of improved plasma technology by comparing its energy consumption with other thermal methods of utilization. Methodology. Analysis of existing modern and advanced methods of waste management and its impact on environmental safety. Considering of energy and monetary costs to implement two different waste management technologies. Results. Studies have shown regular gasification ensure greater heating value due to differences, a significant amount of nitrogen than for plasma gasification. From the point of view of minimizing energy and monetary costs and environmental safety more promising is to offer advanced technology for plasma waste. To carry out the energy assessment of the appropriateness of the considered technologies-comparative calculation was carried out at the standard conditions. This is because in the processing of waste produced useful products, such as liquefied methane, synthetic gas (94% methane and a fuel gas for heating, suitable for sale that provides cost-effectiveness of this technology. Originality. Shown and evaluated ecological and economic efficiency of proposed improved plasma waste utilization technology compared with other thermal techniques. Practical value. Considered and grounded of energy and monetary costs to implement two different waste management technologies, namely ordinary gasification and using plasma generators. Proposed plasma waste utilization technology allows to obtain useful products, such as liquefied methane, synthetic gas and a fuel gas for heating, which are suitable for sale. Plant for improved plasma waste utilization technological process allows to compensate the daily and seasonal electricity and heat consumption fluctuations by allowing the storage of obtained fuel products.

  8. Does plyometric training improve strength performance? A meta-analysis.

    Science.gov (United States)

    Sáez-Sáez de Villarreal, Eduardo; Requena, Bernardo; Newton, Robert U

    2010-09-01

    Majority of the research suggests plyometric training (PT) improves maximal strength performance as measured by 1RM, isometric MVC or slow velocity isokinetic testing. However, the effectiveness of PT depends upon various factors. A meta-analysis of 15 studies with a total of 31 effect sizes (ES) was carried out to analyse the role of various factors on the effects of PT on strength performance. The inclusion criteria for the analysis were: (a) studies using PT programs for lower limb muscles; (b) studies employing true experimental design and valid and reliable measurements; (c) studies including sufficient data to calculate ES. When subjects can adequately follow plyometric exercises, the training gains are independent of fitness level. Subjects in either good or poor physical condition, benefit equally from plyometric work, also men obtain similar strength results to women following PT. In relation to the variables of program design, training volume of less than 10 weeks and with more than 15 sessions, as well as the implementation of high-intensity programs, with more than 40 jumps per session, were the strategies that seem to maximize the probability to obtain significantly greater improvements in performance (p<0.05). In order to optimise strength enhancement, the combination of different types of plyometrics with weight-training would be recommended, rather than utilizing only one form (p<0.05). The responses identified in this analysis are essential and should be considered by the strength and conditioning professional with regard to the most appropriate dose-response trends for PT to optimise strength gains.

  9. Improving patient safety in radiotherapy through error reporting and analysis

    International Nuclear Information System (INIS)

    Findlay, Ú.; Best, H.; Ottrey, M.

    2016-01-01

    Aim: To improve patient safety in radiotherapy (RT) through the analysis and publication of radiotherapy errors and near misses (RTE). Materials and methods: RTE are submitted on a voluntary basis by NHS RT departments throughout the UK to the National Reporting and Learning System (NRLS) or directly to Public Health England (PHE). RTE are analysed by PHE staff using frequency trend analysis based on the classification and pathway coding from Towards Safer Radiotherapy (TSRT). PHE in conjunction with the Patient Safety in Radiotherapy Steering Group publish learning from these events, on a triannual and summarised on a biennial basis, so their occurrence might be mitigated. Results: Since the introduction of this initiative in 2010, over 30,000 (RTE) reports have been submitted. The number of RTE reported in each biennial cycle has grown, ranging from 680 (2010) to 12,691 (2016) RTE. The vast majority of the RTE reported are lower level events, thus not affecting the outcome of patient care. Of the level 1 and 2 incidents reported, it is known the majority of them affected only one fraction of a course of treatment. This means that corrective action could be taken over the remaining treatment fractions so the incident did not have a significant impact on the patient or the outcome of their treatment. Analysis of the RTE reports demonstrates that generation of error is not confined to one professional group or to any particular point in the pathway. It also indicates that the pattern of errors is replicated across service providers in the UK. Conclusion: Use of the terminology, classification and coding of TSRT, together with implementation of the national voluntary reporting system described within this report, allows clinical departments to compare their local analysis to the national picture. Further opportunities to improve learning from this dataset must be exploited through development of the analysis and development of proactive risk management strategies

  10. Response surface analysis to improve dispersed crude oil biodegradation

    Energy Technology Data Exchange (ETDEWEB)

    Zahed, Mohammad A.; Aziz, Hamidi A.; Mohajeri, Leila [School of Civil Engineering, Universiti Sains Malaysia, Nibong Tebal, Penang (Malaysia); Isa, Mohamed H. [Civil Engineering Department, Universiti Teknologi PETRONAS, Tronoh, Perak (Malaysia)

    2012-03-15

    In this research, the bioremediation of dispersed crude oil, based on the amount of nitrogen and phosphorus supplementation in the closed system, was optimized by the application of response surface methodology and central composite design. Correlation analysis of the mathematical-regression model demonstrated that a quadratic polynomial model could be used to optimize the hydrocarbon bioremediation (R{sup 2} = 0.9256). Statistical significance was checked by analysis of variance and residual analysis. Natural attenuation was removed by 22.1% of crude oil in 28 days. The highest removal on un-optimized condition of 68.1% were observed by using nitrogen of 20.00 mg/L and phosphorus of 2.00 mg/L in 28 days while optimization process exhibited a crude oil removal of 69.5% via nitrogen of 16.05 mg/L and phosphorus 1.34 mg/L in 27 days therefore optimization can improve biodegradation in shorter time with less nutrient consumption. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. TENDENCY OF IMPROVEMENT ANALYSIS OF VENTURE ACTIVITY FOR MANAGEMENT DECISIONS

    Directory of Open Access Journals (Sweden)

    G.Yu. Iakovetс

    2015-03-01

    Full Text Available The questions concerning the definition of current trends and prospects of venture financing new innovative enterprises as one of the most effective and alternative, but with a high degree of risk financing sources of the entity. The features of venture financing that is different from other sources of business financing, as well as income from investments of venture capital can greatly exceed the volume of investments, but at the same time such financing risks are significant, so it all makes it necessary to build an effective system of venture capital investments in the workplace. In the course of the study also revealed problems of analysis and minimization of risks in the performance of venture financing of innovative enterprises. Defining characteristics analysis and risk assessment of venture financing helps to find ways to minimize and systematization, avoidance and prevention of risks in the performance of venture capital. The study also identified the major areas of improvement analysis of venture capital for management decisions.

  12. Full-motion video analysis for improved gender classification

    Science.gov (United States)

    Flora, Jeffrey B.; Lochtefeld, Darrell F.; Iftekharuddin, Khan M.

    2014-06-01

    The ability of computer systems to perform gender classification using the dynamic motion of the human subject has important applications in medicine, human factors, and human-computer interface systems. Previous works in motion analysis have used data from sensors (including gyroscopes, accelerometers, and force plates), radar signatures, and video. However, full-motion video, motion capture, range data provides a higher resolution time and spatial dataset for the analysis of dynamic motion. Works using motion capture data have been limited by small datasets in a controlled environment. In this paper, we explore machine learning techniques to a new dataset that has a larger number of subjects. Additionally, these subjects move unrestricted through a capture volume, representing a more realistic, less controlled environment. We conclude that existing linear classification methods are insufficient for the gender classification for larger dataset captured in relatively uncontrolled environment. A method based on a nonlinear support vector machine classifier is proposed to obtain gender classification for the larger dataset. In experimental testing with a dataset consisting of 98 trials (49 subjects, 2 trials per subject), classification rates using leave-one-out cross-validation are improved from 73% using linear discriminant analysis to 88% using the nonlinear support vector machine classifier.

  13. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  14. Improving knowledge management systems with latent semantic analysis

    International Nuclear Information System (INIS)

    Sebok, A.; Plott, C.; LaVoie, N.

    2006-01-01

    Latent Semantic Analysis (LSA) offers a technique for improving lessons learned and knowledge management systems. These systems are expected to become more widely used in the nuclear industry, as experienced personnel leave and are replaced by younger, less-experienced workers. LSA is a machine learning technology that allows searching of text based on meaning rather than predefined keywords or categories. Users can enter and retrieve data using their own words, rather than relying on constrained language lists or navigating an artificially structured database. LSA-based tools can greatly enhance the usability and usefulness of knowledge management systems and thus provide a valuable tool to assist nuclear industry personnel in gathering and transferring worker expertise. (authors)

  15. A improved method for the analysis of alpha spectra

    International Nuclear Information System (INIS)

    Equillor, Hugo E.

    2004-01-01

    In this work we describe a methodology, developed in the last years, for the analysis of alpha emitters spectra, obtained with implanted ion detectors, that tend to solve some of the problems that shows this type of spectra. This is an improved methodology respect to that described in a previous publication. The method is based on the application of a mathematical function that allows to model the tail of an alpha peak, to evaluate the part of the peak that is not seen in the cases of partial superposition with another peak. Also, a calculation program that works in a semiautomatic way, with the possibility of interactive intervention of the analyst, has been developed simultaneously and is described in detail. (author)

  16. Improving assessment of personality disorder traits through social network analysis.

    Science.gov (United States)

    Clifton, Allan; Turkheimer, Eric; Oltmanns, Thomas F

    2007-10-01

    When assessing personality disorder traits, not all judges make equally valid judgments of all targets. The present study uses social network analysis to investigate factors associated with reliability and validity in peer assessment. Participants were groups of military recruits (N=809) who acted as both targets and judges in a round-robin design. Participants completed self- and informant versions of the Multisource Assessment of Personality Pathology. Social network matrices were constructed based on reported acquaintance, and cohesive subgroups were identified. Judges who shared a mutual subgroup were more reliable and had higher self-peer agreement than those who did not. Partitioning networks into two subgroups achieved more consistent improvements than multiple subgroups. We discuss implications for multiple informant assessments.

  17. Plant improvements through the use of benchmarking analysis

    International Nuclear Information System (INIS)

    Messmer, J.R.

    1993-01-01

    As utilities approach the turn of the century, customer and shareholder satisfaction is threatened by rising costs. Environmental compliance expenditures, coupled with low load growth and aging plant assets are forcing utilities to operate existing resources in a more efficient and productive manner. PSI Energy set out in the spring of 1992 on a benchmarking mission to compare four major coal fired plants against others of similar size and makeup, with the goal of finding the best operations in the country. Following extensive analysis of the 'Best in Class' operation, detailed goals and objectives were established for each plant in seven critical areas. Three critical processes requiring rework were identified and required an integrated effort from all plants. The Plant Improvement process has already resulted in higher operation productivity, increased emphasis on planning, and lower costs due to effective material management. While every company seeks improvement, goals are often set in an ambiguous manner. Benchmarking aids in setting realistic goals based on others' actual accomplishments. This paper describes how the utility's short term goals will move them toward being a lower cost producer

  18. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  19. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  20. Sensitivity analysis for improving nanomechanical photonic transducers biosensors

    International Nuclear Information System (INIS)

    Fariña, D; Álvarez, M; Márquez, S; Lechuga, L M; Dominguez, C

    2015-01-01

    The achievement of high sensitivity and highly integrated transducers is one of the main challenges in the development of high-throughput biosensors. The aim of this study is to improve the final sensitivity of an opto-mechanical device to be used as a reliable biosensor. We report the analysis of the mechanical and optical properties of optical waveguide microcantilever transducers, and their dependency on device design and dimensions. The selected layout (geometry) based on two butt-coupled misaligned waveguides displays better sensitivities than an aligned one. With this configuration, we find that an optimal microcantilever thickness range between 150 nm and 400 nm would increase both microcantilever bending during the biorecognition process and increase optical sensitivity to 4.8   ×   10 −2  nm −1 , an order of magnitude higher than other similar opto-mechanical devices. Moreover, the analysis shows that a single mode behaviour of the propagating radiation is required to avoid modal interference that could misinterpret the readout signal. (paper)

  1. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    Palamalai, A.; Thankachan, T.S.; Balasubramanian, G.R.

    1979-01-01

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  2. Improved inhomogeneous finite elements for fabric reinforced composite mechanics analysis

    Science.gov (United States)

    Foye, R. L.

    1992-01-01

    There is a need to do routine stress/failure analysis of fabric reinforced composite microstructures to provide additional confidence in critical applications and guide materials development. Conventional methods of 3-D stress analysis are time consuming to set up, run and interpret. A need exists for simpler methods of modeling these structures and analyzing the models. The principal difficulty is the discrete element mesh generation problem. Inhomogeneous finite elements are worth investigating for application to these problems because they eliminate the mesh generation problem. However, there are penalties associated with these elements. Their convergence rates can be slow compared to homogeneous elements. Also, there is no accepted method for obtaining detailed stresses in the constituent materials of each element. This paper shows that the convergence rate can be significantly improved by a simple device which substitutes homogeneous elements for the inhomogeneous ones. The device is shown to work well in simple one and two dimensional problems. However, demonstration of the application to more complex two and three dimensional problems remains to be done. Work is also progressing toward more realistic fabric microstructural geometries.

  3. Effects of the addition of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) on mechanical properties of luting and lining glass ionomer cement

    Science.gov (United States)

    Heravi, Farzin; Bagheri, Hossein; Rangrazi, Abdolrasoul; Mojtaba Zebarjad, Seyed

    2016-07-01

    Recently, the addition of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) into glass ionomer cements (GICs) has attracted interest due to its remineralization of teeth and its antibacterial effects. However, it should be investigated to ensure that the incorporation of CPP-ACP does not have significant adverse effects on its mechanical properties. The purpose of this study was to evaluate the effects of the addition of CPP-ACP on the mechanical properties of luting and lining GIC. The first step was to synthesize the CPP-ACP. Then the CPP-ACP at concentrations of 1%, 1.56% and 2% of CPP-ACP was added into a luting and lining GIC. GIC without CPP-ACP was used as a control group. The results revealed that the incorporation of CPP-ACP up to 1.56%(w/w) increased the flexural strength (29%), diametral tensile strength (36%) and microhardness (18%), followed by a reduction in these mechanical properties at 2%(w/w) CPP-ACP. The wear rate was significantly decreased (23%) in 1.56%(w/w) concentration of CPP-ACP and it was increased in 2%(w/w). Accordingly, the addition of 1.56%(w/w) CPP-ACP into luting and lining GIC had no adverse effect on the mechanical properties of luting and lining GIC and could be used in clinical practice.

  4. Effectiveness of casein phosphopeptide-amorphous calcium phosphate and lysozyme, lactoferrin, and lactoperoxidase in reducing Streptococcus mutans counts in dentinal caries.

    Science.gov (United States)

    Pinheiro, Sérgio Luiz; Azenha, Giuliana Rodrigues; Araujo, Giovana Spagnolo Albamonte; Puppin Rontani, Regina Maria

    2017-01-01

    This study compared the capacity of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) to that of a combination of lysozyme, lactoferrin, and lactoperoxidase (LLL) in root canal disinfectant for reducing the Streptococcus mutans counts from dentinal caries. Forty human permanent third molars were selected, and flat dentin surfaces were created. Carious lesions were induced using a microbiological model. The specimens were randomly divided into 2 groups (n = 20) according to the type of agent used: group 1, CPP-ACP; group 2, LLL. The S mutans counts were performed before application and after the first, second, and third applications of the agents. The duration of each application was 3 minutes. Carious dentin specimens were homogenized, diluted, and seeded onto mitis salivarius-bacitracin plates for viable counts of S mutans. Results showed that there was no significant reduction in the number of S mutans in group 1 after the applications of CPP-ACP (P > 0.05). In group 2, a significant reduction of S mutans was observed after the third application of LLL (P < 0.01). These results indicate that 3 applications of LLL enzymes can be used to reduce the number of S mutans in dentinal caries lesions.

  5. Remineralization effects of casein phosphopeptide-amorphous calcium phosphate crème on artificial early enamel lesions of primary teeth.

    Science.gov (United States)

    Zhang, Qiong; Zou, Jing; Yang, Ran; Zhou, Xuedong

    2011-09-01

    Caries in children younger than 72 months is called early childhood caries (ECC). Sixty-six per cent of Chinese children younger than 5 years old have dental decay, and about 97% of them are untreated. This in vitro study was conducted to evaluate the remineralization effects of the casein phosphopeptide-amorphous calcium phosphate (CPP-ACP) crème on the artificial early enamel lesions of the primary teeth and to assess its caries-prevention efficiency. Enamel specimens with artificial early lesions were produced and were then randomly divided into Group A: distilled and deionized water, DDW, as negative control; Group B: CPP-ACP crème, test group; Group C: 500 ppm NaF solution, as positive control. The enamel surface microhardness (SMH) was measured before, after demineralization, and 30 days after remineralization. The results were analysed with the SPSS 13.0 software package. The enamel specimens were analysed by the scanning electron microscope. The CPP-ACP crème increased SMH of the eroded enamel significantly more than 500 ppm NaF solution did. The morphology of the enamel was different in each group. The CPP-ACP crème is effective in remineralizing early enamel lesions of the primary teeth, a little more effective than 500 ppm NaF and can be used for the prevention of ECC. 2011 The Authors. International Journal of Paediatric Dentistry © 2011 BSPD, IAPD and Blackwell Publishing Ltd.

  6. Receiver operating characteristic analysis improves diagnosis by radionuclide ventriculography

    International Nuclear Information System (INIS)

    Dickinson, C.Z.; Forman, M.B.; Vaugh, W.K.; Sandler, M.P.; Kronenberg, M.W.

    1985-01-01

    Receiver operating characteristic analysis (ROC) evaluates continuous variables to define diagnostic criteria for the optimal sensitivity (SENS) and specificity (SPEC) of a test. The authors studied exercise-induced chest pain (CP), ST-changes on electrocardiography (ECG) and rest-exercise gated radionuclide ventriculography (RVG) using ROC to clarify the optimal criteria for detecting myocardial ischemia due to coronary artherosclerosis (CAD). The data of 95 consecutive patients studied with coronary angiography, rest-exercise RVG and ECG were reviewed. 77 patients had ''significant'' CAD (≥50% lesions). Exercise-induced CP, ECG abnormalities (ST-T shifts) and RVG abnormalities (change in ejection fraction, 2-view regional wall motion change and relative end-systolic volume) were evaluated to define optimal SENS/SPEC of each and for the combined data. ROC curves were constructed by multiple logistic regression (MLR). By MLR, RVG alone was superior to ECG and CP. The combination of all three produced the best ROC curve for the entire group and for clinical subsets based on the number of diseased vessels and the presence or absence of prior myocardial infarction. When CP, ECG and RVG were combined, the optimal SENS/SPEC for detection of single vessel disease was 88/86. The SENS/SPEC for 3 vessel disease was 93/95. Thus, the application of RVG for the diagnosis of myocardial ischemia is improved with the inclusion of ECG and CP data by the use of a multiple logistic regression model. ROC analysis allows clinical application of multiple data for diagnosing CAD at desired SENS/SPEC rather than by arbitrary single-standard criteria

  7. An integrated sampling and analysis approach for improved biodiversity monitoring.

    Science.gov (United States)

    DeWan, Amielle A; Zipkin, Elise F

    2010-05-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  8. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Directory of Open Access Journals (Sweden)

    R. E. Wilson

    2014-06-01

    Full Text Available Essential ideas, successes, and difficulties of Areal Density Analysis (ADA for color-magnitude diagrams (CMD’s of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA. Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF quantization, IMF steepness, low Areal Densities (A, and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  9. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Science.gov (United States)

    Wilson, R. E.

    2014-06-01

    Essential ideas, successes, and difficulties of Areal Density Analysis (ADA) for color-magnitude diagrams (CMD¡¯s) of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC) so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA). Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF) quantization, IMF steepness, low Areal Densities (A ), and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  10. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  11. Improvements of Defect Analysis by Space-Charge Wave Spectroscopy

    Science.gov (United States)

    Voit, Kay-Michael; Hilling, Burkhard; Schmidt, Heinz-Jürgen; Imlau, Mirco

    2011-03-01

    We report on our recent advancements in space-charge wave spectroscopy, which can be used to investigate defect structures in classical high-resistive semiconductors and insulators. It permits to estimate the effective trap concentrations as well as the effective donor density Neff and the product μτ of electron mobility and life-time in the conduction band. We present a novel method of space-charge wave excitation, using a superposition of a running and a static sinusoidal illumination pattern. Thus we acquire -- in contrast to the former oscillating pattern -- a distinct direction of movement. The proposed new technique can be regarded as an effective amelioration, as the theoretical analysis is no longer limited by numerous presumptions like low modulation depth or small oscillation amplitudes. It not only overcomes these limits of the experimental configuration improving accuracy of SCW spectroscopy, but it also provides additional information, such as the sign of the charge carriers. Financial support by the DFG within the graduate college 695 ``Nonlinearities of Optical Materials'' and the project IM 37/5-1 is gratefully acknowledged.

  12. Improved sampling and analysis of images in corneal confocal microscopy.

    Science.gov (United States)

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the

  13. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis

    DEFF Research Database (Denmark)

    Queiroz, Rayner M L; Charneau, Sebastien; Mandacaru, Samuel C

    2014-01-01

    this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of the T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, i......TRAQ-labeled and multiplexed. Subsequently, phosphopeptides were enriched using TiO2 matrix. Non-phosphorylated peptides were HILIC-fractionated prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification and phosphorylation site assignment. We could...... induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of the T. cruzi amastigogenesis and this data will provide trustworthy basis for future studies and possibly for new potential drug targets....

  14. Comparative analysis of metagenomes of Italian top soil improvers

    International Nuclear Information System (INIS)

    Gigliucci, Federica; Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano

    2017-01-01

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15–50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms’ profiles as indicators of their origin. - Highlights: • Sludge- and green- based biosolids analysed by metagenomics. • Biosolids may introduce microbial hazards in the food chain. • Metagenomics enables tracking biosolids’ sources.

  15. Ethical analysis to improve decision-making on health technologies.

    Science.gov (United States)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian; Lühmann, Dagmar; Mäkelä, Marjukka; Velasco-Garrido, Marcial; Autti-Rämö, Ilona

    2008-08-01

    Health technology assessment (HTA) is the multidisciplinary study of the implications of the development, diffusion and use of health technologies. It supports health-policy decisions by providing a joint knowledge base for decision-makers. To increase its policy relevance, HTA tries to extend beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs, and their implementation may also have significant impact on people other than the patient. These are essential considerations for health policy. The ethics model is structured around key ethical questions rather than philosophical theories, to be applicable to different cultures and usable by non-philosophers. Integrating ethical considerations into HTA can improve the relevance of technology assessments for health care and health policy in both developed and developing countries.

  16. Comparative analysis of metagenomes of Italian top soil improvers.

    Science.gov (United States)

    Gigliucci, Federica; Brambilla, Gianfranco; Tozzoli, Rosangela; Michelacci, Valeria; Morabito, Stefano

    2017-05-01

    Biosolids originating from Municipal Waste Water Treatment Plants are proposed as top soil improvers (TSI) for their beneficial input of organic carbon on agriculture lands. Their use to amend soil is controversial, as it may lead to the presence of emerging hazards of anthropogenic or animal origin in the environment devoted to food production. In this study, we used a shotgun metagenomics sequencing as a tool to perform a characterization of the hazards related with the TSIs. The samples showed the presence of many virulence genes associated to different diarrheagenic E. coli pathotypes as well as of different antimicrobial resistance-associated genes. The genes conferring resistance to Fluoroquinolones was the most relevant class of antimicrobial resistance genes observed in all the samples tested. To a lesser extent traits associated with the resistance to Methicillin in Staphylococci and genes conferring resistance to Streptothricin, Fosfomycin and Vancomycin were also identified. The most represented metal resistance genes were cobalt-zinc-cadmium related, accounting for 15-50% of the sequence reads in the different metagenomes out of the total number of those mapping on the class of resistance to compounds determinants. Moreover the taxonomic analysis performed by comparing compost-based samples and biosolids derived from municipal sewage-sludges treatments divided the samples into separate populations, based on the microbiota composition. The results confirm that the metagenomics is efficient to detect genomic traits associated with pathogens and antimicrobial resistance in complex matrices and this approach can be efficiently used for the traceability of TSI samples using the microorganisms' profiles as indicators of their origin. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. The role of fluoride and casein phosphopeptide/amorphous calcium phosphate in the prevention of erosive/abrasive wear in an in vitro model using hydrochloric acid.

    Science.gov (United States)

    Wegehaupt, Florian J; Attin, T

    2010-01-01

    To investigate the effect of various fluoride compounds and casein phosphopeptide/amorphous calcium phosphate (CPP-ACP) on the reduction of erosive/abrasive tooth wear. Forty enamel samples were prepared from bovine lower incisors, stratified and allocated to 4 groups (1-4). Samples in group 1 remained untreated and served as negative controls. The test samples were treated for 2 min/day as follows: group 2 amine/sodium fluoride gel (pH 4.8; 12,500 ppm), group 3 sodium fluoride gel (pH 7.1; 12,500 ppm) and group 4 CPP-ACP-containing mousse. De- and remineralization cycling was performed for 20 days with 6 erosive attacks for 20 s with HCl (pH 3.0) per day. Samples were stored in artificial saliva between cycles and overnight. Toothbrushing (15 s; 60 strokes/min; load 2.5 N) with a toothpaste slurry was performed each day before the first and 1 h after the last erosive exposure. Tooth wear was measured by comparing baseline surface profiles with the corresponding posttreatment profiles. Tooth wear was significantly reduced in groups 2 and 3 compared with group 1, while the enamel loss of group 4 was not significantly lower compared to the negative control group 1. Between the fluoride groups 2 and 3, no significant difference in tooth wear was recorded. Erosive/abrasive tooth wear under the conditions used could be reduced significantly by the daily application of fluoride gels, irrespective of the fluoride compound, while the application of CPP-ACP-containing mousse was less effective. Copyright 2010 S. Karger AG, Basel.

  18. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  19. A multivariate analysis of factors affecting adoption of improved ...

    African Journals Online (AJOL)

    This paper analyzes the synergies/tradeoffs involved in the adoption of improved varieties of multiple crops in the mixed crop-livestock production systems of the highlands of Ethiopia A multivariate probit (MVP) model involving a system of four equations for the adoption decision of improved varieties of barley, potatoes, ...

  20. A Multivariate Analysis of Factors Affecting Adoption of Improved ...

    African Journals Online (AJOL)

    and significant effects on the likelihood of adopting improved varieties of barley and wheat and vice versa - indicating synergistic effects among the adoption decisions of the two groups of crops. On the other hand, the area share of improved varieties of wheat negatively and significantly effects the chances of using ...

  1. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  2. Analysis and improvement of security of energy smart grids

    International Nuclear Information System (INIS)

    Halimi, Halim

    2014-01-01

    The Smart grid is the next generation power grid, which is a new self-healing, self-activating form of electricity network, and integrates power-flow control, increased quality of electricity, and energy reliability, energy efficiency and energy security using information and communication technologies. Communication networks play a critical role in smart grid, as the intelligence of smart grid is built based on information exchange across the power grid. Its two-way communication and electricity flow enable to monitor, predict and manage the energy usage. To upgrade an existing power grid into a smart grid, it requires an intelligent and secure communication infrastructure. Because of that, the main goal of this dissertation is to propose new architecture and implementation of algorithms for analysis and improvement of the security and reliability in smart grid. In power transmission segments of smart grid, wired communications are usually adopted to ensure robustness of the backbone power network. In contrast, for a power distribution grid, wireless communications provide many benefits such as low cost high speed links, easy setup of connections among different devices/appliances, and so on. Wireless communications are usually more vulnerable to security attacks than wired ones. Developing appropriate wireless communication architecture and its security measures is extremely important for a smart grid system. This research addresses physical layer security in a Wireless Smart Grid. Hence a defense Quorum- based algorithm is proposed to ensure physical security in wireless communication. The new security architecture for smart grid that supports privacy-preserving, data aggregation and access control is defined. This architecture consists of two parts. In the first part we propose to use an efficient and privacy-preserving aggregation scheme (EPPA), which aggregates real-time data of consumers by Local Gateway. During aggregation the privacy of consumers is

  3. Improved Extreme Learning Machine based on the Sensitivity Analysis

    Science.gov (United States)

    Cui, Licheng; Zhai, Huawei; Wang, Benchao; Qu, Zengtang

    2018-03-01

    Extreme learning machine and its improved ones is weak in some points, such as computing complex, learning error and so on. After deeply analyzing, referencing the importance of hidden nodes in SVM, an novel analyzing method of the sensitivity is proposed which meets people’s cognitive habits. Based on these, an improved ELM is proposed, it could remove hidden nodes before meeting the learning error, and it can efficiently manage the number of hidden nodes, so as to improve the its performance. After comparing tests, it is better in learning time, accuracy and so on.

  4. Does ownership of improved dairy cow breeds improve child nutrition? A pathway analysis for Uganda.

    Science.gov (United States)

    Kabunga, Nassul S; Ghosh, Shibani; Webb, Patrick

    2017-01-01

    The promotion of livestock production is widely believed to support enhanced diet quality and child nutrition, but the empirical evidence for this causal linkage remains narrow and ambiguous. This study examines whether adoption of improved dairy cow breeds is linked to farm-level outcomes that translate into household-level benefits including improved child nutrition outcomes in Uganda. Using nationwide data from Uganda's National Panel Survey, propensity score matching is used to create an unbiased counterfactual, based on observed characteristics, to assess the net impacts of improved dairy cow adoption. All estimates were tested for robustness and sensitivity to variations in observable and unobservable confounders. Results based on the matched samples showed that households adopting improved dairy cows significantly increased milk yield-by over 200% on average. This resulted in higher milk sales and milk intakes, demonstrating the potential of this agricultural technology to both integrate households into modern value chains and increase households' access to animal source foods. Use of improved dairy cows increased household food expenditures by about 16%. Although undernutrition was widely prevalent in the study sample and in matched households, the adoption of improved dairy cows was associated with lower child stunting in adopter household. In scale terms, results also showed that holding larger farms tends to support adoption, but that this also stimulates the household's ability to achieve gains from adoption, which can translate into enhanced nutrition.

  5. Does ownership of improved dairy cow breeds improve child nutrition? A pathway analysis for Uganda.

    Directory of Open Access Journals (Sweden)

    Nassul S Kabunga

    Full Text Available The promotion of livestock production is widely believed to support enhanced diet quality and child nutrition, but the empirical evidence for this causal linkage remains narrow and ambiguous. This study examines whether adoption of improved dairy cow breeds is linked to farm-level outcomes that translate into household-level benefits including improved child nutrition outcomes in Uganda. Using nationwide data from Uganda's National Panel Survey, propensity score matching is used to create an unbiased counterfactual, based on observed characteristics, to assess the net impacts of improved dairy cow adoption. All estimates were tested for robustness and sensitivity to variations in observable and unobservable confounders. Results based on the matched samples showed that households adopting improved dairy cows significantly increased milk yield-by over 200% on average. This resulted in higher milk sales and milk intakes, demonstrating the potential of this agricultural technology to both integrate households into modern value chains and increase households' access to animal source foods. Use of improved dairy cows increased household food expenditures by about 16%. Although undernutrition was widely prevalent in the study sample and in matched households, the adoption of improved dairy cows was associated with lower child stunting in adopter household. In scale terms, results also showed that holding larger farms tends to support adoption, but that this also stimulates the household's ability to achieve gains from adoption, which can translate into enhanced nutrition.

  6. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  7. Using external data sources to improve audit trail analysis.

    OpenAIRE

    Herting, R. L.; Asaro, P. V.; Roth, A. C.; Barnes, M. R.

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system wh...

  8. Improved Methods for Pitch Synchronous Linear Prediction Analysis of Speech

    OpenAIRE

    劉, 麗清

    2015-01-01

    Linear prediction (LP) analysis has been applied to speech system over the last few decades. LP technique is well-suited for speech analysis due to its ability to model speech production process approximately. Hence LP analysis has been widely used for speech enhancement, low-bit-rate speech coding in cellular telephony, speech recognition, characteristic parameter extraction (vocal tract resonances frequencies, fundamental frequency called pitch) and so on. However, the performance of the co...

  9. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  10. Why economic analysis of health system improvement interventions matters

    Directory of Open Access Journals (Sweden)

    Edward Ivor Broughton

    2016-10-01

    Full Text Available There is little evidence to direct health systems toward providing efficient interventions to address medical errors, defined as an unintended act of omission or commission or one not executed as intended that may or may not cause harm to the patient but does not achieve its intended outcome. We believe that lack of guidance on what is the most efficient way to reduce adverse events and improve the quality of health care limits the scale-up of health system improvement interventions. Challenges to economic evaluation of these interventions include defining and implementing improvement interventions in different settings with high fidelity, capturing all of the positive and negative effects of the intervention, using process measures of effectiveness rather than health outcomes, and determining the full cost of the intervention and all economic consequences its effects. However, health system improvement interventions should be treated similarly to individual medical interventions and undergo rigorous economic evaluation to provide actionable evidence to guide policy-makers in decisions of resources allocation for improvement activities among other competing demands for health care resources.

  11. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pimprovement of +6.1% (p=0.017) and sustained monthly improvements in care delivery-improving at a rate of 0.7% per month (p=0.028). The SAMU experience demonstrates the utility of a responsive, data-driven quality improvement programme to yield significant immediate and sustained improvements in pre-hospital care for trauma in Rwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the

  12. Method for improving accuracy in full evaporation headspace analysis.

    Science.gov (United States)

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A Note on Improved Homotopy Analysis Method for Solving the Jeffery-Hamel Flow

    OpenAIRE

    Motsa, Sandile Sydney; Sibanda, Precious; Marewo, Gerald T.; Shateyi, Stanford

    2010-01-01

    This paper presents the solution of the nonlinear equation that governs the flow of a viscous, incompressible fluid between two converging-diverging rigid walls using an improved homotopy analysis method. The results obtained by this new technique show that the improved homotopy analysis method converges much faster than both the homotopy analysis method and the optimal homotopy asymptotic method. This improved technique is observed to be much more accurate than these traditional ...

  14. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    Telugu language is one of the most widely spoken south Indian languages. In the proposed Telugu speech recognition system, errors obtained from decoder are analysed to improve the performance of the speech recognition system. Static pronunciation dictionary plays a key role in the speech recognition accuracy.

  15. An Analysis of Appropriateness of Improved Rice Technology for ...

    African Journals Online (AJOL)

    The study was carried out in selected states of South Western Nigeria in order to investigate the appropriateness of improved rice technology for women farmers. A purposive sampling technique was used to select 320 women farmers from the study area. Data collection was carried out using structured interview schedule.

  16. Improved omit set displacement recoveries in dynamic analysis

    Science.gov (United States)

    Allen, Tom; Cook, Greg; Walls, Bill

    1993-09-01

    Two related methods for improving the dependent (OMIT set) displacements after performing a Guyan reduction are presented. The theoretical bases for the methods are derived. The NASTRAN DMAP ALTERs used to implement the methods in a NASTRAN execution are described. Data are presented that verify the methods and the NASTRAN DMAP ALTERs.

  17. Does Competition Improve Public School Efficiency? A Spatial Analysis

    Science.gov (United States)

    Misra, Kaustav; Grimes, Paul W.; Rogers, Kevin E.

    2012-01-01

    Advocates for educational reform frequently call for policies to increase competition between schools because it is argued that market forces naturally lead to greater efficiencies, including improved student learning, when schools face competition. Researchers examining this issue are confronted with difficulties in defining reasonable measures…

  18. Improving Capabilities for Irregular Warfare. Volume 2. Capabilities Analysis

    Science.gov (United States)

    2007-08-01

    and Iraq were limited to six orbits at any one time. Only six operator stations were available at Nellis AFB in Nevada , where Predators were...improve this system will need to hit the ground running with a well thought-out plan. The cost of “learn as you go” will be high: hunger and patience

  19. A Multivariate Analysis of Factors Affecting Adoption of Improved ...

    African Journals Online (AJOL)

    Internal factors that affect the adoption and use of agricultural technologies include farmers' attitude ... Single probit and logit models are often employed to model discrete choices such as adoption of improved ..... suggests that there are unobservable factors affecting both choices and reveals an association after controlling ...

  20. Rapid economic analysis of northern hardwood stand improvement options

    Science.gov (United States)

    William B. Leak

    1980-01-01

    Data and methodology are provided for projecting basal area, diameter, volumes, and values by product for northern hardwood stands, and for determining the rate of return on stand improvement investments. The method is rapid, requires a minimum amount of information, and should prove useful for on-the-ground economic analyses.

  1. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    measures, error-rate and Word Error Rate (WER) by application of the proposed method. Keywords. Speech recognition; pronunciation dictionary modification method; error analysis; F-measure. 1. Introduction. Speech is one of the easiest modes of ...

  2. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  3. Optimizing Bus Passenger Complaint Service through Big Data Analysis: Systematized Analysis for Improved Public Sector Management

    Directory of Open Access Journals (Sweden)

    Weng-Kun Liu

    2016-12-01

    Full Text Available With the advances in industry and commerce, passengers have become more accepting of environmental sustainability issues; thus, more people now choose to travel by bus. Government administration constitutes an important part of bus transportation services as the government gives the right-of-way to transportation companies allowing them to provide services. When these services are of poor quality, passengers may lodge complaints. The increase in consumer awareness and developments in wireless communication technologies have made it possible for passengers to easily and immediately submit complaints about transportation companies to government institutions, which has brought drastic changes to the supply–demand chain comprised of the public sector, transportation companies, and passengers. This study proposed the use of big data analysis technology including systematized case assignment and data visualization to improve management processes in the public sector and optimize customer complaint services. Taichung City, Taiwan, was selected as the research area. There, the customer complaint management process in public sector was improved, effectively solving such issues as station-skipping, allowing the public sector to fully grasp the service level of transportation companies, improving the sustainability of bus operations, and supporting the sustainable development of the public sector–transportation company–passenger supply chain.

  4. Analysis and improvement of face detection based on surf cascade

    Science.gov (United States)

    Hu, Siquan; Zhang, Caihong; Liu, Lei

    2017-08-01

    This paper aims to study limitations of the commonly employed boosting cascade framework. We focus on the factors like data, feature, weak classifier and stages. A set of novel experiments were done to show the relationship. The model contains three key points: SURF feature, weak classifier based on logistic regression and AUC-based cascade learning algorithm. This paper adds cross validation in logistic regression creatively which improves accuracy and speeds up convergence greatly. Eventually only five stages and about 100 weak classifiers are needed. The frontal face detector improves reject rate to 99% for the first three stages, decreases number of false positive greatly and achieves comparable performance among non-CNN techniques on FDDB dataset.

  5. Mass Spectrometry Vapor Analysis for Improving Explosives Detection Canine Proficiency

    Science.gov (United States)

    2017-02-10

    ionization (SESI), 8,19-21 dielectric barrier discharge ionization (DBDI), 21,22 selected-ion-flow-tube ( SIFT ), 23,24 and proton transfer reaction...sensors in addition to as service teams, will improve training efficiency and opera- tional performance. In support of canine training, the Department of...for both materials, but transfer efficiency into the source may be poor. Instrument response also saturat- ed when analyzing cyclohexanone. These

  6. Improving the usefulness of accounting data in financial analysis

    Directory of Open Access Journals (Sweden)

    A Saville

    2004-04-01

    Full Text Available Accounting practices are flawed.  As a consequence, the accounting data generated by firms are generally open to interpretation, often misleading and sometimes patently false.  Yet, financial analysts place tremendous confidence in accounting data when appraising investments and investment strategies.  The implications of financial analysis based on questionable information are numerous, and range from inexact analysis to acute investment error.  To rectify this situation, this paper identifies a set of simple, yet highly effective corrective measures, which have the capacity to move accounting practice into a realm wherein accounting starts to ‘count what counts’.  The net result would be delivery of accounting data that more accurately reflect firms’ economic realities and, as such, are more useful in the task of financial analysis.

  7. Using external data sources to improve audit trail analysis.

    Science.gov (United States)

    Herting, R L; Asaro, P V; Roth, A C; Barnes, M R

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system where information available in the medical record is augmented with external information sources such as: database sources, Light-weight Directory Access Protocol (LDAP) server sources, and World Wide Web (WWW) database sources. We discuss several issues that arise when combining the information from each of these disparate information sources. Furthermore, we explain how the enhanced person specific information obtained can be used to determine user-patient relationships that might signify a motive for inappropriately accessing a patient's medical record.

  8. The Subjectivity Problem: Improving Triangulation Approaches in Metaphor Analysis Studies

    Directory of Open Access Journals (Sweden)

    Sonya L. Armstrong

    2011-06-01

    Full Text Available Metaphor analysis procedures for uncovering participant conceptualizations have been well-established in qualitative research settings since the early 1980s; however, one common criticism of metaphor analysis is the trustworthiness of the findings. Namely, accurate determination of the conceptual metaphors held by participants based on the investigation of linguistic metaphors has been identified as a methodological issue because of the subjectivity involved in the interpretation; that is, because they are necessarily situated in specific social and cultural milieus, meanings of particular metaphors are not universally constructed nor understood. In light of these critiques, this article provides examples of two different triangulation methods that can be employed to supplement the trustworthiness of the findings when metaphor analysis methodologies are used.

  9. A quality improvement study using fishbone analysis and an electronic medical records intervention to improve care for children with asthma.

    Science.gov (United States)

    Gold, Jonathan; Reyes-Gastelum, David; Turner, Jane; Davies, H Dele

    2014-01-01

    Despite expert guidelines, gaps persist in quality of care for children with asthma. This study sought to identify barriers and potential interventions to improve compliance to national asthma prevention guidelines at a single academic pediatric primary care clinic. Using the plan-do-check-act (PDCA) quality improvement framework and fishbone analysis, several barriers to consistent asthma processes and possible interventions were identified by a group of key stakeholders. Two interventions were implemented using the electronic medical record (EMR). Physician documentation of asthma quality measures were analyzed before intervention and during 2 subsequent time points over 16 months. Documentation of asthma action plans (core group P PDCA and fishbone analysis in conjunction with embedded EMR tools can improve asthma care in a pediatric primary care setting.

  10. Rasch Analysis for Psychometric Improvement of Science Attitude Rating Scales

    Science.gov (United States)

    Oon, Pey-Tee; Fan, Xitao

    2017-01-01

    Students' attitude towards science (SAS) is often a subject of investigation in science education research. Survey of rating scale is commonly used in the study of SAS. The present study illustrates how Rasch analysis can be used to provide psychometric information of SAS rating scales. The analyses were conducted on a 20-item SAS scale used in an…

  11. Improving the Computational Morphological Analysis of a Swahili ...

    African Journals Online (AJOL)

    approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation. Keywords: LEXICOGRAPHY, MORPHOLOGY, CORPUS ANNOTATION, LEMMATIZATION, MACHINE LEARNING, SWAHILI ...

  12. Temporal Land Cover Analysis for Net Ecosystem Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Ke, Yinghai; Coleman, Andre M.; Diefenderfer, Heida L.

    2013-04-09

    We delineated 8 watersheds contributing to previously defined river reaches within the 1,468-km2 historical floodplain of the tidally influenced lower Columbia River and estuary. We assessed land-cover change at the watershed, reach, and restoration site scales by reclassifying remote-sensing data from the National Oceanic and Atmospheric Administration Coastal Change Analysis Program’s land cover/land change product into forest, wetland, and urban categories. The analysis showed a 198.3 km2 loss of forest cover during the first 6 years of the Columbia Estuary Ecosystem Restoration Program, 2001–2006. Total measured urbanization in the contributing watersheds of the estuary during the full 1996-2006 change analysis period was 48.4 km2. Trends in forest gain/loss and urbanization differed between watersheds. Wetland gains and losses were within the margin of error of the satellite imagery analysis. No significant land cover change was measured at restoration sites, although it was visible in aerial imagery, therefore, the 30-m land-cover product may not be appropriate for assessment of early-stage wetland restoration. These findings suggest that floodplain restoration sites in reaches downstream of watersheds with decreasing forest cover will be subject to increased sediment loads, and those downstream of urbanization will experience effects of increased impervious surfaces on hydrologic processes.

  13. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  14. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  15. Stiffness Analysis and Improvement of Bolt-Plate Contact Assemblies

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard; Pedersen, Pauli

    2008-01-01

    and the plate is extended by the possibility of designing a gap, that is, a nonuniform distance between the bolt and plate before prestressing. Designing the gap function generates the possibility for a better stress field by which the stiffness of the bolt is lowered, and at the same time the stiffness...... of the members is increased. Both of these changes have a positive influence on the lifetime of the connections. From designing a varying gap size distribution, it is found that the stiffness become a function of the loading. It is shown that similar improvements in the stiffness ratio between the bolt...

  16. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  17. An improvement analysis on video compression using file segmentation

    Science.gov (United States)

    Sharma, Shubhankar; Singh, K. John; Priya, M.

    2017-11-01

    From the past two decades the extreme evolution of the Internet has lead a massive rise in video technology and significantly video consumption over the Internet which inhabits the bulk of data traffic in general. Clearly, video consumes that so much data size on the World Wide Web, to reduce the burden on the Internet and deduction of bandwidth consume by video so that the user can easily access the video data.For this, many video codecs are developed such as HEVC/H.265 and V9. Although after seeing codec like this one gets a dilemma of which would be improved technology in the manner of rate distortion and the coding standard.This paper gives a solution about the difficulty for getting low delay in video compression and video application e.g. ad-hoc video conferencing/streaming or observation by surveillance. Also this paper describes the benchmark of HEVC and V9 technique of video compression on subjective oral estimations of High Definition video content, playback on web browsers. Moreover, this gives the experimental ideology of dividing the video file into several segments for compression and putting back together to improve the efficiency of video compression on the web as well as on the offline mode.

  18. SIFT Based Vein Recognition Models: Analysis and Improvement

    Directory of Open Access Journals (Sweden)

    Guoqing Wang

    2017-01-01

    Full Text Available Scale-Invariant Feature Transform (SIFT is being investigated more and more to realize a less-constrained hand vein recognition system. Contrast enhancement (CE, compensating for deficient dynamic range aspects, is a must for SIFT based framework to improve the performance. However, evidence of negative influence on SIFT matching brought by CE is analysed by our experiments. We bring evidence that the number of extracted keypoints resulting by gradient based detectors increases greatly with different CE methods, while on the other hand the matching result of extracted invariant descriptors is negatively influenced in terms of Precision-Recall (PR and Equal Error Rate (EER. Rigorous experiments with state-of-the-art and other CE adopted in published SIFT based hand vein recognition system demonstrate the influence. What is more, an improved SIFT model by importing the kernel of RootSIFT and Mirror Match Strategy into a unified framework is proposed to make use of the positive keypoints change and make up for the negative influence brought by CE.

  19. Analysis of radial electric field in LHD towards improved confinement

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.

    2001-05-01

    The radial electric field (E{sub r}) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E{sub r} transition and bifurcation. The ambipolar E{sub r} is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E{sub r} properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E{sub r} and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E{sub r} in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E{sub r} can exist. This is the region where E{sub r} transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E{sub r} properties, which indicates an optimum path towards improved confinement. (author)

  20. Analysis of radial electric field in LHD towards improved confinement

    International Nuclear Information System (INIS)

    Yokoyama, M.; Ida, K.; Sanuki, H.; Itoh, K.; Narihara, K.; Tanaka, K.; Kawahata, K.; Ohyabu, N.

    2001-05-01

    The radial electric field (E r ) properties in LHD have been investigated to indicate the guidance towards improved confinement with possible E r transition and bifurcation. The ambipolar E r is obtained from the neoclassical flux based on the analytical formulae. This approach is appropriate to clarify ambipolar E r properties in a wide range of temperature and density in a more transparent way. The comparison between calculated E r and experimentally measured one has shown the qualitatively good agreement such as the threshold density for the transition from ion root to electron root. The calculations also well reproduce the experimentally observed tendency that the electron root is possible by increasing temperatures even for higher density and the ion root is enhanced for higher density. Based on the usefulness of this approach to analyze E r in LHD, calculations in a wide range have been performed to clarify the parameter region of interest where multiple solutions of E r can exist. This is the region where E r transition and bifurcation may be realized as already experimentally confirmed in CHS. The systematic calculations give a comprehensive understandings of experimentally observed E r properties, which indicates an optimum path towards improved confinement. (author)

  1. Improving E-Business Design through Business Model Analysis

    OpenAIRE

    Ilayperuma, Tharaka

    2010-01-01

    To a rapidly increasing degree, traditional organizational structures evolve in large parts of the world towards online business using modern Information and Communication Technology (ICT) capabilities. For efficient applications of inter-organizational information systems, the alignment between business and ICT is a key factor. In this context, business analysis using business modelling can be regarded as a first step in designing economically sustainable e-business solutions. This thesis ex...

  2. In House HSV PCR, Process Improvement and Cost Effectiveness Analysis

    Science.gov (United States)

    2017-09-15

    TYPE 09/15/2017 Poster 4. TITLE AND SUBTITLE Cost-Analysis: In-hous(l HSV P(’R capabilities 6. AUTHOR(S) Ma.i Nich() las R CaJT 7. PERFORMING...ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMIT A TIC ".’ OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES 3

  3. Improved methods for dependent failure analysis in PSA

    International Nuclear Information System (INIS)

    Ballard, G.M.; Games, A.M.

    1988-01-01

    The basic design principle used in ensuring the safe operation of nuclear power plant is defence in depth. This normally takes the form of redundant equipment and systems which provide protection even if a number of equipment failures occur. Such redundancy is particularly effective in ensuring that multiple, independent equipment failures with the potential for jeopardising reactor safety will be rare events. However the achievement of high reliability has served to highlight the potentially dominant role of multiple, dependent failures of equipment and systems. Analysis of reactor operating experience has shown that dependent failure events are the major contributors to safety system failures and reactor incidents and accidents. In parallel PSA studies have shown that the results of a safety analysis are sensitive to assumptions made about the dependent failure (CCF) probability for safety systems. Thus a Westinghouse Analysis showed that increasing system dependent failure probabilities by a factor of 5 led to a factor 4 increase in core. This paper particularly refers to the engineering concepts underlying dependent failure assessment touching briefly on aspects of data. It is specifically not the intent of our work to develop a new mathematical model of CCF but to aid the use of existing models

  4. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  5. Optimization to improve precision in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani

    2010-01-01

    The level of precision or accuracy required in analysis should be satisfied the general requirements and customer needs. In presenting the results of the analysis, the level of precision is expressed as uncertainty. Requirement general is Horwitz prediction. Factors affecting the uncertainty in the Neutron Activation Analysis (NAA) include the mass of sample, mass standards, concentration in standard, count of sample, count of standard and counting geometry. Therefore, to achieve the expected level of precision, these parameters need to be optimized. A standard concentration of similar materials is applied as a basis of calculation. In the calculation NIST SRM 2704 is applied for sediment samples. Mass of sample, irradiation time and cooling time can be modified to obtain the expected uncertainty. The prediction results show the level of precision for Al, V, Mg, Mn, K, Na, As, Cr, Co, Fe, and Zn eligible the Horwitz. The predictive the count and standard deviation for Mg-27 and Zn-65 were higher than the actual value occurred due to overlapping of Mg-27 and Mn-54 peaks and Zn-65 and Fe-59 peaks. Precision level of Ca is greater than the Horwitz, since the value of microscopic cross section, the probability of radiation emission of Ca-49 and gamma spectrometer efficiency at 3084 keV is relatively small. Increased precision can only be done by extending the counting time and multiply the number of samples, because of the fixed value. The prediction results are in accordance with experimental results. (author)

  6. [Predictors of symptomatic improvement in eating disorders. Preliminary analysis].

    Science.gov (United States)

    Cygankiewicz, Patrycja; Solecka, Dorota; Pilecki, Maciej Wojciech; Józefik, Barbara

    2012-01-01

    The article discusses the preliminary results of a follow-up study carried out in 2009-2010 on former patients with a diagnosis of anorexia nervosa and bulimia nervosa, first seen in 2001-2004 at the Department of Child and Adolescent Psychiatry, the Jagiellonian University Medical College in Krakow. At that time they had been taking part in a research project, whose aim was to define the relationships among the psychopathological picture of eating disorders, self-image and family relations and also the influence of socio-cultural factors. The aim of the current study is to attempt to define factors influencing the course and prognosis of eating disorders in the studied group. Results from the Eating Disorder Inventory Questionnaire (EDI) and the Polish version of Family Assessment Measure (KOR) in the first study were juxtaposed with the clinical state and parameters of psychosocial functioning of the studied women assessed on the basis of the follow-up study. In the studied group, 13 girls suffered from anorexia nervosa--restricting type, 6 from anorexia nervosa binge-eating/purging type, and 6 from bulimia. In the studied group, there was complete symptomatic improvement in 12 persons (48%), subclinical symptoms continued to be observed in 9 persons (36%), and 4 persons (16%) met full diagnostic criteria for eating disorders. The most favourable course was observed in the group with a diagnosis of anorexia nervosa restricting type. The least favourable was observed in the group with a diagnosis of bulimia. Results. In the studied group, 13 girls suffered from anorexia nervosa - restricting type, 6 from anorexia nervosa binge-eating/purging type, and 6 from bulimia. In the studied group, there was complete symptomatic improvement in 12 persons (48%), subclinical symptoms continued to be observed in 9 persons (36%), and 4 persons (16%) met full diagnostic criteria for eating disorders. The most favourable course was observed in the group with a diagnosis of

  7. Improvement of testing and maintenance based on fault tree analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2000-01-01

    Testing and maintenance of safety equipment is an important issue, which significantly contributes to safe and efficient operation of a nuclear power plant. In this paper a method, which extends the classical fault tree with time, is presented. Its mathematical model is represented by a set of equations, which include time requirements defined in the house event matrix. House events matrix is a representation of house events switched on and off through the discrete points of time. It includes house events, which timely switch on and off parts of the fault tree in accordance with the status of the plant configuration. Time dependent top event probability is calculated by the fault tree evaluations. Arrangement of components outages is determined on base of minimization of mean system unavailability. The results show that application of the method may improve the time placement of testing and maintenance activities of safety equipment. (author)

  8. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  9. Numerical analysis of modeling based on improved Elman neural network.

    Science.gov (United States)

    Jie, Shao; Li, Wang; WeiSong, Zhao; YaQin, Zhong; Malekian, Reza

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance.

  10. Improving Between-Shot Fusion Data Analysis with Parallel Structures

    Energy Technology Data Exchange (ETDEWEB)

    CHET NIETER

    2005-07-27

    In the Phase I project we concentrated on three technical objectives to demonstrate the feasibility of the Phase II project: (1) the development of a parallel MDSplus data handler, (2) the parallelization of existing fusion data analysis packages, and (3) the development of techniques to automatically generate parallelized code using pre-compiler directives. We summarize the results of the Phase I research for each of these objectives below. We also describe below additional accomplishments related to the development of the TaskDL and mpiDL parallelization packages.

  11. Recent improvements in plutonium gamma-ray analysis using MGA

    International Nuclear Information System (INIS)

    Ruhter, W.D.; Gunnink, R.

    1992-06-01

    MGA is a gamma-ray spectrum analysis program for determining relative plutonium isotopic abundances. It can determine plutonium isotopic abundances better than 1% using a high-resolution, low-energy, planar germanium detector and measurement times ten minutes or less. We have modified MGA to allow determination of absolute plutonium isotopic abundances in solutions. With calibration of a detector using a known solution concentration in a well-defined sample geometry, plutonium solution concentrations can be determined. MGA can include analysis of a second spectrum of the high-energy spectrum to include determination of fission product abundances relative to total plutonium. For the high-energy gamma-ray measurements we have devised a new hardware configuration, so that both the low- and high-energy gamma-ray detectors are mounted in a single cryostat thereby reducing weight and volume of the detector systems. We describe the detector configuration, and the performance of the MGA program for determining plutonium concentrations in solutions and fission product abundances

  12. Analysis and Measures to Improve Waste Management in Schools

    Directory of Open Access Journals (Sweden)

    Elena Cristina Rada

    2016-08-01

    Full Text Available Assessing waste production in schools highlights the contribution of school children and school staff to the total amount of waste generated in a region, as well as any poor practices of recycling (the so-called separate collection of waste in schools by the students, which could be improved through educational activities. Educating young people regarding the importance of environmental issues is essential, since instilling the right behavior in school children is also beneficial to the behavior of their families. The way waste management was carried out in different schools in Trento (northern Italy was analyzed: a primary school, a secondary school, and three high schools were taken as cases of study. The possible influence of the age of the students and of the various activities carried out within the schools on the different behaviors in separating waste was also evaluated. The results showed that the production of waste did not only depend on the size of the institutes and on the number of occupants, but, especially, on the type of activities carried out in addition to the ordinary classes and on the habits of both pupils and staff. In the light of the results obtained, some corrective measures were proposed to schools, aimed at increasing the awareness of the importance of the right behavior in waste management by students and the application of good practices of recycling.

  13. Systematic wavelength selection for improved multivariate spectral analysis

    Science.gov (United States)

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  14. Performance analysis of PV plants: Optimization for improving profitability

    International Nuclear Information System (INIS)

    Díez-Mediavilla, M.; Alonso-Tristán, C.; Rodríguez-Amigo, M.C.; García-Calderón, T.; Dieste-Velasco, M.I.

    2012-01-01

    Highlights: ► Real PV production from two 100 kW p grid-connected installations is conducted. ► Data sets on production were collected over an entire year. ► Economic results highlight the importance of properly selecting the system components. ► Performance of PV plants is directly related to improvements of all components. - Abstract: A study is conducted of real PV production from two 100 kW p grid-connected installations located in the same area, both of which experience the same fluctuations in temperature and radiation. Data sets on production were collected over an entire year and both installations were compared under various levels of radiation. The installations were assembled with mono-Si panels, mounted on the same support system, and the power supply was equal for the inverter and the measurement system; the same parameters were also employed for the wiring, and electrical losses were calculated in both cases. The results, in economic terms, highlight the importance of properly selecting the system components and the design parameters for maximum profitability.

  15. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...

  16. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...... to only analyse the ethical consequences of a technology, but also the ethical issues of the whole HTA process must be considered. Selection of assessment topics, methods and outcomes is essentially a value-laden decision. Health technologies may challenge moral or cultural values and beliefs...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...

  17. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  18. Pooled calibrations and retainment of outliers improve chemical analysis

    DEFF Research Database (Denmark)

    Andersen, Jens; Alfaloje, Haedar S.H.

    2012-01-01

    Analytical chemistry has a large responsibility in society, and credibility and reliability are important concepts associated with chemical analysis. Metrology and Quality Assurance (QA) are key areas of interest in contemporary research. Quality in measurements is illustrated by a series...... of experiments with several analytical technologies comprising of ICP-MS, GC-MS and AAS. The scientific methodology relies on the concept of reproducibility that depends on type of analyte and type of apparatus. By applying the principle of pooled calibrations it is shown that the performance of the apparatus...... indicate that the procedures outlined in the Eurachem/CITAC Guide are of tremendous value to analytical sciences because they direct researcher's attention towards the concept of consensus values rather than towards true values. Introduction of certified reference materials (CRM’s) in metrology has...

  19. An improved method for thin layer chromatographic analysis of saponins.

    Science.gov (United States)

    Sharma, Om P; Kumar, Neeraj; Singh, Bikram; Bhat, Tej K

    2012-05-01

    Analysis of saponins by thin layer chromatography (TLC) is reported. The solvent system was n-butanol:water:acetic acid (84:14:7). Detection of saponins on the TLC plates after development and air-drying was done by immersion in a suspension of sheep erythrocytes, followed by washing off the excess blood on the plate surface. Saponins appeared as white spots against a pink background. The protocol provided specific detection of saponins in the saponins enriched extracts from Aesculusindica (Wall. ex Camb.) Hook.f., Lonicera japonica Thunb., Silene inflata Sm., Sapindusmukorossi Gaertn., Chlorophytum borivilianum Santapau & Fernandes, Asparagusadscendens Roxb., Asparagus racemosus Willd., Agave americana L., Camellia sinensis [L.] O. Kuntze. The protocol is convenient, inexpensive, does not require any corrosive chemicals and provides specific detection of saponins. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Ethical analysis to improve decision-making on health technologies

    DEFF Research Database (Denmark)

    Saarni, Samuli I; Hofmann, Bjørn; Lampe, Kristian

    2008-01-01

    , and their implementation may also have significant impact on people other than the patient. These are essential considerations for health policy. The ethics model is structured around key ethical questions rather than philosophical theories, to be applicable to different cultures and usable by non-philosophers...... beyond effectiveness and costs to also considering the social, organizational and ethical implications of technologies. However, a commonly accepted method for analysing the ethical aspects of health technologies is lacking. This paper describes a model for ethical analysis of health technology...... that is easy and flexible to use in different organizational settings and cultures. The model is part of the EUnetHTA project, which focuses on the transferability of HTAs between countries. The EUnetHTA ethics model is based on the insight that the whole HTA process is value laden. It is not sufficient...

  1. Database improvements for motor vehicle/bicycle crash analysis

    Science.gov (United States)

    Lusk, Anne C; Asgarzadeh, Morteza; Farvid, Maryam S

    2015-01-01

    Background Bicycling is healthy but needs to be safer for more to bike. Police crash templates are designed for reporting crashes between motor vehicles, but not between vehicles/bicycles. If written/drawn bicycle-crash-scene details exist, these are not entered into spreadsheets. Objective To assess which bicycle-crash-scene data might be added to spreadsheets for analysis. Methods Police crash templates from 50 states were analysed. Reports for 3350 motor vehicle/bicycle crashes (2011) were obtained for the New York City area and 300 cases selected (with drawings and on roads with sharrows, bike lanes, cycle tracks and no bike provisions). Crashes were redrawn and new bicycle-crash-scene details were coded and entered into the existing spreadsheet. The association between severity of injuries and bicycle-crash-scene codes was evaluated using multiple logistic regression. Results Police templates only consistently include pedal-cyclist and helmet. Bicycle-crash-scene coded variables for templates could include: 4 bicycle environments, 18 vehicle impact-points (opened-doors and mirrors), 4 bicycle impact-points, motor vehicle/bicycle crash patterns, in/out of the bicycle environment and bike/relevant motor vehicle categories. A test of including these variables suggested that, with bicyclists who had minor injuries as the control group, bicyclists on roads with bike lanes riding outside the lane had lower likelihood of severe injuries (OR, 0.40, 95% CI 0.16 to 0.98) compared with bicyclists riding on roads without bicycle facilities. Conclusions Police templates should include additional bicycle-crash-scene codes for entry into spreadsheets. Crash analysis, including with big data, could then be conducted on bicycle environments, motor vehicle potential impact points/doors/mirrors, bicycle potential impact points, motor vehicle characteristics, location and injury. PMID:25835304

  2. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal

    Science.gov (United States)

    2014-05-01

    1 LEXICAL LINK ANALYSIS APPLICATION: IMPROVING WEB SERVICE TO ACQUISITION VISIBILITY PORTAL May 14-15, 2014 Dr. Ying Zhao, Dr. Douglas J. MacKinnon...3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service to Acquisition...Extract relations among PE, MDAP, and ACATII • Extract costs 2 Methods: System Self-awareness (SSA) and Lexical Link Analysis (LLA) • System Self

  3. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Directory of Open Access Journals (Sweden)

    Gafurov Andrey

    2018-01-01

    Full Text Available The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the “Project analysis scenario” flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  4. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Science.gov (United States)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  5. Delamination Modeling of Composites for Improved Crash Analysis

    Science.gov (United States)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  6. Improving Power System Stability Using Transfer Function: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    G. Shahgholian

    2017-10-01

    Full Text Available In this paper, a small-signal dynamic model of a single-machine infinite-bus (SMIB power system that includes IEEE type-ST1 excitation system and PSS based on transfer fu¬n¬c¬¬tion structure is presented. The changes in the operating co¬n¬dition of a power system on dynamic performance have been exa¬m¬ined. The dynamic performance of the closed-loop system is ana¬lyzed base on its eigenvalues. The effectiveness of the par¬a¬m¬e¬t¬ers changes on dynamic stability is verified by simulation res¬u¬l¬ts. Three types of PSS have been considered for analysis: (a the derivative PSS, (b the lead-lag PSS or conventional PSS, and (c the proportional-integral-derivative PSS. The objective fu¬nc¬t¬i¬o¬n is formulated to increase the dam¬¬ping ratio of the electromechanical mode eigenvalues. Simu¬la¬tion results show that the PID-PSS performs better for less ov¬e¬r¬shoot and less settling time comp¬ared with the CPSS and DPSS un¬der different load ope¬ration and the significant system pa¬r¬am¬eter variation conditions.

  7. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    Science.gov (United States)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  8. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-01-30

    monitoring of content that is accessible. The study examines risks associated with information security, technological change and continued popularity of Scouting. Mitigation is based on system functions that are defined. The approach to developing an improved system for facilitating Boy Scout leader functions was iterative with insights into capabilities coming in the course of working through the used cases and sequence diagrams.

  9. Using robust statistics to improve neutron activation analysis results

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Ticianelli, Regina B.; Figueiredo, Ana Maria G.

    2011-01-01

    Neutron activation analysis (NAA) is an analytical technique where an unknown sample is submitted to a neutron flux in a nuclear reactor, and its elemental composition is calculated by measuring the induced activity produced. By using the relative NAA method, one or more well-characterized samples (usually certified reference materials - CRMs) are irradiated together with the unknown ones, and the concentration of each element is then calculated by comparing the areas of the gamma ray peaks related to that element. When two or more CRMs are used as reference, the concentration of each element can be determined by several different ways, either using more than one gamma ray peak for that element (when available), or using the results obtained in the comparison with each CRM. Therefore, determining the best estimate for the concentration of each element in the sample can be a delicate issue. In this work, samples from three CRMs were irradiated together and the elemental concentration in one of them was calculated using the other two as reference. Two sets of peaks were analyzed for each element: a smaller set containing only the literature-recommended gamma-ray peaks and a larger one containing all peaks related to that element that could be quantified in the gamma-ray spectra; the most recommended transition was also used as a benchmark. The resulting data for each element was then reduced using up to five different statistical approaches: the usual (and not robust) unweighted and weighted means, together with three robust means: the Limitation of Relative Statistical Weight, Normalized Residuals and Rajeval. The resulting concentration values were then compared to the certified value for each element, allowing for discussion on both the performance of each statistical tool and on the best choice of peaks for each element. (author)

  10. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2012-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  11. IMPROVING FINANCIAL ANALYSIS OF ORGANIZATIONS IN ORDER TO PREVENT THEIR INSOLVENCY

    Directory of Open Access Journals (Sweden)

    V. N. Alferov

    2013-01-01

    Full Text Available The current regulatory analysis of the financial condition of insolvent organizations have some disadvantages also does not account the features of the analysis based on the consolidated financial statements under IFRS and GAAP. In this work on the basis of the comparative analysis of financial condition of a number of large Russian companies, calculated on their accounting statements prepared under Russian accounting standards, IFRS and GAAP, proposals are developed to improve the analysis of financial condition of insolvent institutions.

  12. Dr. Dahl publishes article on improving intelligence analysis through the use of case studies

    OpenAIRE

    2017-01-01

    NSA’s Dr. Erik Dahl has recently published an article in Intelligence and National Security entitled “Getting beyond Analysis by Anecdote: Improving Intelligence Analysis through the Use of Case Studies.” Dahl argues that although American intelligence officials have been trying since the 9/11 attacks to improve the quality of their analysis, they have so far failed to make much use of one of the most common methods used in the social science: case study analysis. Through better u...

  13. Logistics analysis to Improve Deployability (LOG-AID): Field Experiment/Results

    National Research Council Canada - National Science Library

    Evers, Kenneth

    2000-01-01

    .... Under sponsorship of the Air Force Research Laboratory Logistics Readiness Branch (AFRL/HESR), the Synergy team analyzed the current wing-level deployment process as part of the Logistics Analysis to Improve Deployability (LOG-AID) program...

  14. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    International Nuclear Information System (INIS)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H.

    2005-01-01

    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported

  15. On-treatment analysis of the Improved Reduction of Outcomes: Vytorin Efficacy International Trial (IMPROVE-IT).

    Science.gov (United States)

    Blazing, Michael A; Giugliano, Robert P; de Lemos, James A; Cannon, Christopher P; Tonkin, Andrew; Ballantyne, Christie M; Lewis, Basil S; Musliner, Thomas A; Tershakovec, Andrew M; Lokhnygina, Yuliya; White, Jennifer A; Reist, Craig; McCagg, Amy; Braunwald, Eugene

    2016-12-01

    We aimed to determine the efficacy and safety of adding ezetimibe (Ez) to simvastatin (S) in a post-acute coronary syndrome (ACS) population in a prespecified on-treatment analysis. We evaluated 17,706 post-ACS patients from the IMPROVE-IT trial who had low-density lipoprotein cholesterol values between 50 and 125 mg/dL and who received Ez 10 mg/d with S 40 mg/d (Ez/S) or placebo with simvastatin 40 mg/d (P/S). The primary composite end point was cardiovascular death, myocardial infarction, unstable angina, coronary revascularization ≥30 days postrandomization, or stroke. The on-treatment analysis included patients who received study drug for the duration of the trial or experienced a primary end point or noncardiovascular death within 30 days of drug discontinuation. Mean low-density lipoprotein cholesterol values at 1 year were 71 mg/dL for P/S and 54 mg/dL for Ez/S (absolute difference -17 mg/dL = -24%; P < .001). The 7-year Kaplan-Meier estimate of the primary end point occurred in 32.4% in the P/S arm and 29.8% in the Ez/S arm (absolute difference 2.6%; HR adj 0.92 [95% CI 0.87-0.98]; P = .01). The absolute treatment effect favoring Ez/S was 30% greater than in the intention-to-treat analysis of IMPROVE-IT. This analysis provides additional support for the efficacy and safety of adding Ez to S in this high-risk, post-ACS population. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Cause-Effect Analysis: Improvement of a First Year Engineering Students' Calculus Teaching Model

    Science.gov (United States)

    van der Hoff, Quay; Harding, Ansie

    2017-01-01

    This study focuses on the mathematics department at a South African university and in particular on teaching of calculus to first year engineering students. The paper reports on a cause-effect analysis, often used for business improvement. The cause-effect analysis indicates that there are many factors that impact on secondary school teaching of…

  17. IMPROVED EFFECT AND FEM ANALYSIS OF VACUUM CONSOLIDATION METHOD USES PRELOAD EMBANKMENT

    Science.gov (United States)

    Hirata, Masafumi; Fukuda, Jun; Nobuta, Junichi; Nishikawa, Kouji; Yamada, Kouichi; Kawaida, Minoru

    In the vacuum consolidation method, the vacuum pressure acts on the direction of the inside of the improvement region. When the embankment is used together, the lateral deformation decrease and the rapid construction of the embankment are possible. However, the FEM analysis is necessary to forecast such an improved effect. In this paper, the improved effect and the deformation characteristic were verified about the vacuum consolidation method executed by the Wakasa construction. Moreover, soil-water coupled FEM analysis was executed, and the use method to the execution management etc. were examined. In the Wakasa construction, it was confirmed that the vacuum consolidation method demonstrated a high effect in the deformation decrease of the surrounding soil, the accelerating consolidation, and the term of works shortening. The executed FEM analysis can reproduce the improved effect of the vacuum consolidation by high accuracy. In the Wakasa construction, a stable construction of embankment has been achieved by using this analytical result for the execution management.

  18. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment (DAVE)

    Science.gov (United States)

    2015-05-01

    1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis (LLA) Application...Making 3 2 1 3 L L A Methods • Lexical Link Analysis (LLA) Core – LLA Reports and Visualizations • Collaborative Learning Agents (CLA) for

  19. Surface plasmon resonance thermodynamic and kinetic analysis as a strategic tool in drug design. Distinct ways for phosphopeptides to plug into Src- and Grb2 SH2 domains

    NARCIS (Netherlands)

    de Mol, Nico J; Dekker, Frank J; Broutin, Isabel; Fischer, Marcel J E; Liskamp, Rob M J; Dekker, Frank

    2005-01-01

    Thermodynamic and kinetic studies of biomolecular interactions give insight into specificity of molecular recognition processes and advance rational drug design. Binding of phosphotyrosine (pY)-containing peptides to Src- and Grb2-SH2 domains was investigated using a surface plasmon resonance

  20. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  1. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know......Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...

  2. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...... recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know...

  3. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Malik, S; Hoehle, F; Lassila-Perini, K; Hinzmann, A; Wolf, R; Shipsey, I

    2012-01-01

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  4. An improved multiple linear regression and data analysis computer program package

    Science.gov (United States)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  5. Improvement in Student Data Analysis Skills after Out-of-Class Assignments

    Directory of Open Access Journals (Sweden)

    Kristen Lee Williams Walton

    2016-12-01

    Full Text Available The ability to understand and interpret data is a critical aspect of scientific thinking.  However, although data analysis is often a focus in biology majors classes, many textbooks for allied health majors classes are primarily content-driven and do not include substantial amounts of experimental data in the form of graphs and figures.  In a lower-division allied health majors microbiology class, students were exposed to data from primary journal articles as take-home assignments and their data analysis skills were assessed in a pre-/posttest format.  Students were given 3 assignments that included data analysis questions.  Assignments ranged from case studies that included a figure from a journal article to reading a short journal article and answering questions about multiple figures or tables.  Data were represented as line or bar graphs, gel photographs, and flow charts.  The pre- and posttest was designed incorporating the same types of figures to assess whether the assignments resulted in any improvement in data analysis skills.  The mean class score showed a small but significant improvement from the pretest to the posttest across three semesters of testing.  Scores on individual questions testing accurate conclusions and predictions improved the most.  This supports the conclusion that a relatively small number of out-of-class assignments through the semester resulted in a significant improvement in data analysis abilities in this population of students.

  6. Improvement on reaction model for sodium-water reaction jet code and application analysis

    International Nuclear Information System (INIS)

    Itooka, Satoshi; Saito, Yoshinori; Okabe, Ayao; Fujimata, Kazuhiro; Murata, Shuuichi

    2000-03-01

    In selecting the reasonable DBL on steam generator (SG), it is necessary to improve analytical method for estimating the sodium temperature on failure propagation due to overheating. Improvement on sodium-water reaction (SWR) jet code (LEAP-JET ver.1.30) and application analysis to the water injection tests for confirmation of code propriety were performed. On the improvement of the code, a gas-liquid interface area density model was introduced to develop a chemical reaction model with a little dependence on calculation mesh size. The test calculation using the improved code (LEAP-JET ver.1.40) were carried out with conditions of the SWAT-3·Run-19 test and an actual scale SG. It is confirmed that the SWR jet behavior on the results and the influence to analysis result of a model are reasonable. For the application analysis to the water injection tests, water injection behavior and SWR jet behavior analyses on the new SWAT-1 (SWAT-1R) and SWAT-3 (SWAT-3R) tests were performed using the LEAP-BLOW code and the LEAP-JET code. In the application analysis of the LEAP-BLOW code, parameter survey study was performed. As the results, the condition of the injection nozzle diameter needed to simulate the water leak rate was confirmed. In the application analysis of the LEAP-JET code, temperature behavior of the SWR jet was investigated. (author)

  7. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  8. An improved and explicit surrogate variable analysis procedure by coefficient adjustment.

    Science.gov (United States)

    Lee, Seunggeun; Sun, Wei; Wright, Fred A; Zou, Fei

    2017-06-01

    Unobserved environmental, demographic, and technical factors can negatively affect the estimation and testing of the effects of primary variables. Surrogate variable analysis, proposed to tackle this problem, has been widely used in genomic studies. To estimate hidden factors that are correlated with the primary variables, surrogate variable analysis performs principal component analysis either on a subset of features or on all features, but weighting each differently. However, existing approaches may fail to identify hidden factors that are strongly correlated with the primary variables, and the extra step of feature selection and weight calculation makes the theoretical investigation of surrogate variable analysis challenging. In this paper, we propose an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties. Simulation studies and real data analysis show that the method is competitive to state-of-the-art methods.

  9. U.S. Forest Service Region 1 Lake Chemistry, NADP, and IMPROVE air quality data analysis

    Science.gov (United States)

    Jill Grenon; Mark Story

    2009-01-01

    This report was developed to address the need for comprehensive analysis of U.S. Forest Service (USFS) Region 1 air quality monitoring data. The monitoring data includes Phase 3 (long-term data) lakes, National Atmospheric Deposition Program (NADP), and Interagency Monitoring of Protected Visual Environments (IMPROVE). Annual and seasonal data for the periods of record...

  10. ULg Spectra: An Interactive Software Tool to Improve Undergraduate Students' Structural Analysis Skills

    Science.gov (United States)

    Agnello, Armelinda; Carre, Cyril; Billen, Roland; Leyh, Bernard; De Pauw, Edwin; Damblon, Christian

    2018-01-01

    The analysis of spectroscopic data to solve chemical structures requires practical skills and drills. In this context, we have developed ULg Spectra, a computer-based tool designed to improve the ability of learners to perform complex reasoning. The identification of organic chemical compounds involves gathering and interpreting complementary…

  11. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  12. The improved Apriori algorithm based on matrix pruning and weight analysis

    Science.gov (United States)

    Lang, Zhenhong

    2018-04-01

    This paper uses the matrix compression algorithm and weight analysis algorithm for reference and proposes an improved matrix pruning and weight analysis Apriori algorithm. After the transactional database is scanned for only once, the algorithm will construct the boolean transaction matrix. Through the calculation of one figure in the rows and columns of the matrix, the infrequent item set is pruned, and a new candidate item set is formed. Then, the item's weight and the transaction's weight as well as the weight support for items are calculated, thus the frequent item sets are gained. The experimental result shows that the improved Apriori algorithm not only reduces the number of repeated scans of the database, but also improves the efficiency of data correlation mining.

  13. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Powerplant productivity improvement study: policy analysis and incentive assessment. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    Policy options that the Illinois Commerce Commission might adopt in order to promote improved power plant productivity for existing units in Illinois are identified and analyzed. These policy options would generally involve either removing existing disincentives and/or adding direct incentives through the regulatory process. The following activities are reported: in-depth review of existing theoretical and empirical literature in the areas of power plant reliability, regulatory utility efficiency and performance incentives, and impacts of various regulatory mechanisms such as the Fuel Adjustment Clauses on productivity; contacts with other state public utility commissions known to be investigating or implementing productivity improvement incentive mechanisms; documentation and analysis of incentive mechanisms adopted or under consideration in other states; analysis of current regulatory practice in Illinois as it relates to power plant productivity incentives and disincentives; identification of candidate incentive mechanisms for consideration by the Illinois Commerce Commission; and analysis and evaluation of these candidates. 72 references, 8 figures.

  15. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    Science.gov (United States)

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to Acquisition...2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service...processes. Lexical Link Analysis (LLA) can help, by applying automation to reveal and depict???to decisionmakers??? the correlations, associations, and

  16. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase II

    Science.gov (United States)

    2014-04-30

    bäÉîÉåíÜ=^ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to...DATE 30 APR 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving...vocabulary or lexicon, to describe the attributes and surrounding environment of the system. Lexical Link Analysis (LLA) is a form of text mining in which

  17. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  18. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  19. Improving educational environment in medical colleges through transactional analysis practice of teachers.

    Science.gov (United States)

    Rajan, Marina; Chacko, Thomas

    2012-01-01

     A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of 'awareness' about intrapersonal and interpersonal processes. To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same.  An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training.  The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students.  These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes.

  20. Transition towards improved regional wood flows by integrating material flux analysis and agent analysis. The case of Appenzell Ausserrhoden, Switzerland

    International Nuclear Information System (INIS)

    Binder, Claudia R.; Hofer, Christoph; Wiek, Arnim; Scholz, Roland W.

    2004-01-01

    This paper discusses the integration of material flux analysis and agent analysis as the basis for a transition towards improved regional wood management in Appenzell Ausserrhoden (AR), a small Swiss canton located in the Pre-Alps of Switzerland. We present a wood flow analysis for forests, wood processing industries and consumption in AR, accounting for different wood products. We find that the forest is currently significantly underutilized although there are sizeable imports of wood and fuel to this small region. The underutilization of the forest contributes to a skewed age distribution, jeopardizing long-term sustainable development of the forest, as the fulfillment of its protective and production function are likely to be at risk. The wood resources, however, are capable of satisfying current wood demand among the population of AR and wood could even be exported. Underutilization has two main causes: first, wood prices are so low that harvesting trees is a money-losing proposition; second, consumer wood demand and the current supply from forest owners are not aligned. Furthermore, cultural values, lifestyle trends and traditions make an alignment of supply and demand difficult. Consensus and strategy building with the relevant stakeholders on the basis of the results obtained from the wood flow analysis and agent analysis is a reasonable next step to take. We conclude that wood flow analysis combined with agent analysis provide a useful and straightforward tool to be used as the basis of a transition process towards improved regional wood flows, which in turn should contribute to sustainable forest management

  1. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    Science.gov (United States)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  2. Improving the PSA quality in the human reliability analysis of pre-accident human errors

    International Nuclear Information System (INIS)

    Kang, D.-I.; Jung, W.-D.; Yang, J.-E.

    2004-01-01

    This paper describes the activities for improving the Probabilistic Safety Assessment (PSA) quality in the human reliability analysis (HRA) of the pre-accident human errors for the Korea Standard Nuclear Power Plant (KSNP). We evaluate the HRA results of the PSA for the KSNP and identify the items to be improved using the ASME PRA Standard. Evaluation results show that the ratio of items to be improved for pre-accident human errors is relatively high when compared with the ratio of those for post-accident human errors. They also show that more than 50% of the items to be improved for pre-accident human errors are related to the identification and screening analysis for them. In this paper, we develop the modeling guidelines for pre-accident human errors and apply them to the auxiliary feedwater system of the KSNP. Application results show that more than 50% of the items to be improved for the pre-accident human errors of the auxiliary feedwater system are resolved. (author)

  3. Root Cause Analysis and Productivity Improvement Of An Apparel Industry In Bangladesh Through Kaizen Implementation

    Directory of Open Access Journals (Sweden)

    Taposh Kumar Kapuria

    2017-12-01

    Full Text Available Garments industry is playing the pioneering role in improving Bangladesh economic condition. It was started in late 1970’s and now the leading foreign currency earner for Bangladesh. It’s no dubiousness to say that, the Bangladesh garment industry is ameliorating garment’s service quality and innovative design features to exist in the global competitive market. Global competition in the garment’s market is changing day to day. Leading garment manufacturer from all over the world are adopting new innovative features and techniques to sustain global fierce competitive market. But the point is, Bangladeshi garment manufacturers are not lingered. They are also emphasizing on better service quality by adding latest design features and using the latest technologies to the garments. The sole purpose of this paper is to identify the root causes of sewing defects of an apparel industry in Bangladesh and continuous improvement in reducing the defects through Kaizen (Continuous Improvement system. In short, productivity improvement of the apparel industry. Our studied garment manufacturing company is “ABONTI Color Tex. Ltd.”. Pareto Analysis is used to identify the top defect items. Cause-Effect Analysis helped to identify the root causes of sewing defects. Then, Kaizen is used for continuous improvement of the minimization of sewing defects.

  4. Comparative evaluation of the effects of casein phosphopeptide-amorphous calcium phosphate (CPP-ACP and xylitol-containing chewing gum on salivary flow rate, pH and buffering capacity in children: An in vivo study

    Directory of Open Access Journals (Sweden)

    Rahul J Hegde

    2017-01-01

    Full Text Available Aim: This study aimed to compare and evaluate the changes in the salivary flow rate, pH, and buffering capacity before and after chewing casein phosphopeptide-amorphous calcium phosphate (CPP-ACP and xylitol-containing chewing gums in children. Materials and Methods: Sixty children aged between 8 and 12 years were selected for the study. They were randomly divided into Group 1 (CPP-ACP chewing gum and Group 2 (xylitol-containing chewing gum comprising thirty children each. Unstimulated and stimulated saliva samples at 15 and 30 min interval were collected from all children. All the saliva samples were estimated for salivary flow rate, pH, and buffering capacity. Results: Significant increase in salivary flow rate, pH, and buffering capacity from baseline to immediately after spitting the chewing gum was found in both the study groups. No significant difference was found between the two study groups with respect to salivary flow rate and pH. Intergroup comparison indicated a significant increase in salivary buffer capacity in Group 1 when compared to Group 2. Conclusion: Chewing gums containing CPP-ACP and xylitol can significantly increase the physiochemical properties of saliva. These physiochemical properties of saliva have a definite relation with caries activity in children.

  5. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  6. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Jonny; Nasution, Januar

    2013-01-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  7. The effects of aromatherapy on sleep improvement: a systematic literature review and meta-analysis.

    Science.gov (United States)

    Hwang, Eunhee; Shin, Sujin

    2015-02-01

    To evaluate the existing data on aromatherapy interventions for improvement of sleep quality. Systematic literature review and meta-analysis on the effects of aromatherapy. Study Sources: Electronic databases, including the Korea Education and Research Information Service (KERIS), Korean studies Information Service System (KISS), National Assembly Library, and eight academies within the Korean Society of Nursing Science, were searched to identify studies published between 2000 and August 2013. Randomized controlled and quasi-experimental trials that included aromatherapy for the improvement of sleep quality. Of the 245 publications identified, 13 studies met the inclusion and exclusion criteria, and 12 studies were used in the meta-analysis. Meta-analysis of the 12 studies using a random-effects model revealed that the use of aromatherapy was effective in improving sleep quality (95% confidence interval [CI], 0.540-1.745; Z=3.716). Subgroup analysis revealed that inhalation aromatherapy (95% CI, 0.792-1.541; Z=6.107) was more effective than massage therapy (95% CI, 0.128-2.166; Z=2.205) in unhealthy (95% CI, 0.248-1.100; Z=3.100) and healthy (95% CI, 0.393-5.104; Z=2.287) participants, respectively. Readily available aromatherapy treatments appear to be effective and promote sleep. Thus, it is essential to develop specific guidelines for the efficient use of aromatherapy.

  8. Improvement of burnup analysis for pebble bed reactors with an accumulative fuel loading scheme

    International Nuclear Information System (INIS)

    Simanullang, Irwan Liapto; Obara, Toru

    2015-01-01

    Given the limitations of natural uranium resources, innovative nuclear power plant concepts that increase the efficiency of nuclear fuel utilization are needed. The Pebble Bed Reactor (PBR) shows some potential to achieve high efficiency in natural uranium utilization. To simplify the PBR concept, PBR with an accumulation fuel loading scheme was introduced and the Fuel Handling System (FHS) removed. In this concept, the pebble balls are added little by little into the reactor core until the pebble balls reach the top of the reactor core, and all pebble balls are discharged from the core at the end of the operation period. A code based on the MVP/MVP-BURN method has been developed to perform an analysis of a PBR with the accumulative fuel loading scheme. The optimum fuel composition was found using the code for high burnup performance. Previous efforts provided several motivations to improve the burnup performance: First, some errors in the input code were corrected. This correction, and an overall simplification of the input code, was implemented for easier analysis of a PBR with the accumulative fuel loading scheme. Second, the optimum fuel design had been obtained in the infinite geometry. To improve the optimum fuel composition, a parametric survey was obtained by varying the amount of Heavy Metal (HM) uranium per pebble and the degree of uranium enrichment. Moreover, an entire analysis of the parametric survey was obtained in the finite geometry. The results show that improvements in the fuel composition can lead to more accurate analysis with the code. (author)

  9. Prediction of improvement in skin fibrosis in diffuse cutaneous systemic sclerosis: a EUSTAR analysis.

    Science.gov (United States)

    Dobrota, Rucsandra; Maurer, Britta; Graf, Nicole; Jordan, Suzana; Mihai, Carina; Kowal-Bielecka, Otylia; Allanore, Yannick; Distler, Oliver

    2016-10-01

    Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Analysis of transient heat conduction in a PWR fuel rod by an improved lumped parameter approach

    Energy Technology Data Exchange (ETDEWEB)

    Dourado, Eneida Regina G. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Cotta, Renato M. [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Mecanica; Jian, Su, E-mail: eneidadourado@gmail.com, E-mail: sujian@nuclear.ufrj.br, E-mail: cotta@mecanica.ufrj.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2017-07-01

    This paper aims to analyze transient heat conduction in a nuclear fuel rod by an improved lumped parameter approach. One-dimensional transient heat conduction is considered, with the circumferential symmetry assumed and the axial conduction neglected. The thermal conductivity and specific heat in the fuel pellet are considered temperature dependent, while the thermophysical properties of the cladding are considered constant. Hermite approximation for integration is used to obtain the average temperature and heat flux in the radial direction. Significant improvement over the classical lumped parameter formulation has been achieved. The proposed model can be also used in dynamic analysis of PWR and nuclear power plant simulators. (author)

  11. Cost-benefit analysis of improved air quality in an office building

    DEFF Research Database (Denmark)

    Djukanovic, R.; Wargocki, Pawel; Fanger, Povl Ole

    2002-01-01

    A cost-benefit analysis of measures to improve air quality in an existing air-conditoned office building (11581 m2, 864 employees) was carried out for hot, temperate and cold climates and for two operating modes: Variable Air Volume (VAV) with economizer; and Constant Air Volume (CAV) with heat...... productivity for every 10% reduction in the proportion of occupants entering a space who are dissatisfied with the air quality. With this assumption, the annual benefit due to improved air quality was always at least 10 times higher than the increase in annual energy and maintenance costs. The payback time...

  12. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; hide

    2015-01-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  13. Application of numerical grid generation for improved CFD analysis of multiphase screw machines

    Science.gov (United States)

    Rane, S.; Kovačević, A.

    2017-08-01

    Algebraic grid generation is widely used for discretization of the working domain of twin screw machines. Algebraic grid generation is fast and has good control over the placement of grid nodes. However, the desired qualities of grid which should be able to handle multiphase flows such as oil injection, may be difficult to achieve at times. In order to obtain fast solution of multiphase screw machines, it is important to further improve the quality and robustness of the computational grid. In this paper, a deforming grid of a twin screw machine is generated using algebraic transfinite interpolation to produce initial mesh upon which an elliptic partial differential equations (PDE) of the Poisson’s form is solved numerically to produce smooth final computational mesh. The quality of numerical cells and their distribution obtained by the differential method is greatly improved. In addition, a similar procedure was introduced to fully smoothen the transition of the partitioning rack curve between the rotors thus improving continuous movement of grid nodes and in turn improve robustness and speed of the Computational Fluid Dynamic (CFD) solver. Analysis of an oil injected twin screw compressor is presented to compare the improvements in grid quality factors in the regions of importance such as interlobe space, radial tip and the core of the rotor. The proposed method that combines algebraic and differential grid generation offer significant improvement in grid quality and robustness of numerical solution.

  14. Construction Delay Analysis Techniques—A Review of Application Issues and Improvement Needs

    Directory of Open Access Journals (Sweden)

    Nuhu Braimah

    2013-07-01

    Full Text Available The time for performance of a project is usually of the essence to the employer and the contractor. This has made it quite imperative for contracting parties to analyse project delays for purposes of making right decisions on potential time and/or cost compensation claims. Over the years, existing delay analysis techniques (DATs for aiding this decision-making have been helpful but have not succeeded in curbing the high incidence of disputes associated with delay claims resolutions. A major source of the disputes lies with the limitations and capabilities of the techniques in their practical use. Developing a good knowledge of these aspects of the techniques is of paramount importance in understanding the real problematic issues involved and their improvement needs. This paper seeks to develop such knowledge and understanding (as part of a wider research work via: an evaluation of the most common DATs based on a case study, a review of the key relevant issues often not addressed by the techniques, and the necessary improvements needs. The evaluation confirmed that the various techniques yield different analysis results for the same delay claims scenario, mainly due to their unique application procedures. The issues that are often ignored in the analysis but would also affect delay analysis results are: functionality of the programming software employed for the analysis, resource loading and levelling requirements, resolving concurrent delays, and delay-pacing strategy. Improvement needs by way of incorporating these issues in the analysis and focusing on them in future research work are the key recommendations of the study.

  15. Effectiveness of Cognitive and Transactional Analysis Group Therapy on Improving Conflict-Solving Skill

    Directory of Open Access Journals (Sweden)

    Bahram A. Ghanbari-Hashemabadi

    2012-03-01

    Full Text Available Background: Today, learning the communication skills such as conflict solving is very important. The purpose of the present study was to investigate the efficiency of cognitive and transactional analysis group therapy on improving the conflict-solving skill.Materials and Method: This study is an experimental study with pretest-posttest and control group. Forty-five clients who were referring to the counseling and psychological services center of Ferdowsi University of Mashhad were chosen based on screening method. In addition, they were randomly divided into three equal groups: control group (15 participants, cognitive experimental group (15 participants and transactional analysis group (15 participants. Conflict-solving questionnaire was used to collect data and the intervention methods were cognitive and transactional analysis group therapy that was administrated during 8 weekly two-hour sessions. Mean and standard deviation were used for data analysis in the descriptive level and One-Way ANOVA method was used at the inference level.Results: The results of the study suggest that the conflict-solving skills in the two experimental groups were significantly increased. Conclusion: The finding of this research is indicative of the fact that both cognitive and transactional analysis group therapy could be an effective intervention for improving conflict-solving skills

  16. Smoothed Particle Hydro-dynamic Analysis of Improvement in Sludge Conveyance Efficiency of Screw Decanter Centrifuge

    International Nuclear Information System (INIS)

    Park, Dae Woong

    2015-01-01

    A centrifuge works on the principle that particles with different densities will separate at a rate proportional to the centrifugal force during high-speed rotation. Dense particles are quickly precipitated, and particles with relatively smaller densities are precipitated more slowly. A decanter-type centrifuge is used to remove, concentrate, and dehydrate sludge in a water treatment process. This is a core technology for measuring the sludge conveyance efficiency improvement. In this study, a smoothed particle hydro-dynamic analysis was performed for a decanter centrifuge used to convey sludge to evaluate the efficiency improvement. This analysis was applied to both the original centrifugal model and the design change model, which was a ball-plate rail model, to evaluate the sludge transfer efficiency.

  17. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  18. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis

    Directory of Open Access Journals (Sweden)

    Masanobu Motooka

    2018-02-01

    Full Text Available Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  19. Analysis of means of improving the uncontrolled lateral motions of personal airplanes

    Science.gov (United States)

    Mckinney, Marion O , Jr

    1951-01-01

    A theoretical analysis has been made of means of improving the uncontrolled motions of personal airplanes. The purpose of this investigation was to determine whether such airplanes could be made to fly uncontrolled for an indefinite period of time without getting into dangerous attitudes and for a reasonable period of time (1 to 3 min) without deviating excessively from their original course. The results of this analysis indicated that the uncontrolled motions of a personal airplane could be made safe as regards spiral tendencies and could be greatly improved as regards maintenance of course without resort to an autopilot. The only way to make the uncontrolled motions completely satisfactory as regards continuous maintenance of course, however, is to use a conventional type of autopilot.

  20. Improvements of Physical Models in TRITGO code for Tritium Behavior Analysis in VHTR

    International Nuclear Information System (INIS)

    Yoo, Jun Soo; Tak, Nam Il; Lim, Hong Sik

    2010-01-01

    Since tritium is radioactive material with 12.32 year of half-life and is generated by a ternary fission reaction in fuel as well as by neutron absorption reactions of impurities in Very High Temperature gas-cooled Reactor (VHTR) core, accurate prediction of tritium behavior and its concentration in product hydrogen is definitely important in terms of public safety for its construction. In this respect, TRITGO code was developed for estimating the tritium production and distribution in high temperature gas-cooled reactors by General Atomics (GA). However, some models in it are hard-wired to specific reactor type or too simplified, which makes the analysis results less applicable. Thus, major improvements need to be considered for better predictions. In this study, some of model improvements have been suggested and its effect is evaluated based on the analysis work against PMR600 design concept

  1. Design improvement and dynamic finite element analysis of novel ITI dental implant under dynamic chewing loads.

    Science.gov (United States)

    Cheng, Yung-Chang; Lin, Deng-Huei; Jiang, Cho-Pei; Lee, Shyh-Yuan

    2015-01-01

    The main aim of this article was to introduce the application of a uniform design for experimental methods to drop the micromotion of a novel ITI dental implant model under the dynamic loads. Combining the characteristics of the traditional ITI and Nano-Tite implants, a new implant with concave holes has been constructed. Compared to the traditional ITI dental implant model, the micromotion of the new dental implant model was significantly reduced by explicit dynamic finite element analysis. From uniform design of experiments, the dynamic finite element analysis method was applied to caluculated the maximum micromotion of the full model. Finally, the chief design in all the experiment simulations which cause the minimum micromotion is picked as the advanced model of the design. Related to the original design, which was associated with a micromotion of 45.11 μm, the micromotion of the improved version was 31.37 μm, for an improvement rate of 30.5%.

  2. Using digital notifications to improve attendance in clinic: systematic review and meta-analysis

    Science.gov (United States)

    Robotham, Dan; Satkunanathan, Safarina; Reynolds, John; Stahl, Daniel; Wykes, Til

    2016-01-01

    Objectives Assess the impact of text-based electronic notifications on improving clinic attendance, in relation to study quality (according to risk of bias), and to assess simple ways in which notifications can be optimised (ie, impact of multiple notifications). Design Systematic review, study quality appraisal assessing risk of bias, data synthesised in meta-analyses. Data sources MEDLINE, EMBASE, PsycINFO, Web of Science and Cochrane Database of Systematic Reviews (01.01.05 until 25.4.15). A systematic search to discover all studies containing quantitative data for synthesis into meta-analyses. Eligibility criteria Studies examining the effect of text-based electronic notifications on prescheduled appointment attendance in healthcare settings. Primary analysis included experimental studies where randomisation was used to define allocation to intervention and where a control group consisting of ‘no reminders’ was used. Secondary meta-analysis included studies comparing text reminders with voice reminders. Studies lacking sufficient information for inclusion (after attempting to contact study authors) were excluded. Outcome measures Primary outcomes were rate of attendance/non-attendance at healthcare appointments. Secondary outcome was rate of rescheduled and cancelled appointments. Results 26 articles were included. 21 included in the primary meta-analysis (8345 patients receiving electronic text notifications, 7731 patients receiving no notifications). Studies were included from Europe (9), Asia (7), Africa (2), Australia (2) and America (1). Patients who received notifications were 23% more likely to attend clinic than those who received no notification (risk ratio=1.23, 67% vs 54%). Those receiving notifications were 25% less likely to ‘no show’ for appointments (risk ratio=.75, 15% vs 21%). Results were similar when accounting for risk of bias, region and publication year. Multiple notifications were significantly more effective at improving

  3. Nonlinear analysis of an improved continuum model considering headway change with memory

    Science.gov (United States)

    Cheng, Rongjun; Wang, Jufeng; Ge, Hongxia; Li, Zhipeng

    2018-01-01

    Considering the effect of headway changes with memory, an improved continuum model of traffic flow is proposed in this paper. By means of linear stability theory, the new model’s linear stability with the effect of headway changes with memory is obtained. Through nonlinear analysis, the KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line. Numerical simulation is carried out to study the improved traffic flow model, which explores how the headway changes with memory affected each car’s velocity, density and energy consumption. Numerical results show that when considering the effects of headway changes with memory, the traffic jams can be suppressed efficiently. Furthermore, research results demonstrate that the effect of headway changes with memory can avoid the disadvantage of historical information, which will improve the stability of traffic flow and minimize car energy consumption.

  4. An improved spectral homotopy analysis method for solving boundary layer problems

    Directory of Open Access Journals (Sweden)

    Sibanda Precious

    2011-01-01

    Full Text Available Abstract This article presents an improved spectral-homotopy analysis method (ISHAM for solving nonlinear differential equations. The implementation of this new technique is shown by solving the Falkner-Skan and magnetohydrodynamic boundary layer problems. The results obtained are compared to numerical solutions in the literature and MATLAB's bvp4c solver. The results show that the ISHAM converges faster and gives accurate results.

  5. Analysis After Sales Service in Effort to Improve Costomer Saticfactoin on PT. Nusantara Motor in Balikpapan

    OpenAIRE

    -, Mursidah -

    2014-01-01

    Research purpose is to determine the effect after sales service to the costomer saticfaction on PT.Nusantara Motor.while the usefulness of this study infut material for the company in planning and improve marketing strategy in particular after sales service accordance with the needs and costumer desires. Analysis tolls used is simple linier regression equation Y= a+bx. Scale coefficient regression (b) =0,9874 , this means there is an increase costumer saticfaction in influenced by the...

  6. Central aortic reservoir-wave analysis improves prediction of cardiovascular events in elderly hypertensives.

    Science.gov (United States)

    Narayan, Om; Davies, Justin E; Hughes, Alun D; Dart, Anthony M; Parker, Kim H; Reid, Christopher; Cameron, James D

    2015-03-01

    Several morphological parameters based on the central aortic pressure waveform are proposed as cardiovascular risk markers, yet no study has definitively demonstrated the incremental value of any waveform parameter in addition to currently accepted biomarkers in elderly, hypertensive patients. The reservoir-wave concept combines elements of wave transmission and Windkessel models of arterial pressure generation, defining an excess pressure superimposed on a background reservoir pressure. The utility of pressure rate constants derived from reservoir-wave analysis in prediction of cardiovascular events is unknown. Carotid blood pressure waveforms were measured prerandomization in a subset of 838 patients in the Second Australian National Blood Pressure Study. Reservoir-wave analysis was performed and indices of arterial function, including the systolic and diastolic rate constants, were derived. Survival analysis was performed to determine the association between reservoir-wave parameters and cardiovascular events. The incremental utility of reservoir-wave parameters in addition to the Framingham Risk Score was assessed. Baseline values of the systolic rate constant were independently predictive of clinical outcome (hazard ratio, 0.33; 95% confidence interval, 0.13-0.82; P=0.016 for fatal and nonfatal stroke and myocardial infarction and hazard ratio, 0.38; 95% confidence interval, 0.20-0.74; P=0.004 for the composite end point, including all cardiovascular events). Addition of this parameter to the Framingham Risk Score was associated with an improvement in predictive accuracy for cardiovascular events as assessed by the integrated discrimination improvement and net reclassification improvement indices. This analysis demonstrates that baseline values of the systolic rate constant predict clinical outcomes in elderly patients with hypertension and incrementally improve prognostication of cardiovascular events. © 2014 American Heart Association, Inc.

  7. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    Science.gov (United States)

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  8. Information Operations Versus Civilian Marketing and Advertising: A Comparative Analysis to Improve IO Planning and Strategy

    Science.gov (United States)

    2008-03-01

    concepts, relevant to IO, which are known successful marketing practices. Successful marketing strategy includes the basic “ 4Ps of marketing ...OPERATIONS VERSUS CIVILIAN MARKETING AND ADVERTISING: A COMPARATIVE ANALYSIS TO IMPROVE IO PLANNING AND STRATEGY by Dan Chilton March 2008...REPORT DATE March 2008 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Information Operations Versus Civilian Marketing and

  9. Use of peers to improve adherence to antiretroviral therapy: a global network meta-analysis.

    Science.gov (United States)

    Kanters, Steve; Park, Jay Jh; Chan, Keith; Ford, Nathan; Forrest, Jamie; Thorlund, Kristian; Nachega, Jean B; Mills, Edward J

    2016-01-01

    It is unclear whether using peers can improve adherence to antiretroviral therapy (ART). To construct the World Health Organization's global guidance on adherence interventions, we conducted a systematic review and network meta-analysis to determine the effectiveness of using peers for achieving adequate adherence and viral suppression. We searched for randomized clinical trials of peer-based interventions to promote adherence to ART in HIV populations. We searched six electronic databases from inception to July 2015 and major conference abstracts within the last three years. We examined the outcomes of adherence and viral suppression among trials done worldwide and those specific to low- and middle-income countries (LMIC) using pairwise and network meta-analyses. Twenty-two trials met the inclusion criteria. We found similar results between pairwise and network meta-analyses, and between the global and LMIC settings. Peer supporter+Telephone was superior in improving adherence than standard-of-care in both the global network (odds-ratio [OR]=4.79, 95% credible intervals [CrI]: 1.02, 23.57) and the LMIC settings (OR=4.83, 95% CrI: 1.88, 13.55). Peer support alone, however, did not lead to improvement in ART adherence in both settings. For viral suppression, we found no difference of effects among interventions due to limited trials. Our analysis showed that peer support leads to modest improvement in adherence. These modest effects may be due to the fact that in many settings, particularly in LMICs, programmes already include peer supporters, adherence clubs and family disclosures for treatment support. Rather than introducing new interventions, a focus on improving the quality in the delivery of existing services may be a more practical and effective way to improve adherence to ART.

  10. Transcriptome Analysis of Maize Immature Embryos Reveals the Roles of Cysteine in Improving Agrobacterium Infection Efficiency

    Science.gov (United States)

    Liu, Yan; Zhang, Zhiqiang; Fu, Junjie; Wang, Guoying; Wang, Jianhua; Liu, Yunjun

    2017-01-01

    Maize Agrobacterium-mediated transformation efficiency has been greatly improved in recent years. Antioxidants, such as, cysteine, can significantly improve maize transformation frequency through improving the Agrobacterium infection efficiency. However, the mechanism underlying the transformation improvement after cysteine exposure has not been elucidated. In this study, we showed that the addition of cysteine to the co-cultivation medium significantly increased the Agrobacterium infection efficiency of hybrid HiII and inbred line Z31 maize embryos. Reactive oxygen species contents were higher in embryos treated with cysteine than that without cysteine. We further investigated the mechanism behind cysteine-related infection efficiency increase using transcriptome analysis. The results showed that the cysteine treatment up-regulated 939 genes and down-regulated 549 genes in both Z31 and HiII. Additionally, more differentially expressed genes were found in HiII embryos than those in Z31 embryos, suggesting that HiII was more sensitive to the cysteine treatment than Z31. GO analysis showed that the up-regulated genes were mainly involved in the oxidation reduction process. The up-regulation of these genes could help maize embryos to cope with the oxidative stress stimulated by Agrobacterium infection. The down-regulated genes were mainly involved in the cell wall and membrane metabolism, such as, aquaporin and expansin genes. Decreased expression of these cell wall integrity genes could loosen the cell wall, thereby improving the entry of Agrobacterium into plant cells. This study offers insight into the role of cysteine in improving Agrobacterium-mediated transformation of maize immature embryos. PMID:29089955

  11. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    International Nuclear Information System (INIS)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs

  12. Improvement of multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok

    1998-09-01

    The MARS (Multi-dimensional Analysis of Reactor Safety) code is a multi-dimensional, best-estimate thermal-hydraulic system analysis code. This report describes the new features that have been improved in the MARS 1.3 code since the release of MARS 1.3 in July 1998. The new features include: - implementation of point kinetics model into the 3D module - unification of the heat structure model - extension of the control function to the 3D module variables - improvement of the 3D module input check function. Each of the items has been implemented in the developmental version of the MARS 1.3.1 code and, then, independently verified and assessed. The effectiveness of the new features is well verified and it is shown that these improvements greatly extend the code capability and enhance the user friendliness. Relevant input data changes are also described. In addition to the improvements, this report briefly summarizes the future code developmental activities that are being carried out or planned, such as coupling of MARS 1.3 with the containment code CONTEMPT and the three-dimensional reactor kinetics code MASTER 2.0. (author). 8 refs.

  13. Improving SFR Economics through Innovations from Thermal Design and Analysis Aspects

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Hongbin Zhang; Vincent Mousseau; Per F. Peterson

    2008-06-01

    Achieving economic competitiveness as compared to LWRs and other Generation IV (Gen-IV) reactors is one of the major requirements for large-scale investment in commercial sodium cooled fast reactor (SFR) power plants. Advances in R&D for advanced SFR fuel and structural materials provide key long-term opportunities to improve SFR economics. In addition, other new opportunities are emerging to further improve SFR economics. This paper provides an overview on potential ideas from the perspective of thermal hydraulics to improve SFR economics. These include a new hybrid loop-pool reactor design to further optimize economics, safety, and reliability of SFRs with more flexibility, a multiple reheat and intercooling helium Brayton cycle to improve plant thermal efficiency and reduce safety related overnight and operation costs, and modern multi-physics thermal analysis methods to reduce analysis uncertainties and associated requirements for over-conservatism in reactor design. This paper reviews advances in all three of these areas and their potential beneficial impacts on SFR economics.

  14. Improved Proteomic Analysis Following Trichloroacetic Acid Extraction of Bacillus anthracis Spore Proteins

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Brooke LD; Wunschel, David S.; Sydor, Michael A.; Warner, Marvin G.; Wahl, Karen L.; Hutchison, Janine R.

    2015-08-07

    Proteomic analysis of bacterial samples provides valuable information about cellular responses and functions under different environmental pressures. Proteomic analysis is dependent upon efficient extraction of proteins from bacterial samples without introducing bias toward extraction of particular protein classes. While no single method can recover 100% of the bacterial proteins, selected protocols can improve overall protein isolation, peptide recovery, or enrich for certain classes of proteins. The method presented here is technically simple and does not require specialized equipment such as a mechanical disrupter. Our data reveal that for particularly challenging samples, such as B. anthracis Sterne spores, trichloroacetic acid extraction improved the number of proteins identified within a sample compared to bead beating (714 vs 660, respectively). Further, TCA extraction enriched for 103 known spore specific proteins whereas bead beating resulted in 49 unique proteins. Analysis of C. botulinum samples grown to 5 days, composed of vegetative biomass and spores, showed a similar trend with improved protein yields and identification using our method compared to bead beating. Interestingly, easily lysed samples, such as B. anthracis vegetative cells, were equally as effectively processed via TCA and bead beating, but TCA extraction remains the easiest and most cost effective option. As with all assays, supplemental methods such as implementation of an alternative preparation method may provide additional insight to the protein biology of the bacteria being studied.

  15. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-11-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  16. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    Science.gov (United States)

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and

  17. A Lean Six Sigma approach to the improvement of the selenium analysis method

    Directory of Open Access Journals (Sweden)

    Bronwyn C. Cloete

    2012-02-01

    Full Text Available Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL. The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC. Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any

  18. Kaizen practice in healthcare: a qualitative analysis of hospital employees' suggestions for improvement.

    Science.gov (United States)

    Mazzocato, Pamela; Stenfors-Hayes, Terese; von Thiele Schwarz, Ulrica; Hasson, Henna; Nyström, Monica Elisabeth

    2016-07-29

    Kaizen, or continuous improvement, lies at the core of lean. Kaizen is implemented through practices that enable employees to propose ideas for improvement and solve problems. The aim of this study is to describe the types of issues and improvement suggestions that hospital employees feel empowered to address through kaizen practices in order to understand when and how kaizen is used in healthcare. We analysed 186 structured kaizen documents containing improvement suggestions that were produced by 165 employees at a Swedish hospital. Directed content analysis was used to categorise the suggestions into following categories: type of situation (proactive or reactive) triggering an action; type of process addressed (technical/administrative, support and clinical); complexity level (simple or complex); and type of outcomes aimed for (operational or sociotechnical). Compliance to the kaizen template was calculated. 72% of the improvement suggestions were reactions to a perceived problem. Support, technical and administrative, and primary clinical processes were involved in 47%, 38% and 16% of the suggestions, respectively. The majority of the kaizen documents addressed simple situations and focused on operational outcomes. The degree of compliance to the kaizen template was high for several items concerning the identification of problems and the proposed solutions, and low for items related to the test and implementation of solutions. There is a need to combine kaizen practices with improvement and innovation practices that help staff and managers to address complex issues, such as the improvement of clinical care processes. The limited focus on sociotechnical aspects and the partial compliance to kaizen templates may indicate a limited understanding of the entire kaizen process and of how it relates to the overall organisational goals. This in turn can hamper the sustainability of kaizen practices and results. Published by the BMJ Publishing Group Limited. For

  19. Application of exergy analysis for improving energy efficiency of natural gas liquids recovery processes

    International Nuclear Information System (INIS)

    Shin, Jihoon; Yoon, Sekwang; Kim, Jin-Kuk

    2015-01-01

    Thermodynamic analysis and optimization method is applied to provide design guidelines for improving energy efficiency and cost-effectiveness of natural gas liquids recovery processes. Exergy analysis is adopted in this study as a thermodynamic tool to evaluate the loss of exergy associated with irreversibility in natural gas liquids recovery processes, with which conceptual understanding on inefficient design feature or equipment can be obtained. Natural gas liquids processes are modeled and simulated within UniSim ® simulator, with which detailed thermodynamic information are obtained for calculating exergy loss. The optimization framework is developed by minimizing overall exergy loss, as an objective function, subject to product specifications and engineering constraints. The optimization is carried out within MATLAB ® with the aid of a stochastic solver based on genetic algorithms. The process simulator is linked and interacted with the optimization solver, in which optimal operating conditions can be determined. A case study is presented to illustrate the benefit of using exergy analysis for the design and optimization of natural gas liquids processes and to demonstrate the applicability of design method proposed in this paper. - Highlights: • Application of exergy analysis for natural gas liquids (NGL) recovery processes. • Minimization of exergy loss for improving energy efficiency. • A systematic optimization framework for the design of NGL recovery processes

  20. Does Flywheel Paradigm Training Improve Muscle Volume and Force? A Meta-Analysis.

    Science.gov (United States)

    Nuñez Sanchez, Francisco J; Sáez de Villarreal, Eduardo

    2017-11-01

    Núñez Sanchez, FJ and Sáez de Villarreal, E. Does flywheel paradigm training improve muscle volume and force? A meta-analysis. J Strength Cond Res 31(11): 3177-3186, 2017-Several studies have confirmed the efficacy of flywheel paradigm training for improving or benefiting muscle volume and force. A meta-analysis of 13 studies with a total of 18 effect sizes was performed to analyse the role of various factors on the effectiveness of flywheel paradigm training. The following inclusion criteria were employed for the analysis: (a) randomized studies; (b) high validity and reliability instruments; (c) published in a high quality peer-reviewed journal; (d) healthy participants; (e) studies where the eccentric programme were described; and (f) studies where increases in muscle volume and force were measured before and after training. Increases in muscle volume and force were noted through the use of flywheel systems during short periods of training. The increase in muscle mass appears was not influenced by the existence of eccentric overload during the exercise. The increase in force was significantly higher with the existence of eccentric overload during the exercise. The responses identified in this analysis are essential and should be considered by strength and conditioning professionals regarding the most appropriate dose response trends for flywheel paradigm systems to optimize the increase in muscle volume and force.

  1. Assessment and improvement of the Plasmodium yoelii yoelii genome annotation through comparative analysis.

    Science.gov (United States)

    Vaughan, Ashley; Chiu, Sum-Ying; Ramasamy, Gowthaman; Li, Ling; Gardner, Malcolm J; Tarun, Alice S; Kappe, Stefan H I; Peng, Xinxia

    2008-07-01

    The sequencing of the Plasmodium yoelii genome, a model rodent malaria parasite, has greatly facilitated research for the development of new drug and vaccine candidates against malaria. Unfortunately, only preliminary gene models were annotated on the partially sequenced genome, mostly by in silico gene prediction, and there has been no major improvement of the annotation since 2002. Here we report on a systematic assessment of the accuracy of the genome annotation based on a detailed analysis of a comprehensive set of cDNA sequences and proteomics data. We found that the coverage of the current annotation tends to be biased toward genes expressed in the blood stages of the parasite life cycle. Based on our proteomic analysis, we estimate that about 15% of the liver stage proteome data we have generated is absent from the current annotation. Through comparative analysis we identified and manually curated a further 510 P. yoelii genes which have clear orthologs in the P. falciparum genome, but were not present or incorrectly annotated in the current annotation. This study suggests that improvements of the current P. yoelii genome annotation should focus on genes expressed in stages other than blood stages. Comparative analysis will be critically helpful for this re-annotation. The addition of newly annotated genes will facilitate the use of P. yoelii as a model system for studying human malaria. Supplementary data are available at Bioinformatics online.

  2. Improved method for minimizing sulfur loss in analysis of particulate organic sulfur.

    Science.gov (United States)

    Park, Ki-Tae; Lee, Kitack; Shin, Kyoungsoon; Jeong, Hae Jin; Kim, Kwang Young

    2014-02-04

    The global sulfur cycle depends primarily on the metabolism of marine microorganisms, which release sulfur gas into the atmosphere and thus affect the redistribution of sulfur globally as well as the earth's climate system. To better quantify sulfur release from the ocean, analysis of the production and distribution of organic sulfur in the ocean is necessary. This report describes a wet-based method for accurate analysis of particulate organic sulfur (POS) in the marine environment. The proposed method overcomes the considerable loss of sulfur (up to 80%) that occurs during analysis using conventional methods involving drying. Use of the wet-based POS extraction procedure in conjunction with a sensitive sulfur analyzer enabled accurate measurements of cellular POS. Data obtained using this method will enable accurate assessment of how rapidly sulfur can transfer among pools. Such information will improve understanding of the role of POS in the oceanic sulfur cycle.

  3. Applying transactional analysis and personality assessment to improve patient counseling and communication skills.

    Science.gov (United States)

    Lawrence, Lesa

    2007-08-15

    To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients.

  4. Improving the effectiveness of FMEA analysis in automotive – a case study

    Directory of Open Access Journals (Sweden)

    Ványi Gábor

    2016-06-01

    Full Text Available Many industries, for example automotive, have well defined product development process definitions and risk evaluation methods. The FMEA (Failure Mode and Effects Analysis is a first line risk analysis method in design, which has been implemented in development and production since decades. Although the first applications were focusing on mechanical and electrical design and functionalities, today, software components are implemented in many modern vehicle systems. However, standards or industry specific associations do not specify any “best practice” how to design the interactions of multiple entities in one model. This case study focuses on modelling interconnections and on the improvement of the FMEA modelling process in the automotive. Selecting and grouping software components for the analysis is discussed, but software architect design patterns are excluded from the study.

  5. Efficient Fault Localization and Failure Analysis Techniques for Improving IC Yield

    Directory of Open Access Journals (Sweden)

    Ankush Oberai

    2018-02-01

    Full Text Available With the increase in the complexity of the semiconductor device processes and increase in the challenge to satisfy high market demands, enhancement in yield has become a crucial factor. Discovering and reacting to yield problems emerging at the end of the production line may cause unbearable yield loss leading to larger times to market. Thus, time and cost involved in fault isolation may be significantly shortened by effectively utilizing the fault diagnosis technology and supporting yield improvements. Hence for yield analysis, a highly integrated data network with software analysis tools have been established to reduce the fault analysis time. Synopsys Avalon, a product used for fault localization is described in this paper which aids in achieving better integrated circuit yields. This paper also illustrates various fault localization techniques for faster problem identification and discusses a few analytical tools like photon emission microscope and transmission emission microscope for faster determination of device failures.

  6. Combined analysis of cortical (EEG) and nerve stump signals improves robotic hand control.

    Science.gov (United States)

    Tombini, Mario; Rigosa, Jacopo; Zappasodi, Filippo; Porcaro, Camillo; Citi, Luca; Carpaneto, Jacopo; Rossini, Paolo Maria; Micera, Silvestro

    2012-01-01

    Interfacing an amputee's upper-extremity stump nerves to control a robotic hand requires training of the individual and algorithms to process interactions between cortical and peripheral signals. To evaluate for the first time whether EEG-driven analysis of peripheral neural signals as an amputee practices could improve the classification of motor commands. Four thin-film longitudinal intrafascicular electrodes (tf-LIFEs-4) were implanted in the median and ulnar nerves of the stump in the distal upper arm for 4 weeks. Artificial intelligence classifiers were implemented to analyze LIFE signals recorded while the participant tried to perform 3 different hand and finger movements as pictures representing these tasks were randomly presented on a screen. In the final week, the participant was trained to perform the same movements with a robotic hand prosthesis through modulation of tf-LIFE-4 signals. To improve the classification performance, an event-related desynchronization/synchronization (ERD/ERS) procedure was applied to EEG data to identify the exact timing of each motor command. Real-time control of neural (motor) output was achieved by the participant. By focusing electroneurographic (ENG) signal analysis in an EEG-driven time window, movement classification performance improved. After training, the participant regained normal modulation of background rhythms for movement preparation (α/β band desynchronization) in the sensorimotor area contralateral to the missing limb. Moreover, coherence analysis found a restored α band synchronization of Rolandic area with frontal and parietal ipsilateral regions, similar to that observed in the opposite hemisphere for movement of the intact hand. Of note, phantom limb pain (PLP) resolved for several months. Combining information from both cortical (EEG) and stump nerve (ENG) signals improved the classification performance compared with tf-LIFE signals processing alone; training led to cortical reorganization and

  7. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats

    Science.gov (United States)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.

    2017-02-01

    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  8. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Improving Production of Treated and Untreated Verbs in Aphasia: A Meta-Analysis.

    Science.gov (United States)

    de Aguiar, Vânia; Bastiaanse, Roelien; Miceli, Gabriele

    2016-01-01

    Background: Demographic and clinical predictors of aphasia recovery have been identified in the literature. However, little attention has been devoted to identifying and distinguishing predictors of improvement for different outcomes, e.g., production of treated vs. untreated materials. These outcomes may rely on different mechanisms, and therefore be predicted by different variables. Furthermore, treatment features are not typically accounted for when studying predictors of aphasia recovery. This is partly due to the small numbers of cases reported in studies, but also to limitations of data analysis techniques usually employed. Method: We reviewed the literature on predictors of aphasia recovery, and conducted a meta-analysis of single-case studies designed to assess the efficacy of treatments for verb production. The contribution of demographic, clinical, and treatment-related variables was assessed by means of Random Forests (a machine-learning technique used in classification and regression). Two outcomes were investigated: production of treated (for 142 patients) and untreated verbs (for 166 patients). Results: Improved production of treated verbs was predicted by a three-way interaction of pre-treatment scores on tests for verb comprehension and word repetition, and the frequency of treatment sessions. Improvement in production of untreated verbs was predicted by an interaction including the use of morphological cues, presence of grammatical impairment, pre-treatment scores on a test for noun comprehension, and frequency of treatment sessions. Conclusion: Improvement in the production of treated verbs occurs frequently. It may depend on restoring access to and/or knowledge of lexeme representations, and requires relative sparing of semantic knowledge (as measured by verb comprehension) and phonological output abilities (including working memory, as measured by word repetition). Improvement in the production of untreated verbs has not been reported very often

  10. Improving production of treated and untreated verbs in aphasia: A meta-analysis

    Directory of Open Access Journals (Sweden)

    Vânia de Aguiar

    2016-09-01

    Full Text Available BACKGROUND. Demographic and clinical predictors of aphasia recovery have been identified in the literature. However, little attention has been devoted to identifying and distinguishing predictors of improvement for different outcomes, e.g., production of treated vs. untreated materials. These outcomes may rely on different mechanisms, and therefore be predicted by different variables. Furthermore, treatment features are not typically accounted for when studying predictors of aphasia recovery. This is partly due to the small numbers of cases reported in studies, but also to limitations of data analysis techniques usually employed. METHOD. We reviewed the literature on predictors of aphasia recovery, and conducted a meta-analysis of single-case studies designed to assess the efficacy of treatments for verb production. The contribution of demographic, clinical, and treatment-related variables was assessed by means of Random Forests (a machine-learning technique used in classification and regression. Two outcomes were investigated: production of treated (for 142 patients and untreated verbs (for 166 patients. RESULTS. Improved production of treated verbs was predicted by a three-way interaction of pre-treatment scores on tests for verb comprehension and word repetition, and the frequency of treatment sessions. Improvement in production of untreated verbs was predicted by an interaction including the use of morphological cues, presence of grammatical impairment, pre-treatment scores on a test for noun comprehension and frequency of treatment sessions. CONCLUSION. Improvement in the production of treated verbs occurs frequently. It may depend on restoring access to and/or knowledge of lexeme representations, and requires relative sparing of semantic knowledge (as measured by verb comprehension and phonological output abilities (including working memory, as measured by word repetition. Improvement in the production of untreated verbs has not been

  11. Aerobic Exercise Improves Cognitive Functioning in People With Schizophrenia: A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Stubbs, Brendon; Rosenbaum, Simon; Vancampfort, Davy; Malchow, Berend; Schuch, Felipe; Elliott, Rebecca; Nuechterlein, Keith H.; Yung, Alison R.

    2017-01-01

    Abstract Cognitive deficits are pervasive among people with schizophrenia and treatment options are limited. There has been an increased interest in the neurocognitive benefits of exercise, but a comprehensive evaluation of studies to date is lacking. We therefore conducted a meta-analysis of all controlled trials investigating the cognitive outcomes of exercise interventions in schizophrenia. Studies were identified from a systematic search across major electronic databases from inception to April 2016. Meta-analyses were used to calculate pooled effect sizes (Hedges g) and 95% CIs. We identified 10 eligible trials with cognitive outcome data for 385 patients with schizophrenia. Exercise significantly improved global cognition (g = 0.33, 95% CI = 0.13–0.53, P = .001) with no statistical heterogeneity (I2 = 0%). The effect size in the 7 studies which were randomized controlled trials was g = 0.43 (P exercise are associated with larger improvements in global cognition (β = .005, P = .065). Interventions which were supervised by physical activity professionals were also more effective (g = 0.47, P Exercise significantly improved the cognitive domains of working memory (g = 0.39, P = .024, N = 7, n = 282), social cognition (g = 0.71, P = .002, N = 3, n = 81), and attention/vigilance (g = 0.66, P = .005, N = 3, n = 104). Effects on processing speed, verbal memory, visual memory and reasoning and problem solving were not significant. This meta-analysis provides evidence that exercise can improve cognitive functioning among people with schizophrenia, particularly from interventions using higher dosages of exercise. Given the challenges in improving cognition, and the wider health benefits of exercise, a greater focus on providing supervised exercise to people with schizophrenia is needed. PMID:27521348

  12. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Pointurier, F.; Hubert, A.; Faure, A.L.; Pottin, A.C.; Mourier, W.; Marie, O.

    2010-01-01

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  13. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  14. Improving defect visibility in square pulse thermography of metallic components using correlation analysis

    Science.gov (United States)

    Xu, Changhang; Xie, Jing; Huang, Weiping; Chen, Guoming; Gong, Xumei

    2018-03-01

    Infrared (IR) thermography has gained wide applications as an important non-destructive testing (NDT) technique. Improving defect visibility is critical to achieving an accurate detection result through IR thermography. In this study, we propose a novel approach to improving defect visibility in square pulse thermography (SPT) of metallic components. In the proposed approach, the correlation function of contrast (CFC) is defined for the first time. Based on the theories of heat conduction and of correlation analysis, the differences of CFC between defects and sound regions are determined. We found that the peak lag time of the CFC is an effective feature for discriminating defects and sound regions in SPT. A new image is then constructed using the peak lag time of the CFC to improve defect visibility. To verify the efficiency of the proposed approach, an experiment was conducted on a steel specimen, and the principle component analysis (PCA) and the presented approach were compared. The results show that through the proposed approach, defects in metallic components can be indicated more clearly and detected more accurately.

  15. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  16. Cross-platform analysis of cancer microarray data improves gene expression based classification of phenotypes

    Directory of Open Access Journals (Sweden)

    Eils Roland

    2005-11-01

    Full Text Available Abstract Background The extensive use of DNA microarray technology in the characterization of the cell transcriptome is leading to an ever increasing amount of microarray data from cancer studies. Although similar questions for the same type of cancer are addressed in these different studies, a comparative analysis of their results is hampered by the use of heterogeneous microarray platforms and analysis methods. Results In contrast to a meta-analysis approach where results of different studies are combined on an interpretative level, we investigate here how to directly integrate raw microarray data from different studies for the purpose of supervised classification analysis. We use median rank scores and quantile discretization to derive numerically comparable measures of gene expression from different platforms. These transformed data are then used for training of classifiers based on support vector machines. We apply this approach to six publicly available cancer microarray gene expression data sets, which consist of three pairs of studies, each examining the same type of cancer, i.e. breast cancer, prostate cancer or acute myeloid leukemia. For each pair, one study was performed by means of cDNA microarrays and the other by means of oligonucleotide microarrays. In each pair, high classification accuracies (> 85% were achieved with training and testing on data instances randomly chosen from both data sets in a cross-validation analysis. To exemplify the potential of this cross-platform classification analysis, we use two leukemia microarray data sets to show that important genes with regard to the biology of leukemia are selected in an integrated analysis, which are missed in either single-set analysis. Conclusion Cross-platform classification of multiple cancer microarray data sets yields discriminative gene expression signatures that are found and validated on a large number of microarray samples, generated by different laboratories and

  17. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    International Nuclear Information System (INIS)

    Chauhan, Narvendra Singh; Mohapatra, Pratap K.J.; Pandey, Keshaw Prasad

    2006-01-01

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone

  18. [Failure mode and effects analysis to improve quality in clinical trials].

    Science.gov (United States)

    Mañes-Sevilla, M; Marzal-Alfaro, M B; Romero Jiménez, R; Herranz-Alonso, A; Sanchez Fresneda, M N; Benedi Gonzalez, J; Sanjurjo-Sáez, M

    2018-02-15

    The failure mode and effects analysis (FMEA) has been used as a tool in risk management and quality improvement. The objective of this study is to identify the weaknesses in processes in the clinical trials area, of a Pharmacy Department (PD) with great research activity, in order to improve the safety of the usual procedures. A multidisciplinary team was created to analyse each of the critical points, identified as possible failure modes, in the development of clinical trial in the PD. For each failure mode, the possible cause and effect were identified, criticality was calculated using the risk priority number and the possible corrective actions were discussed. Six sub-processes were defined in the development of the clinical trials in PD. The FMEA identified 67 failure modes, being the dispensing and prescription/validation sub-processes the most likely to generate errors. All the improvement actions established in the AMFE were implemented in the Clinical Trials area. The FMEA is a useful tool in proactive risk management because it allows us to identify where we are making mistakes and analyze the causes that originate them, to prioritize and to adopt solutions to risk reduction. The FMEA improves process safety and quality in PD. Copyright © 2018 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. Analysis and Improvement of Attitude Output Accuracy in Rotation Inertial Navigation System

    Directory of Open Access Journals (Sweden)

    Kui Li

    2015-01-01

    Full Text Available Inertial navigation system (INS measures vehicle’s angular rate and acceleration by orthogonally mounted tri-axis gyroscopes and accelerometers and then calculates the vehicle’s real-time attitude, velocity, and position. Gyroscope drifts and accelerometer biases are the key factors that affect the navigation accuracy. Theoretical analysis and experimental results show that the influence of gyroscope drifts and accelerometer biases can be restrained greatly in rotation INS (RINS by driving the inertial measurement unit (IMU rotating regularly, thus improving navigation accuracy significantly. High accuracy in position and velocity should be matched with that in attitude theoretically since INS is based on dead reckoning. However, the marine and vehicle experiments show that short-term attitude output accuracy of RINS is even worse compared with that of nonrotation INS. The loss of attitude accuracy has serious impacts on many task systems where high attitude accuracy is required. This paper researched the principle of attitude output accuracy loss in RINS and then proposed a new attitude output accuracy improvement algorithm for RINS. Experiment results show that the proposed attitude compensation method can improve short-term pitch and roll output accuracy from 20~30 arc seconds to less than 5 arc seconds and azimuth output accuracy improved from 2~3 arc minutes to less than 0.5 arc minutes in RINS.

  20. Economic analysis of the health impacts of housing improvement studies: a systematic review

    Science.gov (United States)

    Fenwick, Elisabeth; Macdonald, Catriona; Thomson, Hilary

    2013-01-01

    Background Economic evaluation of public policies has been advocated but rarely performed. Studies from a systematic review of the health impacts of housing improvement included data on costs and some economic analysis. Examination of these data provides an opportunity to explore the difficulties and the potential for economic evaluation of housing. Methods Data were extracted from all studies included in the systematic review of housing improvement which had reported costs and economic analysis (n=29/45). The reported data were assessed for their suitability to economic evaluation. Where an economic analysis was reported the analysis was described according to pre-set definitions of various types of economic analysis used in the field of health economics. Results 25 studies reported cost data on the intervention and/or benefits to the recipients. Of these, 11 studies reported data which was considered amenable to economic evaluation. A further four studies reported conducting an economic evaluation. Three of these studies presented a hybrid ‘balance sheet’ approach and indicated a net economic benefit associated with the intervention. One cost-effectiveness evaluation was identified but the data were unclearly reported; the cost-effectiveness plane suggested that the intervention was more costly and less effective than the status quo. Conclusions Future studies planning an economic evaluation need to (i) make best use of available data and (ii) ensure that all relevant data are collected. To facilitate this, economic evaluations should be planned alongside the intervention with input from health economists from the outset of the study. When undertaken appropriately, economic evaluation provides the potential to make significant contributions to housing policy. PMID:23929616

  1. Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm

    Science.gov (United States)

    Neri, P.

    2017-05-01

    Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.

  2. Aggregate analysis of regulatory authority assessors' comments to improve the quality of periodic safety update reports.

    Science.gov (United States)

    Jullian, Sandra; Jaskiewicz, Lukasz; Pfannkuche, Hans-Jürgen; Parker, Jeremy; Lalande-Luesink, Isabelle; Lewis, David J; Close, Philippe

    2015-09-01

    Marketing authorization holders (MAHs) are expected to provide high-quality periodic safety update reports (PSURs) on their pharmaceutical products to health authorities (HAs). We present a novel instrument aiming at improving quality of PSURs based on standardized analysis of PSUR assessment reports (ARs) received from the European Union HAs across products and therapeutic areas. All HA comments were classified into one of three categories: "Request for regulatory actions," "Request for medical and scientific information," or "Data deficiencies." The comments were graded according to their impact on patients' safety, the drug's benefit-risk profile, and the MAH's pharmacovigilance system. A total of 476 comments were identified through the analysis of 63 PSUR HA ARs received in 2013 and 2014; 47 (10%) were classified as "Requests for regulatory actions," 309 (65%) as "Requests for medical and scientific information," and 118 (25%) comments were related to "Data deficiencies." The most frequent comments were requests for labeling changes (35 HA comments in 19 ARs). The aggregate analysis revealed commonly raised issues and prompted changes of the MAH's procedures related to the preparation of PSURs. The authors believe that this novel instrument based on the evaluation of PSUR HA ARs serves as a valuable mechanism to enhance the quality of PSURs and decisions about optimization of the use of the products and, therefore, contributes to improve further the MAH's pharmacovigilance system and patient safety. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Improved liquid chromatography combined with pulsed electrochemical detection for the analysis of etimicin sulfate.

    Science.gov (United States)

    Wu, Yuning; Zhao, Wei; Zhu, Xiaoyue; Wang, Fang; Zhang, Mei; Fan, Xialei; Yuan, Yaozuo; Hu, Changqin; Deng, Xiaolan; Adams, Erwin

    2016-04-01

    This paper describes an improved liquid chromatography method combined with pulsed electrochemical detection for the analysis of etimicin sulfate. In total, 22 impurities could be separated. A TSK-GEL C18 column (250 mm × 4.6 mm i.d., 5 μm) is used, and the mobile phase is composed of 40 mL of acetonitrile and 960 mL of an aqueous solution containing trifluoroacetic acid (15 mL/L), pentafluoropropionic acid (500 μL/L), 50% sodium hydroxide (8 mL/L) and sodium sulfate (1.5 g/L). The pH of the aqueous solution is adjusted to 3.5 with 0.8 M sodium hydroxide. The influence of the different chromatographic parameters on the separation was investigated. A quadruple potential-time waveform was applied to the electrodes of the detection cell. 0.8 M sodium hydroxide was added post column to raise the pH to at least 12 before detection. A central composite experimental design was used to describe the relationship between factors and response values and to establish factorial analysis. Compared to previously published investigations, this improved method shows higher sensitivity, better separation ability and robustness and has been incorporated by the Chinese Pharmacopoeia 2015 for analysis of etimicin sulfate. A number of commercial samples of etimicin sulfate were also analyzed using this method. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity.

    Science.gov (United States)

    Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey

    2016-02-07

    Real-time Raman spectroscopy can be used to assist in assessing skin lesions suspicious for cancer. Most of the diagnostic algorithms are based on full band of the Raman spectra, either in the fingerprint region or the high wavenumber region. In this paper we explored wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Wavenumber selection was implemented using windows of wavenumber and leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected windows of wavenumber using multivariate statistical analyses, including principal component and general discriminate analysis (PC-GDA) and partial least squares (PLS). In total a combined cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included, which were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. It was found that the area under the receiver operating characteristic curve (ROC) was improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for fixed sensitivity 0.99-0.90 was improved from 0.17-0.65 to 0.20-0.75 with wavenumber selection based analysis.

  5. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    Science.gov (United States)

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  6. PWR (pressurized water reactor) water treatment improvements: Cost-benefit analysis: Final report

    International Nuclear Information System (INIS)

    Siegwarth, D.P.; Bickerstaff, J.A.; Chakravorti, R.

    1988-05-01

    Pressurized water reactor steam generators and turbines have experienced a variety of corrosion problems as a result of ionic, corrosion product and oxidizing species transport into the steam generators. This project considered the design, cost and benefit of equipment modifications and additions which would decrese secondary cycle impurity transport. Improving condenser integrity, adding full-flow condensate polishers, providing low dissolved oxygen in makeup water and installation of all-ferrous heat exchangers are four changes that can significantly improve secondary water quality. Conceptual designs and costs of these four concepts at a 1160 MWe pressurized water reactor are summarized. The expected chemistry and operational benefits are discussed, and a cost-benefit analysis is given

  7. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    International Nuclear Information System (INIS)

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  8. Development of an improved commercial sector energy model for national policy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B.

    1992-12-01

    Pacific Northwest Laboratory provided support to the Office of Conservation and Renewable Energy (CE), under the Office of Planning and Assessment, to develop improved energy and environmental analysis tools. Commercial building sector energy models from the past decade were analyzed in order to provoke comment and stimulate discussion between potential model users and developers as to the appropriate structure and capability of a commercial sector energy model supported by CE. Three specific areas were examined during this review. These areas provide (1) a look at recent suggestions and guidance as to what constitutes a minimal set of requirements and capabilities for a commercial buildings energy model for CE, (2) a review of several existing models in terms of their general structure and how they match up with the requirements listed previously, and (3) an overview of a proposed improved commercial sector energy model.

  9. Surgical videos for accident analysis, performance improvement, and complication prevention: time for a surgical black box?

    Science.gov (United States)

    Gambadauro, Pietro; Magos, Adam

    2012-03-01

    Conventional audit of surgical records through review of surgical results provides useful knowledge but hardly helps identify the technical reasons lying behind specific outcomes or complications. Surgical teams not only need to know that a complication might happen but also how and when it is most likely to happen. Functional awareness is therefore needed to prevent complications, know how to deal with them, and improve overall surgical performance. The authors wish to argue that the systematic recording and reviewing of surgical videos, a "surgical black box," might improve surgical care, help prevent complications, and allow accident analysis. A possible strategy to test this hypothesis is presented and discussed. Recording and reviewing surgical interventions, apart from helping us achieve functional awareness and increasing the safety profile of our performance, allows us also to effectively share our experience with colleagues. The authors believe that those potential implications make this hypothesis worth testing.

  10. An improved principal component analysis based region matching method for fringe direction estimation

    Science.gov (United States)

    He, A.; Quan, C.

    2018-04-01

    The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.

  11. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    Science.gov (United States)

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  12. ANALYSIS AND IMPROVEMENT OF PRODUCTION EFFICIENCY IN A CONSTRUCTION MACHINE ASSEMBLY LINE

    Directory of Open Access Journals (Sweden)

    Alidiane Xavier

    2016-07-01

    Full Text Available The increased competitiveness in the market encourages the ongoing development of systems and production processes. The aim is to increase production efficiency to production costs and waste be reduced to the extreme, allowing an increased product competitiveness. The objective of this study was to analyze the overall results of implementing a Kaizen philosophy in an automaker of construction machinery, using the methodology of action research, which will be studied in situ the macro production process from receipt of parts into the end of the assembly line , prioritizing the analysis time of shipping and handling. The results show that the continuous improvement activities directly impact the elimination of waste from the assembly process, mainly related to shipping and handling, improving production efficiency by 30% in the studied processes.

  13. Regional fringe analysis for improving depth measurement in phase-shifting fringe projection profilometry

    Science.gov (United States)

    Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen

    2018-01-01

    This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.

  14. Silica Fume and Fly Ash Admixed Can Help to Improve the PRC Durability Combine Microscopic Analysis

    Directory of Open Access Journals (Sweden)

    Xiao Li-guang

    2016-01-01

    Full Text Available Silica fume/Fly ash RPC can greatly improve durability. When Silica fume to replace the same amount of 8% of the proportion of cement, re-mixed 15min of mechanically activated Fly ash content of 10%, by chloride ion flux detector measuring, complex doped than the reference RPC impermeability improved significantly; In addition, by using static nitrogen adsorption method showed, RPC internal pore structure determination, the hole integral volume was lower than the reference admixed RPC integral pore volume significantly; And combined SEM microscopic experimental methods, mixed of RPC internal structure and the formation mechanism analysis showed that, SF/FA complex fully embodies the synergy doped composites “Synergistic” principle.

  15. Analysis of the nutritional management practices in intensive care: Identification of needs for improvement.

    Science.gov (United States)

    Lázaro-Martín, N I; Catalán-González, M; García-Fuentes, C; Terceros-Almanza, L; Montejo-González, J C

    2015-12-01

    To analyze the nutritional management practices in Intensive Care (ICU) to detect the need for improvement actions. Re-evaluate the process after implementation of improvement actions. Prospective observational study in 3 phases: 1) observation; 2) analysis, proposal development and dissemination; 3) analysis of the implementation. ICU of a hospital of high complexity. Adult ICU forecast more than 48h of artificial nutrition. Parenteral nutrition (PN), enteral nutrition (EN) (type, average effective volume, complications) and average nutritional ratio. A total of 229 patients (phase 1: 110, phase 3: 119). After analyzing the initial results, were proposed: increased use and precocity of EN, increased protein intake, nutritional monitoring effectiveness and increased supplementary indication NP. The measures were broadcast at specific meetings. During phase 3 more patients received EN (55.5 vs. 78.2%, P=.001), with no significant difference in the start time (1.66 vs. 2.33 days), duration (6.82 vs. 10,12 days) or complications (37,7 vs. 47,3%).Use of hyperproteic diets was higher in phase 3 (0 vs. 13.01%, P<.05). The use of NP was similar (48.2 vs. 48,7%) with a tendency to a later onset in phase 3 (1.25±1.25 vs. 2.45±3.22 days). There were no significant differences in the average nutritional ratio (0.56±0.28 vs. 0.61±0.27, P=.56). The use of EN and the protein intake increased, without appreciating effects on other improvement measures. Other methods appear to be necessary for the proper implementation of improvement measures. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.

  16. Primary health care contribution to improve health outcomes in Bogota-Colombia: a longitudinal ecological analysis

    Directory of Open Access Journals (Sweden)

    Mosquera Paola A

    2012-08-01

    Full Text Available Abstract Background Colombia has a highly segmented and fragmented national health system that contributes to inequitable health outcomes. In 2004 the district government of Bogota initiated a Primary Health Care (PHC strategy to improve health care access and population health status. This study aims to analyse the contribution of the PHC strategy to the improvement of health outcomes controlling for socioeconomic variables. Methods A longitudinal ecological analysis using data from secondary sources was carried out. The analysis used data from 2003 and 2007 (one year before and 3 years after the PHC implementation. A Primary Health Care Index (PHCI of coverage intensity was constructed. According to the PHCI, localities were classified into two groups: high and low coverage. A multivariate analysis using a Poisson regression model for each year separately and a Panel Poisson regression model to assess changes between the groups over the years was developed. Dependent variables were infant mortality rate, under-5 mortality rate, infant mortality rate due to acute diarrheal disease and pneumonia, prevalence of acute malnutrition, vaccination coverage for diphtheria, pertussis, tetanus (DPT and prevalence of exclusive breastfeeding. The independent variable was the PHCI. Control variables were sewerage coverage, health system insurance coverage and quality of life index. Results The high PHCI localities as compared with the low PHCI localities showed significant risk reductions of under-5 mortality (13.8% and infant mortality due to pneumonia (37.5% between 2003 and 2007. The probability of being vaccinated for DPT also showed a significant increase of 4.9%. The risk of infant mortality and of acute malnutrition in children under-5 years was lesser in the high coverage group than in the low one; however relative changes were not statistically significant. Conclusions Despite the adverse contextual conditions and the limitations imposed by the

  17. Drug supply indicators: Pitfalls and possibilities for improvements to assist comparative analysis.

    Science.gov (United States)

    Singleton, Nicola; Cunningham, Andrew; Groshkova, Teodora; Royuela, Luis; Sedefov, Roumen

    2018-03-03

    Interventions to tackle the supply of drugs are seen as standard components of illicit drug policies. Therefore drug market-related administrative data, such as seizures, price, purity and drug-related offending, are used in most countries for policy monitoring and assessment of the drug situation. International agencies, such as the European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) and the UN Office of Drugs and Crime, also monitor and report on the drug situation cross-nationally and therefore seek to collect and make available key data in a uniform manner from the countries they cover. However, these data are not primarily collected for this purpose, which makes interpretation and comparative analysis difficult. Examples of limitations of these data sources include: the extent to which they reflect operational priorities rather than market changes; question marks over the robustness of and consistency in data collection methods, and issues around the timeliness of data availability. Such problems are compounded by cultural, social and contextual differences between countries. Making sense of such data is therefore challenging and extreme care needs to be taken using it. Nevertheless, these data provide an important window on a hidden area, so improving the quality of the data collected and expanding its scope should be a priority for those seeking to understand or monitor drug markets and supply reduction. In addition to highlighting some of the potential pitfalls in using supply indicators for comparative analysis, this paper presents a selection of options for improvements based on the current EMCDDA programme of work to improve their supply-related monitoring and analysis. The conceptual framework developed to steer this work may have wider application. Adopting this approach has the potential to provide a richer picture of drug markets, at both national and international levels, and make it easier to compare data between countries. Copyright

  18. Rational improvement of the engineered isobutanol-producing Bacillus subtilis by elementary mode analysis

    Directory of Open Access Journals (Sweden)

    Li Shanshan

    2012-08-01

    Full Text Available Abstract Background Isobutanol is considered as a leading candidate for the replacement of current fossil fuels, and expected to be produced biotechnologically. Owing to the valuable features, Bacillus subtilis has been engineered as an isobutanol producer, whereas it needs to be further optimized for more efficient production. Since elementary mode analysis (EMA is a powerful tool for systematical analysis of metabolic network structures and cell metabolism, it might be of great importance in the rational strain improvement. Results Metabolic network of the isobutanol-producing B. subtilis BSUL03 was first constructed for EMA. Considering the actual cellular physiological state, 239 elementary modes (EMs were screened from total 11,342 EMs for potential target prediction. On this basis, lactate dehydrogenase (LDH and pyruvate dehydrogenase complex (PDHC were predicted as the most promising inactivation candidates according to flux flexibility analysis and intracellular flux distribution simulation. Then, the in silico designed mutants were experimentally constructed. The maximal isobutanol yield of the LDH- and PDHC-deficient strain BSUL05 reached 61% of the theoretical value to 0.36 ± 0.02 C-mol isobutanol/C-mol glucose, which was 2.3-fold of BSUL03. Moreover, this mutant produced approximately 70 % more isobutanol to the maximal titer of 5.5 ± 0.3 g/L in fed-batch fermentations. Conclusions EMA was employed as a guiding tool to direct rational improvement of the engineered isobutanol-producing B. subtilis. The consistency between model prediction and experimental results demonstrates the rationality and accuracy of this EMA-based approach for target identification. This network-based rational strain improvement strategy could serve as a promising concept to engineer efficient B. subtilis hosts for isobutanol, as well as other valuable products.

  19. Evaluation of a possible direct effect by casein phosphopeptides on paracellular and vitamin D controlled transcellular calcium transport mechanisms in intestinal human HT-29 and Caco2 cell lines.

    Science.gov (United States)

    Colombini, Alessandra; Perego, Silvia; Ardoino, Ilaria; Marasco, Emiliano; Lombardi, Giovanni; Fiorilli, Amelia; Biganzoli, Elia; Tettamanti, Guido; Ferraretto, Anita

    2013-08-01

    Intestinal cells are continuously exposed to food whose components are able to modulate some of their physiological functions. Among the bioactive food derivatives are casein phosphopeptides (CPPs), coming from the in vitro or in vivo casein digestion, which display the ability to form aggregates with calcium ions and to increase the uptake of the minerals in differentiated intestinal human HT-29 and Caco2 cells. Since extracellular calcium is a known inactivator of the TRPV6 channel, which is also involved in the colon cancer progression, the present study aims to determine a possible modulation by CPPs of the molecular structures responsible for paracellular and/or transcellular calcium absorption in these two cell lines. The paracellular calcium transport was determined by TEER measurements in Caco2 cells and by Lucifer Yellow flow in HT-29 cells. The possible modulation of transcellular calcium absorption machinery by CPPs was investigated by determining the mRNA expression for both the TRPV6 calcium channel and the VDR receptor in 1,25(OH)₂D₃ pre-treated undifferentiated/differentiated cells. The results obtained point out that: (i) CPPs do not affect paracellular calcium absorption; (ii) 1,25(OH)₂D₃ increases the TRPV6 mRNA expression in both types of cells. In the case of HT-29 cells this is the first determination of the presence of the TRPV6 channel; (iii) CPPs per se are not able to affect the VDR and TRPV6 mRNA expression; (iv) CPP administration does not affect the TRPV6 mRNA expression in 1,25(OH)₂D₃ pre-treated HT-29 cells and Caco2 cells. Unlike peptides coming from the digestion of cheese whey protein digest, the digestion of milk casein produces peptides with no effects on TRPV6 calcium channel expression, though the same peptides are able to determine a calcium uptake by the intestinal cells.

  20. Using digital notifications to improve attendance in clinic: systematic review and meta-analysis.

    Science.gov (United States)

    Robotham, Dan; Satkunanathan, Safarina; Reynolds, John; Stahl, Daniel; Wykes, Til

    2016-10-24

    Assess the impact of text-based electronic notifications on improving clinic attendance, in relation to study quality (according to risk of bias), and to assess simple ways in which notifications can be optimised (ie, impact of multiple notifications). Systematic review, study quality appraisal assessing risk of bias, data synthesised in meta-analyses. MEDLINE, EMBASE, PsycINFO, Web of Science and Cochrane Database of Systematic Reviews (01.01.05 until 25.4.15). A systematic search to discover all studies containing quantitative data for synthesis into meta-analyses. Studies examining the effect of text-based electronic notifications on prescheduled appointment attendance in healthcare settings. Primary analysis included experimental studies where randomisation was used to define allocation to intervention and where a control group consisting of 'no reminders' was used. Secondary meta-analysis included studies comparing text reminders with voice reminders. Studies lacking sufficient information for inclusion (after attempting to contact study authors) were excluded. Primary outcomes were rate of attendance/non-attendance at healthcare appointments. Secondary outcome was rate of rescheduled and cancelled appointments. 26 articles were included. 21 included in the primary meta-analysis (8345 patients receiving electronic text notifications, 7731 patients receiving no notifications). Studies were included from Europe (9), Asia (7), Africa (2), Australia (2) and America (1). Patients who received notifications were 23% more likely to attend clinic than those who received no notification (risk ratio=1.23, 67% vs 54%). Those receiving notifications were 25% less likely to 'no show' for appointments (risk ratio=.75, 15% vs 21%). Results were similar when accounting for risk of bias, region and publication year. Multiple notifications were significantly more effective at improving attendance than single notifications. Voice notifications appeared more effective than text

  1. Ranking agricultural practices on soil water improvements: a meta-analysis

    Science.gov (United States)

    Basche, A.; DeLonge, M. S.; Gonzalez, J.

    2016-12-01

    Increased rainfall variability is well documented in the historic record and predicted to intensify with future climate change. Managing excess water in periods of heavy rain and a lack of water in periods of inadequate precipitation will continue to be a challenge. Improving soil resiliency through increased water storage is a promising strategy to combat effects of both rainfall extremes. The goal of this research is to quantify to what extent various conservation and ecological practices can improve soil hydrology. We are conducting a global meta-analysis focused on studies where conservation and ecological practices are compared to more conventional management. To date we have analyzed 100 studies with more than 450 paired comparisons to understand the effect of management on water infiltration rates, a critical process that ensures water enters the soil profile for crop use, water storage and runoff prevention. The database will be expanded to include studies measuring soil porosity and the water retained at field capacity. Statistical analysis has been done both with both a bootstrap method and a mixed model that weights studies based on precision while accounting for between-study variation. We find that conservation and ecological practices, ranging from no-till, cover crops, crop rotation, perennial crops and agroforestry, on average significantly increased water infiltration rates relative to more conventional practice controls (mean of 75%, standard error 25%). There were significant differences between practices, where perennial and agroforestry systems show the greatest potential for improving water infiltration rates (> 100% increase). Cover crops also lead to a significant increase in water infiltration rates (> 60%) while crop rotations and no-till systems did not consistently demonstrate increases. We also found that studies needed to include alternative management for more than two years to detect a significant increase. Overall this global meta-analysis

  2. Improving torque per kilogram magnet of permanent magnet couplings using finite element analysis

    DEFF Research Database (Denmark)

    Högberg, Stig; Jensen, Bogi Bech; Bendixen, Flemming Buus

    2013-01-01

    This paper presents the methodology and subsequent findings of a performance-improvement routine that employs automated finite element (FE) analysis to increase the torque-per-kilogram-magnet (TPKM) of a permanent magnet coupling (PMC). The routine is applied to a commercially available cylindrical...... PMC with rectangular permanent magnets (PM), and a new design is discovered which increases TPKM by 15.6%. Furthermore, the study is repeated using concave/convex-shaped PMs, which results in an increase of TPKM of 57.6%. The FE models are validated against experimental measurements of the static...

  3. Improving IT project governance: A reflective analysis based on critical systems heuristics

    Directory of Open Access Journals (Sweden)

    David Johnstone

    2017-05-01

    Full Text Available IT project governance involves establishing authority structures, policies and mechanisms for IT projects. However, the way governance arrangements are implemented can sometimes exclude or marginalise important stakeholders. In this paper, we use critical systems thinking, and the notions of boundary critique and entrenched structural conflict, to inform a critical re-analysis of a case study where the governance proved relatively ineffective. We use the ‘twelve questions’ from the critical systems heuristics (CSH approach to diagnose problems with governance arrangements and suggest solutions. Based on this, we suggest the CSH approach has theoretical and practical efficacy for improving IT project governance in general.

  4. DAMBE7: New and improved tools for data analysis in molecular biology and evolution.

    Science.gov (United States)

    Xia, Xuhua

    2018-04-14

    DAMBE is a comprehensive software package for genomic and phylogenetic data analysis on Windows, Linux and Macintosh computers. New functions include imputing missing distances and phylogeny simultaneously (paving the way to build large phage and transposon trees), new bootstrapping/jackknifing methods for PhyPA (phylogenetics from pairwise alignments), and an improved function for fast and accurate estimation of the shape parameter of the gamma distribution for fitting rate heterogeneity over sites. Previous method corrects multiple hits for each site independently. DAMBE's new method uses all sites simultaneously for correction. DAMBE, featuring a user-friendly graphic interface, is freely available from http://dambe.bio.uottawa.ca.

  5. An improved method for reactor coolant pump abnormality monitoring using power line signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jae Cheon [Korea Power Engineering Company, Korea Advanced Institute of Science and Technology, 150 deokjin-dong, Yuseong-ku, Daejeon (Korea, Republic of)]. E-mail jcjung@kopec.co.kr; Seong, Poong Hyun [Korea Power Engineering Company, Korea Advanced Institute of Science and Technology, 150 deokjin-dong, Yuseong-ku, Daejeon (Korea, Republic of)

    2006-01-15

    An improved method to detect the reactor coolant pump (RCP) abnormality is suggested in this work. The monitoring parameters that are acquired from power line signal analysis are motor torque, motor speed and characteristic harmonic frequencies. The combination of Wigner-Ville Distribution (WVD) and feature area matrix comparison method is used for abnormality diagnosis. For validation of the proposed method, the test was performed during cool-down phase and heat-up phase in nuclear power plant (NPP) by cross-comparison with RCP vibration monitoring system (VMS). Using pump internal inspection results, the diagnosis prediction is verified.

  6. Thrombocytopenia and craniotomy for tumor: A National Surgical Quality Improvement Program analysis.

    Science.gov (United States)

    Dasenbrock, Hormuzdiyar H; Devine, Christopher A; Liu, Kevin X; Gormley, William B; Claus, Elizabeth B; Smith, Timothy R; Dunn, Ian F

    2016-06-01

    To the authors' knowledge, the current study is the first national analysis of the association between preoperative platelet count and outcomes after craniotomy. Patients who underwent craniotomy for tumor were extracted from the prospective National Surgical Quality Improvement Program registry (2007-2014) and stratified by preoperative thrombocytopenia, defined as mild (125,000-149,000/μL), moderate (100,000-124,000/μL), severe (75,000-99,000/μL), or very severe (craniotomy for tumor. Cancer 2016;122:1708-17. © 2016 American Cancer Society. © 2016 American Cancer Society.

  7. Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle

    Science.gov (United States)

    Stambolian, Damon B.; Dippolito, Gregory M.; Nyugen, Bao; Dischinger, Charles; Tran, Donald; Henderson, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human Factors analysis in improving the ground processing procedures for the Ares-1 launch vehicle. The light vehicle engineering designers for Ares-l launch vehicle had to design the flight vehicle for effective, efficient and safe ground operations in the cramped dimensions in a rocket design. The use of a mockup of the area where the technician would be required to work proved to be a very effective method to promote the collaboration between the Ares-1 designers and the ground operations personnel.

  8. Improving power output of inertial energy harvesters by employing principal component analysis of input acceleration

    Science.gov (United States)

    Smilek, Jan; Hadas, Zdenek

    2017-02-01

    In this paper we propose the use of principal component analysis to process the measured acceleration data in order to determine the direction of acceleration with the highest variance on given frequency of interest. This method can be used for improving the power generated by inertial energy harvesters. Their power output is highly dependent on the excitation acceleration magnitude and frequency, but the axes of acceleration measurements might not always be perfectly aligned with the directions of movement, and therefore the generated power output might be severely underestimated in simulations, possibly leading to false conclusions about the feasibility of using the inertial energy harvester for the examined application.

  9. Analysis and Alternate Selection of Nanopowder Modifiers to Improve a Special Protective Coating System

    Directory of Open Access Journals (Sweden)

    S. P. Bardakhanov

    2017-01-01

    Full Text Available This paper presents a practical approach for rational choice of silica nanopowders as modifiers to control and improve the performance of protective coating systems operating in harsh environmental conditions. The approach is based on the multiparameter analysis of nanoparticle reactivity of similar silica synthesized by using chemical and physical methods. The analysis indicates distinct adsorption centers due to the differences in the particles formation; the features of the formation and adsorption mechanisms lead to higher diffusion capacity of the nanoparticles, synthesized by physical methods, into a paint material and finally result in stronger chemical bonds between the system elements. The approach allows reducing the consumption of paint materials by 30% or more, at least 2-3 times increasing of the coating adhesion and hence the system life. Validity of the approach is illustrated through the data obtained from comparative modeling, factory testing, and practical use of modified systems.

  10. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  11. Improved gap filling method based on singular spectrum analysis and its application in space environment

    Science.gov (United States)

    Li, Xiangzhen; Liu, Shuai; Li, Zhi; Gong, Jiancun

    2017-11-01

    Data missing is a common phenomenon in the space environment measurements, which impacts or even blocks the following model-building procedures, predictions and posterior analysis. To fill these data gaps, an improved filling method based on iterative singular spectrum analysis is proposed. It first extracts a distribution array of the gaps and then fills the gaps with all known data. The distribution array is utilized to generate the test sets for cross validation. The embedding window length and principal components are determined by the discrete particle swarm optimization algorithm in a noncontinuous fashion. The effectiveness and adaptability of the filling method are proved by some tests done on solar wind data and geomagnetic indices from different solar activity years.

  12. Sentiment Analysis in Spanish for Improvement of Products and Services: A Deep Learning Approach

    Directory of Open Access Journals (Sweden)

    Mario Andrés Paredes-Valverde

    2017-01-01

    Full Text Available Sentiment analysis is an important area that allows knowing public opinion of the users about several aspects. This information helps organizations to know customer satisfaction. Social networks such as Twitter are important information channels because information in real time can be obtained and processed from them. In this sense, we propose a deep-learning-based approach that allows companies and organizations to detect opportunities for improving the quality of their products or services through sentiment analysis. This approach is based on convolutional neural network (CNN and word2vec. To determine the effectiveness of this approach for classifying tweets, we conducted experiments with different sizes of a Twitter corpus composed of 100000 tweets. We obtained encouraging results with a precision of 88.7%, a recall of 88.7%, and an F-measure of 88.7% considering the complete dataset.

  13. Improving the Design of Capacitive Micromachined Ultrasonic Transducers Aided with Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    A Martowicz

    2016-09-01

    Full Text Available The paper presents the results of analysis performed to search for feasible design improvements for capacitive micromachined ultrasonic transducer. Carried out search has been aided with the sensitivity analysis and the application of Response Surface Method. The multiphysics approach has been taken into account in elaborated finite element model of one cell of described transducer in order to include significant physical phenomena present in modelled microdevice. The set of twelve input uncertain and design parameters consists of geometric, material and control properties. The amplitude of dynamic membrane deformation of the transducer has been chosen as studied parameter. The objective of performed study has been defined as the task of finding robust design configurations of the transducer, i.e. characterizing maximal value of deformation amplitude with its minimal variation.

  14. Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft

    Science.gov (United States)

    Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.

  15. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  16. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    Science.gov (United States)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  17. Structure of CPV17 polyhedrin determined by the improved analysis of serial femtosecond crystallographic data.

    Science.gov (United States)

    Ginn, Helen M; Messerschmidt, Marc; Ji, Xiaoyun; Zhang, Hanwen; Axford, Danny; Gildea, Richard J; Winter, Graeme; Brewster, Aaron S; Hattne, Johan; Wagner, Armin; Grimes, Jonathan M; Evans, Gwyndaf; Sauter, Nicholas K; Sutton, Geoff; Stuart, David I

    2015-03-09

    The X-ray free-electron laser (XFEL) allows the analysis of small weakly diffracting protein crystals, but has required very many crystals to obtain good data. Here we use an XFEL to determine the room temperature atomic structure for the smallest cytoplasmic polyhedrosis virus polyhedra yet characterized, which we failed to solve at a synchrotron. These protein microcrystals, roughly a micron across, accrue within infected cells. We use a new physical model for XFEL diffraction, which better estimates the experimental signal, delivering a high-resolution XFEL structure (1.75 Å), using fewer crystals than previously required for this resolution. The crystal lattice and protein core are conserved compared with a polyhedrin with less than 10% sequence identity. We explain how the conserved biological phenotype, the crystal lattice, is maintained in the face of extreme environmental challenge and massive evolutionary divergence. Our improved methods should open up more challenging biological samples to XFEL analysis.

  18. Predictors of psychological improvement on non-professional suicide message boards: content analysis.

    Science.gov (United States)

    Niederkrotenthaler, T; Gould, M; Sonneck, G; Stack, S; Till, B

    2016-12-01

    Suicide message boards have been at the core of debates about negative influences of the Internet on suicidality. Nothing is currently known about communication styles that may help users to psychologically improve in these settings. In all, 1182 archival threads with 20 499 individual postings from seven non-professional suicide message boards supporting an 'against-suicide', 'neutral' or 'pro-suicide' attitude were randomly selected and subject to content analysis. Initial needs of primary posters (i.e. individual who open a thread), their psychological improvement by the end of the thread, their responses received and indicators of suicidality were coded. Differences between 'pro-suicide', 'neutral' and 'against suicide' boards, and correlations between primary posters and respondents in terms of suicidality were assessed. Logistic regression was used to test associations with psychological improvement. 'Pro-suicide' boards (n = 4) differed from 'neutral' (n = 1) and 'against-suicide' (n = 2) boards in terms of communicated contents. Indicators of suicidality correlated moderately to strongly between primary posters and respondents on 'pro-suicide' message boards, but less on other boards. Several communicative strategies were associated with psychological improvement in primary posters, including the provision of constructive advice [adjusted odds ratio (aOR) 4.10, 95% confidence interval (CI) 2.40-7.03], active listening (aOR 1.60, 95% CI 1.12-2.27), sympathy towards the poster (aOR 2.22, 95% CI 1.68-2.95) and provision of alternatives to suicide (aOR 2.30, 95% CI 1.67-3.18). Respondents resemble primary posters with regard to suicidality in 'pro-suicide' boards, which may hinder psychological improvement. Still, opportunities to intervene in these settings using simple communication techniques exist and need to be taken and evaluated.

  19. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  20. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  1. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    Science.gov (United States)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  2. Improving estimation of kinetic parameters in dynamic force spectroscopy using cluster analysis

    Science.gov (United States)

    Yen, Chi-Fu; Sivasankar, Sanjeevi

    2018-03-01

    Dynamic Force Spectroscopy (DFS) is a widely used technique to characterize the dissociation kinetics and interaction energy landscape of receptor-ligand complexes with single-molecule resolution. In an Atomic Force Microscope (AFM)-based DFS experiment, receptor-ligand complexes, sandwiched between an AFM tip and substrate, are ruptured at different stress rates by varying the speed at which the AFM-tip and substrate are pulled away from each other. The rupture events are grouped according to their pulling speeds, and the mean force and loading rate of each group are calculated. These data are subsequently fit to established models, and energy landscape parameters such as the intrinsic off-rate (koff) and the width of the potential energy barrier (xβ) are extracted. However, due to large uncertainties in determining mean forces and loading rates of the groups, errors in the estimated koff and xβ can be substantial. Here, we demonstrate that the accuracy of fitted parameters in a DFS experiment can be dramatically improved by sorting rupture events into groups using cluster analysis instead of sorting them according to their pulling speeds. We test different clustering algorithms including Gaussian mixture, logistic regression, and K-means clustering, under conditions that closely mimic DFS experiments. Using Monte Carlo simulations, we benchmark the performance of these clustering algorithms over a wide range of koff and xβ, under different levels of thermal noise, and as a function of both the number of unbinding events and the number of pulling speeds. Our results demonstrate that cluster analysis, particularly K-means clustering, is very effective in improving the accuracy of parameter estimation, particularly when the number of unbinding events are limited and not well separated into distinct groups. Cluster analysis is easy to implement, and our performance benchmarks serve as a guide in choosing an appropriate method for DFS data analysis.

  3. A social work study on the effect of transactional analysis on the improvement of intimacy attitude

    Directory of Open Access Journals (Sweden)

    Parvin Gol

    2013-04-01

    Full Text Available The purpose of this paper is to investigate the impact of group counseling using transactional analysis on the improvement of intimacy attitude in some depressed patients in city of Esfahan, Iran. In this paper, semi-experimental design with pretest posttest control groups was conducted among 30 patients. The sample was selected through available sampling method among the depressed patients referred to psychiatric centers. They were randomly assigned into experimental and control groups. The measurement instrument is intimacy attitude scale (IAS questionnaire by Amidon et al. (1983 [Amidon, E., Kumar, V. K., & Treadwell, T. (1983. Measurement of intimacy attitudes: The intimacy attitude scale-revisited. Journal of personality assessment, 47(6, 635-639.] and the Beck depression inventory (BDI. The pretest and posttest scores of the intimacy attitude scale questionnaire were analyzed in both experimental and control groups. For statistical analysis of data, repeated measures analysis of variance was carried out. The research findings indicated that group counseling using transactional analysis increases the level of intimacy attitude in depressed individuals. It also increases the emotional intimacy, but it does not increase the mental intimacy.

  4. Computerized lung sound analysis following clinical improvement of pulmonary edema due to congestive heart failure exacerbations.

    Science.gov (United States)

    Wang, Zhen; Xiong, Ying-xia

    2010-05-05

    Although acute congestive heart failure (CHF) patients typically present with abnormal auscultatory findings on lung examination, lung sounds are not normally subjected to rigorous analysis. The goals of this study were to use a computerized analytic acoustic tool to evaluate lung sound patterns in CHF patients during acute exacerbation and after clinical improvement and to compare CHF profiles with those of normal individuals. Lung sounds throughout the respiratory cycle was captured using a computerized acoustic-based imaging technique. Thirty-two consecutive CHF patients were imaged at the time of presentation to the emergency department and after clinical improvement. Digital images were created, geographical area of the images and lung sound patterns were quantitatively analyzed. The geographical areas of the vibration energy image of acute CHF patients without and with radiographically evident pulmonary edema were (67.9 +/- 4.7) and (60.3 +/- 3.5) kilo-pixels, respectively (P sound increased to (74.5 +/- 4.4) and (73.9 +/- 3.9) kilo-pixels (P sound analysis may be useful to track in acute CHF exacerbations.

  5. Development of thermodynamic optimum searching (TOS) to improve the prediction accuracy of flux balance analysis.

    Science.gov (United States)

    Zhu, Yan; Song, Jiangning; Xu, Zixiang; Sun, Jibin; Zhang, Yanping; Li, Yin; Ma, Yanhe

    2013-03-01

    Flux balance analysis (FBA) has been widely used in calculating steady-state flux distributions that provide important information for metabolic engineering. Several thermodynamics-based methods, for example, quantitative assignment of reaction directionality and energy balance analysis have been developed to improve the prediction accuracy of FBA. However, these methods can only generate a thermodynamically feasible range, rather than the most thermodynamically favorable solution. We therefore developed a novel optimization method termed as thermodynamic optimum searching (TOS) to calculate the thermodynamically optimal solution, based on the second law of thermodynamics, the minimum magnitude of the Gibbs free energy change and the maximum entropy production principle (MEPP). Then, TOS was applied to five physiological conditions of Escherichia coli to evaluate its effectiveness. The resulting prediction accuracy was found significantly improved (10.7-48.5%) by comparing with the (13)C-fluxome data, indicating that TOS can be considered an advanced calculation and prediction tool in metabolic engineering. Copyright © 2012 Wiley Periodicals, Inc.

  6. Diesel engine noise source identification based on EEMD, coherent power spectrum analysis and improved AHP

    Science.gov (United States)

    Zhang, Junhong; Wang, Jian; Lin, Jiewei; Bi, Fengrong; Guo, Qian; Chen, Kongwu; Ma, Liang

    2015-09-01

    As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan  >  left body  >  valve chamber cover  >  gear chamber casing  >  right body  >  flywheel housing, which provides an effectual guidance for the noise reduction.

  7. Improved RNA analysis for immediate autopsy of temporal bone soft tissues.

    Science.gov (United States)

    Lin, J; Kawano, H; Paparella, M M; Ho, S B

    1999-01-01

    RNA analysis is essential for understanding biological activities of a cell or tissue. Unfortunately, retrieval of RNA from existing archives of human temporal bones has proven extremely difficult due to degradation of RNA molecules. The major factors that contribute to degradation of RNA in specimens from autopsied temporal bones are tissue autolysis due to time elapsed before autopsy, and technical problems in processing the bones after harvest. We therefore focused on improving the survival of RNA in human temporal bones by shortening the time to autopsy and through modification of the processing technique by removing targeted tissues directly from the temporal bones and by avoiding time-consuming decalcification and celloidin-embedding. Eight temporal bones collected at immediate autopsies were used in this study. Representative mRNAs, ranging from high (MUC5B, physically unstable) to low (beta-actin, physically stable) molecular weights, and from abundant (MUC5B) to non-abundant (MUC1) RNA, were studied by in situ hybridization, Northern blot technique, or both. Using this modified protocol in autopsies performed up to 6 h after death, the existence of mRNAs was demonstrated in all bones studied. This improved method demonstrates the feasibility of the use of autopsied temporal bone tissues for RNA analysis.

  8. Analysis of the dynamic response improvement of a turbocharged diesel engine driven alternating current generating set

    International Nuclear Information System (INIS)

    Katrasnik, Tomaz; Medica, Vladimir; Trenc, Ferdinand

    2005-01-01

    Reliability of electric supply systems is among the most required necessities of modern society. Turbocharged diesel engine driven alternating current generating sets are often used to prevent electric black outs and/or as prime electric energy suppliers. It is well known that turbocharged diesel engines suffer from an inadequate response to a sudden load increase, this being a consequence of the nature of the energy exchange between the engine and the turbocharger. The dynamic response of turbocharged diesel engines could be improved by electric assisting systems, either by direct energy supply with an integrated starter-generator-booster (ISG) mounted on the engine flywheel, or by an indirect energy supply with an electrically assisted turbocharger. An experimentally verified zero dimensional computer simulation method was used for the analysis of both types of electrical assistance. The paper offers an analysis of the interaction between a turbocharged diesel engine and different electric assisting systems, as well as the requirements for the supporting electric motors that could improve the dynamic response of a diesel engine while driving an AC generating set. When performance class compliance is a concern, it is evident that an integrated starter-generator-booster outperforms an electrically assisted turbocharger for the investigated generating set. However, the electric energy consumption and frequency recovery times are smaller when an electrically assisted turbocharger is applied

  9. Image preprocessing improves Fourier-based texture analysis of nuclear chromatin.

    Science.gov (United States)

    Adam, Randall L; Leite, Neucimar J; Metze, Konradin

    2008-06-01

    To investigate whether preprocessing of digitized images can improve the image analysis of chromatin of cytologic preparations using Fast Fourier Transformation (FFT). In a preprocessing step the nuclear borders of the segmented nuclei were smoothed, thus avoiding the Airy ring artifact. We tested this method, comparing the inertia values of digitalized cardiomyocyte nuclei of rats of different ages. Furthermore, we created in silicio nuclear images with chromatin alterations at or nearby the nuclear edge in order to investigate the robustness of our method. After preprocessing, the FFT-derived variable inertia discriminated significantly better the chromatin structure of the nuclei at different ages in every frequency range. The investigation on simulated nuclei revealed that within the frequency ranges from 1.8 microm to 0.72 microm smoothing of the borders does not interfere with the detection of chromatin changes at the nuclear border. Smoothing of borders in segmented images can improve the analysis of Fourier-derived variables of the chromatin texture.

  10. Security analysis and improvements of authentication and access control in the Internet of Things.

    Science.gov (United States)

    Ndibanje, Bruce; Lee, Hoon-Jae; Lee, Sang-Gon

    2014-08-13

    Internet of Things is a ubiquitous concept where physical objects are connected over the internet and are provided with unique identifiers to enable their self-identification to other devices and the ability to continuously generate data and transmit it over a network. Hence, the security of the network, data and sensor devices is a paramount concern in the IoT network as it grows very fast in terms of exchanged data and interconnected sensor nodes. This paper analyses the authentication and access control method using in the Internet of Things presented by Jing et al. (Authentication and Access Control in the Internet of Things. In Proceedings of the 2012 32nd International Conference on Distributed Computing Systems Workshops, Macau, China, 18-21 June 2012, pp. 588-592). According to our analysis, Jing et al.'s protocol is costly in the message exchange and the security assessment is not strong enough for such a protocol. Therefore, we propose improvements to the protocol to fill the discovered weakness gaps. The protocol enhancements facilitate many services to the users such as user anonymity, mutual authentication, and secure session key establishment. Finally, the performance and security analysis show that the improved protocol possesses many advantages against popular attacks, and achieves better efficiency at low communication cost.

  11. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  12. Improvement of reflood model in RELAP5 code based on sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dong; Liu, Xiaojing; Yang, Yanhua, E-mail: yanhuay@sjtu.edu.cn

    2016-07-15

    Highlights: • Sensitivity analysis is performed on the reflood model of RELAP5. • The selected influential models are discussed and modified. • The modifications are assessed by FEBA experiment and better predictions are obtained. - Abstract: Reflooding is an important and complex process to the safety of nuclear reactor during loss of coolant accident (LOCA). Accurate prediction of the reflooding behavior is one of the challenge tasks for the current system code development. RELAP5 as a widely used system code has the capability to simulate this process but with limited accuracy, especially for low inlet flow rate reflooding conditions. Through the preliminary assessment with six FEBA (Flooding Experiments with Blocked Arrays) tests, it is observed that the peak cladding temperature (PCT) is generally underestimated and bundle quench is predicted too early compared to the experiment data. In this paper, the improvement of constitutive models related to reflooding is carried out based on single parametric sensitivity analysis. Film boiling heat transfer model and interfacial friction model of dispersed flow are selected as the most influential models to the results of interests. Then studies and discussions are specifically focused on these sensitive models and proper modifications are recommended. These proposed improvements are implemented in RELAP5 code and assessed against FEBA experiment. Better agreement between calculations and measured data for both cladding temperature and quench time is obtained.

  13. Repeated Quantitative Urine Toxicology Analysis May Improve Chronic Pain Patient Compliance with Opioid Therapy.

    Science.gov (United States)

    Knezevic, Nebojsa Nick; Khan, Omar M; Beiranvand, Afsaneh; Candido, Kenneth D

    2017-02-01

    Even though serious efforts have been undertaken by different medical societies to reduce opioid use for treating chronic benign pain, many Americans continue to seek pain relief through opioid consumption. Assuring compliance of these patients may be a difficult aspect of proper management even with regular behavioral monitoring. The purpose of this study was to accurately assess the compliance of chronic opioid-consuming patients in an outpatient setting and evaluate if utilizing repeated urine drug testing (UDT) could improve compliance. Retrospective analysis of prospectively collected data. Outpatient pain management clinic. After Institutional Review Board (IRB) approval, a retrospective analysis of data for 500 patients was conducted. We included patients who were aged 18 years and older who were treated with opioid analgesic medication for chronic pain. Patients were asked to provide supervised urine toxicology specimens during their regular clinic visits, and were asked to do so without prior notification. The specimens were sent to an external laboratory for quantitative testing using liquid chromatography-tandem mass spectrometry. Three hundred and eighty-six (77.2%) patients were compliant with prescribed medications and did not use any illicit drugs or undeclared medications. Forty-one (8.2%) patients tested positive for opioid medication(s) that were not prescribed in our clinic; 8 (1.6%) of the patients were positive for medication that was not prescribed by any physician and was not present in the Illinois Prescription Monitoring Program; 5 (1%) patients tested negative for prescribed opioids; and 60 (12%) patients were positive for illicit drugs (8.6% marijuana, 3.2% cocaine, 0.2% heroin). Repeated UDTs following education and disclosure, showed 49 of the 77 patients (63.6%) had improved compliance. This was a single-site study and we normalized concentrations of opioids in urine with creatinine levels while specific gravity normalization was not

  14. Improving student satisfaction of Andalas University Dormitory through Service Quality and Importance Performance Analysis

    Science.gov (United States)

    Putri, Nilda Tri; Anggraini, Larisa

    2018-03-01

    Residential satisfaction of university dormitories serve as one of the significant aspects in the framework of sustainability in higher education. This research investigated the quality of dormitory services in Andalas University Dormitory based on student’s satisfaction. According to management residential, the enrollment of residential student has increased gradually in Andalas University. In 2016, capacity of residential student is 1686, but only 1081 students can stay at dormitory because some rooms in bad condition. There are a lot of problems and complaints regarding dormitory’s service quality i.e water problems, leaky rooms and bathrooms, cleanliness and inadequate facilities in residential college. In addition, there are 20% of last year student’s residential check out before the time of contract runs out. The aim of this research are understanding the level of GAP exists between expectation and perception students’ residential in the content of service quality and evaluating the improvement priority services using Importance Performance Analysis. This study is measuring service quality by using Responsiveness, Assurance, Empathy, Reliability and Tangible dimension. A negative GAP indicates that the actual services are than what was expected and the GAP is highlighted area for improvement. Based on IPA, management should improve this following dimension services : responsiveness, tangible and assurance dimension.

  15. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  16. Improving food safety within the dairy chain: an application of conjoint analysis.

    Science.gov (United States)

    Valeeva, N I; Meuwissen, M P M; Lansink, A G J M Oude; Huirne, R B M

    2005-04-01

    This study determined the relative importance of attributes of food safety improvement in the production chain of fluid pasteurized milk. The chain was divided into 4 blocks: "feed" (compound feed production and its transport), "farm" (dairy farm), "dairy processing" (transport and processing of raw milk, delivery of pasteurized milk), and "consumer" (retailer/catering establishment and pasteurized milk consumption). The concept of food safety improvement focused on 2 main groups of hazards: chemical (antibiotics and dioxin) and microbiological (Salmonella, Escherichia coli, Mycobacterium paratuberculosis, and Staphylococcus aureus). Adaptive conjoint analysis was used to investigate food safety experts' perceptions of the attributes' importance. Preference data from individual experts (n = 24) on 101 attributes along the chain were collected in a computer-interactive mode. Experts perceived the attributes from the "feed" and "farm" blocks as being more vital for controlling the chemical hazards; whereas the attributes from the "farm" and "dairy processing" were considered more vital for controlling the microbiological hazards. For the chemical hazards, "identification of treated cows" and "quality assurance system of compound feed manufacturers" were considered the most important attributes. For the microbiological hazards, these were "manure supply source" and "action in salmonellosis and M. paratuberculosis cases". The rather high importance of attributes relating to quality assurance and traceability systems of the chain participants indicates that participants look for food safety assurance from the preceding participants. This information has substantial decision-making implications for private businesses along the chain and for the government regarding the food safety improvement of fluid pasteurized milk.

  17. Analysis of Technological Innovation and Environmental Performance Improvement in Aviation Sector

    Science.gov (United States)

    Lee, Joosung; Mo, Jeonghoon

    2011-01-01

    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector—aircraft manufacturers and airlines—has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation’s lifecycle environmental impact if they can achieve sufficient economies of scale. PMID:22016716

  18. Analysis of technological innovation and environmental performance improvement in aviation sector.

    Science.gov (United States)

    Lee, Joosung; Mo, Jeonghoon

    2011-09-01

    The past oil crises have caused dramatic improvements in fuel efficiency in all industrial sectors. The aviation sector-aircraft manufacturers and airlines-has also made significant efforts to improve the fuel efficiency through more advanced jet engines, high-lift wing designs, and lighter airframe materials. However, the innovations in energy-saving aircraft technologies do not coincide with the oil crisis periods. The largest improvement in aircraft fuel efficiency took place in the 1960s while the high oil prices in the 1970s and on did not induce manufacturers or airlines to achieve a faster rate of innovation. In this paper, we employ a historical analysis to examine the socio-economic reasons behind the relatively slow technological innovation in aircraft fuel efficiency over the last 40 years. Based on the industry and passenger behaviors studied and prospects for alternative fuel options, this paper offers insights for the aviation sector to shift toward more sustainable technological options in the medium term. Second-generation biofuels could be the feasible option with a meaningful reduction in aviation's lifecycle environmental impact if they can achieve sufficient economies of scale.

  19. Genome shuffling of Propionibacterium shermanii for improving vitamin B12 production and comparative proteome analysis.

    Science.gov (United States)

    Zhang, Ying; Liu, Jian-Zhong; Huang, Jun-Sheng; Mao, Zong-Wan

    2010-07-20

    Genome shuffling is an efficient approach for the rapid improvement of microbial phenotype. Here we improved vitamin B12 production of Propionibacterium shermanii by genome shuffling based on inactivated protoplast fusion. A genome shuffling strain with titer of vitamin B12 of 2.85 mgl(-1), named Propionibacterium shermanii-F2-3, was obtained. The genome shuffled strain produced about 61% improvement of vitamin B12 over the parent strain after 96 h. Comparative analysis of proteome profile was conducted between Propionibacterium shermanii 17 and F2-3. The expression levels of 38 proteins varied significantly in the genome shuffled strain compared with those in the parent strain. Of these proteins, 22 proteins were up-regulated, 16 proteins were down-regulated. Of the up-regulated proteins, 6 proteins (glutaminyl-tRNA synthetase (GlnS), Delta-aminolevulinic acid dehydratase (HemB), methionine synthase (Meth), riboflavin synthase (RibE), phosphofructo kinase (PfkA) and isocitrate dehydrogenase (Icd) is involved in the vitamin B12 biosynthesis pathway. They may be the key enzymes of vitamin B12 biosynthesis. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  20. Energy spectrum analysis of blast waves based on an improved Hilbert-Huang transform

    Science.gov (United States)

    Li, L.; Wang, F.; Shang, F.; Jia, Y.; Zhao, C.; Kong, D.

    2017-05-01

    Using the improved Hilbert-Huang transform (HHT), this paper investigates the problems of analysis and interpretation of the energy spectrum of a blast wave. It has been previously established that the energy spectrum is an effective feature by which to characterize a blast wave. In fact, the higher the energy spectra in a frequency band of a blast wave, the greater the damage to a target in the same frequency band. However, most current research focuses on analyzing wave signals in the time domain or frequency domain rather than considering the energy spectrum. We propose here an improved HHT method combined with a wavelet packet to extract the energy spectrum feature of a blast wave. When applying the HHT, the signal is first roughly decomposed into a series of intrinsic mode functions (IMFs) by empirical mode decomposition. The wavelet packet method is then performed on each IMF to eliminate noise on the energy spectrum. Second, a coefficient is introduced to remove unrelated IMFs. The energy of each instantaneous frequency can be derived through the Hilbert transform. The energy spectrum can then be obtained by adding up all the components after the wavelet packet filters and screens them through a coefficient to obtain the effective IMFs. The effectiveness of the proposed method is demonstrated by 12 groups of experimental data, and an energy attenuation model is established based on the experimental data. The improved HHT is a precise method for blast wave signal analysis. For other shock wave signals from blasting experiments, an energy frequency time distribution and energy spectrum can also be obtained through this method, allowing for more practical applications.

  1. Life cycle assessment of Italian citrus-based products. Sensitivity analysis and improvement scenarios.

    Science.gov (United States)

    Beccali, Marco; Cellura, Maurizio; Iudicello, Maria; Mistretta, Marina

    2010-07-01

    Though many studies concern the agro-food sector in the EU and Italy, and its environmental impacts, literature is quite lacking in works regarding LCA application on citrus products. This paper represents one of the first studies on the environmental impacts of citrus products in order to suggest feasible strategies and actions to improve their environmental performance. In particular, it is part of a research aimed to estimate environmental burdens associated with the production of the following citrus-based products: essential oil, natural juice and concentrated juice from oranges and lemons. The life cycle assessment of these products, published in a previous paper, had highlighted significant environmental issues in terms of energy consumption, associated CO(2) emissions, and water consumption. Starting from such results the authors carry out an improvement analysis of the assessed production system, whereby sustainable scenarios for saving water and energy are proposed to reduce environmental burdens of the examined production system. In addition, a sensitivity analysis to estimate the effects of the chosen methods will be performed, giving data on the outcome of the study. Uncertainty related to allocation methods, secondary data sources, and initial assumptions on cultivation, transport modes, and waste management is analysed. The results of the performed analyses allow stating that every assessed eco-profile is differently influenced by the uncertainty study. Different assumptions on initial data and methods showed very sensible variations in the energy and environmental performances of the final products. Besides, the results show energy and environmental benefits that clearly state the improvement of the products eco-profile, by reusing purified water use for irrigation, using the railway mode for the delivery of final products, when possible, and adopting efficient technologies, as the mechanical vapour recompression, in the pasteurisation and

  2. Developing person-centred analysis of harm in a paediatric hospital: a quality improvement report.

    Science.gov (United States)

    Lachman, Peter; Linkson, Lynette; Evans, Trish; Clausen, Henning; Hothi, Daljit

    2015-05-01

    The provision of safe care is complex and difficult to achieve. Awareness of what happens in real time is one of the ways to develop a safe system within a culture of safety. At Great Ormond Street Hospital, we developed and tested a tool specifically designed for patients and families to report harm, with the aim of raising awareness and opportunities for staff to continually improve and provide safe care. Over a 10-month period, we developed processes to report harm. We used the Model for Improvement and multiple Plan, Do, Study, Act cycles for testing. We measured changes using culture surveys as well as analysis of the reports. The tool was tested in different formats and moved from a provider centric to a person-centred tool analysed in real time. An independent person working with the families was best placed to support reporting. Immediate feedback to families was managed by senior staff, and provided the opportunity for clarification, transparency and apologies. Feedback to staff provided learning opportunities. Improvements in culture climate and staff reporting were noted in the short term. The integration of patient involvement in safety monitoring systems is essential to achieve safety. The high number of newly identified 'near-misses' and 'critical incidents' by families demonstrated an underestimation of potentially harmful events. This testing and introduction of a self-reporting, real-time bedside tool has led to active engagement with families and patients and raised situation awareness. We believe that this will lead to improved and safer care in the longer term. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Crossing the Barriers: An Analysis of Land Access Barriers to Geothermal Development and Potential Improvement Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Katherine R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-04

    Developers have identified many non-technical barriers to geothermal power development, including access to land. Activities required for accessing land, such as environmental review and private and public leasing can take a considerable amount of time and can delay or prevent project development. This paper discusses the impacts to available geothermal resources and deployment caused by land access challenges, including tribal and cultural resources, environmentally sensitive areas, biological resources, land ownership, federal and state lease queues, and proximity to military installations. In this analysis, we identified challenges that have the potential to prevent development of identified and undiscovered hydrothermal geothermal resources. We found that an estimated 400 MW of identified geothermal resource potential and 4,000 MW of undiscovered geothermal resource potential were either unallowed for development or contained one or more significant barriers that could prevent development at the site. Potential improvement scenarios that could be employed to overcome these barriers include (1) providing continuous funding to the U.S. Forest Service (USFS) for processing geothermal leases and permit applications and (2) the creation of advanced environmental mitigation measures. The model results forecast that continuous funding to the USFS could result in deployment of an additional 80 MW of geothermal capacity by 2030 and 124 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The creation of advanced environmental mitigation measures coupled with continuous funding to the USFS could result in deployment of an additional 97 MW of geothermal capacity by 2030 and 152 MW of geothermal capacity by 2050 when compared to the business-as-usual scenario. The small impact on potential deployment in these improvement scenarios suggests that these 4,400 MW have other barriers to development in addition to land access. In other words, simply

  4. Music-assisted relaxation to improve sleep quality: meta-analysis.

    Science.gov (United States)

    de Niet, Gerrit; Tiemens, Bea; Lendemeijer, Bert; Hutschemaekers, Giel

    2009-07-01

    This paper is a report of a meta-analysis conducted to evaluate the efficacy of music-assisted relaxation for sleep quality in adults and elders with sleep complaints with or without a co-morbid medical condition. Clinical studies have shown that music can influence treatment outcome in a positive and beneficial way. Music holds the promise of counteracting psychological presleep arousal and thus improving the preconditions for sleep. We conducted a search in the Embase (1997 - July 2008), Medline (1950 - July 2008), Cochrane (2000 - July 2008), Psychinfo (1987 - July 2008) and Cinahl (1982 - July 2008) databases for randomized controlled trials reported in English, German, French and Dutch. The outcome measure of interest was sleep quality. Data were extracted from the included studies using predefined data fields. The researchers independently assessed the quality of the trials using the Delphi list. Only studies with a score of 5 points or higher were included. A pooled analysis was performed based on a fixed effect model. Five randomized controlled trials with six treatment conditions and a total of 170 participants in intervention groups and 138 controls met our inclusion criteria. Music-assisted relaxation had a moderate effect on the sleep quality of patients with sleep complaints (standardized mean difference, -0.74; 95% CI: -0.96, -0.46). Subgroup analysis revealed no statistically significant contribution of accompanying measures. Music-assisted relaxation can be used without intensive investment in training and materials and is therefore cheap, easily available and can be used by nurses to promote music-assisted relaxation to improve sleep quality.

  5. An improved method for Multipath Hemispherical Map (MHM) based on Trend Surface Analysis

    Science.gov (United States)

    Wang, Zhiren; Chen, Wen; Dong, Danan; Yu, Chao

    2017-04-01

    Among various approaches developed for detecting the multipath effect in high-accuracy GNSS positioning, Only MHM (Multipath Hemispherical Map) and SF (Sidereal Filtering) can be implemented to real-time GNSS data processing. SF is based on the time repeatability of satellites which just suitable for static environment, while the spatiotemporal repeatability-based MHM is applicable not only for static environment but also for dynamic carriers with static multipath environment such as ships and airplanes, and utilizes much smaller number of parameters than ASF. However, the MHM method also has certain defects. Since the MHM take the mean of residuals from the grid as the filter value, it is more suitable when the multipath regime is medium to low frequency. Now existing research data indicate that the newly advanced Sidereal Filtering (ASF) method perform better with high frequency multipath reduction than MHM by contrast. To solve the above problem and improve MHM's performance on high frequency multipath, we combined binary trend surface analysis method with original MHM model to effectively analyze particular spatial distribution and variation trends of multipath effect. We computed trend surfaces of the residuals within a grid by least-square procedures, and chose the best results through the moderate successive test. The enhanced MHM grid was constructed from a set of coefficients of the fitted equation instead of mean value. According to the analysis of the actual observation, the improved MHM model shows positive effect on high frequency multipath reduction, and significantly reduced the root mean square (RMS) value of the carrier residuals. Keywords: Trend Surface Analysis; Multipath Hemispherical Map; high frequency multipath effect

  6. Texture analysis improves level set segmentation of the anterior abdominal wall

    International Nuclear Information System (INIS)

    Xu, Zhoubing; Allen, Wade M.; Baucom, Rebeccah B.; Poulose, Benjamin K.; Landman, Bennett A.

    2013-01-01

    Purpose: The treatment of ventral hernias (VH) has been a challenging problem for medical care. Repair of these hernias is fraught with failure; recurrence rates ranging from 24% to 43% have been reported, even with the use of biocompatible mesh. Currently, computed tomography (CT) is used to guide intervention through expert, but qualitative, clinical judgments, notably, quantitative metrics based on image-processing are not used. The authors propose that image segmentation methods to capture the three-dimensional structure of the abdominal wall and its abnormalities will provide a foundation on which to measure geometric properties of hernias and surrounding tissues and, therefore, to optimize intervention.Methods: In this study with 20 clinically acquired CT scans on postoperative patients, the authors demonstrated a novel approach to geometric classification of the abdominal. The authors’ approach uses a texture analysis based on Gabor filters to extract feature vectors and follows a fuzzy c-means clustering method to estimate voxelwise probability memberships for eight clusters. The memberships estimated from the texture analysis are helpful to identify anatomical structures with inhomogeneous intensities. The membership was used to guide the level set evolution, as well as to derive an initial start close to the abdominal wall.Results: Segmentation results on abdominal walls were both quantitatively and qualitatively validated with surface errors based on manually labeled ground truth. Using texture, mean surface errors for the outer surface of the abdominal wall were less than 2 mm, with 91% of the outer surface less than 5 mm away from the manual tracings; errors were significantly greater (2–5 mm) for methods that did not use the texture.Conclusions: The authors’ approach establishes a baseline for characterizing the abdominal wall for improving VH care. Inherent texture patterns in CT scans are helpful to the tissue classification, and texture

  7. Improving primary health care facility performance in Ghana: efficiency analysis and fiscal space implications.

    Science.gov (United States)

    Novignon, Jacob; Nonvignon, Justice

    2017-06-12

    Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private

  8. An improved model for whole genome phylogenetic analysis by Fourier transform.

    Science.gov (United States)

    Yin, Changchuan; Yau, Stephen S-T

    2015-10-07

    and demonstrates that the improved DFT dissimilarity measure is an efficient and effective similarity measure of DNA sequences. Due to its high efficiency and accuracy, the proposed DFT similarity measure is successfully applied on phylogenetic analysis for individual genes and large whole bacterial genomes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Effects of improved modeling on best estimate BWR severe accident analysis

    International Nuclear Information System (INIS)

    Hyman, C.R.; Ott, L.J.

    1984-01-01

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H 2 O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B 4 C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table

  10. Sensitivity analysis of thermodynamic properties of liquid water: a general approach to improve empirical potentials.

    Science.gov (United States)

    Iordanov, Tzvetelin D; Schenter, Gregory K; Garrett, Bruce C

    2006-01-19

    A sensitivity analysis of bulk water thermodynamics is presented in an effort to understand the relation between qualitative features of molecular potentials and properties that they predict. The analysis is incorporated in molecular dynamics simulations and investigates the sensitivity of the Helmholtz free energy, internal energy, entropy, heat capacity, pressure, thermal pressure coefficient, and static dielectric constant to components of the potential rather than the parameters of a given functional form. The sensitivities of the properties are calculated with respect to the van der Waals repulsive and the attractive parts, plus short- and long-range Coulomb parts of three four site empirical water potentials: TIP4P, Dang-Chang and TTM2R. The polarization sensitivity is calculated for the polarizable Dang-Chang and TTM2R potentials. This new type of analysis allows direct comparisons of the sensitivities for different potentials that use different functional forms. The analysis indicates that all investigated properties are most sensitive to the van der Waals repulsive, the short-range Coulomb and the polarization components of the potentials. When polarization is included in the potentials, the magnitude of the sensitivity of the Helmholtz free energy, internal energy, and entropy with respect to this part of the potential is comparable in magnitude to the other electrostatic components. In addition similarities in trends of observed sensitivities for nonpolarizable and polarizable potentials lead to the conclusion that the complexity of the model is not of critical importance for the calculation of these thermodynamic properties for bulk water. The van der Waals attractive and the long-range Coulomb sensitivities are relatively small for the entropy, heat capacity, thermal pressure coefficient and the static dielectric constant, while small changes in any of the potential contributions will significantly affect the pressure. The analysis suggests a procedure

  11. Flipped classroom improves student learning in health professions education: a meta-analysis.

    Science.gov (United States)

    Hew, Khe Foon; Lo, Chung Kwan

    2018-03-15

    The use of flipped classroom approach has become increasingly popular in health professions education. However, no meta-analysis has been published that specifically examines the effect of flipped classroom versus traditional classroom on student learning. This study examined the findings of comparative articles through a meta-analysis in order to summarize the overall effects of teaching with the flipped classroom approach. We focused specifically on a set of flipped classroom studies in which pre-recorded videos were provided before face-to-face class meetings. These comparative articles focused on health care professionals including medical students, residents, doctors, nurses, or learners in other health care professions and disciplines (e.g., dental, pharmacy, environmental or occupational health). Using predefined study eligibility criteria, seven electronic databases were searched in mid-April 2017 for relevant articles. Methodological quality was graded using the Medical Education Research Study Quality Instrument (MERSQI). Effect sizes, heterogeneity estimates, analysis of possible moderators, and publication bias were computed using the COMPREHENSIVE META-ANALYSIS software. A meta-analysis of 28 eligible comparative studies (between-subject design) showed an overall significant effect in favor of flipped classrooms over traditional classrooms for health professions education (standardized mean difference, SMD = 0.33, 95% confidence interval, CI = 0.21-0.46, p flipped classroom approach was more effective when instructors used quizzes at the start of each in-class session. More respondents reported they preferred flipped to traditional classrooms. Current evidence suggests that the flipped classroom approach in health professions education yields a significant improvement in student learning compared with traditional teaching methods.

  12. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  13. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  14. Application of computational fluid dynamics methods to improve thermal hydraulic code analysis

    Science.gov (United States)

    Sentell, Dennis Shannon, Jr.

    A computational fluid dynamics code is used to model the primary natural circulation loop of a proposed small modular reactor for comparison to experimental data and best-estimate thermal-hydraulic code results. Recent advances in computational fluid dynamics code modeling capabilities make them attractive alternatives to the current conservative approach of coupled best-estimate thermal hydraulic codes and uncertainty evaluations. The results from a computational fluid dynamics analysis are benchmarked against the experimental test results of a 1:3 length, 1:254 volume, full pressure and full temperature scale small modular reactor during steady-state power operations and during a depressurization transient. A comparative evaluation of the experimental data, the thermal hydraulic code results and the computational fluid dynamics code results provides an opportunity to validate the best-estimate thermal hydraulic code's treatment of a natural circulation loop and provide insights into expanded use of the computational fluid dynamics code in future designs and operations. Additionally, a sensitivity analysis is conducted to determine those physical phenomena most impactful on operations of the proposed reactor's natural circulation loop. The combination of the comparative evaluation and sensitivity analysis provides the resources for increased confidence in model developments for natural circulation loops and provides for reliability improvements of the thermal hydraulic code.

  15. Research on the improvement of nuclear safety -The development of a severe accident analysis code-

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Heui Dong; Cho, Sung Won; Park, Jong Hwa; Hong, Sung Wan; Yoo, Dong Han; Hwang, Moon Kyoo; Noh, Kee Man; Song, Yong Man [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    For prevention and mitigation of the containment failure during severe accident, the study is focused on the severe accident phenomena, especially, the ones occurring inside the cavity and is intended to improve existing models and develop analytical tools for the assessment of severe accidents. A correlation equation of the flame velocity of pre mixture gas of H{sub 2}/air/steam has been suggested and combustion flame characteristic was analyzed using a developed computer code. For the analysis of the expansion phase of vapor explosion, the mechanical model has been developed. The development of a debris entrainment model in a reactor cavity with captured volume has been continued to review and examine the limitation and deficiencies of the existing models. Pre-test calculation was performed to support the severe accident experiment for molten corium concrete interaction study and the crust formation process and heat transfer characteristics of the crust have been carried out. A stress analysis code was developed using finite element method for the reactor vessel lower head failure analysis. Through international program of PHEBUS-FP and participation in the software development, the research on the core degradation process and fission products release and transportation are undergoing. CONTAIN and MELCOR codes were continuously updated under the cooperation with USNRC and French developed computer codes such as ICARE2, ESCADRE, SOPHAEROS were also installed into the SUN workstation. 204 figs, 61 tabs, 87 refs. (Author).

  16. Analysis and improvement of digital control stability for master-slave manipulator system

    International Nuclear Information System (INIS)

    Yoshida, Koichi; Yabuta, Tetsuro

    1992-01-01

    Some bilateral controls of master-slave system have been designed, which can realize high-fidelity telemanipulation as if the operator were manipulating the object directly. While usual robot systems are controlled by software-servo system using digital computer, little work has been published on design and analysis for digital control of these systems, which must consider time-delay of sensor signals and zero order hold effect of command signals on actuators. This paper presents a digital control analysis for single degree of freedom master-slave system including impedance models of both the human operator and the task object, which clarifies some index for the stability. The stability result shows a virtual master-slave system concepts, which improve the digital control stability. We first analyze a dynamic control method of master-slave system in discrete-time system for the stability problem, which can realize high-fidelity telemanipulation in the continuous-time. Secondly, using the results of the stability analysis, the robust control scheme for master-slave system is proposed, and the validity of this scheme is finally confirmed by the simulation. Consequently, it would be considered that any combination of master and slave modules with dynamic model of these manipulators is possible to construct the stable master-slave system. (author)

  17. Improving the Measurement of Shared Cultural Schemas with Correlational Class Analysis: Theory and Method

    Directory of Open Access Journals (Sweden)

    Andrei Boutyline

    2017-05-01

    Full Text Available Measurement of shared cultural schemas is a central methodological challenge for the sociology of culture. Relational Class Analysis (RCA is a recently developed technique for identifying such schemas in survey data. However, existing work lacks a clear definition of such schemas, which leaves RCA’s accuracy largely unknown. Here, I build on the theoretical intuitions behind RCA to arrive at this definition. I demonstrate that shared schemas should result in linear dependencies between survey rows—the relationship usually measured with Pearson’s correlation. I thus modify RCA into a “Correlational Class Analysis” (CCA. When I compare the methods using a broad set of simulations, results show that CCA is reliably more accurate at detecting shared schemas than RCA, even in scenarios that substantially violate CCA’s assumptions. I find no evidence of theoretical settings where RCA is more accurate. I then revisit a previous RCA analysis of the 1993 General Social Survey musical tastes module. Whereas RCA partitioned these data into three schematic classes, CCA partitions them into four. I compare these results with a multiple-groups analysis in structural equation modeling and find that CCA’s partition yields greatly improved model fit over RCA. I conclude with a parsimonious framework for future work.

  18. Reliability of multiresolution deconvolution for improving depth resolution in SIMS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boulakroune, M’Hamed, E-mail: Boulakroune.mhamed@univ-ouargla.dz

    2016-11-15

    Highlights: • Recovery of SIMS profiles by enhancement of depth resolution using multiresolution deconvolution. • The multiresolution deconvolution is based on Tikhonov Miller regularization and wavelet analysis. • Local application of the regularization parameter at each resolution level provided to smoothed signals without artifacts related to noise. • The aim is to show the ability of multiresolution deconvolution to restore two extremely different structures large and thin. • On thin structure the multiresolution deconvolution by zone was successfully applied. - Abstract: This paper deals the effectiveness and reliability of multiresolution deconvolution algorithm for recovery Secondary Ions Mass Spectrometry, SIMS, profiles altered by the measurement. This new algorithm is characterized as a regularized wavelet transform. It combines ideas from Tikhonov Miller regularization, wavelet analysis and deconvolution algorithms in order to benefit from the advantages of each. The SIMS profiles were obtained by analysis of two structures of boron in a silicon matrix using a Cameca-Ims6f instrument at oblique incidence. The first structure is large consisting of two distant wide boxes and the second one is thin structure containing ten delta-layers in which the deconvolution by zone was applied. It is shown that this new multiresolution algorithm gives best results. In particular, local application of the regularization parameter of blurred and estimated solutions at each resolution level provided to smoothed signals without creating artifacts related to noise content in the profile. This led to a significant improvement in the depth resolution and peaks’ maximums.

  19. Preparation of Improved Turkish DataSet for Sentiment Analysis in Social Media

    Directory of Open Access Journals (Sweden)

    Makinist Semiha

    2017-01-01

    Full Text Available A public dataset, with a variety of properties suitable for sentiment analysis [1], event prediction, trend detection and other text mining applications, is needed in order to be able to successfully perform analysis studies. The vast majority of data on social media is text-based and it is not possible to directly apply machine learning processes into these raw data, since several different processes are required to prepare the data before the implementation of the algorithms. For example, different misspellings of same word enlarge the word vector space unnecessarily, thereby it leads to reduce the success of the algorithm and increase the computational power requirement. This paper presents an improved Turkish dataset with an effective spelling correction algorithm based on Hadoop [2]. The collected data is recorded on the Hadoop Distributed File System and the text based data is processed by MapReduce programming model. This method is suitable for the storage and processing of large sized text based social media data. In this study, movie reviews have been automatically recorded with Apache ManifoldCF (MCF [3] and data clusters have been created. Various methods compared such as Levenshtein and Fuzzy String Matching have been proposed to create a public dataset from collected data. Experimental results show that the proposed algorithm, which can be used as an open source dataset in sentiment analysis studies, have been performed successfully to the detection and correction of spelling errors.

  20. In vivo dynamics of skeletal muscle Dystrophin in zebrafish embryos revealed by improved FRAP analysis.

    Science.gov (United States)

    Bajanca, Fernanda; Gonzalez-Perez, Vinicio; Gillespie, Sean J; Beley, Cyriaque; Garcia, Luis; Theveneau, Eric; Sear, Richard P; Hughes, Simon M

    2015-10-13

    Dystrophin forms an essential link between sarcolemma and cytoskeleton, perturbation of which causes muscular dystrophy. We analysed Dystrophin binding dynamics in vivo for the first time. Within maturing fibres of host zebrafish embryos, our analysis reveals a pool of diffusible Dystrophin and complexes bound at the fibre membrane. Combining modelling, an improved FRAP methodology and direct semi-quantitative analysis of bleaching suggests the existence of two membrane-bound Dystrophin populations with widely differing bound lifetimes: a stable, tightly bound pool, and a dynamic bound pool with high turnover rate that exchanges with the cytoplasmic pool. The three populations were found consistently in human and zebrafish Dystrophins overexpressed in wild-type or dmd(ta222a/ta222a) zebrafish embryos, which lack Dystrophin, and in Gt(dmd-Citrine)(ct90a) that express endogenously-driven tagged zebrafish Dystrophin. These results lead to a new model for Dystrophin membrane association in developing muscle, and highlight our methodology as a valuable strategy for in vivo analysis of complex protein dynamics.

  1. SWOT analysis of a pediatric rehabilitation programme: a participatory evaluation fostering quality improvement.

    Science.gov (United States)

    Camden, Chantal; Swaine, Bonnie; Tétreault, Sylvie; Bergeron, Sophie

    2009-01-01

    To present the results of a strengths, weaknesses, opportunities and threats (SWOT) analysis used as part of a process aimed at reorganising services provided within a pediatric rehabilitation programme (PRP) in Quebec, Canada and to report the perceptions of the planning committee members regarding the usefulness of the SWOT in this process. Thirty-six service providers working in the PRP completed a SWOT questionnaire and reported what they felt worked and what did not work in the existing model of care. Their responses were used by a planning committee over a 12-month period to assist in the development of a new service delivery model. Committee members shared their thoughts about the usefulness of the SWOT. Current programme strengths included favourable organisational climate and interdisciplinary work whereas weaknesses included lack of psychosocial support to families and long waiting times for children. Opportunities included working with community partners, whereas fear of losing professional autonomy with the new service model was a threat. The SWOT results helped the planning committee redefine the programme goals and make decisions to improve service coordination. SWOT analysis was deemed as a very useful tool to help guide service reorganisation. SWOT analysis appears to be an interesting evaluation tool to promote awareness among service providers regarding the current functioning of a rehabilitation programme. It fosters their active participation in the reorganisation of a new service delivery model for pediatric rehabilitation.

  2. Partnership capacity for community health improvement plan implementation: findings from a social network analysis.

    Science.gov (United States)

    McCullough, J Mac; Eisen-Cohen, Eileen; Salas, S Bianca

    2016-07-13

    Many health departments collaborate with community organizations on community health improvement processes. While a number of resources exist to plan and implement a community health improvement plan (CHIP), little empirical evidence exists on how to leverage and expand partnerships when implementing a CHIP. The purpose of this study was to identify characteristics of the network involved in implementing the CHIP in one large community. The aims of this analysis are to: 1) identify essential network partners (and thereby highlight potential network gaps), 2) gauge current levels of partner involvement, 3) understand and effectively leverage network resources, and 4) enable a data-driven approach for future collaborative network improvements. We collected primary data via survey from n = 41 organizations involved in the Health Improvement Partnership of Maricopa County (HIPMC), in Arizona. Using the previously validated Program to Analyze, Record, and Track Networks to Enhance Relationships (PARTNER) tool, organizations provided information on existing ties with other coalition members, including frequency and depth of partnership and eight categories of perceived value/trust of each current partner organization. The coalition's overall network had a density score of 30 %, degree centralization score of 73 %, and trust score of 81 %. Network maps are presented to identify existing relationships between HIPMC members according to partnership frequency and intensity, duration of involvement in the coalition, and self-reported contributions to the coalition. Overall, number of ties and other partnership measures were positively correlated with an organization's perceived value and trustworthiness as rated by other coalition members. Our study presents a novel use of social network analysis methods to evaluate the coalition of organizations involved in implementing a CHIP in an urban community. The large coalition had relatively low network density but high

  3. Partnership capacity for community health improvement plan implementation: findings from a social network analysis

    Directory of Open Access Journals (Sweden)

    J. Mac McCullough

    2016-07-01

    Full Text Available Abstract Background Many health departments collaborate with community organizations on community health improvement processes. While a number of resources exist to plan and implement a community health improvement plan (CHIP, little empirical evidence exists on how to leverage and expand partnerships when implementing a CHIP. The purpose of this study was to identify characteristics of the network involved in implementing the CHIP in one large community. The aims of this analysis are to: 1 identify essential network partners (and thereby highlight potential network gaps, 2 gauge current levels of partner involvement, 3 understand and effectively leverage network resources, and 4 enable a data-driven approach for future collaborative network improvements. Methods We collected primary data via survey from n = 41 organizations involved in the Health Improvement Partnership of Maricopa County (HIPMC, in Arizona. Using the previously validated Program to Analyze, Record, and Track Networks to Enhance Relationships (PARTNER tool, organizations provided information on existing ties with other coalition members, including frequency and depth of partnership and eight categories of perceived value/trust of each current partner organization. Results The coalition’s overall network had a density score of 30 %, degree centralization score of 73 %, and trust score of 81 %. Network maps are presented to identify existing relationships between HIPMC members according to partnership frequency and intensity, duration of involvement in the coalition, and self-reported contributions to the coalition. Overall, number of ties and other partnership measures were positively correlated with an organization’s perceived value and trustworthiness as rated by other coalition members. Conclusions Our study presents a novel use of social network analysis methods to evaluate the coalition of organizations involved in implementing a CHIP in an urban community. The

  4. An Improved, Automated Whole-Air Sampler and VOC Analysis System: Results from SONGNEX 2015

    Science.gov (United States)

    Lerner, B. M.; Gilman, J.; Tokarek, T. W.; Peischl, J.; Koss, A.; Yuan, B.; Warneke, C.; Isaacman-VanWertz, G. A.; Sueper, D.; De Gouw, J. A.; Aikin, K. C.

    2015-12-01

    Accurate measurement of volatile organic compounds (VOCs) in the troposphere is critical for the understanding of emissions and physical and chemical processes that can impact both air quality and climate. Airborne VOC measurements have proven challenging due to the requirements of short sample collection times (=10 s) to maximize spatial resolution and sampling frequency and high sensitivity (pptv) to chemically diverse hydrocarbons, halocarbons, oxygen- and nitrogen-containing VOCs. NOAA ESRL CSD has built an improved whole air sampler (iWAS) which collects compressed ambient air samples in electropolished stainless steel canisters, based on the NCAR HAIS Advanced Whole Air Sampler [Atlas and Blake]. Post-flight chemical analysis is performed with a custom-built gas chromatograph-mass spectrometer system that pre-concentrates analyte cryostatically via a Stirling cooler, an electromechanical chiller which precludes the need for liquid nitrogen to reach trapping temperatures. For the 2015 Shale Oil and Natural Gas Nexus Study (SONGNEX), CSD conducted iWAS measurements on 19 flights aboard the NOAA WP-3D aircraft between March 19th and April 27th. Nine oil and natural gas production regions were surveyed during SONGNEX and more than 1500 air samples were collected and analyzed. For the first time, we employed real-time mapping of sample collection combined with live data from fast time-response measurements (e.g. ethane) for more uniform surveying and improved target plume sampling. Automated sample handling allowed for more than 90% of iWAS canisters to be analyzed within 96 hours of collection - for the second half of the campaign improved efficiencies reduced the median sample age at analysis to 36 hours. A new chromatography peak-fitting software package was developed to minimize data reduction time by an order of magnitude without a loss of precision or accuracy. Here we report mixing ratios for aliphatic and aromatic hydrocarbons (C2-C8) along with select

  5. Quantitative Phosphoproteomic Analysis of Soybean Root Hairs Inoculated with Bradyrhizobium japonicum

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Tran H.; Brechenmacher, Laurent; Aldrich, Joshua T.; Clauss, Therese RW; Gritsenko, Marina A.; Hixson, Kim K.; Libault, Marc; Tanaka, Kiwamu; Yang, Feng; Yao, Qiuming; Pasa-Tolic, Ljiljana; Xu, Dong; Nguyen, Henry T.; Stacey, Gary

    2012-11-11

    Root hairs are single hair-forming cells on roots that function to increase root surface area, enhancing water and nutrient uptake. In leguminous plants, root hairs also play a critical role as the site of infection by symbiotic nitrogen fixing rhizobia, leading to the formation of a novel organ, the nodule. The initial steps in the rhizobia-root hair infection process are known to involve specific receptor kinases and subsequent kinase cascades. Here, we characterize the phosphoproteome of the root hairs and the corresponding stripped roots (i.e., roots from which root hairs were removed) during rhizobial colonization and infection to gain insight into the molecular mechanism of root hair cell biology. We chose soybean (Glycine max L.), one of the most important crop plants in the legume family, for this study because of its larger root size, which permits isolation of sufficient root hair material for phosphoproteomic analysis. Phosphopeptides derived from root hairs and stripped roots, mock inoculated or inoculated with the soybean-specific rhizobium Bradyrhizobium japonicum, were labeled with the isobaric tag 8-plex ITRAQ, enriched using Ni-NTA magnetic beads and subjected to nRPLC-MS/MS analysis using HCD and decision tree guided CID/ETD strategy. A total of 1,625 unique phosphopeptides, spanning 1,659 non-redundant phosphorylation sites, were detected from 1,126 soybean phosphoproteins. Among them, 273 phosphopeptides corresponding to 240 phosphoproteins were found to be significantly regulated (>1.5 fold abundance change) in response to inoculation with B. japonicum. The data reveal unique features of the soybean root hair phosphoproteome, including root hair and stripped root-specific phosphorylation suggesting a complex network of kinase-substrate and phosphatase-substrate interactions in response to rhizobial inoculation.

  6. Improvement of microtome cutting process of carbon nanotube composite sample preparation for TEM analysis

    Science.gov (United States)

    Trayner, Sarah

    As research progresses towards nanoscale materials, there has become a need for a more efficient and effective way to obtain ultra-thin samples for imaging under transmission electron microscope (TEM) for atomic resolution analysis. There are various methods used to obtain thin samples (research is a continuous effort to study and improve the ultra-microtome cutting technique to provide an effective and reliable approach of obtaining an ultra-thin (25-50 nm) cross section of a CNT/polymer composite for high resolution TEM analysis. Improvements were achieved by studying the relationships between the chosen cutting parameters, sample characteristics and TEM image quality. From this information, a cutting protocol was established so that ultra-thin sample slices can be achieved by different microtome operators for high resolution TEM analysis. In addition, a custom tool was created to aid in the sample collection process. In this research, three composite samples were studied for both microtome cutting and TEM analysis: 1) Unidirectional (UD) IM7/BMI composite; 2) Single-layer CNT buckypaper (BP)/epoxy nanocomposite; 3) 3-layer CNT BP/BMI nanocomposite. The resultant TEM images revealed a clear microstructure consisting of amorphous resin and graphite crystalline packing. UD IM7/BMI composite TEM results did not reveal an interfacial region resulting in a need for even thinner sliced cross sections. TEM results for the single-layer CNT BP/epoxy nanocomposite revealed the alignment direction of the nanotubes and numerous stacks of CNT bundles. In addition, there was visible flattening of CNT packing into dumbbell shapes similar to results obtain by Alan Windle. TEM results for the 3-layer CNT BP/BMI nanocomposite revealed uniformly cut resin. However, when the diamond knife reached graphite crystalline regions, the nanotube either became deformed into a cone-like structure, was cut at a thicker thickness than the resin, or folded over onto itself. This is most likely

  7. Term AnalysisImproving the Quality of Learning and Application Documents in Engineering Design

    Directory of Open Access Journals (Sweden)

    S. Weiss

    2006-01-01

    Full Text Available Conceptual homogeneity is one determinant of the quality of text documents. A concept remains the same if the words used (termini change [1, 2]. In other words, termini can vary while the concept retains the same meaning. Human beings are able to handle concepts and termini because of their semantic network, which is able to connect termini to the actual context and thus identify the adequate meaning of the termini. Problems could arise when humans have to learn new content and correspondingly new concepts. Since the content is basically imparted by text via particular termini, it is a challenge to establish the right concept from the text with the termini. A term might be known, but have a different meaning [3, 4]. Therefore, it is very important to build up the correct understanding of concepts within a text. This is only possible when concepts are explained by the right termini, within an adequate context, and above all, homogeneously. So, when setting up or using text documents for teaching or application, it is essential to provide concept homogeneity.Understandably, the quality of documents is, ceteris paribus, reciprocally proportional to variations of termini. Therefore, an analysis of variations of termini could form a basis for specific improvement of conceptual homogeneity.Consequently, an exposition of variations of termini as control and improvement parameters is carried out in this investigation. This paper describes the functionality and the profit of a tool called TermAnalysis.It also outlines the margins, typeface and other vital specifications necessary for authors preparing camera-ready papers for submission to the 5th International Conference on Advanced Engineering Design. The aim of this paper is to ensure that all readers are clear as to the uniformity required by the organizing committee and to ensure that readers’ papers will be accepted as camera-ready for the conference.TermAnalysis is a software tool developed

  8. Thermodynamic analysis questions claims of improved cardiac efficiency by dietary fish oil

    Science.gov (United States)

    Goo, Eden; Chapman, Brian; Hickey, Anthony J.R.

    2016-01-01

    Studies in the literature describe the ability of dietary supplementation by omega-3 fish oil to increase the pumping efficiency of the left ventricle. Here we attempt to reconcile such studies with our own null results. We undertake a quantitative analysis of the improvement that could be expected theoretically, subject to physiological constraints, by posing the following question: By how much could efficiency be expected to increase if inefficiencies could be eliminated? Our approach utilizes thermodynamic analyses to investigate the contributions, both singly and collectively, of the major components of cardiac energetics to total cardiac efficiency. We conclude that it is unlikely that fish oils could achieve the required diminution of inefficiencies without greatly compromising cardiac performance. PMID:27574288

  9. Use of video-feedback, reflection, and interactive analysis to improve nurse leadership practices.

    Science.gov (United States)

    Crenshaw, Jeannette T

    2012-01-01

    The chronic shortage of registered nurses (RNs) affects patient safety and health care quality. Many factors affect the RN shortage in the workforce, including negative work environments, exacerbated by ineffective leadership approaches. Improvements in the use of relationship-based leadership approaches lead to healthier work environments that foster RN satisfaction and reduce RN turnover and vacancy rates in acute care settings. In this article, an innovative approach to reduce nurse turnover and decrease vacancy rates in acute care settings is described. Video feedback with reflection and interactive analysis is an untapped resource for nurse leaders and aspiring nurse leaders in their development of effective leadership skills. This unique method may be an effective leadership strategy for addressing recruitment and retention issues in a diverse workforce.

  10. Cross-sectional atom probe tomography sample preparation for improved analysis of fins on SOI.

    Science.gov (United States)

    Martin, Andrew J; Weng, Weihao; Zhu, Zhengmao; Loesing, Rainer; Shaffer, James; Katnani, Ahmad

    2016-02-01

    Sample preparation for atom probe tomography of 3D semiconductor devices has proven to significantly affect field evaporation and the reliability of reconstructed data. A cross-sectional preparation method is applied to state-of-the-art Si finFET technology on SOI. This preparation approach advantageously provides a conductive path for voltage and heat, offers analysis of many fins within a single tip, and improves resolution across interfaces of particular interest. Measured B and Ge profiles exhibit good correlation with SIMS and EDX and show no signs of B clustering or pile-up near the Si/SiGe interface of the fin. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Improved analysis of bacterial CGH data beyond the log-ratio paradigm

    Directory of Open Access Journals (Sweden)

    Aakra Ågot

    2009-03-01

    Full Text Available Abstract Background Existing methods for analyzing bacterial CGH data from two-color arrays are based on log-ratios only, a paradigm inherited from expression studies. We propose an alternative approach, where microarray signals are used in a different way and sequence identity is predicted using a supervised learning approach. Results A data set containing 32 hybridizations of sequenced versus sequenced genomes have been used to test and compare methods. A ROC-analysis has been performed to illustrate the ability to rank probes with respect to Present/Absent calls. Classification into Present and Absent is compared with that of a gaussian mixture model. Conclusion The results indicate our proposed method is an improvement of existing methods with respect to ranking and classification of probes, especially for multi-genome arrays.

  12. Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving.

    Science.gov (United States)

    Semeniuk, Yulia Yuriyivna; Brown, Roger L; Riesch, Susan K

    2016-07-01

    We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem-solving skill. The intervention is based on the Circumplex Model and Social Problem-Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem-Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. © The Author(s) 2016.

  13. Analysis of the characteristics of taxi services as a prerequisite for their improvement

    Directory of Open Access Journals (Sweden)

    Vujić Nenad

    2014-01-01

    Full Text Available The expansion of services sector is the characteristics of modern and developed societies that influence national economy. Therefore, the analysis of services, as a concept and part of marketing is very significant. In this sense, the paper researches a particular service - the taxi services in the capital of Serbia. Through this research, the authors try to define the groups of customer of taxi services and their preference and attitudes. The research was performed in period May to July 2014, by direct contact with customer of taxi services. The results of research have confirmed the initial hypothesis and provide possibilities for further insight into the way of using taxi services and general circumstances that characterize them in mentioned region. On this basis, it is provided proposals for improvement of taxi services and easier outreach of target groups.

  14. Analysis from reviews in Social Media to improve hotel´s online reputation

    Directory of Open Access Journals (Sweden)

    Daissy Hatblathy Moya Sánchez

    2017-07-01

    Full Text Available Today, hoteliers have problems with handling online reputation due to bad reviews they’ve received on social networks. The aim of this research is to identify the key factors to consider in the operation of each hotel to avoid negative comments and to increase their online reputation. The ratings received by virtual means in 57 Latin American hotels belonging to the GHL Hotel Chain from March 31st, 2015 until March 31st, 2016. By using the software Revinate, there were analyzed the reviews by department. Then, they were classified to developed a manual of good practices. From the analysis of those comments, recommendations were made on six areas of the hotels: Rooms, Food and Beverage, Front Desk, Business Center, Security, and Management to optimize the quality in hotels and thus improve their online reputation.

  15. Improved method for HPLC analysis of polyamines, agmatine and aromatic monoamines in plant tissue

    Science.gov (United States)

    Slocum, R. D.; Flores, H. E.; Galston, A. W.; Weinstein, L. H.

    1989-01-01

    The high performance liquid chromatographic (HPLC) method of Flores and Galston (1982 Plant Physiol 69: 701) for the separation and quantitation of benzoylated polyamines in plant tissues has been widely adopted by other workers. However, due to previously unrecognized problems associated with the derivatization of agmatine, this important intermediate in plant polyamine metabolism cannot be quantitated using this method. Also, two polyamines, putrescine and diaminopropane, also are not well resolved using this method. A simple modification of the original HPLC procedure greatly improves the separation and quantitation of these amines, and further allows the simulation analysis of phenethylamine and tyramine, which are major monoamine constituents of tobacco and other plant tissues. We have used this modified HPLC method to characterize amine titers in suspension cultured carrot (Daucas carota L.) cells and tobacco (Nicotiana tabacum L.) leaf tissues.

  16. Improved Label-Free LC-MS Analysis by Wavelet-Based Noise Rejection

    Directory of Open Access Journals (Sweden)

    Salvatore Cappadona

    2010-01-01

    Full Text Available Label-free LC-MS analysis allows determining the differential expression level of proteins in multiple samples, without the use of stable isotopes. This technique is based on the direct comparison of multiple runs, obtained by continuous detection in MS mode. Only differentially expressed peptides are selected for further fragmentation, thus avoiding the bias toward abundant peptides typical of data-dependent tandem MS. The computational framework includes detection, alignment, normalization and matching of peaks across multiple sets, and several software packages are available to address these processing steps. Yet, more care should be taken to improve the quality of the LC-MS maps entering the pipeline, as this parameter severely affects the results of all downstream analyses. In this paper we show how the inclusion of a preprocessing step of background subtraction in a common laboratory pipeline can lead to an enhanced inclusion list of peptides selected for fragmentation and consequently to better protein identification.

  17. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Zamorano, Montserrat

    2010-01-01

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  18. A critical analysis of energy efficiency improvement potentials in Taiwan's cement industry

    International Nuclear Information System (INIS)

    Huang, Yun-Hsun; Chang, Yi-Lin; Fleiter, Tobias

    2016-01-01

    The cement industry is the second most energy-intensive sector in Taiwan, which underlines the need to understand its potential for energy efficiency improvement. A bottom-up model-based assessment is utilized to conduct a scenario analysis of energy saving opportunities up to the year 2035. The analysis is supported by detailed expert interviews in all cement plants of Taiwan. The simulation results reveal that by 2035, eighteen energy efficient technologies could result in 25% savings for electricity and 9% savings for fuels under the technical diffusion scenario. This potential totally amounts to about 5000 TJ/year, of which 91% can be implemented cost-effectively assuming a discount rate of 10%. Policy makers should support a fast diffusion of these technologies. Additionally, policy makers can tap further saving potentials. First, by decreasing the clinker share, which is currently regulated to a minimum of 95%. Second, by extending the prohibition to build new cement plants by allowing for replacement of existing capacity with new innovative plants in the coming years. Third, by supporting the use of alternative fuels, which is currently still a niche in Taiwan. - Highlights: •We analyze energy efficiency improvement potentials in Taiwan's cement industry. •Eighteen process-specific technologies are analyzed using a bottom-up model. •Our model systematically reflects the diffusion of technologies over time. •We find energy-saving potentials of 25% for electricity and 9% for fuels in 2035. •91% of the energy-saving potentials can be realized cost-effectively.

  19. Improving time-frequency domain sleep EEG classification via singular spectrum analysis.

    Science.gov (United States)

    Mahvash Mohammadi, Sara; Kouchaki, Samaneh; Ghavami, Mohammad; Sanei, Saeid

    2016-11-01

    Manual sleep scoring is deemed to be tedious and time consuming. Even among automatic methods such as time-frequency (T-F) representations, there is still room for more improvement. To optimise the efficiency of T-F domain analysis of sleep electroencephalography (EEG) a novel approach for automatically identifying the brain waves, sleep spindles, and K-complexes from the sleep EEG signals is proposed. The proposed method is based on singular spectrum analysis (SSA). The single-channel EEG signal (C3-A2) is initially decomposed and then the desired components are automatically separated. In addition, the noise is removed to enhance the discrimination ability of features. The obtained T-F features after preprocessing stage are classified using a multi-class support vector machines (SVMs) and used for the identification of four sleep stages over three sleep types. Furthermore, to emphasise on the usefulness of the proposed method the automatically-determined spindles are parameterised to discriminate three sleep types. The four sleep stages are classified through SVM twice: with and without preprocessing stage. The mean accuracy, sensitivity, and specificity for before the preprocessing stage are: 71.5±0.11%, 56.1±0.09% and 86.8±0.04% respectively. However, these values increase significantly to 83.6±0.07%, 70.6±0.14% and 90.8±0.03% after applying SSA. The new T-F representation has been compared with the existing benchmarks. Our results prove that, the proposed method well outperforms the previous methods in terms of identification and representation of sleep stages. Experimental results confirm the performance improvement in terms of classification rate and also representative T-F domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Observations of Tunable Resistive Pulse Sensing for Exosome Analysis: Improving System Sensitivity and Stability.

    Science.gov (United States)

    Anderson, Will; Lane, Rebecca; Korbie, Darren; Trau, Matt

    2015-06-16

    Size distribution and concentration measurements of exosomes are essential when investigating their cellular function and uptake. Recently, a particle size distribution and concentration measurement platform known as tunable resistive pulse sensing (TRPS) has seen increased use for the characterization of exosome samples. TRPS measures the brief increase in electrical resistance (a resistive pulse) produced by individual submicrometer/nanoscale particles as they translocate through a size-tunable submicrometer/micrometer-sized pore, embedded in an elastic membrane. Unfortunately, TRPS measurements are susceptible to issues surrounding system stability, where the pore can become blocked by particles, and sensitivity issues, where particles are too small to be detected against the background noise of the system. Herein, we provide a comprehensive analysis of the parameters involved in TRPS exosome measurements and demonstrate the ability to improve system sensitivity and stability by the optimization of system parameters. We also provide the first analysis of system noise, sensitivity cutoff limits, and accuracy with respect to exosome measurements and offer an explicit definition of system sensitivity that indicates the smallest particle diameter that can be detected within the noise of the trans-membrane current. A comparison of exosome size measurements from both TRPS and cryo-electron microscopy is also provided, finding that a significant number of smaller exosomes fell below the detection limit of the TRPS platform and offering one potential insight as to why there is such large variability in the exosome size distribution reported in the literature. We believe the observations reported here may assist others in improving TRPS measurements for exosome samples and other submicrometer biological and nonbiological particles.

  1. Analysis of an Online Match Discussion Board: Improving the Otolaryngology—Head and Neck Surgery Match

    Science.gov (United States)

    Kozin, Elliott D.; Sethi, Rosh; Lehmann, Ashton; Remenschneider, Aaron K.; Golub, Justin S.; Reyes, Samuel A.; Emerick, Kevin; Lee, Daniel J.; Gray, Stacey T.

    2015-01-01

    Introduction “The Match” has become the accepted selection process for graduate medical education. Otomatch.com has provided an online forum for Otolaryngology-Head and Neck Surgery (OHNS) Match-related questions for over a decade. Herein, we aim to 1) delineate the types of posts on Otomatch to better understand the perspective of medical students applying for residency and 2) provide recommendations to potentially improve the Match process. Methods Discussion forum posts on Otomatch between December 2001 and April 2014 were reviewed. The title of each thread and total number of views were recorded for quantitative analysis. Each thread was organized into one of six major categories and one of eighteen subcategories, based on chronology within the application cycle and topic. National Resident Matching Program (NRMP) data were utilized for comparison. Results We identified 1,921 threads corresponding to over 2 million page views. Over 40% of threads related to questions about specific programs, and 27% were discussions about interviews. Views, a surrogate measure for popularity, reflected different trends. The majority of individuals viewed posts on interviews (42%), program specific questions (20%) and how to rank programs (11%). Increase in viewership tracked with a rise in applicant numbers based on NRMP data. Conclusions Our study provides an in depth analysis of a popular discussion forum for medical students interested in the OHNS Match. The most viewed posts are about interview dates and questions regarding specific programs. We provide suggestions to address unmet needs for medical students and potentially improve the Match process. PMID:25550223

  2. Reactive Landing of Gas-Phase Ions as a Tool for the Fabrication of Metal Oxide Surfaces for In Situ Phosphopeptide Enrichment

    Czech Academy of Sciences Publication Activity Database

    Blacken, G. R.; Volný, Michael; Diener, M.; Jackson, K. E.; Ranjitkar, P.; Maly, D. J.; Tureček, F.

    2009-01-01

    Roč. 20, č. 6 (2009), s. 915-926 ISSN 1044-0305 Institutional research plan: CEZ:AV0Z50200510 Keywords : TANDEM MASS-SPECTROMETRY * SELECTIVE DETECTION * PHOSPHOPROTEOME ANALYSIS Subject RIV: EE - Microbiology, Virology Impact factor: 3.391, year: 2009

  3. Recommendations to improve imaging and analysis of brain lesion load and atrophy in longitudinal studies of multiple sclerosis

    NARCIS (Netherlands)

    Vrenken, H.; Jenkinson, M.; Horsfield, M.A.; Battaglini, M.; van Schijndel, R.A.; Rostrup, E.; Geurts, J.J.G.; Fisher, E.; Zijdenbos, A.; Ashburner, J.; Miller, D. H.; Filippi, M.; Fazekas, F.; Rovaris, M.; Rovira, A.; Barkhof, F.; De Stefano, N.

    2013-01-01

    Focal lesions and brain atrophy are the most extensively studied aspects of multiple sclerosis (MS), but the image acquisition and analysis techniques used can be further improved, especially those for studying within-patient changes of lesion load and atrophy longitudinally. Improved accuracy and

  4. How to Use Value-Added Analysis to Improve Student Learning: A Field Guide for School and District Leaders

    Science.gov (United States)

    Kennedy, Kate; Peters, Mary; Thomas, Mike

    2012-01-01

    Value-added analysis is the most robust, statistically significant method available for helping educators quantify student progress over time. This powerful tool also reveals tangible strategies for improving instruction. Built around the work of Battelle for Kids, this book provides a field-tested continuous improvement model for using…

  5. Failure mode and effect analysis: improving intensive care unit risk management processes.

    Science.gov (United States)

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  6. Instrumented gait analysis: a measure of gait improvement by a wheeled walker in hospitalized geriatric patients.

    Science.gov (United States)

    Schülein, Samuel; Barth, Jens; Rampp, Alexander; Rupprecht, Roland; Eskofier, Björn M; Winkler, Jürgen; Gaßmann, Karl-Günter; Klucken, Jochen

    2017-02-27

    In an increasing aging society, reduced mobility is one of the most important factors limiting activities of daily living and overall quality of life. The ability to walk independently contributes to the mobility, but is increasingly restricted by numerous diseases that impair gait and balance. The aim of this cross-sectional observation study was to examine whether spatio-temporal gait parameters derived from mobile instrumented gait analysis can be used to measure the gait stabilizing effects of a wheeled walker (WW) and whether these gait parameters may serve as surrogate marker in hospitalized patients with multifactorial gait and balance impairment. One hundred six patients (ages 68-95) wearing inertial sensor equipped shoes passed an instrumented walkway with and without gait support from a WW. The walkway assessed the risk of falling associated gait parameters velocity, swing time, stride length, stride time- and double support time variability. Inertial sensor-equipped shoes measured heel strike and toe off angles, and foot clearance. The use of a WW improved the risk of spatio-temporal parameters velocity, swing time, stride length and the sagittal plane associated parameters heel strike and toe off angles in all patients. First-time users (FTUs) showed similar gait parameter improvement patterns as frequent WW users (FUs). However, FUs with higher levels of gait impairment improved more in velocity, stride length and toe off angle compared to the FTUs. The impact of a WW can be quantified objectively by instrumented gait assessment. Thus, objective gait parameters may serve as surrogate markers for the use of walking aids in patients with gait and balance impairments.

  7. [Security of hospital infusion practices: From an a priori risk analysis to an improvement action plan].

    Science.gov (United States)

    Pignard, J; Cosserant, S; Traore, O; Souweine, B; Sautou, V

    2016-03-01

    Infusion in care units, and all the more in intensive care units, is a complex process which can be the source of many risks for the patient. Under cover of an institutional approach for the improvement of the quality and safety of patient healthcare, a risk mapping infusion practices was performed. The analysis was focused on intravenous infusion situations in adults, the a priori risk assessment methodology was applied and a multidisciplinary work group established. Forty-three risks were identified for the infusion process (prescription, preparation and administration). The risks' assessment and the existing means of control showed that 48% of them would have a highly critical patient security impact. Recommendations were developed for 20 risks considered to be most critical, to limit their occurrence and severity, and improve their control level. An institutional action plan was developed and validated in the Drug and Sterile Medical Devices Commission. This mapping allowed the realization of an exhaustive inventory of potential risks associated with the infusion. At the end of this work, multidisciplinary groups were set up to work on different themes and regular quarterly meetings were established to follow the progress of various projects. Risk mapping will be performed in pediatric and oncology unit where the risks associated with the handling of toxic products is omnipresent. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  8. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  9. Use of Selection Indices Based on Multivariate Analysis for Improving Grain Yield in Rice

    Directory of Open Access Journals (Sweden)

    Hossein SABOURI

    2008-12-01

    Full Text Available In order to study selection indices for improving rice grain yield, a cross was made between an Iranian traditional rice (Oryza sativa L. variety, Tarommahalli and an improved indica rice variety, Khazar in 2006. The traits of the parents (30 plants, F1 (30 plants and F2 generations (492 individuals were evaluated at the Rice Research Institute of Iran (RRII during 2007. Heritabilities of the number of panicles per plant, plant height, days to heading and panicle exsertion were greater than that of grain yield. The selection indices were developed using the results of multivariate analysis. To evaluate selection strategies to maximize grain yield, 14 selection indices were calculated based on two methods (optimum and base and combinations of 12 traits with various economic weights. Results of selection indices showed that selection for grain weight, number of panicles per plant and panicle length by using their phenotypic and/or genotypic direct effects (path coefficient as economic weights should serve as an effective selection criterion for using either the optimum or base index.

  10. Systematic review and meta-analysis of behavioral interventions to improve child pedestrian safety.

    Science.gov (United States)

    Schwebel, David C; Barton, Benjamin K; Shen, Jiabin; Wells, Hayley L; Bogar, Ashley; Heath, Gretchen; McCullough, David

    2014-09-01

    Pedestrian injuries represent a pediatric public health challenge. This systematic review/meta-analysis evaluated behavioral interventions to teach children pedestrian safety. Multiple strategies derived eligible manuscripts (published before April 1, 2013, randomized design, evaluated behavioral child pedestrian safety interventions). Screening 1,951 abstracts yielded 125 full-text retrievals. 25 were retained for data extraction, and 6 were later omitted due to insufficient data. In all, 19 articles reporting 25 studies were included. Risk of bias and quality of evidence were assessed. Behavioral interventions generally improve children's pedestrian safety, both immediately after training and at follow-up several months later. Quality of the evidence was low to moderate. Available evidence suggested interventions targeting dash-out prevention, crossing at parked cars, and selecting safe routes across intersections were effective. Individualized/small-group training for children was the most effective training strategy based on available evidence. Behaviorally based interventions improve children's pedestrian safety. Efforts should continue to develop creative, cost-efficient, and effective interventions. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Improving Students’ Argumentation Style Ability in Writing Essay through Discourse Analysis Model Critical Thinking Map Oriented

    Directory of Open Access Journals (Sweden)

    R. Panca Pertiwi Hidayati

    2017-03-01

    Full Text Available Students’ ability in writing anessay, as one of language skills which can improve the creativity in language, is a serious problem that should be investigated by a deep research. Besides for showing that language as a thinking tool, writing competence can be seen as a product if we take a look from critical thinking ability measurement aspect and also as a process if we take a look at the individual development itself aspect. One of indicator for measuring the students’ critical is by the ability in delivering their argumentation style which is showed in their essay. The goal of this research is obtaining the objective discourse analysis model critical thinking map oriented in improving students’ argumentation style skills in writing anessay, hence can be used as a modelling for the growing of students’ creativity in a variety of writing skills. Based on paired-samples t test, the pre-test/post-test paired-sample is significant, because the significance (2-tailed is less than 0.05 in the standard of 95%. This is mean, there is a real difference between the ability before and after getting the treatment. The sample’s ability after getting the treatment is better than before. As well as if be compared with control class, experiment class shows the significant superiority for all aspects of essay writing assessment.

  12. Effectiveness of goal management training® in improving executive functions: A meta-analysis.

    Science.gov (United States)

    Stamenova, Vessela; Levine, Brian

    2018-03-14

    Our objective was to review the literature and quantitatively summarise the effectiveness of Goal Management Training® (GMT) (alone or in combination with other training approaches) in improving executive functions in adult populations. Ovid, Scopus, Web of Science, and ProQuest Dissertations & Theses Global were searched for articles citing "goal management training". Any group trials (n > 3) in adults that used multiple-session GMT programmes were included in the analyses. Outcome variables were extracted and classified into one of nine cognitive measures domains: executive functioning tasks, everyday executive functioning tasks, subjective executive tasks rated by the patient, subjective executive tasks rated by proxy, working memory, speed of processing, long-term memory, instrumental activities of daily living and general mental health status questionnaires. A total of 21 publications, containing 19 separate treatment group samples were included in the final analyses. Significantly positive small to moderate effect sizes were observed in all cognitive measure domains (except speed of processing) with effects maintained at follow-up assessments for all followed-up outcome measures, except for subjective ratings by patients and proxy. The analysis suggests that GMT is an effective intervention, leading to moderate improvements in executive functions that are usually maintained at follow-up.

  13. Improvement of the fringe analysis algorithm for wavelength scanning interferometry based on filter parameter optimization.

    Science.gov (United States)

    Zhang, Tao; Gao, Feng; Muhamedsalih, Hussam; Lou, Shan; Martin, Haydn; Jiang, Xiangqian

    2018-03-20

    The phase slope method which estimates height through fringe pattern frequency and the algorithm which estimates height through the fringe phase are the fringe analysis algorithms widely used in interferometry. Generally they both extract the phase information by filtering the signal in frequency domain after Fourier transform. Among the numerous papers in the literature about these algorithms, it is found that the design of the filter, which plays an important role, has never been discussed in detail. This paper focuses on the filter design in these algorithms for wavelength scanning interferometry (WSI), trying to optimize the parameters to acquire the optimal results. The spectral characteristics of the interference signal are analyzed first. The effective signal is found to be narrow-band (near single frequency), and the central frequency is calculated theoretically. Therefore, the position of the filter pass-band is determined. The width of the filter window is optimized with the simulation to balance the elimination of the noise and the ringing of the filter. Experimental validation of the approach is provided, and the results agree very well with the simulation. The experiment shows that accuracy can be improved by optimizing the filter design, especially when the signal quality, i.e., the signal noise ratio (SNR), is low. The proposed method also shows the potential of improving the immunity to the environmental noise by adapting the signal to acquire the optimal results through designing an adaptive filter once the signal SNR can be estimated accurately.

  14. Analysis of drought characteristics for improved understanding of a water resource system

    Directory of Open Access Journals (Sweden)

    A. T. Lennard

    2014-09-01

    Full Text Available Droughts are a reoccurring feature of the UK climate; recent drought events (2004–2006 and 2010–2012 have highlighted the UK’s continued vulnerability to this hazard. There is a need for further understanding of extreme events, particularly from a water resource perspective. A number of drought indices are available, which can help to improve our understanding of drought characteristics such as frequency, severity and duration. However, at present little of this is applied to water resource management in the water supply sector. Improved understanding of drought characteristics using indices can inform water resource management plans and enhance future drought resilience. This study applies the standardised precipitation index (SPI to a series of rainfall records (1962–2012 across the water supply region of a single utility provider. Key droughts within this period are analysed to develop an understanding of the meteorological characteristics that lead to, exist during and terminate drought events. The results of this analysis highlight how drought severity and duration can vary across a small-scale water supply region, indicating that the spatial coherence of drought events cannot be assumed.

  15. A fuzzy MICMAC analysis for improving supply chain performance of basic vaccines in developing countries.

    Science.gov (United States)

    Chandra, Dheeraj; Kumar, Dinesh

    2018-03-01

    In recent years, demand to improve child immunization coverage globally, and the development of the latest vaccines and technology has made the vaccine market very complex. The rise in such complexities often gives birth to numerous issues in the vaccine supply chain, which are the primary cause of its poor performance. Figuring out the cause of the performance problem can help you decide how to address it. The goal of the present study is to identify and analyze important issues in the supply chain of basic vaccines required for child immunization in the developing countries. Twenty-five key issues as various factors of the vaccine supply chain have been presented in this paper. Fuzzy MICMAC analysis has been carried out to classify the factors based on their driving and dependence power and to develop a hierarchy based model. Further, the findings have been discussed with the field experts to identify the critical factors. Three factors: better demand forecast, communication between the supply chain members, and proper planning and scheduling have been identified as the critical factors of vaccine supply chain. These factors should be given special care to improve vaccine supply chain performance.

  16. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    Science.gov (United States)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  17. Analysis of the Convention on Nuclear Safety and Suggestions for Improvement

    International Nuclear Information System (INIS)

    Choi, K. S.; Viet, Phuong Nguyen

    2013-01-01

    The innovative approach of the Convention, which is based on incentive after than legal binding, had been considered successful in strengthening the nuclear safety worldwide. However, the nuclear accident at the Fukushima Dai-ichi Nuclear Power Plant (Japan) in March 2011 has exposed a number of weaknesses of the Convention. Given that context, this paper will analyse the characteristics of the CNS in order to understand the advantages and disadvantages of the Convention, and finally to suggest some possible improvements. The analysis in this paper shows that the incentive approach of the CNS has succeeded in facilitating the active roles of its Contracting Parties in making the National Reports and participating in the peer review of these reports. However, the incoherent quality of the National Reports, the different level of participation in the peer review process by different Contracting Parties, and the lack of transparency of the peer review have undermined the effectiveness of the Convention in strengthening the international safety regime as well as preventing serious regulatory errors that had happened in Japan before the Fukushima accident. Therefore, the peer review process should be reformed into a more transparent and independent direction, while an advisory group of regulators within the CNS might also be useful in improving the effectiveness of the Convention as already proven by the good practice in the European Union. Only with such effective change, the CNS can maintain its pivotal role in the international safety regime

  18. Risk analysis of urban gas pipeline network based on improved bow-tie model

    Science.gov (United States)

    Hao, M. J.; You, Q. J.; Yue, Z.

    2017-11-01

    Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.

  19. Improvement and error analysis of quantitative information extraction in diffraction-enhanced imaging

    International Nuclear Information System (INIS)

    Yang Hao; Xuan Rui-Jiao; Hu Chun-Hong; Duan Jing-Hao

    2014-01-01

    Diffraction-enhanced imaging (DEI) is a powerful phase-sensitive technique that provides higher spatial resolution and supercontrast of weakly absorbing objects than conventional radiography. It derives contrast from the X-ray absorption, refraction, and ultra-small-angle X-ray scattering (USAXS) properties of an object. The separation of different-contrast contributions from images is an important issue for the potential application of DEI. In this paper, an improved DEI (IDEI) method is proposed based on the Gaussian curve fitting of the rocking curve (RC). Utilizing only three input images, the IDEI method can accurately separate the absorption, refraction, and USAXS contrasts produced by the object. The IDEI method can therefore be viewed as an improvement to the extended DEI (EDEI) method. In contrast, the IDEI method can circumvent the limitations of the EDEI method well since it does not impose a Taylor approximation on the RC. Additionally, analysis of the IDEI model errors is performed to further investigate the factors that lead to the image artifacts, and finally validation studies are conducted using computer simulation and synchrotron experimental data. (interdisciplinary physics and related areas of science and technology)

  20. SEISMIC FRAGILITY ANALYSIS OF IMPROVED RC FRAMES USING DIFFERENT TYPES OF BRACING

    Directory of Open Access Journals (Sweden)

    HAMED HAMIDI JAMNANI

    2017-04-01

    Full Text Available Application of bracings to increase the lateral stiffness of building structures is a technique of seismic improvement that engineers frequently have recourse to. Accordingly, investigating the role of bracings in concrete structures along with the development of seismic fragility curves are of overriding concern to civil engineers. In this research, an ordinary RC building, designed according to the 1st edition of Iranian seismic code, was selected for examination. According to FEMA 356 code, this building is considered to be vulnerable. To improve the seismic performance of this building, 3 different types of bracings, which are Concentrically Braced Frames, Eccentrically Braced Frames and Buckling Restrained Frames were employed, and each bracing element was distributed in 3 different locations in the building. The researchers developed fragility curves and utilized 30 earthquake records on the Peak Ground Acceleration seismic intensity scale to carry out a time history analysis. Tow damage scale, including Inter-Story Drifts and Plastic Axial Deformation were also used. The numerical results obtained from this investigation confirm that Plastic Axial Deformation is more reliable than conventional approaches in developing fragility curves for retrofitted frames. In lieu of what is proposed, the researchers selected the suitable damage scale and developed and compared log-normal distribution of fragility curves first for the original and then for the retrofitted building.

  1. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  2. Using containment analysis to improve component cooling water heat exchanger limits

    International Nuclear Information System (INIS)

    Da Silva, H.C.; Tajbakhsh, A.

    1995-01-01

    The Comanche Peak Steam Electric Station design requires that exit temperatures from the Component Cooling Water Heat Exchanger remain below 330.37 K during the Emergency Core Cooling System recirculation stage, following a hypothetical Loss of Coolant Accident (LOCA). Due to measurements indicating a higher than expected combination of: (a) high fouling factor in the Component Cooling Water Heat Exchanger with (b) high ultimate heat sink temperatures, that might lead to temperatures in excess of the 330.37 K limit, if a LOCA were to occur, TUElectric adjusted key flow rates in the Component Cooling Water network. This solution could only be implemented with improvements to the containment analysis methodology of record. The new method builds upon the CONTEMPT-LT/028 code by: (a) coupling the long term post-LOCA thermohydraulics with a more detailed analytical model for the complex Component Cooling Water Heat Exchanger network and (b) changing the way mass and energy releases are calculated after core reflood and steam generator energy is dumped to the containment. In addition, a simple code to calculate normal cooldowns was developed to confirm RHR design bases were met with the improved limits

  3. IMPROVEMENT OF THE LOCA PSA MODEL USING A BEST-ESTIMATE THERMAL-HYDRAULIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    DONG HYUN LEE

    2014-08-01

    Full Text Available Probabilistic Safety Assessment (PSA has been widely used to estimate the overall safety of nuclear power plants (NPP and it provides base information for risk informed application (RIA and risk informed regulation (RIR. For the effective and correct use of PSA in RIA/RIR related decision making, the risk estimated by a PSA model should be as realistic as possible. In this work, a best-estimate thermal-hydraulic analysis of loss-of-coolant accidents (LOCAs for the Hanul Nuclear Units 3&4 is first carried out in a systematic way. That is, the behaviors of peak cladding temperature (PCT were analyzed with various combinations of break sizes, the operating conditions of safety systems, and the operator's action time for aggressive secondary cooling. Thereafter, the results of the thermal-hydraulic analysis have been reflected in the improvement of the PSA model by changing both accident sequences and success criteria of the event trees for the LOCA scenarios.

  4. Using Critical Discourse Analysis Based Instruction to Improve EFL Learners’ Writing Complexity, Accuracy and Fluency

    Directory of Open Access Journals (Sweden)

    Hamid Marashi

    2016-11-01

    Full Text Available The literature of ELT is perhaps overwhelmed by attempts to enhance learners’ writing through the application of different methodologies. One such methodology is critical discourse analysis which is founded upon stressing not only the decoding of the propositional meaning of a text but also its ideological assumptions. Accordingly, this study was an attempt to investigate the impact of critical discourse analysis-based (CDA instruction on EFL learners’ writing complexity, accuracy, and fluency (CAF. To fulfill the purpose of this study, 60 female intermediate EFL learners were selected from among a total number of 100 through their performance on a piloted sample PET. Based on the results, the students were randomly assigned to a control and an experimental group with 30 participants in each. Both groups underwent the same amount of teaching time during 17 sessions which included a treatment of CDA instruction for the experimental group. A writing posttest was administered at the end of the instruction to both groups and their mean scores on the test were compared through a MANOVA. The results led to the rejection of the three null hypotheses, thereby demonstrating that the learners in the experimental group benefited significantly more than those in the control group in terms of improving their writing CAF. To this end, it is recommended that CDA instruction be incorporated more frequently in writing classes following of course adequate syllabus design and materials development.

  5. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    International Nuclear Information System (INIS)

    Sandusky, Peter; Appiah-Amponsah, Emmanuel; Raftery, Daniel

    2011-01-01

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  6. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  7. Drought Characteristic Analysis Based on an Improved PDSI in the Wei River Basin of China

    Directory of Open Access Journals (Sweden)

    Lei Zou

    2017-03-01

    Full Text Available In this study, to improve the efficiency of the original Palmer Drought Severity Index (PDSI_original, we coupled the Soil and Water Assessment tool (SWAT and PDSI_original to construct a drought index called PDSI_SWAT. The constructed PDSI_SWAT is applied in the Wei River Basin (WRB of China during 1960–2012. The comparison of the PDSI_SWAT with four other commonly used drought indices reveals the effectiveness of the PDSI_SWAT in describing the drought propagation processes in WRB. The whole WRB exhibits a dry trend, with more significant trends in the northern, southeastern and western WRB than the remaining regions. Furthermore, the drought frequencies show that drought seems to occur more likely in the northern part than the southern part of WRB. The principle component analysis method based on the PDSI_SWAT reveals that the whole basin can be further divided into three distinct sub-regions with different drought variability, i.e., the northern, southeastern and western part. Additionally, these three sub-regions are also consistent with the spatial pattern of drought shown by the drought frequency. The wavelet transform analysis method indicates that the El Niño-Southern Oscillation (ENSO events have strong impacts on inducing droughts in the WRB. The results of this study could be beneficial for a scientific water resources management and drought assessment in the current study area and also provide a valuable reference for other areas with similar climatic characteristics.

  8. Cross-sectional atom probe tomography sample preparation for improved analysis of fins on SOI

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Andrew J., E-mail: andy.martin@globalfoundries.com; Weng, Weihao; Zhu, Zhengmao; Loesing, Rainer; Shaffer, James; Katnani, Ahmad

    2016-02-15

    Sample preparation for atom probe tomography of 3D semiconductor devices has proven to significantly affect field evaporation and the reliability of reconstructed data. A cross-sectional preparation method is applied to state-of-the-art Si finFET technology on SOI. This preparation approach advantageously provides a conductive path for voltage and heat, offers analysis of many fins within a single tip, and improves resolution across interfaces of particular interest. Measured B and Ge profiles exhibit good correlation with SIMS and EDX and show no signs of B clustering or pile-up near the Si/SiGe interface of the fin. - Highlights: • Cross-section atom probe tomography sample preparation of fins on SOI. • >5 fins captured in single atom probe tip via cross-section method. • Oxides affect collection efficiency, reconstruction accuracy, and data reliability. • Sample orientation affects field evaporation of dissimilar materials. • Data is well-matched to SIMS and EDX analysis.

  9. An improved quadratic inference function for parameter estimation in the analysis of correlated data.

    Science.gov (United States)

    Westgate, Philip M; Braun, Thomas M

    2013-08-30

    Generalized estimating equations (GEE) are commonly employed for the analysis of correlated data. However, the quadratic inference function (QIF) method is increasing in popularity because of its multiple theoretical advantages over GEE. We base our focus on the fact that the QIF method is more efficient than GEE when the working covariance structure for the data is misspecified. It has been shown that because of the use of an empirical weighting covariance matrix inside its estimating equations, the QIF method's realized estimation performance can potentially be inferior to GEE's when the number of independent clusters is not large. We therefore propose an alternative weighting matrix for the QIF, which asymptotically is an optimally weighted combination of the empirical covariance matrix and its model-based version, which is derived by minimizing its expected quadratic loss. Use of the proposed weighting matrix maintains the large-sample advantages the QIF approach has over GEE and, as shown via simulation, improves small-sample parameter estimation. We also illustrated the proposed method in the analysis of a longitudinal study. Copyright © 2012 John Wiley & Sons, Ltd.

  10. PROJECT MANAGEMENT OF CULTURAL EVENTS IN A PUBLIC UNIVERSITY: ANALYSIS OF PRACTICES AND IMPROVEMENT PROPOSITIONS

    Directory of Open Access Journals (Sweden)

    Cláudia Fabiana Gohr,

    2013-09-01

    Full Text Available This article aims to describe the management practices used for conducting cultural events adopted by the Department of Culture of a public university, identifying critical points and providing the basis for developing a specific methodology for the organization. The research adopted a qualitative approach of analysis, both to identify practices with those responsible for project management and for conducting a case study of a project undertaken by the institution. For data collection we used semi-structured interviews, participant observation and document analysis. It was found that the project management of cultural events at the university has some quirks, which requires the development of specific processes for their management. Moreover, when compared with the best practices suggested by the literature, the project management of events conducted by the university is still at an incipient stage of maturity. Based on the results, we presented some proposals for improvement for the project management of cultural events in the Department of Culture, especially regarding the adoption of management tools for projects and activities that must be performed at each stage of the life cycle of a cultural event.

  11. Improved process analytical technology for protein a chromatography using predictive principal component analysis tools.

    Science.gov (United States)

    Hou, Ying; Jiang, Canping; Shukla, Abhinav A; Cramer, Steven M

    2011-01-01

    Protein A chromatography is widely employed for the capture and purification of antibodies and Fc-fusion proteins. Due to the high cost of protein A resins, there is a significant economic driving force for using these chromatographic materials for a large number of cycles. The maintenance of column performance over the resin lifetime is also a significant concern in large-scale manufacturing. In this work, several statistical methods are employed to develop a novel principal component analysis (PCA)-based tool for predicting protein A chromatographic column performance over time. A method is developed to carry out detection of column integrity failures before their occurrence without the need for a separate integrity test. In addition, analysis of various transitions in the chromatograms was also employed to develop PCA-based models to predict both subtle and general trends in real-time protein A column yield decay. The developed approach has significant potential for facilitating timely and improved decisions in large-scale chromatographic operations in line with the process analytical technology (PAT) guidance from the Food and Drug Administration (FDA). © 2010 Wiley Periodicals, Inc.

  12. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  13. Is Recreational Soccer Effective for Improving VO2max A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Milanović, Zoran; Pantelić, Saša; Čović, Nedim; Sporiš, Goran; Krustrup, Peter

    2015-09-01

    Soccer is the most popular sport worldwide, with a long history and currently more than 500 million active participants, of whom 300 million are registered football club members. On the basis of scientific findings showing positive fitness and health effects of recreational soccer, FIFA (Fédération Internationale de Football Association) introduced the slogan "Playing football for 45 min twice a week-best prevention of non-communicable diseases" in 2010. The objective of this paper was to perform a systematic review and meta-analysis of the literature to determine the effects of recreational soccer on maximal oxygen uptake (VO2max). Six electronic databases (MEDLINE, PubMed, SPORTDiscus, Web of Science, CINAHL and Google Scholar) were searched for original research articles. A manual search was performed to cover the areas of recreational soccer, recreational physical activity, recreational small-sided games and VO2max using the following key terms, either singly or in combination: recreational small-sided games, recreational football, recreational soccer, street football, street soccer, effect, maximal oxygen uptake, peak oxygen uptake, cardiorespiratory fitness, VO2max. The inclusion criteria were divided into four sections: type of study, type of participants, type of interventions and type of outcome measures. Probabilistic magnitude-based inferences for meta-analysed effects were based on standardised thresholds for small, moderate and large changes (0.2, 0.6 and 1.2, respectively) derived from between-subject standard deviations for baseline fitness. Seventeen studies met the inclusion criteria and were included in the systematic review and meta-analysis. Mean differences showed that VO2max increased by 3.51 mL/kg/min (95 % CI 3.07-4.15) over a recreational soccer training programme in comparison with other training models. The meta-analysed effects of recreational soccer on VO2max compared with the controls of no exercise, continuous running and strength

  14. Crossing the Barriers: An Analysis of Permitting Barriers to Geothermal Development and Potential Improvement Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Young, Katherine R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-04

    Developers have identified many non-technical barriers to geothermal power development, including permitting. Activities required for permitting, such as the associated environmental reviews, can take a considerable amount of time and delay project development. This paper discusses the impacts to geothermal development timelines due to the permitting challenges, including the regulatory framework, environmental review process, and ancillary permits. We identified barriers that have the potential to prevent geothermal development or delay timelines and defined improvement scenarios that could assist in expediting geothermal development and permitting timelines and lead to the deployment of additional geothermal resources by 2030 and 2050: (1) the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices as well as (2) an expansion of existing categorical exclusions applicable to geothermal development on Bureau of Land Management public lands to include the oil and gas categorical exclusions passed as part of the Energy Policy Act of 2005. We utilized the Regional Energy Deployment System (ReEDS) and the Geothermal Electricity Technology Evaluation Model (GETEM) to forecast baseline geothermal deployment based on previous analysis of geothermal project development and permitting timelines. The model results forecast that reductions in geothermal project timelines can have a significant impact on geothermal deployment. For example, using the ReEDS model, we estimated that reducing timelines by two years, perhaps due to the creation of a centralized federal geothermal permitting office and utilization of state permit coordination offices, could result in deployment of an additional 204 MW by 2030 and 768 MW by 2050 - a 13% improvement when compared to the business as usual scenario. The model results forecast that a timeline improvement of four years - for example with an expansion of existing categorical

  15. Systematic Review and Meta-Analysis of Interventions to Improve Access and Coverage of Adolescent Immunizations.

    Science.gov (United States)

    Das, Jai K; Salam, Rehana A; Arshad, Ahmed; Lassi, Zohra S; Bhutta, Zulfiqar A

    2016-10-01

    Vaccination strategies are among the most successful and cost-effective public health strategies for preventing disease and death. Until recently, most of the existing immunization programs targeted infants and children younger than 5 years which have successfully resulted in reducing global infant and child mortality. Adolescent immunization has been relatively neglected, leaving a quarter of world's population underimmunized and hence vulnerable to a number of preventable diseases. In recent years, a large number of programs have been launched to increase the uptake of different vaccines in adolescents; however, the recommended vaccination coverage among the adolescent population overall remains very low, especially in low- and middle-income countries. Adolescent vaccination has received significantly more attention since the advent of the human papillomavirus (HPV) vaccine in 2006. However, only half of the adolescent girls in the United States received a single dose of HPV vaccine while merely 43% and 33% received two and three doses, respectively. We systematically reviewed literature published up to December 2014 and included 23 studies on the effectiveness of interventions to improve immunization coverage among adolescents. Moderate-quality evidence suggested an overall increase in vaccination coverage by 78% (relative risk: 1.78; 95% confidence interval: 1.41-2.23). Review findings suggest that interventions including implementing vaccination requirement in school, sending reminders, and national permissive recommendation for adolescent vaccination have the potential to improve immunization uptake. Strategies to improve coverage for HPV vaccines resulted in a significant decrease in the prevalence of HPV by 44% and genital warts by 33%; however, the quality of evidence was low. Analysis from single studies with low- or very low-quality evidence suggested significant decrease in varicella deaths, measles incidence, rubella susceptibility, and incidence of

  16. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  17. Interventions to improve gross motor performance in children with neurodevelopmental disorders: a meta-analysis.

    Science.gov (United States)

    Lucas, Barbara R; Elliott, Elizabeth J; Coggan, Sarah; Pinto, Rafael Z; Jirikowic, Tracy; McCoy, Sarah Westcott; Latimer, Jane

    2016-11-29

    Gross motor skills are fundamental to childhood development. The effectiveness of current physical therapy options for children with mild to moderate gross motor disorders is unknown. The aim of this study was to systematically review the literature to investigate the effectiveness of conservative interventions to improve gross motor performance in children with a range of neurodevelopmental disorders. A systematic review with meta-analysis was conducted. MEDLINE, EMBASE, AMED, CINAHL, PsycINFO, PEDro, Cochrane Collaboration, Google Scholar databases and clinical trial registries were searched. Published randomised controlled trials including children 3 to ≤18 years with (i) Developmental Coordination Disorder (DCD) or Cerebral Palsy (CP) (Gross Motor Function Classification System Level 1) or Developmental Delay or Minimal Acquired Brain Injury or Prematurity (Disorders; and (ii) receiving non-pharmacological or non-surgical interventions from a health professional and (iii) gross motor outcomes obtained using a standardised assessment tool. Meta-analysis was performed to determine the pooled effect of intervention on gross motor function. Methodological quality and strength of meta-analysis recommendations were evaluated using PEDro and the GRADE approach respectively. Of 2513 papers, 9 met inclusion criteria including children with CP (n = 2) or DCD (n = 7) receiving 11 different interventions. Only two of 9 trials showed an effect for treatment. Using the least conservative trial outcomes a large beneficial effect of intervention was shown (SMD:-0.8; 95% CI:-1.1 to -0.5) with "very low quality" GRADE ratings. Using the most conservative trial outcomes there is no treatment effect (SMD:-0.1; 95% CI:-0.3 to 0.2) with "low quality" GRADE ratings. Study limitations included the small number and poor quality of the available trials. Although we found that some interventions with a task-orientated framework can improve gross motor outcomes in children with

  18. IMPROVEMENT OF EXPERT ANALYSIS FOR ROAD TRAFFIC ACCIDENTS USING COMPUTER SIMULATION PROGRAMS

    Directory of Open Access Journals (Sweden)

    S. A. Azemsha

    2015-01-01

    Full Text Available The existing methods for auto-technical expertise presuppose selection of some parameters on the basis of the expert’s intuition and experience. Type of a vehicle and its loading rate, road conditions are not taken into account also in the case when deceleration is to be determined. While carrying out the analysis it has been established that an application of special software makes it possible to improve significantly efficiency of the executed works directed on solution of the assigned tasks, to speed up calculation processes, to decrease qualitatively probability of arithmetic errors and provides the possibility to visualize results of the conducted investigations. Possibility of using various models for dynamic motion simulation and collision of vehicles (in the form of 3D-models has been established in the paper. In such a case specific features of vehicle technical conditions, its loading rate and condition of roadway surface have been taken account in the paper. The given paper also permits to obtain a dynamic display of reconstructed accident mechanism in axonometric projection, to film video-clips when a camera is positioned at any spatial point: road, roadside, raised position, moving vehicle, driver's seat in the vehicle.The paper contains an analysis of possibilities of road traffic accident simulation programs, a statistical analysis that shows significance in differences between simulation results when various programs have been used. The paper presents initial data and results of vehicle speed calculation on the basis of braking track length which have been obtained with the help of road traffic accident express analysis (a classical approach and PC-Crash when additional influencing factors are taken into account. A number of shortcomings have been revealed while analyzing the simulation results of the applied software. The shortcomings must be removed in the analyzed software products.On the basis of the executed analysis in

  19. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  20. Improvement of the analysis of the biochemical oxygen demand (BOD) of Mediterranean seawater by seeding control.

    Science.gov (United States)

    Simon, F Xavier; Penru, Ywann; Guastalli, Andrea R; Llorens, Joan; Baig, Sylvie

    2011-07-15

    Biochemical oxygen demand (BOD) is a useful parameter for assessing the biodegradability of dissolved organic matter in water. At the same time, this parameter is used to evaluate the efficiency with which certain processes remove biodegradable natural organic matter (NOM). However, the values of BOD in seawater are very low (around 2 mgO(2)L(-1)) and the methods used for its analysis are poorly developed. The increasing attention given to seawater desalination in the Mediterranean environment, and related phenomena such as reverse osmosis membrane biofouling, have stimulated interest in seawater BOD close to the Spanish coast. In this study the BOD analysis protocol was refined by introduction of a new step in which a critical quantity of autochthonous microorganisms, measured as adenosine triphosphate, is added. For the samples analyzed, this improvement allowed us to obtain reliable and replicable BOD measurements, standardized with solutions of glucose-glutamic acid and acetate. After 7 days of analysis duration, more than 80% of ultimate BOD is achieved, which in the case of easily biodegradable compounds represents nearly a 60% of the theoretical oxygen demand. BOD(7) obtained from the Mediterranean Sea found to be 2.0±0.3 mgO(2)L(-1) but this value decreased with seawater storage time due to the rapid consumption of labile compounds. No significant differences were found between two samples points located on the Spanish coast, since their organic matter content was similar. Finally, the determination of seawater BOD without the use of inoculum may lead to an underestimation of BOD. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Improvements in quantification of low z element analysis for Sr- and conventional TXRF

    International Nuclear Information System (INIS)

    Baur, K.; Brennan, S.; Pianetta, P.; Kerner, J.; Zhu, Q.; Burrow, B.

    2000-01-01

    As the dimensions of integrated circuits continue to shrink also the amount of tolerable contamination on Si wafer surfaces decreases. Contaminants of primary concern are transition metals and light elements like Al. Total reflection x-ray fluorescence (TXRF) spectroscopy using synchrotron radiation from the Stanford synchrotron radiation laboratory (SSRL) is one of the most powerful techniques for trace impurity analysis on Si wafer surfaces. In addition, it is among the more sensitive techniques and the only one, which is non-destructive. Upon having established a better detection sensitivity for transition elements than required by semiconductor industry, the current effort focuses on the improvement of the sensitivity for the detection and data analysis of light elements. Due to the presence of the neighboring Si signal from the substrate this can only be achieved by tuning the excitation energy below the Si-K absorption edge. For conventional TXRF systems this can be done by using a W-M fluorescence line (1.78 keV) for excitation or by employing the tunability of synchrotron radiation. However, this results in a substantial increase in background due to resonant X-ray Raman scattering. This scattering dominates the background behavior of the Al K fluorescence line, and consequently limits the achievable sensitivity for the detection of Al surface contaminants. In particular, we find that for a precise determination of the achievable sensitivity, the specific shape of the continuous Raman background must be used in the deconvolution. This data analysis opens a new perspective for conventional TXRF systems to overcome background problems in quantification and first results will be presented. (author)

  2. Using the failure mode and effects analysis model to improve parathyroid hormone and adrenocorticotropic hormone testing

    Directory of Open Access Journals (Sweden)

    Magnezi R

    2016-12-01

    Full Text Available Racheli Magnezi,1 Asaf Hemi,1 Rina Hemi2 1Department of Management, Public Health and Health Systems Management Program, Bar Ilan University, Ramat Gan, 2Endocrine Service Unit, Sheba Medical Center, Tel Aviv, Israel Background: Risk management in health care systems applies to all hospital employees and directors as they deal with human life and emergency routines. There is a constant need to decrease risk and increase patient safety in the hospital environment. The purpose of this article is to review the laboratory testing procedures for parathyroid hormone and adrenocorticotropic hormone (which are characterized by short half-lives and to track failure modes and risks, and offer solutions to prevent them. During a routine quality improvement review at the Endocrine Laboratory in Tel Hashomer Hospital, we discovered these tests are frequently repeated unnecessarily due to multiple failures. The repetition of the tests inconveniences patients and leads to extra work for the laboratory and logistics personnel as well as the nurses and doctors who have to perform many tasks with limited resources.Methods: A team of eight staff members accompanied by the Head of the Endocrine Laboratory formed the team for analysis. The failure mode and effects analysis model (FMEA was used to analyze the laboratory testing procedure and was designed to simplify the process steps and indicate and rank possible failures.Results: A total of 23 failure modes were found within the process, 19 of which were ranked by level of severity. The FMEA model prioritizes failures by their risk priority number (RPN. For example, the most serious failure was the delay after the samples were collected from the department (RPN =226.1.Conclusion: This model helped us to visualize the process in a simple way. After analyzing the information, solutions were proposed to prevent failures, and a method to completely avoid the top four problems was also developed. Keywords: failure mode

  3. An improved Agrobacterium-mediated transformation system for the functional genetic analysis of Penicillium marneffei.

    Science.gov (United States)

    Kummasook, Aksarakorn; Cooper, Chester R; Vanittanakom, Nongnuch

    2010-12-01

    We have developed an improved Agrobacterium-mediated transformation (AMT) system for the functional genetic analysis of Penicillium marneffei, a thermally dimorphic, human pathogenic fungus. Our AMT protocol included the use of conidia or pre-germinated conidia of P. marneffei as the host recipient for T-DNA from Agrobacterium tumefaciens and co-cultivation at 28°C for 36 hours. Bleomycin-resistant transformants were selected as yeast-like colonies following incubation at 37°C. The efficiency of transformation was approximately 123 ± 3.27 and 239 ± 13.12 transformants per plate when using 5 × 10(4) conidia and pre-germinated conidia as starting materials, respectively. Southern blot analysis demonstrated that 95% of transformants contained single copies of T-DNA. Inverse PCR was employed for identifying flanking sequences at the T-DNA insertion sites. Analysis of these sequences indicated that integration occurred as random recombination events. Among the mutants isolated were previously described stuA and gasC defective strains. These AMT-derived mutants possessed single T-DNA integrations within their particular coding sequences. In addition, other morphological and pigmentation mutants possessing a variety of gene-specific defects were isolated, including two mutants having T-DNA integrations within putative promoter regions. One of the latter integration events was accompanied by the deletion of the entire corresponding gene. Collectively, these results indicated that AMT could be used for large-scale, functional genetic analyses in P. marneffei. Such analyses can potentially facilitate the identification of those genetic elements related to morphogenesis, as well as pathogenesis in this medically important fungus.

  4. Improving configuration management of thermalhydraulic analysis by automating the linkage between pipe geometry and plant idealization

    International Nuclear Information System (INIS)

    Gibb, R.; Girard, R.; Thompson, W.

    1997-01-01

    All safety analysis codes require some representation of actual plant data as a part of their input. Such representations, referred to at Point Lepreau Generating Station (PLGS) as plant idealizations, may include piping layout, orifice, pump or valve opening characteristics, boundary conditions of various sorts, reactor physics parameters, etc. As computing power increases, the numerical capabilities of thermalhydraulic analysis tools become more sophisticated, requiring more detailed assessments, and consequently more complex and complicated idealizations of the system models. Thus, a need has emerged to create a precise plant model layout in electronic form which ensures a realistic representation of the plant systems, and form which analytical approximations of any chosen degree of accuracy may be created. The benefits of this process are twofold. Firstly, the job of developing a plant idealization is made simpler, and therefore is cheaper for the utility. More important however, are the improvements in documentation and reproducibility that this process imparts to the resultant idealization. Just as the software that performs the numerical operations on the input data must be subject to verification/validation, equally robust measures must be taken to ensure that these software operations are being applied to valid idealizations, that are formally documented. Since the CATHENA Code is one of the most important thermalhydraulic code used for safety analysis at PLGS the main effort was directed towards the systems plant models for this code. This paper reports the results of the work carried on at PLGS and ANSL to link the existing piping data base to the actual CATHENA plant idealization. An introduction to the concept is given first, followed by a description of the databases, and the supervisory tool which manages the data, and associated software. An intermediate code, which applied some thermalhydraulic rules to the data, and translated the resultant data

  5. An Improved Rigid Multibody Model for the Dynamic Analysis of the Planetary Gearbox in a Wind Turbine

    Directory of Open Access Journals (Sweden)

    Wenguang Yang

    2016-01-01

    Full Text Available This paper proposes an improved rigid multibody model for the dynamic analysis of the planetary gearbox in a wind turbine. The improvements mainly include choosing the inertia frame as the reference frame of the carrier, the ring, and the sun and adding a new degree of freedom for each planet. An element assembly method is introduced to build the model, and a time-varying mesh stiffness model is presented. A planetary gear study case is employed to verify the validity of the improved model. Comparisons between the improvement model and the traditional model show that the natural characteristics are very close; the improved model can obtain the right equivalent moment of inertia of the planetary gear in the transient simulation, and all the rotation speeds satisfy the transmission relationships well; harmonic resonance and resonance modulation phenomena can be found in their vibration signals. The improved model is applied in a multistage gearbox dynamics analysis to reveal the prospects of the model. Modal analysis and transient analysis with and without time-varying mesh stiffness considered are conducted. The rotation speeds from the transient analysis are consistent with the theory, and resonance modulation can be found in the vibration signals.

  6. A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, A. S. T. [Univ. of Cambridge (United Kingdom); Chapman, J. D. [Univ. of Cambridge (United Kingdom); Thomson, M. A. [Univ. of Cambridge (United Kingdom)

    2013-04-01

    This paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of νμ charged-current interactions with an oscillatory dependence on Lν/Eν, where Lν is the neutrino propagation distance and E mrow is="true">ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the Lν/Eν resolution varies significantly from event to event. The precision of the oscillation measurement can be improved by incorporating information on Lν/Eν resolution into the oscillation analysis. In the analysis presented

  7. Gynecologic Oncology Group quality assurance audits: analysis and initiatives for improvement.

    Science.gov (United States)

    Blessing, John A; Bialy, Sally A; Whitney, Charles W; Stonebraker, Bette L; Stehman, Frederick B

    2010-08-01

    The Gynecologic Oncology Group (GOG) is a multi-institution, multi-discipline Cooperative Group funded by the National Cancer Institute (NCI) to conduct clinical trials which investigate the treatment, prevention, control, quality of survivorship, and translational science of gynecologic malignancies. In 1982, the NCI initiated a program of on-site quality assurance audits of participating institutions. Each is required to be audited at least once every 3 years. In GOG, the audit mandate is the responsibility of the GOG Quality Assurance Audit Committee and it is centralized in the Statistical and Data Center (SDC). Each component (Regulatory, Investigational Drug Pharmacy, Patient Case Review) is classified as Acceptable, Acceptable, follow-up required, or Unacceptable. To determine frequently occurring deviations and develop focused innovative solutions to address them. A database was created to examine the deviations noted at the most recent audit conducted at 57 GOG parent institutions during 2004-2007. Cumulatively, this involved 687 patients and 306 protocols. The results documented commendable performance: Regulatory (39 Acceptable, 17 Acceptable, follow-up, 1 Unacceptable); Pharmacy (41 Acceptable, 3 Acceptable, follow-up, 1 Unacceptable, 12 N/A): Patient Case Review (31 Acceptable, 22 Acceptable, follow-up, 4 Unacceptable). The nature of major and lesser deviations was analyzed to create and enhance initiatives for improvement of the quality of clinical research. As a result, Group-wide proactive initiatives were undertaken, audit training sessions have emphasized recurring issues, and GOG Data Management Subcommittee agendas have provided targeted instruction and training. The analysis was based upon parent institutions only; affiliate institutions and Community Clinical Oncology Program participants were not included, although it is assumed their areas of difficulty are similar. The coordination of the GOG Quality Assurance Audit program in the SDC has

  8. Trabeculectomy Improves Vessel Response Measured by Dynamic Vessel Analysis (DVA) in Glaucoma Patients.

    Science.gov (United States)

    J, Michael Selbach; Schallenberg, Maurice; Kramer, Sebastian; Anastassiou, Gerasimos; Steuhl, Klaus-Peter; Vilser, Walthard; Kremmer, Stephan

    2014-01-01

    To determine the effects of surgical IOP reduction (trabeculectomy) on retinal blood flow parameters in glaucoma patients using Dynamic Vessel Analysis (DVA). 26 eyes of 26 patients with progressive primary open-angle glaucoma (POAG) despite maximal topical therapy were examined before and after trabeculectomy. The responses of the retinal vessels to flickering light provocation were measured with DVA the day before surgery and 4 to 6 weeks after trabeculectomy. Between 3 and 4 weeks before surgery all local therapies were stopped and a systemic therapy with acetazolamide and conservative free topic steroidal eye drops was started. In 19 patients (73%), an inadequate response to the flicker stimulation was measured preoperatively. In these patients, the maximum dilation of arteries and veins was reduced significantly as compared to healthy eyes. In this group, the maximum dilation of the arteries following the flicker provocation improved from 1.4% before to 3.8% following trabeculectomy (p<0.01). In retinal veins, this parameter increased from 3.1% to 4.6% (p<0.05). In the 7 patients whose arterial and venous reactions to flickering light provocation preoperatively did not differ from healthy eyes, there was no significant change after surgery. The initial baseline values of arteries and veins (MU) did not deviate significantly in both groups. POAG patients with progressive disease and impaired vascular regulation profit from IOP lowering trabeculectomy concerning vascular reactivity and dilative reserve, indicating a possible improvement of retinal perfusion following effective IOP control. Future studies with long-term follow-up must determine the clinical importance of these findings for the treatment of glaucoma patients.

  9. FDG uptake heterogeneity evaluated by fractal analysis improves the differential diagnosis of pulmonary nodules

    International Nuclear Information System (INIS)

    Miwa, Kenta; Inubushi, Masayuki; Wagatsuma, Kei; Nagao, Michinobu; Murata, Taisuke; Koyama, Masamichi; Koizumi, Mitsuru; Sasaki, Masayuki

    2014-01-01

    Purpose: The present study aimed to determine whether fractal analysis of morphological complexity and intratumoral heterogeneity of FDG uptake can help to differentiate malignant from benign pulmonary nodules. Materials and methods: We retrospectively analyzed data from 54 patients with suspected non-small cell lung cancer (NSCLC) who were examined by FDG PET/CT. Pathological assessments of biopsy specimens confirmed 35 and 19 nodules as NSCLC and inflammatory lesions, respectively. The morphological fractal dimension (m-FD), maximum standardized uptake value (SUV max ) and density fractal dimension (d-FD) of target nodules were calculated from CT and PET images. Fractal dimension is a quantitative index of morphological complexity and tracer uptake heterogeneity; higher values indicate increased complexity and heterogeneity. Results: The m-FD, SUV max and d-FD significantly differed between malignant and benign pulmonary nodules (p < 0.05). Although the diagnostic ability was better for d-FD than m-FD and SUV max , the difference did not reach statistical significance. Tumor size correlated significantly with SUV max (r = 0.51, p < 0.05), but not with either m-FD or d-FD. Furthermore, m-FD combined with either SUV max or d-FD improved diagnostic accuracy to 92.6% and 94.4%, respectively. Conclusion: The d-FD of intratumoral heterogeneity of FDG uptake can help to differentially diagnose malignant and benign pulmonary nodules. The SUV max and d-FD obtained from FDG-PET images provide different types of information that are equally useful for differential diagnoses. Furthermore, the morphological complexity determined by CT combined with heterogeneous FDG uptake determined by PET improved diagnostic accuracy

  10. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency.

    Science.gov (United States)

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-12-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. [Analysis of interventions designed to improve clinical supervision of student nurses in Benin].

    Science.gov (United States)

    Otti, André; Pirson, Magali; Piette, Danielle; Coppieters T Wallant, Yves

    2017-12-05

    The absence of an explicit and coherent conception of the articulation between theory and practice in the reform of nursing training in Benin has resulted in poor quality clinical supervision of student nurses. The objective of this article is to analyze two interventions designed to improve the quality of supervision. A student welcome booklet developed by means of a consultative and provocative participatory approach was tested with twelve student nurses versus a control group. Content analysis of the data collected by individual semi-directed interviews and during two focus groups demonstrated the value of this tool. Student nurses were also taught to use to training diaries inspired by the ?experiential learning? Training diaries were analysed using a grid based on the descriptive elements of the five types of Scheepers training diaries (2008). According to the student nurses, the welcome booklet provided them with structured information to be used as a reference during their training and a better understanding of their teachers, and allowed them to situate the resources of the training course with a lower level of stress. Fifty-eight per cent of the training diaries were are mosaics, reflecting the reflective practice and self-regulated learning of student nurses. This activity also promoted metacognitive dialogue with their supervisors. The student welcome booklet appeared to facilitate integration of student nurses into the clinical setting and promoted professional and organizational socialization. The training diary improved the quality of clinical learning by repeated reflective observation of student nurses and helped to maintain permanent communication with the supervisors.

  12. Improving the position resolution of highly segmented HPGe detectors using pulse shape analysis methods

    International Nuclear Information System (INIS)

    Descovich, Martina

    2002-01-01

    This work presents an approach for determining the interaction position of γ rays in highly segmented HPGe detectors. A precise knowledge of the interaction position enables the effective granularity of the detector to be substantially improved and a calibration of the detector response as a function of position to be performed. An improved position resolution is fundamental for the development of arrays of γ ray tracking detectors. The performance of a highly segmented germanium detector (TIGRE) has been characterised. TIGRE consists of a large volume coaxial high-purity n-type germanium crystal with a 24-fold segmented outer contact. Due to its high granularity and its fast electronics, TIGRE represents a unique example of a tracking detector, having low noise output signals, fast rise time and good energy resolution. In order to calibrate the response of the detector as a function of the interaction position, a dedicated scanning apparatus has been developed and the front surface of the detector has been scanned. The method developed for position determination is based on the digital analysis of the preamplifier signal, whose features are position dependent. A two-dimensional position resolution is accomplished by combining the radial position information, contained in the rise time of the pulse shape leading edge, with the azimuthal position information, carried by the magnitude of the transient charge signals induced in the spectator segments. Utilising this method, a position resolution of 0.6 mm, both radially and along the azimuthal direction, can be achieved in the most sensitive part of the detector. (author)

  13. Spatial analysis of ecosystem service relationships to improve targeting of payments for hydrological services.

    Science.gov (United States)

    Mokondoko, Pierre; Manson, Robert H; Ricketts, Taylor H; Geissert, Daniel

    2018-01-01

    Payment for hydrological services (PHS) are popular tools for conserving ecosystems and their water-related services. However, improving the spatial targeting and impacts of PHS, as well as their ability to foster synergies with other ecosystem services (ES), remain challenging. We aimed at using spatial analyses to evaluate the targeting performance of México's National PHS program in central Veracruz. We quantified the effectiveness of areas targeted for PHS in actually covering areas of high HS provision and social priority during 2003-2013. First, we quantified provisioning and spatial distributions of two target (water yield and soil retention), and one non-target ES (carbon storage) using InVEST. Subsequently, pairwise relationships among ES were quantified by using spatial correlation and overlap analyses. Finally, we evaluated targeting by: (i) prioritizing areas of individual and overlapping ES; (ii) quantifying spatial co-occurrences of these priority areas with those targeted by PHS; (iii) evaluating the extent to which PHS directly contribute to HS delivery; and (iv), testing if PHS targeted areas disproportionately covered areas with high ecological and social priority. We found that modelled priority areas exhibited non-random distributions and distinct spatial patterns. Our results show significant pairwise correlations between all ES suggesting synergistic relationships. However, our analysis showed a significantly lower overlap than expected and thus significant mismatches between PHS targeted areas and all types of priority areas. These findings suggest that the targeting of areas with high HS provisioning and social priority by Mexico's PHS program could be improved significantly. This study underscores: (1) the importance of using maps of HS provisioning as main targeting criteria in PHS design to channel payments towards areas that require future conservation, and (2) the need for future research that helps balance ecological and socioeconomic

  14. Identification of quality improvement areas in pediatric MRI from analysis of patient safety reports

    International Nuclear Information System (INIS)

    Jaimes, Camilo; Murcia, Diana J.; Miguel, Karen; DeFuria, Cathryn; Sagar, Pallavi; Gee, Michael S.

    2018-01-01

    Analysis of safety reports has been utilized to guide practice improvement efforts in adult magnetic resonance imaging (MRI). Data specific to pediatric MRI could help target areas of improvement in this population. To estimate the incidence of safety reports in pediatric MRI and to determine associated risk factors. In a retrospective HIPAA-compliant, institutional review board-approved study, a single-institution Radiology Information System was queried to identify MRI studies performed in pediatric patients (0-18 years old) from 1/1/2010 to 12/31/2015. The safety report database was queried for events matching the same demographic and dates. Data on patient age, gender, location (inpatient, outpatient, emergency room [ER]), and the use of sedation/general anesthesia were recorded. Safety reports were grouped into categories based on the cause and their severity. Descriptive statistics were used to summarize continuous variables. Chi-square analyses were performed for univariate determination of statistical significance of variables associated with safety report rates. A multivariate logistic regression was used to control for possible confounding effects. A total of 16,749 pediatric MRI studies and 88 safety reports were analyzed, yielding a rate of 0.52%. There were significant differences in the rate of safety reports between patients younger than 6 years (0.89%) and those older (0.41%) (P<0.01), sedated (0.8%) and awake children (0.45%) (P<0.01), and inpatients (1.1%) and outpatients (0.4%) (P<0.01). The use of sedation/general anesthesia is an independent risk factor for a safety report (P=0.02). The most common causes for safety reports were service coordination (34%), drug reactions (19%), and diagnostic test and ordering errors (11%). The overall rate of safety reports in pediatric MRI is 0.52%. Interventions should focus on vulnerable populations, such as younger patients, those requiring sedation, and those in need of acute medical attention. (orig.)

  15. Improving the Efficiency and Ease of Healthcare Analysis Through Use of Data Visualization Dashboards.

    Science.gov (United States)

    Stadler, Jennifer G; Donlon, Kipp; Siewert, Jordan D; Franken, Tessa; Lewis, Nathaniel E

    2016-06-01

    The digitization of a patient's health record has profoundly impacted medicine and healthcare. The compilation and accessibility of medical history has provided clinicians an unprecedented, holistic account of a patient's conditions, procedures, medications, family history, and social situation. In addition to the bedside benefits, this level of information has opened the door for population-level monitoring and research, the results of which can be used to guide initiatives that are aimed at improving quality of care. Cerner Corporation partners with health systems to help guide population management and quality improvement projects. With such an enormous and diverse client base-varying in geography, size, organizational structure, and analytic needs-discerning meaning in the data and how they fit with that particular hospital's goals is a slow, difficult task that requires clinical, statistical, and technical literacy. This article describes the development of dashboards for efficient data visualization at the healthcare facility level. Focusing on two areas with broad clinical importance, sepsis patient outcomes and 30-day hospital readmissions, dashboards were developed with the goal of aggregating data and providing meaningful summary statistics, highlighting critical performance metrics, and providing easily digestible visuals that can be understood by a wide range of personnel with varying levels of skill and areas of expertise. These internal-use dashboards have allowed associates in multiple roles to perform a quick and thorough assessment on a hospital of interest by providing the data to answer necessary questions and to