WorldWideScience

Sample records for refinement protocol combining

  1. IEEE 802.11 Wireless LANs: Performance Analysis and Protocol Refinement

    Directory of Open Access Journals (Sweden)

    Chatzimisios P.

    2005-01-01

    Full Text Available The IEEE 802.11 protocol is emerging as a widely used standard and has become the most mature technology for wireless local area networks (WLANs. In this paper, we focus on the tuning of the IEEE 802.11 protocol parameters taking into consideration, in addition to throughput efficiency, performance metrics such as the average packet delay, the probability of a packet being discarded when it reaches the maximum retransmission limit, the average time to drop a packet, and the packet interarrival time. We present an analysis, which has been validated by simulation that is based on a Markov chain model commonly used in the literature. We further study the improvement on these performance metrics by employing suitable protocol parameters according to the specific communication needs of the IEEE 802.11 protocol for both basic access and RTS/CTS access schemes. We show that the use of a higher initial contention window size does not considerably degrade performance in small networks and performs significantly better in any other scenario. Moreover, we conclude that the combination of a lower maximum contention window size and a higher retry limit considerably improves performance. Results indicate that the appropriate adjustment of the protocol parameters enhances performance and improves the services that the IEEE 802.11 protocol provides to various communication applications.

  2. Combining Techniques to Refine Item to Skills Q-Matrices with a Partition Tree

    Science.gov (United States)

    Desmarais, Michel C.; Xu, Peng; Beheshti, Behzad

    2015-01-01

    The problem of mapping items to skills is gaining interest with the emergence of recent techniques that can use data for both defining this mapping, and for refining mappings given by experts. We investigate the problem of refining mapping from an expert by combining the output of different techniques. The combination is based on a partition tree…

  3. Combined analysis of fourteen nuclear genes refines the Ursidae phylogeny.

    Science.gov (United States)

    Pagès, Marie; Calvignac, Sébastien; Klein, Catherine; Paris, Mathilde; Hughes, Sandrine; Hänni, Catherine

    2008-04-01

    Despite numerous studies, questions remain about the evolutionary history of Ursidae and additional independent genetic markers were needed to elucidate these ambiguities. For this purpose, we sequenced ten nuclear genes for all the eight extant bear species. By combining these new sequences with those of four other recently published nuclear markers, we provide new insights into the phylogenetic relationships of the Ursidae family members. The hypothesis that the giant panda was the first species to diverge among ursids is definitively confirmed and the precise branching order within the Ursus genus is clarified for the first time. Moreover, our analyses indicate that the American and the Asiatic black bears do not cluster as sister taxa, as had been previously hypothesised. Sun and sloth bears clearly appear as the most basal ursine species but uncertainties about their exact relationships remain. Since our larger dataset did not enable us to clarify this last question, identifying rare genomic changes in bear genomes could be a promising solution for further studies.

  4. The Effects of the Combination of a Refined Carbohydrate Diet and Exposure to Hyperoxia in Mice

    Science.gov (United States)

    Soares, Nicia Pedreira; Campos, Keila Karine Duarte; Pena, Karina Braga; Bandeira, Ana Carla Balthar

    2016-01-01

    Obesity is a multifactorial disease with genetic, social, and environmental influences. This study aims at analyzing the effects of the combination of a refined carbohydrate diet and exposure to hyperoxia on the pulmonary oxidative and inflammatory response in mice. Twenty-four mice were divided into four groups: control group (CG), hyperoxia group (HG), refined carbohydrate diet group (RCDG), and refined carbohydrate diet + hyperoxia group (RCDHG). The experimental diet was composed of 10% sugar, 45% standard diet, and 45% sweet condensed milk. For 24 hours, the HG and RCDHG were exposed to hyperoxia and the CG and RCDG to ambient air. After the exposures were completed, the animals were euthanized, and blood, bronchoalveolar lavage fluid, and lungs were collected for analyses. The HG showed higher levels of interferon-γ in adipose tissue as compared to other groups and higher levels of interleukin-10 and tumor necrosis factor-α compared to the CG and RCDHG. SOD and CAT activities in the pulmonary parenchyma decreased in the RCDHG as compared to the CG. There was an increase of lipid peroxidation in the HG, RCDG, and RCDHG as compared to the CG. A refined carbohydrate diet combined with hyperoxia promoted inflammation and redox imbalance in adult mice. PMID:28018521

  5. The Effects of the Combination of a Refined Carbohydrate Diet and Exposure to Hyperoxia in Mice

    Directory of Open Access Journals (Sweden)

    Nicia Pedreira Soares

    2016-01-01

    Full Text Available Obesity is a multifactorial disease with genetic, social, and environmental influences. This study aims at analyzing the effects of the combination of a refined carbohydrate diet and exposure to hyperoxia on the pulmonary oxidative and inflammatory response in mice. Twenty-four mice were divided into four groups: control group (CG, hyperoxia group (HG, refined carbohydrate diet group (RCDG, and refined carbohydrate diet + hyperoxia group (RCDHG. The experimental diet was composed of 10% sugar, 45% standard diet, and 45% sweet condensed milk. For 24 hours, the HG and RCDHG were exposed to hyperoxia and the CG and RCDG to ambient air. After the exposures were completed, the animals were euthanized, and blood, bronchoalveolar lavage fluid, and lungs were collected for analyses. The HG showed higher levels of interferon-γ in adipose tissue as compared to other groups and higher levels of interleukin-10 and tumor necrosis factor-α compared to the CG and RCDHG. SOD and CAT activities in the pulmonary parenchyma decreased in the RCDHG as compared to the CG. There was an increase of lipid peroxidation in the HG, RCDG, and RCDHG as compared to the CG. A refined carbohydrate diet combined with hyperoxia promoted inflammation and redox imbalance in adult mice.

  6. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    Science.gov (United States)

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  7. Service refinement

    Institute of Scientific and Technical Information of China (English)

    HE JiFeng

    2008-01-01

    This paper presents a refinement calculus for service components. We model the behaviour of individual service by a guarded design, which enables one to separate the responsibility of clients from the commitment made by the system, and to iden-tify a component by a set of failures and divergences. Protocols are introduced to coordinate the interactions between a component with the external environment. We adopt the notion of process refinement to formalize the substitutivity of components, and provide a complete proof method based on the notion of simulations.

  8. Antibody structure determination using a combination of homology modeling, energy-based refinement, and loop prediction

    Science.gov (United States)

    Zhu, Kai; Day, Tyler; Warshaviak, Dora; Murrett, Colleen; Friesner, Richard; Pearlman, David

    2017-01-01

    We present the blinded prediction results in the Second Antibody Modeling Assessment (AMA-II) using a fully automatic antibody structure prediction method implemented in the programs BioLuminate and Prime. We have developed a novel knowledge based approach to model the CDR loops, using a combination of sequence similarity, geometry matching, and the clustering of database structures. The homology models are further optimized with a physics-based energy function (VSGB2.0), which improves the model quality significantly. H3 loop modeling remains the most challenging task. Our ab initio loop prediction performs well for the H3 loop in the crystal structure context, and allows improved results when refining the H3 loops in the context of homology models. For the 10 human and mouse derived antibodies in this assessment, the average RMSDs for the homology model Fv and framework regions are 1.19 Å and 0.74 Å, respectively. The average RMSDs for five non-H3 CDR loops range from 0.61 Å to 1.05 Å, and the H3 loop average RMSD is 2.91 Å using our knowledge-based loop prediction approach. The ab initio H3 loop predictions yield an average RMSD of 1.28 Å when performed in the context of the crystal structure and 2.67 Å in the context of the homology modeled structure. Notably, our method for predicting the H3 loop in the crystal structure environment ranked first among the seven participating groups in AMA-II, and our method made the best prediction among all participants for seven of the ten targets. PMID:24619874

  9. Antibody structure determination using a combination of homology modeling, energy-based refinement, and loop prediction.

    Science.gov (United States)

    Zhu, Kai; Day, Tyler; Warshaviak, Dora; Murrett, Colleen; Friesner, Richard; Pearlman, David

    2014-08-01

    We present the blinded prediction results in the Second Antibody Modeling Assessment (AMA-II) using a fully automatic antibody structure prediction method implemented in the programs BioLuminate and Prime. We have developed a novel knowledge based approach to model the CDR loops, using a combination of sequence similarity, geometry matching, and the clustering of database structures. The homology models are further optimized with a physics-based energy function (VSGB2.0), which improves the model quality significantly. H3 loop modeling remains the most challenging task. Our ab initio loop prediction performs well for the H3 loop in the crystal structure context, and allows improved results when refining the H3 loops in the context of homology models. For the 10 human and mouse derived antibodies in this assessment, the average RMSDs for the homology model Fv and framework regions are 1.19 Å and 0.74 Å, respectively. The average RMSDs for five non-H3 CDR loops range from 0.61 Å to 1.05 Å, and the H3 loop average RMSD is 2.91 Å using our knowledge-based loop prediction approach. The ab initio H3 loop predictions yield an average RMSD of 1.28 Å when performed in the context of the crystal structure and 2.67 Å in the context of the homology modeled structure. Notably, our method for predicting the H3 loop in the crystal structure environment ranked first among the seven participating groups in AMA-II, and our method made the best prediction among all participants for seven of the ten targets. © 2014 Wiley Periodicals, Inc.

  10. An Authentication Protocol Based on Combined RFID-Biometric System

    Directory of Open Access Journals (Sweden)

    Mohamed Benmohammed

    2012-04-01

    Full Text Available Radio Frequency Identification (RFID and biometric technologies saw fast evolutions during the last years and which are used in several applications, such as access control. Among important characteristics in the RFID tags, we mention the limitation of resources (memory, energy, …. Our work focuses on the design of a RFID authentication protocol which uses biometric data and which confirms the secrecy, the authentication and the privacy. Our protocol requires a PRNG (Pseud-Random Number Generator, a robust hash function and Biometric hash function. The Biometric hash function is used to optimize and to protect biometric data. For Security analysis of protocol proposed, we will use AVISPA and SPAN tools to verify the authentication and the secrecy.

  11. Towards Symbolic Model-Based Mutation Testing: Combining Reachability and Refinement Checking

    CERN Document Server

    Aichernig, Bernhard K; 10.4204/EPTCS.80.7

    2012-01-01

    Model-based mutation testing uses altered test models to derive test cases that are able to reveal whether a modelled fault has been implemented. This requires conformance checking between the original and the mutated model. This paper presents an approach for symbolic conformance checking of action systems, which are well-suited to specify reactive systems. We also consider nondeterminism in our models. Hence, we do not check for equivalence, but for refinement. We encode the transition relation as well as the conformance relation as a constraint satisfaction problem and use a constraint solver in our reachability and refinement checking algorithms. Explicit conformance checking techniques often face state space explosion. First experimental evaluations show that our approach has potential to outperform explicit conformance checkers.

  12. Synergistic Separation Behavior of Boron in Metallurgical Grade Silicon Using a Combined Slagging and Gas Blowing Refining Technique

    Science.gov (United States)

    Wu, Jijun; Zhou, Yeqiang; Ma, Wenhui; Xu, Min; Yang, Bin

    2017-02-01

    A combined slagging and gas blowing refining technique for boron removal from metallurgical grade silicon using the CaO-SiO2-CaCl2 slag and the mixed Ar-O2-H2O gas is investigated. The oxygen gas blowing in combination with water vapor shows a wonderful removal efficiency of boron compared with the single oxygen or the single water vapor blowing. It is analyzed from the thermodynamics that a synergistic separation behavior of boron is resulted from CaCl2 and O2. Boron is removed and reduced from 22 to 0.75 ppmw with a removal efficiency of 96.6 pct.

  13. Synergistic Separation Behavior of Boron in Metallurgical Grade Silicon Using a Combined Slagging and Gas Blowing Refining Technique

    Science.gov (United States)

    Wu, Jijun; Zhou, Yeqiang; Ma, Wenhui; Xu, Min; Yang, Bin

    2016-11-01

    A combined slagging and gas blowing refining technique for boron removal from metallurgical grade silicon using the CaO-SiO2-CaCl2 slag and the mixed Ar-O2-H2O gas is investigated. The oxygen gas blowing in combination with water vapor shows a wonderful removal efficiency of boron compared with the single oxygen or the single water vapor blowing. It is analyzed from the thermodynamics that a synergistic separation behavior of boron is resulted from CaCl2 and O2. Boron is removed and reduced from 22 to 0.75 ppmw with a removal efficiency of 96.6 pct.

  14. 6-Aminopenicillanic acid revisited: A combined solid state NMR and in silico refinement

    Science.gov (United States)

    Aguiar, Daniel Lima Marques de; San Gil, Rosane Aguiar da Silva; Alencastro, Ricardo Bicca de; Souza, Eugenio Furtado de; Borré, Leandro Bandeira; Vaiss, Viviane da Silva; Leitão, Alexandre Amaral

    2016-09-01

    13C/15N (experimental and ab initio) solid-state NMR was used to achieve an affordable way to improve hydrogen refinement of 6-aminopenicillanic acid (6-APA) structure. The lattice effect on the isotropic chemical shifts was probed by using two different magnetic shielding calculations: isolated molecules and periodic crystal structure. The electron density difference maps of optimized and non-optimized structures were calculated in order to investigate the interactions inside the 6-APA unit cell. The 13C and 15N chemical shifts assignments were unambiguously stablished. In addition, some of the literature 13C resonances ambiguities could be properly solved.

  15. Modeling of the inhomogeneity of grain refinement during combined metal forming process by finite element and cellular automata methods

    Energy Technology Data Exchange (ETDEWEB)

    Majta, Janusz; Madej, Łukasz; Svyetlichnyy, Dmytro S.; Perzyński, Konrad; Kwiecień, Marcin, E-mail: mkwiecie@agh.edu.pl; Muszka, Krzysztof

    2016-08-01

    The potential of discrete cellular automata technique to predict the grain refinement in wires produced using combined metal forming process is presented and discussed within the paper. The developed combined metal forming process can be treated as one of the Severe Plastic Deformation (SPD) techniques that consists of three different modes of deformation: asymmetric drawing with bending, namely accumulated angular drawing (AAD), wire drawing (WD) and wire flattening (WF). To accurately replicate complex stress state both at macro and micro scales during subsequent deformations two stage modeling approach was used. First, the Finite Element Method (FEM), implemented in commercial ABAQUS software, was applied to simulate entire combined forming process at the macro scale level. Then, based on FEM results, the Cellular Automata (CA) method was applied for simulation of grain refinement at the microstructure level. Data transferred between FEM and CA methods included set of files with strain tensor components obtained from selected integration points in the macro scale model. As a result of CA simulation, detailed information on microstructure evolution during severe plastic deformation conditions was obtained, namely: changes of shape and sizes of modeled representative volume with imposed microstructure, changes of the number of grains, subgrains and dislocation cells, development of grain boundaries angle distribution as well as changes in the pole figures. To evaluate CA model predictive capabilities, results of computer simulation were compared with scanning electron microscopy and electron back scattered diffraction images (SEM/EBSD) studies of samples after AAD+WD+WF process.

  16. A combined protocol for identification of maggots of forensic interest.

    Science.gov (United States)

    Tuccia, Fabiola; Giordani, Giorgia; Vanin, Stefano

    2016-07-01

    In Forensic Entomology the estimation of the age of insects is used for the estimation of the minimum post-mortem interval. As insect development is temperature dependent and species specific, a correct species identification is therefore fundamental. In the majority of cases the molecular identification is based on a destructive approach. In this paper a working protocol for molecular identification of fly larvae without affecting the anatomical characters used for morphological identification is presented. The suggested technique allows the preservation of the larval exoskeleton and of the unused soft tissues in the same vial allowing a repetition of both the morphological and molecular identification and reducing the risk of loss of the evidence. This method also allows the possibility of measuring the size of the specimens before their morphological and biomolecular characterization. In order to demonstrate that this technique can be applied on maggots of a large spectrum of dimensions it has been tested and validated using larvae of different size from ~1.7-1.3cm [Calliphora vomitoria and Lucilia sericata (Diptera: Calliphoridae)] to ~10-6.5mm [Musca domestica (Diptera: Muscidae) and Megaselia scalaris (Diptera: Phoridae)]. The importance of a unique identifier and of a complete database with all the specimen information (origin, sample size, identification, etc.) is also discussed.

  17. Protein NMR structures refined without NOE data.

    Science.gov (United States)

    Ryu, Hyojung; Kim, Tae-Rae; Ahn, SeonJoo; Ji, Sunyoung; Lee, Jinhyuk

    2014-01-01

    The refinement of low-quality structures is an important challenge in protein structure prediction. Many studies have been conducted on protein structure refinement; the refinement of structures derived from NMR spectroscopy has been especially intensively studied. In this study, we generated flat-bottom distance potential instead of NOE data because NOE data have ambiguity and uncertainty. The potential was derived from distance information from given structures and prevented structural dislocation during the refinement process. A simulated annealing protocol was used to minimize the potential energy of the structure. The protocol was tested on 134 NMR structures in the Protein Data Bank (PDB) that also have X-ray structures. Among them, 50 structures were used as a training set to find the optimal "width" parameter in the flat-bottom distance potential functions. In the validation set (the other 84 structures), most of the 12 quality assessment scores of the refined structures were significantly improved (total score increased from 1.215 to 2.044). Moreover, the secondary structure similarity of the refined structure was improved over that of the original structure. Finally, we demonstrate that the combination of two energy potentials, statistical torsion angle potential (STAP) and the flat-bottom distance potential, can drive the refinement of NMR structures.

  18. A COMBINED ADMISSION CONTROL ALGORITHM WITH DA PROTOCOL FOR SATELLITE ATM NETWORKS

    Institute of Scientific and Technical Information of China (English)

    Lu Rong; Cao Zhigang

    2006-01-01

    Admission control is an important strategy for Quality of Service (QoS) provisioning in Asynchronous Transfer Mode (ATM) networks. Based on a control-theory model of resources on-Demand Allocation (DA) protocol, the paper studies the effect of the protocol on the statistical characteristics of network traffic,and proposes a combined connection admission control algorithm with the DA protocol to achieve full utilization of link resources in satellite communication systems. The proposed algorithm is based on the cross-layer-design approach. Theoretical analysis and system simulation results show that the proposed algorithm can admit more connections within certain admission thresholds than one that does not take into account the DA protocol. Thus, the proposed algorithm can increase admission ratio of traffic sources for satellite ATM networks and improve satellite link utilization.

  19. Combining simulations and solution experiments as a paradigm for RNA force field refinement

    CERN Document Server

    Cesari, Andrea; Bussi, Giovanni

    2016-01-01

    Recent computational efforts have shown that the current potential energy models used in molecular dynamics are not accurate enough to describe the conformational ensemble of RNA oligomers and suggest that molecular dynamics should be complemented with experimental data. We here propose a scheme based on the maximum entropy principle to combine simulations with bulk experiments. In the proposed scheme the noise arising from both the measurements and the forward models used to back calculate the experimental observables is explicitly taken into account. The method is tested on RNA nucleosides and is then used to construct chemically consistent corrections to the Amber RNA force field that allow a large set of experimental data on nucleosides and dinucleosides to be correctly reproduced. The transferability of these corrections is assessed against independent data on tetranucleotides and displays a previously unreported agreement with experiments. This procedure can be applied to enforce multiple experimental d...

  20. Refinement of linkage of human severe combined immunodeficiency (SCIDXI) to polymorphic markers in Xq13

    Energy Technology Data Exchange (ETDEWEB)

    Puck, J.M (Children' s Hospital of Philadelphia, PA (United States) Univ. of Pennsylvania School of Medicine, Philadelphia, PA (United States)); Conely, M.E. (St. Jude Children' s Research Hospital and Univ. of Tennessee School of Medicine, Memphis (United States)); Bailey, L.C. (Univ. of Pennsylvania School of Medicine, Philadelphia, PA (United States))

    1993-07-01

    The most common form of human severe combined immunodeficiency (SCID) is inherited as an X-linked recessive genetic defect, MIM 300400. The disease locus, SCIDX1, has previously been placed in Xq13.1-q21.1 by demonstration of linkage to polymorphic markers between DXS159 and DXS3 and by exclusion from interstitial deletions of Xq21.1-q21.3. The authors report an extension of previous linkage studies, with new markers and a total of 25 SCIDX1 families including female carriers identified by nonrandom X chromosome inactivation in their T lymphocytes. SCIDX1 was nonrecombinant with DXS441, with a lod score of 17.96. Linkage relationships of new markers in the SCIDX1 families were consistent with the linkage map generated in the families of the Centre d'Etude du Polymorphisms Humain (CEPH) and with available physical map data. The most likely locus order was DXS1-(DXS159,DXS153)-DXS106-DXS132-DXS453-(SCIDX1,PGK1, DXS325,DXS347,DXS441)-DXS447-DXS72-DXYS1X-DXS3. The SCIDX1 region now spans approximately 10 Mb of DNA in Xq13; this narrowed genetic localization will assist efforts to identify gene candidates and will improve genetic management for families with SCID. 25 refs., 3 figs., 2 tabs.

  1. New criteria for liver transplantation in adults : the combined Groningen and Rotterdam protocol

    NARCIS (Netherlands)

    Jansen, PLM

    1998-01-01

    A combined protocol for liver transplantation has been written by the teams of Groningen and Rotterdam. This is to ensure that the criteria for selection and timing of liver transplantation, and the procedures for patient evaluation, are identical or at least very similar. Also, the waiting list pro

  2. Structure of 3 at.% and 9 at.% Si-doped HfO{sub 2} from combined refinement of X-ray and neutron diffraction patterns

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Lili [Department of Materials Science and Engineering, North Carolina State University, NC 27695 (United States); School of Information Science and Technology, Northwest University, Xi' an 710127 (China); Hou, Dong; Usher, Tedi-Marie; Iamsasri, Thanakorn; Fancher, Chris M.; Forrester, Jennifer S. [Department of Materials Science and Engineering, North Carolina State University, NC 27695 (United States); Nishida, Toshikazu [Department of Electrical and Computer Engineering, University of Florida, FL 32611 (United States); Moghaddam, Saeed [Department of Mechanical and Aerospace Engineering, University of Florida, FL 32611 (United States); Jones, Jacob L., E-mail: jacobjones@ncsu.edu [Department of Materials Science and Engineering, North Carolina State University, NC 27695 (United States)

    2015-10-15

    The crystal structure of 3 at.% and 9 at.% Si-doped HfO{sub 2} powder was determined through refinements using X-ray and neutron diffraction patterns. The lattice parameters, atomic positions, dopant occupancy, and the second phase fraction were determined with high precision using a combined full pattern fitting via the Rietveld method. The results show that both 3 at.% and 9 at.% Si-doped HfO{sub 2} powder exhibit the monoclinic crystal structure with P 1 2{sub 1}/c 1 space group. Through the combined refinement, the crystal structure parameters, especially for the positions and occupancies of the lighter atoms, were more precisely determined compared to independent X-ray diffraction refinement. Although the ionic radius of Si{sup 4+} is smaller than Hf{sup 4+}, with increasing Si occupancy, the unit cell volume slightly increases; possible mechanisms for this effect are discussed. Moreover, the refined results provide evidence of the existence of a non-equilibrium phase of Hf{sub x}Si{sub 1−x}O{sub 2}. The second phase (SiO{sub 2}) fraction is determined as 0.17 at.% for 3 at.% Si-doped HfO{sub 2} powders and 1.7 at.% for 9 at.% Si-doped HfO{sub 2} powders. - Highlights: • X-ray and neutron diffraction patterns were used to determine crystal structure. • Results from independent refinements and a combined refinement were compared. • Highly precise structural parameters were obtained by the combined refinement. • Structural data of 3 at.% and 9 at.% Si-doped HfO{sub 2} are provided. • The unit cell volume slightly increases with Si concentration from 3 to 9 at.%.

  3. The Protocol of Fixed Reconstruction for Severely Worn Teeth Combined with Anterior Deep Bite

    Directory of Open Access Journals (Sweden)

    Ya-Wen Zhao

    2017-01-01

    Full Text Available Full mouth reconstruction is one of the most effective methods to restore severe worn teeth that have suffered reduced vertical dimension. Although the use of the overlay splint restoration for a trial period allowing the patient to adapt to an increased vertical dimension is the recognized method, the specific protocol from the transitional splint to the fixed reconstruction is yet to be established. This case report describes a 50-year-old female patient who has severely worn teeth combined with an anterior deep bite and chewing pain. The protocol of the treatment process is described.

  4. New, combined, and reduced dosing treatment protocols cure Trypanosoma cruzi infection in mice.

    Science.gov (United States)

    Bustamante, Juan M; Craft, Julie M; Crowe, Byron D; Ketchie, Sarah A; Tarleton, Rick L

    2014-01-01

    The development of treatment protocols with reduced toxicity and equivalent or improved efficacy for Trypanosoma cruzi infection is a priority. We tested the effectiveness of benznidazole (BZ), nifurtimox (NFX), other prospective drugs in intermittent and combined treatment protocols to cure T. cruzi infection initiated with susceptible and drug-resistant parasite strains. A 40-day course of BZ, NFX, or the oxaborale AN4169 cured 100% of mice, whereas posaconazole (POS), and NTLA-1 (a nitro-triazole) cured approximately 90% and 20% of mice, respectively. Reducing the overall dosage of BZ or NFX by using an intermittent (once every 5 days) schedule or combining 5 daily doses of POS with 7 intermittent doses of BZ also provided approximately 100% cure. T. cruzi strains resistant to BZ were also found to be resistant to other drugs (POS), and extending the time of treatment or combining drugs did not increase cure rates with these isolates. Thus, dosing schedules for anti-T. cruzi compounds should be determined empirically, and compounds targeting different pathways may be combined to yield effective therapies with reduced toxicity. This work also suggests that standard treatment protocols using BZ and NFX may be significantly overdosing patients, perhaps contributing to the adverse events.

  5. Establishing an educational programme for nurses to supply emergency hormonal contraception (combined method) to protocol.

    Science.gov (United States)

    Brittain, D

    1999-10-01

    This paper gives an account of an innovative educational programme developed by the Department of Midwifery Studies at the University of Central Lancashire (UCLAN) in 1995. The North West Regional Health Authority (NWRHA) approached the Department of Midwifery Studies to develop an educational programme for family planning nurses to supply the combined method of emergency hormonal contraception (EHC) under protocol when a doctor was not present. The purpose was to increase the availability and accessibility of EHC for young people in the North West region. The 3-day programme was designed to complement previous ENB 901/900 training, and also to provide the nurses with the specific skills and knowledge required to undertake this new role. One hundred and thirty-nine nurses from the North West area attended the programme between 1995-1998. Students were assessed both theoretically and clinically. Extending the role of family planning nurses to supply EHC gives purchasers and providers of sexual health care the potential to offer a wider range of accessible services. The recently published interim Crown Report1 on the supply and administration of medicines under group protocols states that protocols should specify clear arrangements for professional responsibility and accountability. Appropriate training is essential to ensure that the extended role of the nurse in family planning is fully understood.

  6. NEREC, an effective brain mapping protocol for combined language and long-term memory functions.

    Science.gov (United States)

    Perrone-Bertolotti, Marcela; Girard, Cléa; Cousin, Emilie; Vidal, Juan Ricardo; Pichat, Cédric; Kahane, Philippe; Baciu, Monica

    2015-12-01

    Temporal lobe epilepsy can induce functional plasticity in temporoparietal networks involved in language and long-term memory processing. Previous studies in healthy subjects have revealed the relative difficulty for this network to respond effectively across different experimental designs, as compared to more reactive regions such as frontal lobes. For a protocol to be optimal for clinical use, it has to first show robust effects in a healthy cohort. In this study, we developed a novel experimental paradigm entitled NEREC, which is able to reveal the robust participation of temporoparietal networks in a uniquely combined language and memory task, validated in an fMRI study with healthy subjects. Concretely, NEREC is composed of two runs: (a) an intermixed language-memory task (confrontation naming associated with encoding in nonverbal items, NE) to map language (i.e., word retrieval and lexico-semantic processes) combined with simultaneous long-term verbal memory encoding (NE items named but also explicitly memorized) and (b) a memory retrieval task of items encoded during NE (word recognition, REC) intermixed with new items. Word recognition is based on both perceptual-semantic familiarity (feeling of 'know') and accessing stored memory representations (remembering). In order to maximize the remembering and recruitment of medial temporal lobe structures, we increased REC difficulty by changing the modality of stimulus presentation (from nonverbal during NE to verbal during REC). We report that (a) temporoparietal activation during NE was attributable to both lexico-semantic (language) and memory (episodic encoding and semantic retrieval) processes; that (b) encoding activated the left hippocampus, bilateral fusiform, and bilateral inferior temporal gyri; and that (c) task recognition (recollection) activated the right hippocampus and bilateral but predominant left fusiform gyrus. The novelty of this protocol consists of (a) combining two tasks in one (language

  7. Combining Wireless Sensor Networks and Groundwater Transport Models: Protocol and Model Development in a Simulative Environment

    Science.gov (United States)

    Barnhart, K.; Urteaga, I.; Han, Q.; Porta, L.; Jayasumana, A.; Illangasekare, T.

    2007-12-01

    , and protocols necessary for a closed-loop simulation online, combining work across multiple disciplines. This simulation environment will expedite software development and a large-scale experimental aquifer will be used for further validation of the techniques. The results presented here address: setup of a WSN simulator which cooperates with transport models, development of fault detection techniques into the WSN routing protocol which are particular to this application, and planned steps in building a transport model capable of working in the WSN context.

  8. Detection of telomerase activity by combination of telomeric repeat amplification protocol and electrochemiluminescence assay

    Institute of Scientific and Technical Information of China (English)

    Xiao Ming Zhou; Li Jia

    2008-01-01

    A highly sensitive telomerase detection method that combines telomeric repeat amplification protocol (TRAP) and magnetic beads based electrochemiluminescence (ECL) assay has been developed. Briefly, telomerase recognizes biotinylated telomerase synthesis primer (B-TS) and synthesizes extension products, which then serve as the templates for PCR amplification using B-TS as the forward primer and Iris-(2'2'-bipyridyl) ruthenium (TBR) labeled ACX (TBR-ACX) as the reversed primer. The amplified product is captured on streptavidin-coated paramagnetic beads and detected by ECL. Telomerase positive HeLa cells were used to validate the feasibility of the method. The experimental results showed down to 10 cancer cells can be detected easily. The method is a useful tool for telomerase activity analysis due to its sensitivity, rapidity, safety, high throughput, and low cost. It can be used for screening a large amount of clinical samples.

  9. Antitumor action of non thermal plasma sources, DBD and Plasma Gun, alone or in combined protocols

    Science.gov (United States)

    Robert, Eric; Brullé, Laura; Vandamme, Marc; Riès, Delphine; Le Pape, Alain; Pouvesle, Jean-Michel

    2012-10-01

    The presentation deals with the assessment on two non thermal plasma sources developed and optimized for oncology applications. The first plasma source is a floating-electrode dielectric barrier discharge powered at a few hundreds of Hz which deliver air-plasma directly on the surface of cell culture medium in dishes or on the skin or organs of mice bearing cancer tumors. The second plasma source, so called Plasma Gun, is a plasma jet source triggered in noble gas, transferred in high aspect ratio and flexible capillaries, on targeting cells or tumors after plasma transfer in air through the ``plasma plume'' generated at the capillary outlet. In vitro evidence for massive cancer cell destruction and in vivo tumor activity and growth rate reductions have been measured with both plasma sources. DNA damages, cell cycle arrests and apoptosis induction were also demonstrated following the application of any of the two plasma source both in vitro and in vivo. The comparison of plasma treatment with state of the art chemotherapeutic alternatives has been performed and last but not least the benefit of combined protocols involving plasma and chemotherapeutic treatments has been evidenced for mice bearing orthotopic pancreas cancer and is under evaluation for the colon tumors.

  10. Absolute Paleointensity Estimates using Combined Shaw and Pseudo-Thellier Experimental Protocols

    Science.gov (United States)

    Foucher, M. S.; Smirnov, A. V.

    2016-12-01

    Data on the long-term evolution of Earth's magnetic field intensity have a great potential to advance our understanding of many aspects of the Earth's evolution. However, paleointensity determination is one of the most challenging aspects of paleomagnetic research so the quantity and quality of existing paleointensity data remain limited, especially for older epochs. While the Thellier double-heating method remains to be the most commonly used paleointensity technique, its applicability is limited for many rocks that undergo magneto-mineralogical alteration during the successive heating steps required by the method. In order to reduce the probability of alteration, several alternative methods that involve a limited number of or no heating steps have been proposed. However, continued efforts are needed to better understand the physical foundations and relative efficiency of reduced/non-heating methods in recovering the true paleofield strength and to better constrain their calibration factors. We will present the results of our investigation of synthetic and natural magnetite-bearing samples using a combination of the LTD-DHT Shaw and pseudo-Thellier experimental protocols for absolute paleointensity estimation.

  11. Refinement Modal Logic

    CERN Document Server

    Bozzelli, Laura; French, Tim; Hales, James; Pinchinat, Sophie

    2012-01-01

    In this paper we present refinement modal logic. A refinement is like a bisimulation, except that from the three relational requirements only 'atoms' and 'back' need to be satisfied. Our logic contains a new operator 'forall' in additional to the standard modalities 'Box' for each agent. The operator 'forall' acts as a quantifier over the set of all refinements of a given model. We call it the refinement operator. As a variation on a bisimulation quantifier, it can be seen as a refinement quantifier over a variable not occurring in the formula bound by the operator. The logic combines the simplicity of multi-agent modal logic with some powers of monadic second order quantification. We present a sound and complete axiomatization of multiagent refinement modal logic. We also present an extension of the logic to the modal mu-calculus, and an axiomatization for the single-agent version of this logic. Examples and applications are also discussed: to software verification and design (the set of agents can also be s...

  12. Combining Q2MM Modeling and Kinetic Studies for Refinement of the Osmium-catalyzed Asymmetric Dihydroxylation (AD) Mnemonic

    DEFF Research Database (Denmark)

    Fristrup, Peter; Jensen, Gitte Holm; Andersen, Marie Louise Nygaard

    2006-01-01

    The interactions between the substrate and the ligand in the Sharpless AD reaction have been examined in detail, using a combination of substrate competition experiments and molecular modeling of transition states. There is a good agreement between computational and experimental results, in parti...

  13. Using OpenRefine

    CERN Document Server

    Verborgh, Ruben

    2013-01-01

    The book is styled on a Cookbook, containing recipes - combined with free datasets - which will turn readers into proficient OpenRefine users in the fastest possible way.This book is targeted at anyone who works on or handles a large amount of data. No prior knowledge of OpenRefine is required, as we start from the very beginning and gradually reveal more advanced features. You don't even need your own dataset, as we provide example data to try out the book's recipes.

  14. Screening performance for trisomy 21 comparing first trimester combined screening and a first trimester contingent screening protocol including ductus venosus and tricuspid flow

    DEFF Research Database (Denmark)

    Ekelund, Charlotte Kvist; Petersen, Olav Bjørn; Sundberg, Karin Milner

    2012-01-01

    To compare the standard first trimester combined risk assessment for trisomy 21 with a contingent screening protocol including tricuspid flow and ductus venosus flow.......To compare the standard first trimester combined risk assessment for trisomy 21 with a contingent screening protocol including tricuspid flow and ductus venosus flow....

  15. An 8-Node Shell Element for Nonlinear Analysis of Shells Using the Refined Combination of Membrane and Shear Interpolation Functions

    Directory of Open Access Journals (Sweden)

    Woo-Young Jung

    2013-01-01

    Full Text Available An improved 8-node shell finite element applicable for the geometrically linear and nonlinear analyses of plates and shells is presented. Based on previous first-order shear deformation theory, the finite element model is further improved by the combined use of assumed natural strains and different sets of collocation points for the interpolation of the different strain components. The influence of the shell element with various conditions such as locations, number of enhanced membranes, and shear interpolation is also identified. By using assumed natural strain method with proper interpolation functions, the present shell element generates neither membrane nor shear locking behavior even when full integration is used in the formulation. Furthermore, to characterize the efficiency of these modifications of the 8-node shell finite elements, numerical studies are carried out for the geometrically linear and non-linear analysis of plates and shells. In comparison to some other shell elements, numerical examples for the methodology indicate that the modified element described locking-free behavior and better performance. More specifically, the numerical examples of annular plate presented herein show good validity, efficiency, and accuracy to the developed nonlinear shell element.

  16. Cross-Layer Protocol Combining Tree Routing and TDMA Slotting in Wireless Sensor Networks

    Science.gov (United States)

    Bai, Ronggang; Ji, Yusheng; Lin, Zhiting; Wang, Qinghua; Zhou, Xiaofang; Qu, Yugui; Zhao, Baohua

    Being different from other networks, the load and direction of data traffic for wireless sensor networks are rather predictable. The relationships between nodes are cooperative rather than competitive. These features allow the design approach of a protocol stack to be able to use the cross-layer interactive way instead of a hierarchical structure. The proposed cross-layer protocol CLWSN optimizes the channel allocation in the MAC layer using the information from the routing tables, reduces the conflicting set, and improves the throughput. Simulations revealed that it outperforms SMAC and MINA in terms of delay and energy consumption.

  17. RNA Structure Refinement using the ERRASER-Phenix pipeline

    Science.gov (United States)

    Chou, Fang-Chieh; Echols, Nathaniel; Terwilliger, Thomas C.; Das, Rhiju

    2015-01-01

    Summary The final step of RNA crystallography involves the fitting of coordinates into electron density maps. The large number of backbone atoms in RNA presents a difficult and tedious challenge, particularly when experimental density is poor. The ERRASER-Phenix pipeline can improve an initial set of RNA coordinates automatically based on a physically realistic model of atomic-level RNA interactions. The pipeline couples diffraction-based refinement in Phenix with the Rosetta-based real-space refinement protocol ERRASER (Enumerative Real-Space Refinement ASsisted by Electron density under Rosetta). The combination of ERRASER and Phenix can improve the geometrical quality of RNA crystallographic models while maintaining or improving the fit to the diffraction data (as measured by Rfree). Here we present a complete tutorial for running ERRASER-Phenix through the Phenix GUI, from the command-line, and via an application in the Rosetta On-line Server that Includes Everyone (ROSIE). PMID:26227049

  18. Uncovering Expertise-Related Differences in Troubleshooting Performance: Combining Eye Movement and Concurrent Verbal Protocol Data

    NARCIS (Netherlands)

    Van Gog, Tamara; Paas, Fred; Van Merriënboer, Jeroen

    2007-01-01

    This study explored the value of eye movement data for uncovering relatively small expertise-related differences in electrical circuit-troubleshooting performance, and describes that value in relation to concurrent verbal protocols. Results show that in the ‘problem orientation’ phase, higher expert

  19. Performance Analysis of Genetic Zone Routing Protocol Combined With Vertical Handover Algorithm for 3G-WiFi Offload

    Directory of Open Access Journals (Sweden)

    Setiyo Budiyanto

    2014-11-01

    Full Text Available In the deployment scenario of multiple base stations there is usually a deficiency in the routing protocols for load balancing in the wireless network. In this study, we propose a routing algorithm that can be implemented inMobile Adhoc Networks (MANETs as well as third-generation (3G–Wireless Fidelity (WiFi offload networks. We combined the Genetic Zone Routing Protocol (GZRP with the Vertical Handover (VHO algorithm in a 3G–WiFioffload network with multiple base stations. Simulationresults show thatthe proposed algorithm yields improvement in the received signal strength(which is increased up to 25 dBm, user throughput (which is approximately 1 Mbps-2.5 Mbps, and data rate (which is increased up to 2.5 Mbps.

  20. The Combination of SAT and SBT Protocols May Help Reduce the Incidence of Ventilator-Associated Pneumonia in the Burn Intensive Care Unit.

    Science.gov (United States)

    Lee, Yann-Leei Larry; Sims, Kaci D; Butts, Charles C; Frotan, M Amin; Kahn, Steven; Brevard, Sidney B; Simmons, Jon D

    There are few published reports on the unique nature of burn patients using a paired spontaneous awakening and spontaneous breathing protocol. A combined protocol was implemented in our burn intensive care unit (ICU) on January 1, 2012. This study evaluates the impact of this protocol on patient outcomes in a burn ICU. We performed a retrospective review of our burn registry over 4 years, including all patients placed on mechanical ventilation. In the latter 2 years, patients meeting criteria underwent daily spontaneous awakening trial; if successful, spontaneous breathing trial was performed. Patient data included age, burn size, percent full-thickness burn, tracheostomy, and inhalation injury. Outcome measures included ventilator days, ICU and hospital lengths of stay, pneumonia, and disposition. Data were analyzed using Graphpad Prism and IBM SPSS software, with statistical significance defined as P < .05. There were 171 admissions in the preprotocol period and 136 after protocol implementation. Protocol patients had greater percent full-thickness burns, but did not differ in other characteristics. The protocol group had significantly shorter ICU length of stay, fewer ventilator days, and lower pneumonia incidence. Hospital length of stay, disposition, and mortality were not significantly different. Among patients with inhalation injuries, the protocol group exhibited fewer ventilator and ICU days. Protocol implementation in a burn ICU was accompanied by decreased ventilator days and a reduced incidence of pneumonia. A combined spontaneous awakening and breathing protocol is safe and may improve clinical practice in the burn ICU.

  1. Effects of a combined mechanical stimulation protocol: Value for skeletal muscle tissue engineering.

    Science.gov (United States)

    Boonen, Kristel J M; Langelaan, Marloes L P; Polak, Roderick B; van der Schaft, Daisy W J; Baaijens, Frank P T; Post, Mark J

    2010-05-28

    Skeletal muscle is an appealing topic for tissue engineering because of its variety in applications for regenerative medicine, in vitro physiological model systems, and in vitro meat production. Besides conventional biochemical cues to promote muscle tissue maturation in vitro, biophysical stimuli are necessary to reach the desired functionality and texture of the engineered tissue. Stretch, caused by active movements of the body, is an important factor present in the niche of muscle progenitor cells in vivo. We therefore investigated the effects of uniaxial ramp stretch (2%) followed by uniaxial intermittent dynamic stretch (4%) on C2C12 and murine muscle progenitor cells in a 2D and 3D environment and found that stretch negatively influenced maturation in all cases, demonstrated by decreased expression of MRFs and sarcomere proteins at the RNA level and a delay in the formation of cross striations. We therefore conclude that the current protocol is not recommended for skeletal muscle tissue engineering purposes.

  2. Promoting teamwork and surgical optimization: combining TeamSTEPPS with a specialty team protocol.

    Science.gov (United States)

    Tibbs, Sheila Marie; Moss, Jacqueline

    2014-11-01

    This quality improvement project was a 300-day descriptive preintervention and postintervention comparison consisting of a convenience sample of 18 gynecology surgical team members. We administered the Team Strategies & Tools to Enhance Performance and Patient Safety (TeamSTEPPS®) Teamwork Perception Questionnaire to measure the perception of teamwork. In addition, we collected data regarding rates of compliance (ie, huddle, time out) and measurable surgical procedure times. Results showed a statistically significant increase in the number of team members present for each procedure, 2.34 μ before compared with 2.61 μ after (P = .038), and in the final time-out (FTO) compliance as a result of a clarification of the definition of FTO, 1.05 μ before compared with 1.18 μ after (P = .004). Additionally, there was improvement in staff members' perception of teamwork. The implementation of team training, protocols, and algorithms can enhance surgical optimization, communication, and work relationships.

  3. Clinical Study of Endostar Combined with DP Protocol in Treatment of Advanced Esophageal Cancer

    Directory of Open Access Journals (Sweden)

    Wen-ying DENG

    2015-09-01

    Full Text Available Objective: To observe the clinical outcomes of Endostar combined with DP regimen for treating advanced esophageal cancer.Methods: A total of 62 patients with advanced esophageal cancer admitted from May, 2011 to May, 2013 were enrolled for a prospective, randomized controlled trial and 2 cases were excluded from the study because of Ⅳ degree of digestive tract reaction and myelosuppression. Therefore, 60 cases could be evaluated, and then divided into combined group (given Endostar+DP plan and single chemotherapy group, 30 cases in each group. The level of VEGF, tumor size and CT perfusion (CTP parameters, including blood flow (BF, blood volume (BV, mean transit time (MTT, and permeability surface (PS before and after treatment were determined for comparison. Kaplan-Merier method was used to analyze the overall survival (OS of 2 groups.Results: The efficacy of combined group was superior to single chemotherapy group. The level of vascular endothelial growth factor (VEGF in combined group was obviously lower than that in single chemotherapy group after treatment (P<0.01. Compared with treatment before in combined group, BF, BV and PS decreased while MTT increased after treatment (P<0.05. However, there were no significant differences in single chemotherapygroup before and after treatment (P>0.05. The median OS was 30 months (95%CI: 20.935-39.065 for combined group and 21 months (95%CI: 15.109-26.591 for single chemotherapy group, respectively (P=0.048. The 1-, 2- and 3-year survival rates were 86.2%, 59.3% and 36.6% in combined group, and 70.8%, 32.1% and 17.8% in single chemotherapy group, respectively.Conclusion: Endostar can down-regulate the expression of VEGF, improve the state of hypertransfusion and high permeability of tumor vessels, has better curative effect without slighter adverse reactions, and prolong the survival time of patients with advanced esophageal cancer when combined with chemotherapy.

  4. Efficiency of estrous synchronization in tropical sheep by combining short-interval cloprostenol-based protocols and "male effect".

    Science.gov (United States)

    Contreras-Solis, I; Vasquez, B; Diaz, T; Letelier, C; Lopez-Sebastian, A; Gonzalez-Bulnes, A

    2009-04-01

    This study assessed the efficacy of a protocol combining short-interval cloprostenol-based protocols and "male effect" for estrous synchronization in hair sheep. In Experiment 1, 24 ewes were randomly assigned to three groups (n=8) and treated with cloprostenol on Days 3, 5 and 7 after ovulation, respectively. Estradiol secretion during the follicular phase was similar among groups. Onset of estrus (Psynchronized in 68 ewes. Nineteen of them (group FGA) were treated using intravaginal sponges impregnated with fluorogestone acetate for 12 days and inseminated at 55h. Forty-nine females (group ME) were treated like ME group. Twenty-four (ME48 group) and 25 ewes (ME55 group) were inseminated at 48 and 55h after treatment, respectively. The fertility rate was numerically higher in ME48 than ME55 and FGA groups (62.5, 44.0 and 47.4%, respectively). In conclusions, the combined use of short-interval cloprostenol treatment and "male effect" may be an adequate alternative for synchronizing estrus and applying artificial insemination in hair sheep throughout the entire year.

  5. Coloured Petri Net Refinement Specification and Correctness Proof with Coq

    Science.gov (United States)

    Choppy, Christine; Mayero, Micaela; Petrucci, Laure

    2009-01-01

    In this work, we address the formalisation of symmetric nets, a subclass of coloured Petri nets, refinement in COQ. We first provide a formalisation of the net models, and of their type refinement in COQ. Then the COQ proof assistant is used to prove the refinement correctness lemma. An example adapted from a protocol example illustrates our work.

  6. Clinical Study of Endostar Combined with DP Protocol in Treatment of Advanced Esophageal Cancer

    Institute of Scientific and Technical Information of China (English)

    DENG Wen-ying; LI Ning; LUO Su-xia

    2015-01-01

    Objective: To observe the clinical outcomes of Endostar combined with DP regimen for treating advanced esophageal cancer. Methods: A total of 62 patients with advanced esophageal cancer admitted from May, 2011 to May, 2013 were enrolled for a prospective, randomized controlled trial and 2 cases were excluded from the study because ofⅣ degree of digestive tract reaction and myelosuppression. Therefore, 60 cases could be evaluated, and then divided into combined group (given Endostar+DP plan) and single chemotherapy group, 30 cases in each group. The level of VEGF, tumor size and CT perfusion (CTP) parameters, including blood flow (BF), blood volume (BV), mean transit time (MTT), and permeability surface (PS) before and after treatment were determined for comparison. Kaplan-Merier method was used to analyze the overall survival (OS) of 2 groups. Results:The efifcacy of combined group was superior to single chemotherapy group. The level of vascular endothelial growth factor (VEGF) in combined group was obviously lower than that in single chemotherapy group after treatment (P0.05). The median OS was 30 months (95%CI: 20.935-39.065) for combined group and 21 months (95%CI: 15.109-26.591) for single chemotherapy group, respectively (P=0.048). The 1-, 2- and 3-year survival rates were 86.2%, 59.3% and 36.6% in combined group, and 70.8%, 32.1% and 17.8% in single chemotherapy group, respectively. Conclusion: Endostar can down-regulate the expression of VEGF, improve the state of hypertransfusion and high permeability of tumor vessels, has better curative effect without slighter adverse reactions, and prolong the survival time of patients with advanced esophageal cancer when combined with chemotherapy.

  7. Combined and sequential treatment of oral and maxillofacial malignancies: an evolving concept and clinical protocol

    Institute of Scientific and Technical Information of China (English)

    ZHENG Jia-wei; QIU Wei-liu; ZHANG Zhi-yuan

    2008-01-01

    Objective To introduce the concept and rational regimens and present the latest development of combined treatment of oral and maxillofacial malignancies.Data sources The related published literature was searched through the CNKI database and MEDLINE using the terms of oral cancer, oral and maxillofacial malignancies, combined and sequential therapy, multidisciplinary approach.Study selection The available related literature was read and evaluated. Studies that met the inclusion criteria were selected.Results The results show that oral and maxillofacial malignancies diagnosed at an eady stages (stages Ⅰ and Ⅱ) can be well treated with surgery alone and/or radiotherapy with optimal outcome, but advanced or recurrent diseases should be treated with rational combined and sequential treatment modalities. The use of concomitant chemoradiotherapy,taxane-containing, three-drug induction regimens and Cetuximab in combination with chemotherapy or radiotherapy demonstrated favorable results in previously untreated patients with head and neck squamous cell carcinoma.Conclusions The concept of combined and sequential treatment of advanced oral and maxillofacial malignancies should be widely accepted, and the rational regimen for individual and each type of entity should be determined based on the anatomical site and the patient's performance status.

  8. A study of two protocols combining aglepristone and cloprostenol to treat open cervix pyometra in the bitch.

    Science.gov (United States)

    Gobello, Cristina; Castex, Gervasio; Klima, Liliana; Rodríguez, Raúl; Corrada, Yanina

    2003-09-15

    To compare the efficacy and safety of two protocols using a combination of aglepristone and cloprostenol for the treatment of open cervix pyometra in the bitch and to describe the progesterone (P4) serum profiles before and during treatments, 15 bitches were randomly allocated into two treatment groups: I (n = 8): aglepristone was administered at 10mg/kg, s.c., on Days 1, 3, 8, and 15 (if not cured), combined with cloprostenol at the dose of 1 microg/kg, s.c., on Days 3 and 8, and II (n = 7): received the same treatment with aglepristone as Treatment I but cloprostenol on Days 3, 5, 8 10, 12, and 15 (if not cured). Before the beginning of the treatments and then on Days 8, 15, and 29 all bitches were evaluated for clinical signs, side effects, hemogram, serum P4 concentrations, and uterus diameters. Bitches in both treatment groups, with (n = 6) or without (n = 9; > or =1.2 ng/ml) initial basal P4 serum concentrations, achieved treatment success without side effects and no significant differences, either on Day 15 (6/8 for Treatment I and 4/7 for Treatment II) or on Day 29 (2/8 for Treatment I and 3/7 for Treatment II). In both treatments groups, clinical signs, blood parameters, and uterine diameters improved to normal values throughout the experiments. A significant interaction between day and treatment was found for percentage change in P4 when all bitches were considered together. Redevelopment of pyometra in the next estrous cycle occurred in 20% of the bitches. One nonrecurrent bitch was mated and whelped a normal litter. It is concluded that these two combined protocols proved to be efficient and safe in reversing clinical signs of open cervix pyometra independently of initial P4 concentrations and that the number of cloprostenol administrations seemed to have an effect on P4 serum changes throughout treatments.

  9. Refining Inductive Types

    CERN Document Server

    Atkey, Robert; Ghani, Neil

    2012-01-01

    Dependently typed programming languages allow sophisticated properties of data to be expressed within the type system. Of particular use in dependently typed programming are indexed types that refine data by computationally useful information. For example, the N-indexed type of vectors refines lists by their lengths. Other data types may be refined in similar ways, but programmers must produce purpose-specific refinements on an ad hoc basis, developers must anticipate which refinements to include in libraries, and implementations must often store redundant information about data and their refinements. In this paper we show how to generically derive inductive characterisations of refinements of inductive types, and argue that these characterisations can alleviate some of the aforementioned difficulties associated with ad hoc refinements. Our characterisations also ensure that standard techniques for programming with and reasoning about inductive types are applicable to refinements, and that refinements can the...

  10. Long term successful weight loss with a combination biphasic ketogenic Mediterranean diet and Mediterranean diet maintenance protocol.

    Science.gov (United States)

    Paoli, Antonio; Bianco, Antonino; Grimaldi, Keith A; Lodi, Alessandra; Bosco, Gerardo

    2013-12-18

    Weight loss protocols can only be considered successful if they deliver consistent results over the long term-a goal which is often elusive, so much so that the term "yo-yo" is used to describe the perennial weight loss/weight regain battle common in obesity. We hypothesized that a ketogenic Mediterranean diet with phytoextracts (KEMEPHY) combined with the acknowledged health benefits of traditional Mediterranean nutrition may favor long term weight loss. We analysed 89 male and female obese subjects, aged between 25 and 65 years who were overall healthy apart from being overweight. The subjects followed a staged diet protocol over a period of 12 months: 20 day of KEMEPHY; 20 days low carb-non ketogenic; 4 months Mediterranean normocaloric nutrition; a second 20 day ketogenic phase followed by 6 months of Mediterranean normocaloric nutrition. For the majority of subjects (88.25%) there was significant loss of weight (from 100.7 ± 16.54 to 84.59 ± 9.71 kg; BMI from 35.42 ± 4.11 to 30.27 ± 3.58) and body fat (form 43.44% ± 6.34% to 33.63% ± 7.6%) during both ketogenic phases followed by successful maintenance, without weight regain, during the 6 month stabilization phase with only 8 subjects failing to comply. There were also significant and stable decreases in total cholesterol, LDLc, triglycerides and glucose levels over the 12 month study period. HDLc showed small increases after the ketogenic phases but over the full 12 months there was no significant change. No significant changes were observed in ALT, AST, Creatinine or BUN. The combination of a biphasic KEMEPHY diet separated by longer periods of maintenance nutrition, based on the traditional Mediterranean diet, led to successful long term weight loss and improvements in health risk factors in a majority of subjects; compliance was very high which was a key determinant of the results seen.

  11. RNA 3D Structure Modeling by Combination of Template-Based Method ModeRNA, Template-Free Folding with SimRNA, and Refinement with QRNAS.

    Science.gov (United States)

    Piatkowski, Pawel; Kasprzak, Joanna M; Kumar, Deepak; Magnus, Marcin; Chojnowski, Grzegorz; Bujnicki, Janusz M

    2016-01-01

    RNA encompasses an essential part of all known forms of life. The functions of many RNA molecules are dependent on their ability to form complex three-dimensional (3D) structures. However, experimental determination of RNA 3D structures is laborious and challenging, and therefore, the majority of known RNAs remain structurally uncharacterized. To address this problem, computational structure prediction methods were developed that either utilize information derived from known structures of other RNA molecules (by way of template-based modeling) or attempt to simulate the physical process of RNA structure formation (by way of template-free modeling). All computational methods suffer from various limitations that make theoretical models less reliable than high-resolution experimentally determined structures. This chapter provides a protocol for computational modeling of RNA 3D structure that overcomes major limitations by combining two complementary approaches: template-based modeling that is capable of predicting global architectures based on similarity to other molecules but often fails to predict local unique features, and template-free modeling that can predict the local folding, but is limited to modeling the structure of relatively small molecules. Here, we combine the use of a template-based method ModeRNA with a template-free method SimRNA. ModeRNA requires a sequence alignment of the target RNA sequence to be modeled with a template of the known structure; it generates a model that predicts the structure of a conserved core and provides a starting point for modeling of variable regions. SimRNA can be used to fold small RNAs (models for larger RNAs that have a correctly modeled core. ModeRNA can be either downloaded, compiled and run locally or run through a web interface at http://genesilico.pl/modernaserver/ . SimRNA is currently available to download for local use as a precompiled software package at http://genesilico.pl/software/stand-alone/simrna and as a

  12. Ultrasound assisted extraction of food and natural products. Mechanisms, techniques, combinations, protocols and applications. A review.

    Science.gov (United States)

    Chemat, Farid; Rombaut, Natacha; Sicaire, Anne-Gaëlle; Meullemiestre, Alice; Fabiano-Tixier, Anne-Sylvie; Abert-Vian, Maryline

    2017-01-01

    This review presents a complete picture of current knowledge on ultrasound-assisted extraction (UAE) in food ingredients and products, nutraceutics, cosmetic, pharmaceutical and bioenergy applications. It provides the necessary theoretical background and some details about extraction by ultrasound, the techniques and their combinations, the mechanisms (fragmentation, erosion, capillarity, detexturation, and sonoporation), applications from laboratory to industry, security, and environmental impacts. In addition, the ultrasound extraction procedures and the important parameters influencing its performance are also included, together with the advantages and the drawbacks of each UAE techniques. Ultrasound-assisted extraction is a research topic, which affects several fields of modern plant-based chemistry. All the reported applications have shown that ultrasound-assisted extraction is a green and economically viable alternative to conventional techniques for food and natural products. The main benefits are decrease of extraction and processing time, the amount of energy and solvents used, unit operations, and CO2 emissions.

  13. Design of a protocol for combined laser hyperthermia-photodynamic therapy in the esophagus

    Energy Technology Data Exchange (ETDEWEB)

    London, R A; Eichler, J; Liebetrudt, J; Ziegenhagen, L

    2000-02-01

    Photodynamic laser therapy (PDT) for esophageal cancer has recently been studied in animal and clinical trials. In several animal experiments a synergetic effect was found by simultaneously applying PDT and hyperthermia (HT). In this paper an optical fiber system is described which can be used in the esophagus for combined PDT with a 1 W dye laser and HT with a 15--40 W Nd-YAG laser. Phantoms were developed to simulate the geometry of the esophagus using cow muscle. The spatial-temporal temperature field during HT was measured. The results were compared with calculations using a coupled Monte Carlo laser transport/finite difference heat transport model using the LATIS computer program. Measurements and calculations yield a realistic description of the temperature distribution during HT under various experimental conditions. The LATIS program allows the prediction of the effects of blood perfusion for in-vivo situations. The results show that the perfusion has considerable influence on the temperature field, which must be considered for in-vivo applications.

  14. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  15. Hirshfeld atom refinement.

    Science.gov (United States)

    Capelli, Silvia C; Bürgi, Hans-Beat; Dittrich, Birger; Grabowsky, Simon; Jayatilaka, Dylan

    2014-09-01

    Hirshfeld atom refinement (HAR) is a method which determines structural parameters from single-crystal X-ray diffraction data by using an aspherical atom partitioning of tailor-made ab initio quantum mechanical molecular electron densities without any further approximation. Here the original HAR method is extended by implementing an iterative procedure of successive cycles of electron density calculations, Hirshfeld atom scattering factor calculations and structural least-squares refinements, repeated until convergence. The importance of this iterative procedure is illustrated via the example of crystalline ammonia. The new HAR method is then applied to X-ray diffraction data of the dipeptide Gly-l-Ala measured at 12, 50, 100, 150, 220 and 295 K, using Hartree-Fock and BLYP density functional theory electron densities and three different basis sets. All positions and anisotropic displacement parameters (ADPs) are freely refined without constraints or restraints - even those for hydrogen atoms. The results are systematically compared with those from neutron diffraction experiments at the temperatures 12, 50, 150 and 295 K. Although non-hydrogen-atom ADPs differ by up to three combined standard uncertainties (csu's), all other structural parameters agree within less than 2 csu's. Using our best calculations (BLYP/cc-pVTZ, recommended for organic molecules), the accuracy of determining bond lengths involving hydrogen atoms from HAR is better than 0.009 Å for temperatures of 150 K or below; for hydrogen-atom ADPs it is better than 0.006 Å(2) as judged from the mean absolute X-ray minus neutron differences. These results are among the best ever obtained. Remarkably, the precision of determining bond lengths and ADPs for the hydrogen atoms from the HAR procedure is comparable with that from the neutron measurements - an outcome which is obtained with a routinely achievable resolution of the X-ray data of 0.65 Å.

  16. Combining motivational and volitional strategies to promote unsupervised walking in patients with fibromyalgia: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Pastor, María-Ángeles; López-Roig, Sofía; Lledó, Ana; Peñacoba, Cecilia; Velasco, Lilian; Schweiger-Gallo, Inge; Cigarán, Margarita; Ecija, Carmen; Limón, Ramón; Sanz, Yolanda

    2014-04-11

    Fibromyalgia patients are often advised to engage in regular low- to moderate-intensity physical exercise. The need of fibromyalgia patients to walk has been stressed in previous research. Behavioral self-regulation theories suggest that a combination of motivational aspects (to develop or strengthen a behavioral intention: Theory of Planned Behavior) and volitional aspects (engagement of intention in behavior: implementation intentions) is more effective than a single intervention. In this paper, we describe a protocol for identifying the motivational processes (using the Theory of Planned Behavior) involved in the practice of walking (phase I) and for studying the efficacy of an intervention that combines motivational and volitional contents to enhance the acquisition and continuation of this exercise behavior (phase II). The paper also shows the characteristics of eligible individuals (women who do not walk) and ineligible populations (women who walk or do not walk because of comorbidity without medical recommendation to walk). Both groups consist of members of any of four patients' associations in Spain who are between 18 and 70 years of age and meet the London Fibromyalgia Epidemiology Study Screening Questionnaire criteria for fibromyalgia. Furthermore, using this study protocol, we will explore the characteristics of participants (eligible women who agreed to participate in the study) and nonparticipants (eligible women who refused to participate). Two studies will be conducted: Phase I will be a cross-sectional study, and phase II will be a triple-blind, randomized longitudinal study with two treatment groups and one active control group. The questionnaires were sent to a total of 2,227 members of four patients' associations in Spain. A total of 920 participants with fibromyalgia returned the questionnaires, and 582 were ultimately selected to participate. The first data gathered have allowed us to identify the characteristics of the study population and

  17. Combined direct regeneration protocols in tissue culture of different cumin genotypes based on pre-existing meristems.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Hosseinzadeh, Abdolhadi; Nagavi, Mohammad Reza; Ghannadha, Mohammad Reza; Mohammadie-Dehcheshmeh, Manijeh

    2007-05-01

    Rapid and genotype-independent protocols for two direct in vitro morphogenesis pathways including direct shoot organogenesis from embryo and direct shoot proliferation from node have been developed in cumin (Cuminum cyminum L.). Direct regenerations occurring without passing callus phase are important since fewer somaclonal variation and genotype-dependency are likely to arise from these methods in comparison with regenerations trough callus. After embryo culture, shoots with single-cellular origin were regenerated from the meristematic zone of embryo without any intermediate callus phase. In contrast, proliferated shoots with multi-cellular origin were directly regenerated from the axillary buds (meristems) of node explants. Effects of different concentrations of 6-Benzylaminopurine (BAP), alpha-Naphthaleneacetic Acid (NAA) and Indole-3-kcetic Acid (IAA) on B5 medium of embryo and node cultures as well as subculture were studied in detail. In direct organogenesis pathway from embryo explant, 0.1 mg L(-1) NAA + 1 mg L(-1) IAA resulted the highest shoot regeneration response (89.5 shoots per regenerated explant), whereas 0.1 mg L(-1) BAP + 1 mg L(-1) NAA was the most effective combination in direct shoot proliferation from node explant (42 shoots per regenerated explant). BAP (cytokinin) revealed the inhibitory effect on induction of direct shoot organogenesis pathway from embryo explant, while low concentration of BAP (0.1 mg L(-1)) had positive effect on direct shoot proliferation pathway from node explant. Subculturing was not necessary for shoot multiplication and elongation in embryo culture, whereas multiplication and elongation of shoots in node culture were associated to subculture on growth regulator-free medium. In other part of study, the behavior of different cumin genotypes in direct regeneration pathways was studied. Both direct organogenesis and direct proliferation pathways were applicable to different cumin genotypes and regenerated plants were

  18. Long Term Successful Weight Loss with a Combination Biphasic Ketogenic Mediterranean Diet and Mediterranean Diet Maintenance Protocol

    Directory of Open Access Journals (Sweden)

    Antonio Paoli

    2013-12-01

    Full Text Available Weight loss protocols can only be considered successful if they deliver consistent results over the long term—a goal which is often elusive, so much so that the term “yo-yo” is used to describe the perennial weight loss/weight regain battle common in obesity. We hypothesized that a ketogenic Mediterranean diet with phytoextracts (KEMEPHY combined with the acknowledged health benefits of traditional Mediterranean nutrition may favor long term weight loss. We analysed 89 male and female obese subjects, aged between 25 and 65 years who were overall healthy apart from being overweight. The subjects followed a staged diet protocol over a period of 12 months: 20 day of KEMEPHY; 20 days low carb-non ketogenic; 4 months Mediterranean normocaloric nutrition; a second 20 day ketogenic phase followed by 6 months of Mediterranean normocaloric nutrition. For the majority of subjects (88.25% there was significant loss of weight (from 100.7 ± 16.54 to 84.59 ± 9.71 kg; BMI from 35.42 ± 4.11 to 30.27 ± 3.58 and body fat (form 43.44% ± 6.34% to 33.63% ± 7.6% during both ketogenic phases followed by successful maintenance, without weight regain, during the 6 month stabilization phase with only 8 subjects failing to comply. There were also significant and stable decreases in total cholesterol, LDLc, triglycerides and glucose levels over the 12 month study period. HDLc showed small increases after the ketogenic phases but over the full 12 months there was no significant change. No significant changes were observed in ALT, AST, Creatinine or BUN. The combination of a biphasic KEMEPHY diet separated by longer periods of maintenance nutrition, based on the traditional Mediterranean diet, led to successful long term weight loss and improvements in health risk factors in a majority of subjects; compliance was very high which was a key determinant of the results seen.

  19. On the refinement calculus

    CERN Document Server

    Vickers, Trevor

    1992-01-01

    On the Refinement Calculus gives one view of the development of the refinement calculus and its attempt to bring together - among other things - Z specifications and Dijkstra's programming language. It is an excellent source of reference material for all those seeking the background and mathematical underpinnings of the refinement calculus.

  20. Does roflumilast decrease exacerbations in severe COPD patients not controlled by inhaled combination therapy? the REACT study protocol

    Directory of Open Access Journals (Sweden)

    Calverley PM

    2012-06-01

    Full Text Available Peter MA Calverley,1 Fernando J Martinez,2 Leonardo M Fabbri,3 Udo-Michael Goehring,4 Klaus F Rabe5–71Clinical Science Center, University Hospital Aintree, Liverpool, United Kingdom; 2Department of Internal Medicine, University of Michigan, MI, USA; 3Department of Respiratory Diseases, University of Modena and Reggio Emilia, Modena, Italy; 4Nycomed (a Takeda company, Konstanz, Germany; 5University of Kiel, Kiel, Germany; 6Krankenhaus Grosshansdorf, Center for Pulmonology and Thoracic Surgery, Grosshansdorf, Germany; 7Department of Pulmonology, Leiden University Medical Center, Leiden, The NetherlandsBackground: Many patients with chronic obstructive pulmonary disease (COPD continue to suffer exacerbations, even when treated with maximum recommended therapy (eg, inhaled combinations of long-acting β2-agonist and high dose inhaled corticosteroids, with or without a long-acting anticholinergic [long-acting muscarinic antagonist]. Roflumilast is approved to treat severe COPD in patients with chronic bronchitis – and a history of frequent exacerbations – as an add-on to bronchodilators.Purpose: The REACT (Roflumilast in the Prevention of COPD Exacerbations While Taking Appropriate Combination Treatment study (identification number RO-2455-404-RD, clinicaltrials.gov identifier NCT01329029 will investigate whether roflumilast further reduces exacerbations when added to inhaled combination therapy in patients still suffering from frequent exacerbations.Patients and methods: REACT is a 1-year randomized, double-blind, multicenter, phase III/IV study of roflumilast 500 µg once daily or placebo on top of a fixed long-acting β2-agonist/inhaled corticosteroid combination. A concomitant long-acting muscarinic antagonist will be allowed at stable doses. The primary outcome is the rate of moderate or severe COPD exacerbations. Using a Poisson regression model with a two-sided significance level of 5%, a sample size of 967 patients per treatment group

  1. OPTIMIZING EUCALYPTUS PULP REFINING

    Institute of Scientific and Technical Information of China (English)

    Vail Manfredi

    2004-01-01

    This paper discusses the refining of bleached eucalyptus kraft pulp (BEKP).Pilot plant tests were carried out in to optimize the refining process and to identify the effects of refining variables on final paper quality and process costs.The following parameters are discussed: pulp consistency, disk pattern design, refiner speed,energy input, refiner configuration (parallel or serial)and refining intensity.The effects of refining on pulp fibers were evaluated against the pulp quality properties, such as physical strengths, bulk, opacity and porosity, as well as the interactions with papermaking process, such as paper machine runnability, paper breaks and refining control.The results showed that process optimization,considering pulp quality and refining costs, were obtained when eucalyptus pulp is refined under the lowest intensity and the highest pulp consistency possible. Changes on the operational refining conditions will have the highest impact on total energy requirements (costs) without any significant effect on final paper properties.It was also observed that classical ways to control the industrial operation, such as those based on drainage measurements, do not represent the best alternative to maximize the final paper properties neither the paper machine runability.

  2. Hirshfeld atom refinement

    Directory of Open Access Journals (Sweden)

    Silvia C. Capelli

    2014-09-01

    Full Text Available Hirshfeld atom refinement (HAR is a method which determines structural parameters from single-crystal X-ray diffraction data by using an aspherical atom partitioning of tailor-made ab initio quantum mechanical molecular electron densities without any further approximation. Here the original HAR method is extended by implementing an iterative procedure of successive cycles of electron density calculations, Hirshfeld atom scattering factor calculations and structural least-squares refinements, repeated until convergence. The importance of this iterative procedure is illustrated via the example of crystalline ammonia. The new HAR method is then applied to X-ray diffraction data of the dipeptide Gly–l-Ala measured at 12, 50, 100, 150, 220 and 295 K, using Hartree–Fock and BLYP density functional theory electron densities and three different basis sets. All positions and anisotropic displacement parameters (ADPs are freely refined without constraints or restraints – even those for hydrogen atoms. The results are systematically compared with those from neutron diffraction experiments at the temperatures 12, 50, 150 and 295 K. Although non-hydrogen-atom ADPs differ by up to three combined standard uncertainties (csu's, all other structural parameters agree within less than 2 csu's. Using our best calculations (BLYP/cc-pVTZ, recommended for organic molecules, the accuracy of determining bond lengths involving hydrogen atoms from HAR is better than 0.009 Å for temperatures of 150 K or below; for hydrogen-atom ADPs it is better than 0.006 Å2 as judged from the mean absolute X-ray minus neutron differences. These results are among the best ever obtained. Remarkably, the precision of determining bond lengths and ADPs for the hydrogen atoms from the HAR procedure is comparable with that from the neutron measurements – an outcome which is obtained with a routinely achievable resolution of the X-ray data of 0.65 Å.

  3. Protocol for the process evaluation of interventions combining performance-based financing with health equity in Burkina Faso.

    Science.gov (United States)

    Ridde, Valéry; Turcotte-Tremblay, Anne-Marie; Souares, Aurélia; Lohmann, Julia; Zombré, David; Koulidiati, Jean Louis; Yaogo, Maurice; Hien, Hervé; Hunt, Matthew; Zongo, Sylvie; De Allegri, Manuela

    2014-10-12

    The low quality of healthcare and the presence of user fees in Burkina Faso contribute to low utilization of healthcare and elevated levels of mortality. To improve access to high-quality healthcare and equity, national authorities are testing different intervention arms that combine performance-based financing with community-based health insurance and pro-poor targeting. There is a need to evaluate the implementation of these unique approaches. We developed a research protocol to analyze the conditions that led to the emergence of these intervention arms, the fidelity between the activities initially planned and those conducted, the implementation and adaptation processes, the sustainability of the interventions, the possibilities for scaling them up, and their ethical implications. The study adopts a longitudinal multiple case study design with several embedded levels of analyses. To represent the diversity of contexts where the intervention arms are carried out, we will select three districts. Within districts, we will select both primary healthcare centers (n =18) representing different intervention arms and the district or regional hospital (n =3). We will select contrasted cases in relation to their initial performance (good, fair, poor). Over a period of 18 months, we will use quantitative and qualitative data collection and analytical tools to study these cases including in-depth interviews, participatory observation, research diaries, and questionnaires. We will give more weight to qualitative methods compared to quantitative methods. Performance-based financing is expanding rapidly across low- and middle-income countries. The results of this study will enable researchers and decision makers to gain a better understanding of the factors that can influence the implementation and the sustainability of complex interventions aiming to increase healthcare quality as well as equity.

  4. Automated protein structure modeling in CASP9 by I-TASSER pipeline combined with QUARK-based ab initio folding and FG-MD-based structure refinement.

    Science.gov (United States)

    Xu, Dong; Zhang, Jian; Roy, Ambrish; Zhang, Yang

    2011-01-01

    I-TASSER is an automated pipeline for protein tertiary structure prediction using multiple threading alignments and iterative structure assembly simulations. In CASP9 experiments, two new algorithms, QUARK and fragment-guided molecular dynamics (FG-MD), were added to the I-TASSER pipeline for improving the structural modeling accuracy. QUARK is a de novo structure prediction algorithm used for structure modeling of proteins that lack detectable template structures. For distantly homologous targets, QUARK models are found useful as a reference structure for selecting good threading alignments and guiding the I-TASSER structure assembly simulations. FG-MD is an atomic-level structural refinement program that uses structural fragments collected from the PDB structures to guide molecular dynamics simulation and improve the local structure of predicted model, including hydrogen-bonding networks, torsion angles, and steric clashes. Despite considerable progress in both the template-based and template-free structure modeling, significant improvements on protein target classification, domain parsing, model selection, and ab initio folding of β-proteins are still needed to further improve the I-TASSER pipeline. Copyright © 2011 Wiley-Liss, Inc.

  5. Refining the Openness-Performance Relationship: Construct Specificity, Contextualization, Social Skill, and the Combination of Trait Self- and Other-Ratings.

    Science.gov (United States)

    Kholin, Mareike; Meurs, James A; Blickle, Gerhard; Wihler, Andreas; Ewen, Christian; Momm, Tassilo D

    2016-01-01

    Scholars have raised concerns that openness to experience has ambiguous relationships with performance. In this study, we examine both openness and one of its more narrow dimensions, learning approach. In addition, the research context was made narrow (i.e., higher education academic performance in science), and social skill was interactively combined with peer- and self-rated personality in the prediction of academic performance (i.e., grades). We found that those high on learning approach, but not openness, 1 year later performed better academically than those lower on learning approach. Furthermore, for those high and average on social skill, increased peer-rated learning approach was associated with higher performance. Finally, the combination of self- and other-ratings of learning approach was a better predictor of academic performance than the combination of self- and other-ratings of openness. The relationship of openness with academic performance benefits from narrowing predictors and criteria, framing the study within a relevant context, accounting for social skill, and combining self- and other trait ratings.

  6. China Becomes Globe's Second Largest Oil Refiner

    Institute of Scientific and Technical Information of China (English)

    Zhang Weijun

    2010-01-01

    @@ China's refining capacity of crude oil reached 477 million tons by the end of last year,ranking the second in the world.CNPC and Sinopec now own 27 percent of the country's oil refineries with a combined refining capacity amounting to 76 percent of the country's total.As the country's biggest oil refiner,Sinopec's refining ability has increased 72.8 percent in the past ten years with a growth rate of 6.3 percent per year,ranking the third in the world,according to statistics released by Sinopec.Meanwhile,China's local oil refining enterprises' total capacity has reached 88 million tons per year.According to Sinopec,China has built 17 ten-million-ton-oil refineries which amount to half of the country's total capacity.

  7. Combined pre-injection wrist and ankle MRI protocol and steroid joint injections in juvenile idiopathic arthritis

    Energy Technology Data Exchange (ETDEWEB)

    Kan, J.H. [Texas Children' s Hospital, Department of Radiology, Houston, TX (United States); Graham, T.B. [Monroe Carell Jr. Children' s Hospital at Vanderbilt, Department of Pediatric Rheumatology, Nashville, TN (United States)

    2011-10-15

    Precise localization of affected compartments of the wrist and ankle in children with an established diagnosis of juvenile idiopathic arthritis (JIA) is clinically challenging. The purpose of this paper is to describe our experience utilizing a pre-injection MRI protocol of the wrist and ankle for localizing disease activity followed by fluoroscopically guided joint injections in children with JIA. (orig.)

  8. OPTIMIZING EUCALYPTUS PULP REFINING

    Institute of Scientific and Technical Information of China (English)

    VailManfredi

    2004-01-01

    This paper discusses the refining of bleachedeucalyptus kraft pulp (BEKP).Pilot plant tests were carded out in to optimize therefining process and to identify the effects of refiningvariables on final paper quality and process costs.The following parameters are discussed: pulpconsistency, disk pattern design, refiner speed,energy input, refiner configuration (parallel or serial)and refining intensity.The effects of refining on pulp fibers were evaluatedagainst the pulp quality properties, such as physicalstrengths, bulk, opacity and porosity, as well as theinteractions with papermaking process, such as papermachine runnability, paper breaks and refiningcontrol.The results showed that process optimization,considering pulp quality and refining costs, wereobtained when eucalyptus pulp is refined under thelowest intensity and the highest pulp consistencypossible. Changes on the operational refiningconditions will have the highest impact on totalenergy requirements (costs) without any significanteffect on final paper properties.It was also observed that classical ways to control theindustrial operation, such as those based on drainagemeasurements, do not represent the best alternative tomaximize the final paper properties neither the papermachine runability.

  9. Refined Semilattices of Semigroups

    Institute of Scientific and Technical Information of China (English)

    Liang Zhang; K.P. Shum; Ronghua Zhang

    2001-01-01

    In this paper, we introduce the concept of refined semilattices of semigroups. This is a modified concept of the generally strong semilattice of semigroups initiated by Zhang and Huang. By using the concept of generally strong semilattice, Zhang and Huang showed that a regular band can be expressed by a generally strong semilattice of rectangular bands. However, the proof of the associativity for the multiplication is not complete and there exist some gaps in their construction of regular bands. We now revise the generally strong semilattices and call them refined semilattices. In this way, we are able to remove the gaps,and the associative law of the multiplication can be verified. As an application, we prove that a band is regular if and only if it is a refined semilattice of rectangular bands. In fact, refined semilattices provide a new device in the construction of new semigroups from the old ones.

  10. A Matricial Algorithm for Polynomial Refinement

    CERN Document Server

    King, Emily J

    2011-01-01

    In order to have a multiresolution analysis, the scaling function must be refinable. That is, it must be the linear combination of 2-dilation, $\\mathbb{Z}$-translates of itself. Refinable functions used in connection with wavelets are typically compactly supported. In 2002, David Larson posed the question, "Are all polynomials (of a single variable) finitely refinable?" That summer the author proved that the answer indeed was true using basic linear algebra. The result was presented in a number of talks but had not been typed up until now. The purpose of this short note is to record that particular proof.

  11. Refinement by interface instantiation

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Hoang, Thai Son

    2012-01-01

    be easily refined. Our first contribution hence is a proposal for a new construct called interface that encapsulates the external variables, along with a mechanism for interface instantiation. Using the new construct and mechanism, external variables can be refined consistently. Our second contribution...... is an approach for verifying the correctness of Event-B extensions using the supporting Rodin tool. We illustrate our approach by proving the correctness of interface instantiation....

  12. NAFTA opportunities: Petroleum refining

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The North American Free Trade Agreement (NAFTA) creates a more transparent environment for the sale of refined petroleum products to Mexico, and locks in access to Canada's relatively open market for these products. Canada and Mexico are sizable United States export markets for refined petroleum products, with exports of $556 million and $864 million, respectively, in 1992. These markets represent approximately 24 percent of total U.S. exports of these goods.

  13. The IVF Outcome Counseling Based on the Model Combining DHEAS and Age in Patients with Low AMH Prior to the First Cycle of GnRH Antagonist Protocol of Ovarian Stimulation

    Directory of Open Access Journals (Sweden)

    Miro Šimun Alebić

    2013-01-01

    Full Text Available Objective. To investigate the endocrine and/or clinical characteristics of women with low anti-Müllerian hormone (AMH that could improve the accuracy of IVF outcome prediction based on the female age alone prior to the first GnRH antagonist IVF cycle. Methods. Medical records of 129 patients with low AMH level (<6.5 pmol/L who underwent their first GnRH antagonist ovarian stimulation protocol for IVF/ICSI were retrospectively analyzed. The main outcome measure was the area under the ROC curve (AUC-ROC for the models combining age and other potential predictive factors for the clinical pregnancy. Results. Clinical pregnancy rate (CPR per initiated cycles was 11.6%. For the prediction of clinical pregnancy, DHEAS and age showed AUC-ROC of 0.726 (95%CI 0.641–0.801 and 0.662 (95%CI 0.573–0.743, respectively (. The predictive accuracy of the model combining age and DHEAS (AUC-ROC 0.796; 95%CI 0.716–0.862 was significantly higher compared to that of age alone (. In patients <37.5 years with DHEAS  pmol/L, 60% (9/15 of all pregnancies were achieved with CPR of 37.5%. Conclusions. DHEAS appears to be predictive for clinical pregnancy in younger women (<37.5 years with low AMH after the first GnRH antagonist IVF cycle. Therefore, DHEAS-age model could refine the pretreatment counseling on pregnancy prospects following IVF.

  14. Randomised controlled trial of two sequential artemisinin-based combination therapy regimens to treat uncomplicated falciparum malaria in African children: a protocol to investigate safety, efficacy and adherence

    DEFF Research Database (Denmark)

    Schallig, Henk D. F. H.; Tinto, Halidou; Sawa, Patrick

    2017-01-01

    Background Management of uncomplicated Plasmodium falciparum malaria relies on artemisinin-based combination therapies (ACTs). These highly effective regimens have contributed to reductions in malaria morbidity and mortality. However, artemisinin resistance in Asia and changing parasite...... whether prolonged ACT-based regimens using currently available formulations can eliminate potentially resistant parasites. The protocol investigates whether a sequential course of two licensed ACT in 1080 children aged 6–120 months exhibits superior efficacy against acute P. falciparum malaria and non...... gametocytaemia, occurrence of treatment-related adverse events in the double-ACT versus single-ACT arms, carriage of molecular markers of drug resistance, drug kinetics and patient adherence to treatment. Discussion This protocol addresses efficacy and safety of sequential ACT regimens in P. falciparum malaria...

  15. The Pirate group intervention protocol: description and a case report of a modified constraint-induced movement therapy combined with bimanual training for young children with unilateral spastic cerebral palsy.

    NARCIS (Netherlands)

    Aarts, P.B.M.; Hartingsveldt, M. van; Anderson, P.G.; Tillaar, I. van den; Burg, J. van der; Geurts, A.C.H.

    2012-01-01

    The purpose of this article was to describe a child-friendly modified constraint-induced movement therapy protocol that is combined with goal-directed task-specific bimanual training (mCIMT-BiT). This detailed description elucidates the approach and supports various research reports. This protocol i

  16. A Formal Software Development Approach Using Refinement Calculus

    Institute of Scientific and Technical Information of China (English)

    王云峰; 庞军; 等

    2001-01-01

    The advantage of COOZ(Complete Object-Oriented Z) is to specify large scale software,but it does not support refinement calculus.Thus its application is comfined for software development.Including refinement calculus into COOZ overcomes its disadvantage during design and implementation.The separation between the design and implementation for structure and notation is removed as well .Then the software can be developed smoothly in the same frame.The combination of COOZ and refinement calculus can build object-oriented frame,in which the specification in COOZ is refined stepwise to code by calculus.In this paper,the development model is established.which is based on COOZ and refinement calculus.Data refinement is harder to deal with in a refinement tool than ordinary algorithmic refinement,since data refinement usually has to be done on a large program component at once.As to the implementation technology of refinement calculus,the data refinement calculator is constructed and an approach for data refinement which is based on data refinement calculus and program window inference is offered.

  17. A combination therapy protocol of plasmapheresis, intravenous immunoglobulins and betamethasone to treat anti-Ro/La-related congenital atrioventricular block. A case series and review of the literature.

    Science.gov (United States)

    Ruffatti, Amelia; Marson, Piero; Svaluto-Moreolo, Giorgio; Marozio, Luca; Tibaldi, Maria; Favaro, Maria; Calligaro, Antonia; Grava, Chiara; Hoxha, Ariela; Pengo, Vittorio; Punzi, Leonardo

    2013-05-01

    The aim of this report was to evaluate the efficacy and safety of a combined treatment protocol used to treat 2nd and 3rd degree anti-Ro/La-related congenital atrioventricular block (CAVB). Six consecutive women diagnosed with 2nd degree (three cases) or 3rd degree block (three cases) between 2009 and 2011 referred to our outpatient clinic underwent a combination therapy protocol composed of weekly plasmapheresis, fortnightly 1g/kg intravenous immunoglobulins (IVIG) and daily betamethasone (4mg/day) throughout pregnancy. IVIG (1g/kg) treatment in the neonates was begun at birth and administered every fifteen days until passive maternal antibodies became undetectable. The fetuses affected with 2nd degree block (cases 1, 2 and 3) reverted to a normal atrioventricular conduction after combined therapy, while those with a 3rd degree block remained stable (case 4), showed an increase in the ventricular rate (case 5) or an improvement in cardiac function (case 6). None of the fetuses with 3rd degree CAVB had other cardiac complications such as cardiomyopathy or fetal hydrops. The follow-up of the children affected with 2nd degree CAVB revealed a complete regression of the block in cases 1 and 3, and no signs of relevant worsening in case 2. The infants affected with 3rd degree block (cases 4, 5, and 6) remained stable and until now only one has required a pacemaker at the age of 10months. If these results are confirmed by large-scale studies, this protocol could lead to improved outcomes in the treatment of this devastating disease. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Refinement for Probabilistic Systems with Nondeterminism

    Directory of Open Access Journals (Sweden)

    David Streader

    2011-06-01

    Full Text Available Before we combine actions and probabilities two very obvious questions should be asked. Firstly, what does "the probability of an action" mean? Secondly, how does probability interact with nondeterminism? Neither question has a single universally agreed upon answer but by considering these questions at the outset we build a novel and hopefully intuitive probabilistic event-based formalism. In previous work we have characterised refinement via the notion of testing. Basically, if one system passes all the tests that another system passes (and maybe more we say the first system is a refinement of the second. This is, in our view, an important way of characterising refinement, via the question "what sort of refinement should I be using?" We use testing in this paper as the basis for our refinement. We develop tests for probabilistic systems by analogy with the tests developed for non-probabilistic systems. We make sure that our probabilistic tests, when performed on non-probabilistic automata, give us refinement relations which agree with for those non-probabilistic automata. We formalise this property as a vertical refinement.

  19. Refining and petrochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Constancio, Silva

    2006-07-01

    In 2004, refining margins showed a clear improvement that persisted throughout the first three quarters of 2005. This enabled oil companies to post significantly higher earnings for their refining activity in 2004 compared to 2003, with the results of the first half of 2005 confirming this trend. As for petrochemicals, despite a steady rise in the naphtha price, higher cash margins enabled a turnaround in 2004 as well as a clear improvement in oil company financial performance that should continue in 2005, judging by the net income figures reported for the first half-year. Despite this favorable business environment, capital expenditure in refining and petrochemicals remained at a low level, especially investment in new capacity, but a number of projects are being planned for the next five years. (author)

  20. Technology combined with a counseling protocol to stimulate physical activity of chronically ill patients in primary care.

    Science.gov (United States)

    Verwey, R; van der Weegen, S; Spreeuwenberg, M; Tange, H; van der Weijden, T; de Witte, L

    2014-01-01

    An iterative user-centered design method was used to develop and test mobile technology (the It's LiFe! tool/monitor) embedded in primary care, followed by a three months feasibility study with 20 patients and three nurses. The tool consists of an accelerometer that transfers data to an app on a Smartphone, which is subsequently connected to a server. Physical activity levels are measured in minutes per day compared to pre-set activity goals, which are set by patients in dialogue with nurses. Nurses can monitor patients' physical activity via a secured website. The counseling protocol is based on the Five A's model and consists of a limited number of behavior change consultations intertwined with interaction with and responses from the tool. The technology supports nurses when performing physical activity counseling. Provided that no connectivity problems occur, the It's LiFe! intervention is feasible, and its longitudinal effects will be tested in a cluster RCT.

  1. Grain Refinement of Deoxidized Copper

    Science.gov (United States)

    Balart, María José; Patel, Jayesh B.; Gao, Feng; Fan, Zhongyun

    2016-10-01

    This study reports the current status of grain refinement of copper accompanied in particular by a critical appraisal of grain refinement of phosphorus-deoxidized, high residual P (DHP) copper microalloyed with 150 ppm Ag. Some deviations exist in terms of the growth restriction factor ( Q) framework, on the basis of empirical evidence reported in the literature for grain size measurements of copper with individual additions of 0.05, 0.1, and 0.5 wt pct of Mo, In, Sn, Bi, Sb, Pb, and Se, cast under a protective atmosphere of pure Ar and water quenching. The columnar-to-equiaxed transition (CET) has been observed in copper, with an individual addition of 0.4B and with combined additions of 0.4Zr-0.04P and 0.4Zr-0.04P-0.015Ag and, in a previous study, with combined additions of 0.1Ag-0.069P (in wt pct). CETs in these B- and Zr-treated casts have been ascribed to changes in the morphology and chemistry of particles, concurrently in association with free solute type and availability. No further grain-refining action was observed due to microalloying additions of B, Mg, Ca, Zr, Ti, Mn, In, Fe, and Zn (~0.1 wt pct) with respect to DHP-Cu microalloyed with Ag, and therefore are no longer relevant for the casting conditions studied. The critical microalloying element for grain size control in deoxidized copper and in particular DHP-Cu is Ag.

  2. Incremental Bisimulation Abstraction Refinement

    DEFF Research Database (Denmark)

    Godskesen, Jens Christian; Song, Lei; Zhang, Lijun

    2013-01-01

    an abstraction refinement approach for the probabilistic computation tree logic (PCTL), which is based on incrementally computing a sequence of may- and must-quotient automata. These are induced by depth-bounded bisimulation equivalences of increasing depth. The approach is both sound and complete, since...

  3. SU-E-T-620: Dosimetric Compliance Study for a New Prostate Protocol of Combined High Dose Rate Brachytherapy and Stereotactic Body Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Peng, C; Giaddui, T; Den, R; Harrison, A; Yu, Y [Thomas Jefferson University, Philadelphia, PA (United States)

    2014-06-15

    Purpose: To investigate the adherence of treatment plans of prostate cancer patients with the dosimetric compliance criteria of the new in house phase I trial of high dose rate (HDR) brachytherapy combined with stereotactic body radiotherapy (SBRT) for intermediate risk prostate cancer patients. Methods: Ten prostate cancer patients were treated using this trial. They received one fraction of HDR to 15Gy, followed by external beam(EB) boost of 3.2Gy(Level 1, five patients) or 3.94Gy(level 2, five patients) per fraction for 10 or 7 fractions, respectively, both equivalent to EB treatments of 113.5Gy in 2Gy fractions. The EB plans were either IMRT or VMAT plans. DVH analysis was performed to verify the adherence of treatment plans to the dosimetric criteria of the trial. Results: For Level 1 patients, target coverage were adequate, with CTV V32Gy(%) of 99.0±1.0 (mean ± 1 standard deviation), and PTV V31Gy(%) of 99.6±0.3. PTV V32.9Gy(%) is 1.4±3.1 and PTVmax is 32.9±0.2Gy. Rectum, bladder and femoral heads sparing were well within protocol criteria. For Level 2 patients, CTV V27.6Gy(%) is 98.7±1.8; PTV V26.7Gy(%) is 99.0±1.4. PTV V28.4Gy(%) is 1.3±1.4, with three patients having minor deviation from protocol. Again critical structures were spared compliant to the protocol. The analysis of HDR plans show similar results, with adequate dose coverage to the prostate and sparing of critical structures including urethra and rectum. V100(%) and V90(%) of prostate are 96.0±1.1 and 98.9±0.5. Urethra D10(%) is 113.1±2.9. Rectum V80(cc) is 1.4±0.5. Hotspot in prostate is substantially higher than what the protocol specifies. But the criteria for hotspot are only guidelines, serving to lower the dose to urethra . Conclusion: This new high biological equivalent dose prostate trial has been carried out successfully for ten patients. Based on dosimetric analysis, all HDR and external plans were compliant to the protocol criteria, with only minor deviations.

  4. Rehabilitation of atrophic maxilla using the combination of autogenous and allogeneic bone grafts followed by protocol-type prosthesis.

    Science.gov (United States)

    Margonar, Rogério; dos Santos, Pâmela Letícia; Queiroz, Thallita Pereira; Marcantonio, Elcio

    2010-11-01

    Currently, there are several techniques for the rehabilitation of atrophic maxillary ridges in literature. The grafting procedure using autogenous bone is considered ideal by many researchers, as it shows osteogenic capability and causes no antigenic reaction. However, this type of bone graft has some shortcomings, mainly the restricted availability of donor sites. In recent years, several alternatives have been investigated to supply the disadvantages of autogenous bone grafts. In such studies, allogeneic bone grafts, which are obtained from individuals with different genetic load, but from the same species, have been extensively used. They can be indicated in cases of arthroplasty, surgical knee reconstruction, large bone defects, and in oral and maxillofacial reconstruction. Besides showing great applicability and biocompatibility, this type of bone is available in unlimited quantities. On the other hand, allogeneic bone may have the disadvantage of transmitting infectious diseases. Atrophic maxillae can be treated with bone grafts followed by osseointegrated implants to obtain aesthetic and functional oral rehabilitation. This study aimed to show the viability of allogeneic bone grafting in an atrophic maxilla, followed by oral rehabilitation with dental implant and protocol-type prosthesis within a 3-year follow-up period by means of a clinical case report.

  5. Perspicuity and Granularity in Refinement

    CERN Document Server

    Boiten, Eerke

    2011-01-01

    This paper reconsiders refinements which introduce actions on the concrete level which were not present at the abstract level. It draws a distinction between concrete actions which are "perspicuous" at the abstract level, and changes of granularity of actions between different levels of abstraction. The main contribution of this paper is in exploring the relation between these different methods of "action refinement", and the basic refinement relation that is used. In particular, it shows how the "refining skip" method is incompatible with failures-based refinement relations, and consequently some decisions in designing Event-B refinement are entangled.

  6. Protocol for a prospective observational study of conventional treatment and traditional Korean medicine combination treatment for children with cerebral palsy.

    Science.gov (United States)

    Yoo, Jeong-Eun; Yun, Young-Ju; Shin, Yong-Beom; Kim, Nam-Kwen; Kim, Soo-Yeon; Shin, Myung-Jun; Yu, Sun-Ae

    2016-06-08

    Cerebral palsy leads to many complications as well as delayed motor development, and early intensive rehabilitation in infancy, which is based on the theory of brain plasticity, is emphasized. In addition to conventional treatment, including physical, occupational, or speech-language therapies, children also have a demand for traditional Korean medicine interventions such as acupuncture or herbal medicine; however, a lack of evidence has made traditional Korean medicine difficult to implement in practice. We planned a multicentre, prospective, observational study to assess the effectiveness, safety and cost-effectiveness of conventional treatment and traditional Korean medicine combination treatment for children with cerebral palsy. Three hundred children with cerebral palsy aged 6 to 78 months will be recruited from six institutions. Data from each child are collected every month for a one-year period, during which time treatment might be changed or discontinued. A qualified investigator visits the sites to measure effectiveness variables, including Gross Motor Function Measure and Paediatric Evaluation of Disability Inventory. Adverse events and cost-effectiveness variables are collected using surveys conducted at baseline, mid-study, and end of study, as well as monthly tracking surveys. In the analyses, participants will be classified into two groups: group A children will be the conventional treatment group with physical, occupational, speech-language or other conventional rehabilitation therapies, whereas group B children will be the combination treatment group with traditional Korean medicine interventions, that is, herbal medicine, chuna, moxibustion and acupuncture, in addition to conventional treatment. Only a few clinical case reports have evaluated the effectiveness and safety of traditional Korean medicine; therefore, more data are required to provide optimal information to children with cerebral palsy and their guardians. We hypothesized that

  7. Refinement for Administrative Policies

    OpenAIRE

    Dekker, MAC; Etalle, S Sandro

    2007-01-01

    Flexibility of management is an important requisite for access control systems as it allows users to adapt the access control system in accordance with practical requirements. This paper builds on earlier work where we defined administrative policies for a general class of RBAC models. We present a formal definition of administrative refinnement and we show that there is an ordering for administrative privileges which yields administrative refinements of policies. We argue (by giving an examp...

  8. Individual patient data meta-analysis of combined treatments versus psychotherapy (with or without pill placebo), pharmacotherapy or pill placebo for adult depression: a protocol

    Science.gov (United States)

    Weitz, Erica; Kleiboer, Annet; van Straten, Annemieke; Hollon, Steven D; Cuijpers, Pim

    2017-01-01

    Introduction There are many proven treatments (psychotherapy, pharmacotherapy or their combination) for the treatment of depression. Although there is growing evidence for the effectiveness of combination treatment (psychotherapy + pharmacotherapy) over pharmacotherapy alone, psychotherapy alone or psychotherapy plus pill placebo, for depression, little is known about which specific groups of patients may respond best to combined treatment versus monotherapy. Conventional meta-analyses techniques have limitations when tasked with examining whether specific individual characteristics moderate the effect of treatment on depression. Therefore, this protocol outlines an individual patient data (IPD) meta-analysis to explore which patients, with which clinical characteristics, have better outcomes in combined treatment compared with psychotherapy (alone or with pill placebo), pharmacotherapy and pill placebo. Methods and Analysis Study searches are completed using an established database of randomised controlled trials (RCTs) on the psychological treatment of adult depression that has previously been reported. Searches were conducted in PubMed, PsycInfo, Embase and the Cochrane Central Register of Controlled Trials. RCTs comparing combination treatment (psychotherapy + pharmacotherapy) with psychotherapy (with or without pill placebo), pharmacotherapy or pill placebo for the treatment of adult depression will be included. Study authors of eligible trials will be contacted and asked to contribute IPD. Conventional meta-analysis techniques will be used to examine differences between studies that have contributed data and those that did not. Then, IPD will be harmonised and analysis using multilevel regression will be conducted to examine effect moderators of treatment outcomes. Dissemination Study results outlined above will be published in peer-reviewed journals. Study results will contribute to better understanding whether certain patients respond best to combined

  9. A switch therapy protocol with intravenous azithromycin and ciprofloxacin combination for severe, relapsing chronic bacterial prostatitis: a prospective non-comparative pilot study.

    Science.gov (United States)

    Kolumbić Lakos, A; Skerk, V; Maleković, G; Dujnić Spoljarević, T; Kovacic, D; Pasini, M; Markotić, A; Magri, V; Perletti, G

    2011-12-01

    Chronic bacterial prostatitis (CBP) is characterized by intense clinical symptoms, frequent relapse episodes and poor quality of life. Aggressive antibacterial therapy is warranted to eradicate the causative pathogens and to achieve a permanent cure. We administered a "switch-therapy" protocol to 30 patients showing severe CBP symptoms and two or more relapse episodes in the previous 12 months. Patients received intravenous azithromycin (500 mg/day) and ciprofloxacin (800 mg/day) for 3 days, followed by oral ciprofloxacin (1 g/day) for 25 days.Twenty-seven (90%) patients showed pathogen eradication at test-of-cure (TOC) visit. Five cases of infection relapse were detected at follow-up. At the TOC visit, 25 patients (83%) showed mild/absent symptoms, measured with the NIH-chronic prostatitis symptom index.These results indicate the efficacy of a "switch-therapy" protocol, based on combined azithromycin and ciprofloxacin. Comparative studies on larger CBP patient populations are warranted to confirm these encouraging results.

  10. Security Policy Enforcement Through Refinement Process

    CERN Document Server

    Stouls, Nicolas

    2010-01-01

    In the area of networks, a common method to enforce a security policy expressed in a high-level language is based on an ad-hoc and manual rewriting process. We argue that it is possible to build a formal link between concrete and abstract terms, which can be dynamically computed from the environment data. In order to progressively introduce configuration data and then simplify the proof obligations, we use the B refinement process. We present a case study modeling a network monitor. This program, described by refinement following the layers of the TCP/IP suite protocol, has to warn for all observed events which do not respect the security policy. To design this model, we use the event-B method because it is suitable for modeling network concepts. This work has been done within the framework of the POTESTAT project, based on the research of network testing methods from a high-level security policy.

  11. Combination Protocol of Low-Frequency rTMS and Intensive Occupational Therapy for Post-stroke Upper Limb Hemiparesis: a 6-year Experience of More Than 1700 Japanese Patients.

    Science.gov (United States)

    Kakuda, Wataru; Abo, Masahiro; Sasanuma, Jinichi; Shimizu, Masato; Okamoto, Takatsugu; Kimura, Chikou; Kakita, Kiyohito; Hara, Hiroyoshi

    2016-06-01

    Several years ago, we proposed a combination protocol of repetitive transcranial magnetic stimulation (rTMS) and intensive occupational therapy (OT) for upper limb hemiparesis after stroke. Subsequently, the number of patients treated with the protocol has increased in Japan. We aimed to present the latest data on our proposed combination protocol for post-stroke upper limb hemiparesis as a result of a multi-institutional study. After confirming that a patient met the inclusion criteria for the protocol, they were scheduled to receive the 15-day inpatient protocol. In the protocol, two sessions of 20-min rTMS and 120-min occupational therapy were provided daily, except for Sundays and the days of admission/discharge. Motor function of the affected upper limb was evaluated by the Fugl-Meyer assessment (FMA) and Wolf motor function test (WMFT) at admission/discharge and at 4 weeks after discharge if possible. A total of 1725 post-stroke patients were studied (mean age at admission 61.4 ± 13.0 years). The scheduled 15-day protocol was completed by all patients. At discharge, the increase in FMA score, shortening in performance time of WMFT, and increase in functional ability scale (FAS) score of WMFT were significant (FMA score 46.8 ± 12.2 to 50.9 ± 11.4 points, p stroke, although its efficacy should be confirmed in a randomized controlled study.

  12. Combined modality program with possible organ preservation for invasive bladder carcinoma: Results of RTOG protocol 85-12

    Energy Technology Data Exchange (ETDEWEB)

    Tester, W.; Porter, A.; Asbell, S.; Coughlin, C.; Heaney, J.; Krall, J.; Martz, K.; Venner, P.; Hammond, E. (Radiation Therapy Oncology Group, Philadelphia, PA (United States))

    1993-04-02

    This Phase 2 study was designed to test the tolerance and effectiveness of concurrent cisplatin-radiotherapy in the treatment of invasive bladder cancer. Objectives were to determine toxicity, complete response rate, bladder preservation rate, and survival. Patients with invasive bladder cancer, clinical Stages T2--4, NO-2 or NX, MO were treated with pelvic radiotherapy 40 Gy in 4 weeks and cisplatin 100 mg/m[sup 2] on days 1 and 22. Complete responders were given an additional 24 Gy bladder boost plus a third dose of cisplatin; patients with residual tumor after 40 Gy were assigned radical cystectomy. The complete remission rate following cisplatin and 40 Gy for evaluable cases was 31/47 (66%). Acute toxicity was acceptable with only two patients not completing induction therapy. Patients with poorly differentiated tumors were more likely to achieve complete remission. Of fully evaluable patients, 28/42 (67%) achieved complete remission with induction therapy, 11 remain continuously in remission, and eight have relapsed with bladder as the only site of failure. Five of these eight cases relapsed with noninvasive tumor. Of the 14 patients who failed to achieve complete remission, only three remain disease-free. Median survival is not reached, with 17/42 (19/48) deaths reported. Actuarial survival is 64% at 3 years. This combined cisplatin-radiotherapy regime was moderately well-tolerated and associated with tumor clearance in 66% of patients treated. Isolated bladder recurrences with invasive carcinoma are infrequent. Better definition of pretreatment selection criteria is needed if combined modality treatment is to achieve disease control and organ preservation for patients with bladder cancer. 37 refs., 3 figs., 5 tabs.

  13. A Two-Phase Expansion Protocol Combining Interleukin (IL-15 and IL-21 Improves Natural Killer Cell Proliferation and Cytotoxicity against Rhabdomyosarcoma

    Directory of Open Access Journals (Sweden)

    Juliane Wagner

    2017-06-01

    Full Text Available Rhabdomyosarcoma (RMS is the most common soft tissue malignancy in children. Despite intensive research in recent decades the prognosis for patients with metastatic or relapsed diseases has hardly improved. New therapeutic concepts in anti-tumor therapy aim to modulate the patient’s immune system to increase its aggressiveness or targeted effects toward tumor cells. Besides surgery, radiotherapy and chemotherapy, immune activation by direct application of cytokines, antibodies or adoptive cell therapy are promising approaches. In the last years, adoptive transfer of natural killer (NK cells came into the focus of translational medicine, because of their high cytotoxic potential against transformed malignant cells. A main challenge of NK cell therapy is that it requires a high amount of functional NK cells. Therefore, ex vivo NK cell expansion protocols are currently being developed. Many culturing strategies are based on the addition of feeder or accessory cells, which need to be removed prior to the clinical application of the final NK cell product. In this study, we addressed feeder cell-free expansion methods using common γ-chain cytokines, especially IL-15 and IL-21. Our results demonstrated high potential of IL-15 for NK cell expansion, while IL-21 triggered NK cell maturation and functionality. Hence, we established a two-phase expansion protocol with IL-15 to induce an early NK cell expansion, followed by short exposure to IL-21 that boosted the cytotoxic activity of NK cells against RMS cells. Further functional analyses revealed enhanced degranulation and secretion of pro-inflammatory cytokines such as interferon-γ and tumor necrosis factor-α. In a proof of concept in vivo study, we also observed a therapeutic effect of adoptively transferred IL-15 expanded and IL-21 boosted NK cells in combination with image guided high precision radiation therapy using a luciferase-transduced RMS xenograft model. In summary, this two

  14. A combined static-dynamic single-dose imaging protocol to compare quantitative dynamic SPECT with static conventional SPECT.

    Science.gov (United States)

    Sciammarella, Maria; Shrestha, Uttam M; Seo, Youngho; Gullberg, Grant T; Botvinick, Elias H

    2017-08-03

    SPECT myocardial perfusion imaging (MPI) is a clinical mainstay that is typically performed with static imaging protocols and visually or semi-quantitatively assessed for perfusion defects based upon the relative intensity of myocardial regions. Dynamic cardiac SPECT presents a new imaging technique based on time-varying information of radiotracer distribution, which permits the evaluation of regional myocardial blood flow (MBF) and coronary flow reserve (CFR). In this work, a preliminary feasibility study was conducted in a small patient sample designed to implement a unique combined static-dynamic single-dose one-day visit imaging protocol to compare quantitative dynamic SPECT with static conventional SPECT for improving the diagnosis of coronary artery disease (CAD). Fifteen patients (11 males, four females, mean age 71 ± 9 years) were enrolled for a combined dynamic and static SPECT (Infinia Hawkeye 4, GE Healthcare) imaging protocol with a single dose of (99m)Tc-tetrofosmin administered at rest and a single dose administered at stress in a one-day visit. Out of 15 patients, eleven had selective coronary angiography (SCA), 8 within 6 months and the rest within 24 months of SPECT imaging, without intervening symptoms or interventions. The extent and severity of perfusion defects in each myocardial region was graded visually. Dynamically acquired data were also used to estimate the MBF and CFR. Both visually graded images and estimated CFR were tested against SCA as a reference to evaluate the validity of the methods. Overall, conventional static SPECT was normal in ten patients and abnormal in five patients, dynamic SPECT was normal in 12 patients and abnormal in three patients, and CFR from dynamic SPECT was normal in nine patients and abnormal in six patients. Among those 11 patients with SCA, conventional SPECT was normal in 5, 3 with documented CAD on SCA with an overall accuracy of 64%, sensitivity of 40% and specificity of 83%. Dynamic SPECT image

  15. Self-consistent QM/MM methodologies for structural refinement of photosystem II and other macromolecules of biological interest

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Enrique R [Los Alamos National Laboratory; Sproviero, Eduardo M [YALE UNIV; Newcomer, Michael [YALE UNIV; Gascon, Jose A [YALE UNIV; Batista, Victor S [YALE UNIV

    2008-01-01

    The combination of quantum mechanics and molecular mechanics (QM/MM) is one of the most promising approaches to study the structure, function, and properties of proteins and nucleic acids. However, there some instances in which the limitations of either the MM (lack of a proper electronic description) or QM (limited to a few number of atoms) methods prevent a proper description of the system. To address this issue, we review here our approach to fine-tune the structure of biological systems using post-QM/MM refinements. These protocols are based on spectroscopy data, and/or partitioning of the system to extend the QM description to a larger region of a protein. We illustrate these methodologies through applications to several biomolecules, which were pre-optimized at the QM/MM level and then further refined using postQM/MM refinement methodologies: mod(QM/MM), which refines the atomic charges of the residues included in the MM region accounting for polarization effects; mod(QM/MM)-opt that partition the MM region in smaller parts and optimizes each part in an iterative. self-consistent way, and the Polarized-Extended X-Ray Absorption Fine Structure (P-EXAFS) fitting procedure, which fine-tune the atomic coordinates to reproduce experimental polarized EXAFS spectra. The first two techniques were applied to the guanine quadruplex. while the P-EXAFS refinement was applied to the oxygen evolving complex of photosystem II.

  16. Combination Analgesic Development for Enhanced Clinical Efficacy (CADENCE Trial): Study Protocol for a Double-Blind, Randomized, Placebo-Controlled Crossover Trial of an Alpha-Lipoic Acid - Pregabalin Combination for the Treatment of Fibromyalgia Pain.

    Science.gov (United States)

    Gilron, Ian; Tu, Dongsheng; Holden, Ronald; Towheed, Tanveer; Vandenkerkhof, Elizabeth; Milev, Roumen

    2017-08-04

    Fibromyalgia is a clinical disorder commonly presenting with chronic widespread pain as well as sleep disturbance, fatigue, depression, and cognitive dysfunction. There is an urgent need for treatment strategies that provide better pain relief and fewer adverse effects (AEs). Efforts to develop rational combinations of specific fibromyalgia treatments have demonstrated potential for measurable improvements in pain relief, quality of life, and health care utilization. More than half of fibromyalgia patients receive 2 or more analgesics but current combination use is based on limited evidence. As an early proof-of-concept project from the Canadian Institutes of Health Research-Strategy on Patient-Oriented Research Chronic Pain Network, this trial protocol is expected to advance the field by rigorously evaluating a new treatment combination for fibromyalgia. We will test the hypothesis that analgesic combinations containing at least one nonsedating agent would be as safe but more effective than either monotherapy because of additive pain relief without increasing overall AEs. Pregabalin (PGB), a sedating anticonvulsant, is proven effective for fibromyalgia, and the antioxidant, alpha-lipoic acid (ALA), one of the only nonsedating systemic agents proven effective for neuropathic pain, is currently being evaluated in fibromyalgia. Thus, we will conduct a clinical trial to compare a PGB+ALA combination to each monotherapy for fibromyalgia. Using a double-blind, double-dummy, crossover design, 54 adults with fibromyalgia will be randomly allocated to 1 of 6 sequences of treatment with PGB, ALA, and PGB+ALA combination. During each of 3 different treatment periods, participants will take 2 sets of capsules containing (1) ALA (or placebo) and (2) PGB (or placebo) for 31 days, followed by an 11-day taper/washout period. The primary outcome will be mean daily pain intensity (0 to 10 scale) at maximal tolerated doses (MTDs) during each period. Secondary outcomes, assessed at

  17. Predictors of remission in depression to individual and combined treatments (PReDICT: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Dunlop Boadie W

    2012-07-01

    -week course of treatment, during which they receive a combination of CBT and antidepressant medication. Predictors of the primary outcome, remission, will be identified for overall and treatment-specific effects, and a statistical model incorporating multiple predictors will be developed to predict outcomes. Discussion The PReDICT study’s evaluation of biological, psychological, and clinical factors that may differentially impact treatment outcomes represents a sizeable step toward developing personalized treatments for MDD. Identified predictors should help guide the selection of initial treatments, and identify those patients most vulnerable to recurrence, who thus warrant maintenance or combination treatments to achieve and maintain wellness. Trial registration Clinicaltrials.gov Identifier: NCT00360399. Registered 02 AUG 2006. First patient randomized 09 FEB 2007.

  18. A refined atomic scale model of the Saccharomyces cerevisiae K+-translocation protein Trk1p combined with experimental evidence confirms the role of selectivity filter glycines and other key residues.

    Science.gov (United States)

    Zayats, Vasilina; Stockner, Thomas; Pandey, Saurabh Kumar; Wörz, Katharina; Ettrich, Rüdiger; Ludwig, Jost

    2015-05-01

    Potassium ion (K+) uptake in yeast is mediated mainly by the Trk1/2 proteins that enable cells to survive on external K+ concentration as low as a few μM. Fungal Trks are related to prokaryotic TRK and Ktr and plant HKT K+ transport systems. Overall sequence similarity is very low, thus requiring experimental verification of homology models. Here a refined structural model of the Saccharomyces cerevisiae Trk1 is presented that was obtained by combining homology modeling, molecular dynamics simulation and experimental verification through functional analysis of mutants. Structural models and experimental results showed that glycines within the selectivity filter, conserved among the K-channel/transporter family, are not only important for protein function, but are also required for correct folding/membrane targeting. A conserved aspartic acid in the PA helix (D79) and a lysine in the M2D helix (K1147) were proposed earlier to interact. Our results suggested individual roles of these residues in folding, structural integrity and function. While mutations of D79 completely abolished protein folding, mutations at position 1147 were tolerated to some extent. Intriguingly, a secondary interaction of D79 with R76 could enhance folding/stability of Trk1 and enable a fraction of Trk1[K1147A] to fold. The part of the ion permeation path containing the selectivity filter is shaped similar to that of ion channels. However below the selectivity filter it is obstructed or regulated by a proline containing loop. The presented model could provide the structural basis for addressing the long standing question if Trk1 is a passive or active ion-translocation system.

  19. Refining and petrochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Benazzi, E.; Alario, F

    2004-07-01

    In 2003, refining margins showed a clear improvement that continued throughout the first three quarters of 2004. Oil companies posted significantly higher earnings in 2003 compared to 2002, with the results of first quarter 2004 confirming this trend. Due to higher feedstock prices, the implementation of new capacity and more intense competition, the petrochemicals industry was not able to boost margins in 2003. In such difficult business conditions, aggravated by soaring crude prices, the petrochemicals industry is not likely to see any improvement in profitability before the second half of 2004. (author)

  20. Refining and petrochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Benazzi, E

    2003-07-01

    Down sharply in 2002, refining margins showed a clear improvement in the first half-year of 2003. As a result, the earnings reported by oil companies for financial year 2002 were significantly lower than in 2001, but the prospects are brighter for 2003. In the petrochemicals sector, slow demand and higher feedstock prices eroded margins in 2002, especially in Europe and the United States. The financial results for the first part of 2003 seem to indicate that sector profitability will not improve before 2004. (author)

  1. Protocol for the combined immunosuppression & radiotherapy in thyroid eye disease (CIRTED trial: A multi-centre, double-masked, factorial randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Kingston Laura

    2008-01-01

    Full Text Available Abstract Background Medical management of thyroid eye disease remains controversial due to a paucity of high quality evidence on long-term treatment outcomes. Glucocorticoids are known to be effective initially but have significant side-effects with long-term use and recrudescence can occur on cessation. Current evidence is conflicting on the efficacy of radiotherapy and non-steroid systemic immunosuppression, and the majority of previous studies have been retrospective, uncontrolled, small or poorly designed. The Combined Immunosuppression and Radiotherapy in Thyroid Eye Disease (CIRTED trial was designed to investigate the efficacy of radiotherapy and azathioprine in combination with a standard course of oral prednisolone in patients with active thyroid eye disease. Methods/design Patients with active thyroid eye disease will be randomised to receive (i azathioprine or oral placebo and (ii radiotherapy or sham-radiotherapy in this multi-centre, factorial randomised control trial. The primary outcome is improvement in disease severity (assessed using a composite binary measure at 12 months and secondary end-points include quality of life scores and health economic measures. Discussion The CIRTED trial is the first study to evaluate the role of radiotherapy and azathioprine as part of a long-term, combination immunosuppressive treatment regime for Thyroid Eye Disease. It will provide evidence for the role of radiotherapy and prolonged immunosuppression in the management of this condition, as well as pilot data on their use in combination. We have paid particular attention in the trial design to establishing (a robust placebo controls and masking protocols which are effective and safe for both radiotherapy and the systemic administration of an antiproliferative drug; (b constructing effective inclusion and exclusion criteria to select for active disease; and (c selecting pragmatic outcome measures. Trial registration Current controlled trials

  2. Macromolecular crystallographic estructure refinement

    Directory of Open Access Journals (Sweden)

    Afonine, Pavel V.

    2015-04-01

    Full Text Available Model refinement is a key step in crystallographic structure determination that ensures final atomic structure of macromolecule represents measured diffraction data as good as possible. Several decades have been put into developing methods and computational tools to streamline this step. In this manuscript we provide a brief overview of major milestones of crystallographic computing and methods development pertinent to structure refinement.El refinamiento es un paso clave en el proceso de determinación de una estructura cristalográfica al garantizar que la estructura atómica de la macromolécula final represente de la mejor manera posible los datos de difracción. Han hecho falta varias décadas para poder desarrollar nuevos métodos y herramientas computacionales dirigidas a dinamizar esta etapa. En este artículo ofrecemos un breve resumen de los principales hitos en la computación cristalográfica y de los nuevos métodos relevantes para el refinamiento de estructuras.

  3. Multidataset Refinement Resonant Diffraction, and Magnetic Structures.

    Science.gov (United States)

    Attfield, J Paul

    2004-01-01

    The scope of Rietveld and other powder diffraction refinements continues to expand, driven by improvements in instrumentation, methodology and software. This will be illustrated by examples from our research in recent years. Multidataset refinement is now commonplace; the datasets may be from different detectors, e.g., in a time-of-flight experiment, or from separate experiments, such as at several x-ray energies giving resonant information. The complementary use of x rays and neutrons is exemplified by a recent combined refinement of the monoclinic superstructure of magnetite, Fe3O4, below the 122 K Verwey transition, which reveals evidence for Fe(2+)/Fe(3+) charge ordering. Powder neutron diffraction data continue to be used for the solution and Rietveld refinement of magnetic structures. Time-of-flight instruments on cold neutron sources can produce data that have a high intensity and good resolution at high d-spacings. Such profiles have been used to study incommensurate magnetic structures such as FeAsO4 and β-CrPO4. A multiphase, multidataset refinement of the phase-separated perovskite (Pr0.35Y0.07Th0.04Ca0.04Sr0.5)MnO3 has been used to fit three components with different crystal and magnetic structures at low temperatures.

  4. Validation of a new protocol for {sup 18}F-FDG infusion using an automatic combined dispenser and injector system

    Energy Technology Data Exchange (ETDEWEB)

    Lecchi, Michela; Lucignani, Giovanni; Maioli, Claudio; Ignelzi, Giuseppe; Del Sole, Angelo [University of Milan, Department of Health Sciences and Centre of Molecular and Cellular Imaging (IMAGO), Milan (Italy); Unit of Nuclear Medicine, San Paolo Hospital, Department of Diagnostic Services, Milan (Italy)

    2012-11-15

    In nuclear medicine, radiopharmaceuticals are usually administered in unit doses partitioned from multi-dose vials. The partitioning typically takes place in a radiopharmacy, depending on local practice. Automatic, as opposed to manual, partitioning and administration should reduce radiation exposure of the personnel involved, improve the accuracy of the administration and mitigate contamination. This study set out to verify and validate the {sup 18}F-fluorodeoxyglucose (FDG) administration procedure performed using Intego trademark (MEDRAD, Inc., Warrendale, PA, USA), a combined dispenser and injector system. We considered maintenance of sterility and the system's potential to improve, with respect to the manual procedure, the accuracy of net administered {sup 18}F-FDG radioactivity in patients and the radiation protection of operators. A media-fill procedure was used to assess whether sterility is maintained during use of the Intego trademark system. Simulating a typical working day's setup and use of the system, we investigated the accuracy of the net administered {sup 18}F-FDG activity obtained with Intego trademark versus the manual dose delivery system. We also measured personnel radiation exposure during use of Intego trademark and during manual administration and recorded and compared environmental doses in the two conditions. The radiopharmaceutical remained sterile in all the tests performed. The accuracy of the net {sup 18}F-FDG activity delivered to the patients was found to be within 3 % points, as required by European Association of Nuclear Medicine (EANM) guidelines on {sup 18}F-FDG imaging procedures. With Intego trademark, the residual radioactivity in the tubing was 0.20 MBq, corresponding to approximately 0.07 % of the mean activity delivered. With manual injection, the residual radioactivity in the syringe averaged 7.37 MBq, corresponding to a mean error of 2.9 % in the delivered dose. During the injection step of the positron emission

  5. Refines Efficiency Improvement

    Energy Technology Data Exchange (ETDEWEB)

    WRI

    2002-05-15

    Refinery processes that convert heavy oils to lighter distillate fuels require heating for distillation, hydrogen addition or carbon rejection (coking). Efficiency is limited by the formation of insoluble carbon-rich coke deposits. Heat exchangers and other refinery units must be shut down for mechanical coke removal, resulting in a significant loss of output and revenue. When a residuum is heated above the temperature at which pyrolysis occurs (340 C, 650 F), there is typically an induction period before coke formation begins (Magaril and Aksenova 1968, Wiehe 1993). To avoid fouling, refiners often stop heating a residuum before coke formation begins, using arbitrary criteria. In many cases, this heating is stopped sooner than need be, resulting in less than maximum product yield. Western Research Institute (WRI) has developed innovative Coking Index concepts (patent pending) which can be used for process control by refiners to heat residua to the threshold, but not beyond the point at which coke formation begins when petroleum residua materials are heated at pyrolysis temperatures (Schabron et al. 2001). The development of this universal predictor solves a long standing problem in petroleum refining. These Coking Indexes have great potential value in improving the efficiency of distillation processes. The Coking Indexes were found to apply to residua in a universal manner, and the theoretical basis for the indexes has been established (Schabron et al. 2001a, 2001b, 2001c). For the first time, a few simple measurements indicates how close undesired coke formation is on the coke formation induction time line. The Coking Indexes can lead to new process controls that can improve refinery distillation efficiency by several percentage points. Petroleum residua consist of an ordered continuum of solvated polar materials usually referred to as asphaltenes dispersed in a lower polarity solvent phase held together by intermediate polarity materials usually referred to as

  6. Refinement of macromolecular structures against neutron data with SHELXL2013.

    Science.gov (United States)

    Gruene, Tim; Hahn, Hinrich W; Luebben, Anna V; Meilleur, Flora; Sheldrick, George M

    2014-02-01

    Some of the improvements in SHELX2013 make SHELXL convenient to use for refinement of macromolecular structures against neutron data without the support of X-ray data. The new NEUT instruction adjusts the behaviour of the SFAC instruction as well as the default bond lengths of the AFIX instructions. This work presents a protocol on how to use SHELXL for refinement of protein structures against neutron data. It includes restraints extending the Engh & Huber [Acta Cryst. (1991), A47, 392-400] restraints to H atoms and discusses several of the features of SHELXL that make the program particularly useful for the investigation of H atoms with neutron diffraction. SHELXL2013 is already adequate for the refinement of small molecules against neutron data, but there is still room for improvement, like the introduction of chain IDs for the refinement of macromolecular structures.

  7. Comparison of Placido disc and Scheimpflug image-derived topography-guided excimer laser surface normalization combined with higher fluence CXL: the Athens Protocol, in progressive keratoconus

    Directory of Open Access Journals (Sweden)

    Kanellopoulos AJ

    2013-07-01

    Full Text Available Anastasios John Kanellopoulos,1,2 George Asimellis11Laservision.gr Eye Institute, Athens, Greece; 2New York University School of Medicine, Department of Opthalmology, NY, NY, USABackground: The purpose of this study was to compare the safety and efficacy of two alternative corneal topography data sources used in topography-guided excimer laser normalization, combined with corneal collagen cross-linking in the management of keratoconus using the Athens protocol, ie, a Placido disc imaging device and a Scheimpflug imaging device.Methods: A total of 181 consecutive patients with keratoconus who underwent the Athens protocol between 2008 and 2011 were studied preoperatively and at months 1, 3, 6, and 12 postoperatively for visual acuity, keratometry, and anterior surface corneal irregularity indices. Two groups were formed, depending on the primary source used for topoguided photoablation, ie, group A (Placido disc and group B (Scheimpflug rotating camera. One-year changes in visual acuity, keratometry, and seven anterior surface corneal irregularity indices were studied in each group.Results: Changes in visual acuity, expressed as the difference between postoperative and preoperative corrected distance visual acuity were +0.12 ± 0.20 (range +0.60 to -0.45 for group A and +0.19 ± 0.20 (range +0.75 to -0.30 for group B. In group A, K1 (flat keratometry changed from 45.202 ± 3.782 D to 43.022 ± 3.819 D, indicating a flattening of -2.18 D, and K2 (steep keratometry changed from 48.670 ± 4.066 D to 45.865 ± 4.794 D, indicating a flattening of -2.805 D. In group B, K1 (flat keratometry changed from 46.213 ± 4.082 D to 43.190 ± 4.398 D, indicating a flattening of -3.023 D, and K2 (steep keratometry changed from 50.774 ± 5.210 D to 46.380 ± 5.006 D, indicating a flattening of -4.394 D. For group A, the index of surface variance decreased to -5.07% and the index of height decentration to -26.81%. In group B, the index of surface variance

  8. Structure refinement of astrophyllite

    Institute of Scientific and Technical Information of China (English)

    MA; Zhesheng

    2001-01-01

    [1]Abdel-Fattah M. Abdel-Rahman., Mineral chemistry and paragenesis of astrophyllite from Egypt, Mineralogical Magazine, 1992, 56: 17-26.[2]Liu Yan, Ma Zhesheng, Han Xiuling et al, Astrophyllite from the Namjabarwa Area, Eastern Tibet, Acta Petrologica et Mineralogica, 1997,16(4): 338-340.[3]Peng Zhizhong, Ma Zhesheng, The crystal structure of astrophyllite (in Russian), Scientia Sinica, 1963, 12(2): 272-276.[4]Pen Zhizhong, Ma Zhesheng, The crystal structure of Tricinic Mangano-astrophyllite (in Russian), Scientia Sinica (Scien-ce in China), 1964, 13(7): 1180-1183.[5]Shi Nicheng, Ma Zhesheng, Li Guowu et al., Stucyure Refinement of Monoclinic astrophyllite, Acta Crystallographica, Section B, 1998, B54: 109-114.[6]Woodrow, P. J., The Crystal structure of astrophyllite, Acta Crystallographica, 1967, 22: 673-678.[7]СеменовЕ. И., Куплетскит-Новый Минерал Группы Астрофиллита, ДАН, 1956, 108(5), 933-936.[8]Nickel, E. H., Rowland, J. E., Charette, D. J., Niobophyllite the niobium analogue of astrophyllite: A new mineral from Sead Laxe Labrador, Canad. Mine., 1964, 8(1): 40.[9]X-Ray Laboratory of Hubei Geologic College, The crystal chemistry of astrophyllite group minerals (in Chinese), Scientia Geologica Sinica, 1974, (1): 18-30.[10]Sheldrick, G. M., Program for the solution of crystal structures, SHELX86, University of G?ttingen, 1985, Germany.[11]Sheldrick, G. M., Program for the refinement of crystal structures, SHELXL93, University of G?ttingen, 1993, Germany.[12]Liebau, F., Structural Chemistry of Silicates Structure, Bonding, and Classification, Heidelberg: Springer-Verlag QD181, S6L614, 1985.[13]Ferraris, G., Ivaldi, G., Khomyakov, A. P. et al., Nafertisite, a layer titanosilicate member of a polysomatic series including mica, Eur. J. Mineral.,1996, 8: 241-249.[14]Ferraris, G., Polysomatism as a tool for correlating properties and structure, in EMU Notes in

  9. High Intensity Interval- vs Resistance or Combined- Training for Improving Cardiometabolic Health in Overweight Adults (Cardiometabolic HIIT-RT Study): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Ramírez-Vélez, Robinson; Hernandez, Alejandra; Castro, Karem; Tordecilla-Sanders, Alejandra; González-Ruíz, Katherine; Correa-Bautista, Jorge Enrique; Izquierdo, Mikel; García-Hermoso, Antonio

    2016-06-24

    Although evidence shows the positive health effects of physical activity, most of the adult population in Colombia are sedentary. It is, therefore, important to implement strategies that generate changes in lifestyle behaviours. This protocol describes a study in which we will compare the effects of 12 weeks of high-intensity interval training (HIIT), resistance training (RT) or combined training (HIIT + RT) on the improvement of body composition, endothelial function, blood pressure, blood lipids, and cardiorespiratory fitness in a cohort of sedentary, overweight adults (aged 30-50 years). Sixty sedentary, overweight adults attending primary care in Bogotá, Colombia will be included in a factorial randomised controlled trial. Participants will be randomly assigned to the following intervention groups: (1) non-exercise group: usual care with dietary support, (2) HIIT group: 4 × 4-min intervals at 85-95 % maximum heart rate (HRmax) (with the target zone maintained for at least 2 minutes), interspersed with a 4-min recovery period, at 65 % HRmax, (3) RT group: completing a resistance circuit (including upper and lower muscle groups) as many times as needed according to subject's weight until an expenditure of 500 kcal at 40-80 % of one-rep max (1RM) has been achieved, and (4) combined group: HIIT + RT. The primary end point for effectiveness is vascular function as measured by flow-mediated vasodilatation 1 week after the end of exercise training. The results of this study will provide new information about the possible effect of the programme in improving the cardiometabolic health of overweight adults, making a more efficient use of an adult's resources over time. ClinicalTrials.gov ID: NCT02715063 . Registered on 8 March 2016.

  10. Safety and Efficacy of miltefosine alone and in combination with sodium stibogluconate and liposomal amphotericin B for the treatment of primary visceral leishmaniasis in East Africa: study protocol for a randomized controlled trial

    NARCIS (Netherlands)

    Omollo, R.; Alexander, N.; Edwards, T.; Khalil, E.A.G.; Younis, B.M.; Abuzaid, A.A.; Wasunna, M.; Njoroge, N.; Kinoti, D.; Kirigi, G.; Dorlo, T.P.C.; Ellis, S.; Balasegaram, M.; Musa, A.M.

    2011-01-01

    BACKGROUND: Treatment options for Visceral Leishmaniasis (VL) in East Africa are far from satisfactory due to cost, toxicity, prolonged treatment duration or emergence of parasite resistance. Hence there is a need to explore alternative treatment protocols such as miltefosine alone or in combination

  11. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  12. The Application of Cytidyl Guanosyl Oligodeoxynucleotide Can Affect the Antitumor Immune Response Induced by a Combined Protocol of Cryoablation and Dendritic Cells in Lewis Lung Cancer Model.

    Science.gov (United States)

    Zhang, Mi; Yin, Tianquan; Lu, Yuan; Feng, Huasong

    2016-04-19

    BACKGROUND Recently, several combined therapeutic strategies and targeted agents have been under investigation for their potential role in lung cancer. The combined administration of dendritic cells (DCs) and immune-adjuvant cytidyl guanosyl oligodeoxynucleotide (CpG-ODN) after cryosurgery has proven to be an effective strategy for treating lung cancer. However, whether the application of CpG-ODN could affect the therapeutic results remained to be further explored. MATERIAL AND METHODS The Lewis lung cancer (LLC)-bearing mice received cryoablation and injection of ex vivo-cultured DCs into the peritumoral zone. Subsequently, CpG-ODN was administered to experimental animals 6 hours, 12 hours, and 24 hours after DC injection. The mice in the control group received coadministration of DCs and CpG-ODN simultaneously. Therapeutic effects were evaluated by survival rates. The resistance to rechallenge of LLC cell was assessed by lung metastasis and in vitro cytotoxicity of splenocytes. Furthermore, T-cell subsets and multiple cytokines (interleukin [IL]-4, -10, and-12; interferon [IFN]-γ; tumor necrosis factor [TNF]-α) in the blood were assessed to elucidate the underlying mechanisms. RESULTS Higher ratios of CD4+ and CD8+ T cells and higher levels of IL-12, IFN-γ, and TNF-α were found in the blood of the mice that received CpG-ODN therapy 12 h after DC injection. The cytotoxicity potency of the splenocytes of these mice was significantly higher compared with the mice in other groups. Moreover, the mice receiving CpG-ODN therapy 12 h after DC injection showed significantly better resistance to rechallenge. Compared with the mice in other groups, the mice receiving CpG-ODN therapy 12 h after DC injection were superior in survival rates and antimetastatic effects. CONCLUSIONS Our study suggested that the therapeutic efficacy was closely associated with CpG-ODN administration in the combined therapeutic protocol of cryoablation, DCs, and immune adjuvant. In situ

  13. Crystal structure refinement with SHELXL

    Energy Technology Data Exchange (ETDEWEB)

    Sheldrick, George M., E-mail: gsheldr@shelx.uni-ac.gwdg.de [Department of Structural Chemistry, Georg-August Universität Göttingen, Tammannstraße 4, Göttingen 37077 (Germany)

    2015-01-01

    New features added to the refinement program SHELXL since 2008 are described and explained. The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as ‘a CIF’) containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

  14. Parallel Adaptive Mesh Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L; Hornung, R; Plassmann, P; WIssink, A

    2005-03-04

    As large-scale, parallel computers have become more widely available and numerical models and algorithms have advanced, the range of physical phenomena that can be simulated has expanded dramatically. Many important science and engineering problems exhibit solutions with localized behavior where highly-detailed salient features or large gradients appear in certain regions which are separated by much larger regions where the solution is smooth. Examples include chemically-reacting flows with radiative heat transfer, high Reynolds number flows interacting with solid objects, and combustion problems where the flame front is essentially a two-dimensional sheet occupying a small part of a three-dimensional domain. Modeling such problems numerically requires approximating the governing partial differential equations on a discrete domain, or grid. Grid spacing is an important factor in determining the accuracy and cost of a computation. A fine grid may be needed to resolve key local features while a much coarser grid may suffice elsewhere. Employing a fine grid everywhere may be inefficient at best and, at worst, may make an adequately resolved simulation impractical. Moreover, the location and resolution of fine grid required for an accurate solution is a dynamic property of a problem's transient features and may not be known a priori. Adaptive mesh refinement (AMR) is a technique that can be used with both structured and unstructured meshes to adjust local grid spacing dynamically to capture solution features with an appropriate degree of resolution. Thus, computational resources can be focused where and when they are needed most to efficiently achieve an accurate solution without incurring the cost of a globally-fine grid. Figure 1.1 shows two example computations using AMR; on the left is a structured mesh calculation of a impulsively-sheared contact surface and on the right is the fuselage and volume discretization of an RAH-66 Comanche helicopter [35]. Note the

  15. An approach of requirements tracing in formal refinement

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Leuschel, Michael;

    2010-01-01

    results that we have obtained for tracing informal natural-language requirements into formal models across refinement levels. The approach uses the WRSPM reference model for requirements modeling, and Event-B for formal modeling and formal refinement. The combined use of WRSPM and Event-B is facilitated......Formal modeling of computing systems yields models that are intended to be correct with respect to the requirements that have been formalized. The complexity of typical computing systems can be addressed by formal refinement introducing all the necessary details piecemeal. We report on preliminary...... by the rudimentary refinement notion of WRSPM, which provides the foundation for tracing requirements to formal refinements. We assume that requirements are evolving, meaning that we have to cope with frequent changes of the requirements model and the formal model. Our approach is capable of dealing with frequent...

  16. Implementing a Combined Cognitive-Behavioral + Exercise Therapy Protocol for Use by Older Adults with Chronic Back Pain: Evidence for a Possible Race/Ethnicity Effect

    Science.gov (United States)

    Beissner, Katherine; Parker, Samantha; Henderson, Charles R.; Pal, Anusmiriti; Papaleontiou, Maria; Reid, M.C.

    2012-01-01

    Purpose To determine the feasibility and potential efficacy of a self-management program that combines cognitive-behavioral strategies with exercise for use by seniors with chronic back pain, and to assess for possible race/ethnicity differences in program impact. Design/Methods Sixty-nine participants attending senior centers in New York City enrolled in the 8-week group-based program, with approximately equal numbers of African Americans (n=24), Hispanics (n=25), and non-Hispanic whites (n=20) enrolling. Participants provided weekly input on their perceived understanding and usefulness of program components. Efficacy outcomes included pain-related disability, as measured by the Roland Morris Disability Questionnaire (RMDQ), pain intensity, pain self-efficacy, depressive symptom score, social activity, and functional status. Results Eighty percent of enrollees completed the program, and 84% of program participants indicated they did the weekly practice/homework exercises. Program content was rated as understandable and highly useful to participants. Significant decreases in RMDQ scores were found for non-Hispanic white (adjusted change score −3.53), African American (−3.89), and Hispanic (−8.45), participants. Significant improvements in all other efficacy outcomes (pain intensity, social activity, activities of daily living, depressive symptoms) were observed, but only for Hispanic participants. Conclusions These results confirm that implementation of the protocol in urban senior centers is feasible, and the program shows potential efficacy in affecting pain-related disability among a diverse population of older adults. The race/ethnicity differences observed in the current study merit further investigation. PMID:22015623

  17. Efficacy of combined conservative therapies on clinical outcomes in patients with thumb base osteoarthritis: protocol for a randomised, controlled trial (COMBO)

    Science.gov (United States)

    Deveza, Leticia A; Wajon, Anne; Bennell, Kim L; Vicenzino, Bill; Hodges, Paul; Eyles, Jillian P; Jongs, Ray; Riordan, Edward A; Duong, Vicky; Min Oo, Win; O'Connell, Rachel; Meneses, Sarah R F

    2017-01-01

    Introduction Management of thumb base osteoarthritis (OA) using a combination of therapies is common in clinical practice; however, evidence for the efficacy of this approach is lacking. The aim of this study is to determine the effect of a combination of conservative therapies for the treatment of thumb base OA compared with an education control group. Methods and analysis This is a randomised, controlled, single-centre, two-arm superiority trial with 1:1 allocation ratio; with assessor and statistician blinded. Participants are blinded to the trial's hypothesis and to the interventions received by the opposite group. A total of 204 participants will be recruited from the community and randomised using a computer-generated schedule. The intervention group will receive education for joint protection and OA, a splint for the base of the thumb, hand exercises and topical diclofenac sodium 1% gel over 6 weeks. The control group will receive education for joint protection and OA alone. Main inclusion criteria are pain ≥40 mm (Visual Analogue Scale, 0–100) at the base of the thumb, impairment in hand function ≥6 (Functional Index for Hand Osteoarthritis, 0–30) and radiographic thumb base OA (Kellgren Lawrence grade ≥2). Participants currently receiving any of the intervention components will be excluded. Outcomes will be measured at 2, 6 and 12 weeks. The primary outcome is change in pain and hand function from baseline to 6 weeks. Other outcomes include changes in grip and pinch strength, quality of life, presence of joint swelling and tenderness, duration of joint stiffness, patient's global assessment and use of rescue medication. Analysis will be performed according to the intention-to-treat principle. Adverse events will be monitored throughout the study. Ethics and dissemination This protocol is approved by the local ethics committee (HREC/15/HAWKE/479). Dissemination will occur through presentations at international conferences and publication

  18. Combined low-dose contrast-enhanced MR angiography and perfusion for acute ischemic stroke at 3T: A more efficient stroke protocol.

    Science.gov (United States)

    Nael, K; Meshksar, A; Ellingson, B; Pirastehfar, M; Salamon, N; Finn, P; Liebeskind, D S; Villablanca, J P

    2014-06-01

    There is need to improve image acquisition speed for MR imaging in evaluation of patients with acute ischemic stroke. The purpose of this study was to evaluate the feasibility of a 3T MR stroke protocol that combines low-dose contrast-enhanced MRA and dynamic susceptibility contrast perfusion, without additional contrast. Thirty patients with acute stroke who underwent 3T MR imaging followed by DSA were retrospectively enrolled. TOF-MRA of the neck and brain and 3D contrast-enhanced MRA of the craniocervical arteries were obtained. A total of 0.1 mmol/kg of gadolinium was used for both contrast-enhanced MRA (0.05 mmol/kg) and dynamic susceptibility contrast perfusion (0.05 mmol/kg) (referred to as half-dose). An age-matched control stroke population underwent TOF-MRA and full-dose (0.1 mmol/kg) dynamic susceptibility contrast perfusion. The cervicocranial arteries were divided into 25 segments. Degree of arterial stenosis on contrast-enhanced MRA and TOF-MRA was compared with DSA. Time-to-maximum maps (>6 seconds) were evaluated for image quality and hypoperfusion. Quantitative analysis of arterial input function curves, SNR, and maximum T2* effects were compared between half- and full-dose groups. The intermodality agreements (k) for arterial stenosis were 0.89 for DSA/contrast-enhanced MRA and 0.63 for DSA/TOF-MRA. Detection specificity of >50% arterial stenosis was lower for TOF-MRA (89%) versus contrast-enhanced MRA (97%) as the result of overestimation of 10% (39/410) of segments by TOF-MRA. The DWI-perfusion mismatch was identified in both groups with high interobserver agreement (r = 1). There was no significant difference between full width at half maximum of the arterial input function curves (P = .14) or the SNR values (0.6) between the half-dose and full-dose groups. In patients with acute stroke, combined low-dose contrast-enhanced MRA and dynamic susceptibility contrast perfusion at 3T is feasible and results in significant scan time and contrast dose

  19. Crystallization in lactose refining-a review.

    Science.gov (United States)

    Wong, Shin Yee; Hartel, Richard W

    2014-03-01

    In the dairy industry, crystallization is an important separation process used in the refining of lactose from whey solutions. In the refining operation, lactose crystals are separated from the whey solution through nucleation, growth, and/or aggregation. The rate of crystallization is determined by the combined effect of crystallizer design, processing parameters, and impurities on the kinetics of the process. This review summarizes studies on lactose crystallization, including the mechanism, theory of crystallization, and the impact of various factors affecting the crystallization kinetics. In addition, an overview of the industrial crystallization operation highlights the problems faced by the lactose manufacturer. The approaches that are beneficial to the lactose manufacturer for process optimization or improvement are summarized in this review. Over the years, much knowledge has been acquired through extensive research. However, the industrial crystallization process is still far from optimized. Therefore, future effort should focus on transferring the new knowledge and technology to the dairy industry.

  20. Some Bounds on Security Protocol Analysis--Combining Model Checking and Strand Spaces%安全协议分析的界--综合模型检查与Strand Spaces

    Institute of Scientific and Technical Information of China (English)

    刘怡文; 李伟琴

    2002-01-01

    Strand Spaces serve as a model of security protocol analysis. In this paper, the main characteristics of Strand Spaces are briefly introduced, and its advantages and disadvantages are presented . An algorithm of building an ideal model of a protocol is proposed, which is used to bound both the abilities of the penetrator and the number of concurrent protocol runs. Combining Model Checking and Strand Spaces, a method is proposed to use both the automatic reasoning mechanism of the Model Checking and the bounds on security protocol analysis to achieve effective analysis of security protocols, avoiding state explosion problems.%Strand Spaces是一种用于分析安全协议的机器证明方法.简要介绍了Strand Spaces的基本特点,分析了其优劣,提出了构造协议的理想子环的算法,并以此来约束协议入侵者的能力和协议并行运行的次数.将模型检查与Strand Spaces结合在一起,提出了一种综合分析方法来验证协议的安全特性,该方法可充分发挥模型检查与Strand Spaces二者的优势.

  1. Crystal structure refinement with SHELXL.

    Science.gov (United States)

    Sheldrick, George M

    2015-01-01

    The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as `a CIF') containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

  2. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...... notions of implementation, is shown to be computationally hard (co-NP hard). Second, we consider four forms of consistency (existence of implementations) for modal specifications. We characterize each operationally, giving algorithms for deciding, and for synthesizing implementations, together...

  3. A formal model of fair exchange protocols

    Institute of Scientific and Technical Information of China (English)

    QING Sihan; LI Gaicheng

    2005-01-01

    Based on the study of existing fair exchange protocols, this paper sets up an accurate formal model by stepwise refinement. In the process of refinement an unreliable channel is employed to simulate an attack behavior. The model provides a novel formal definition of exchanged items, and presents the formal goals for fairness, accountability,etc., reflecting the inherent requirements for fair exchange protocols across-the-board. In order to check, prove, and design fair exchange protocols effectively and efficiently, the model puts forward a novel property of abuse-freeness which applies to all fair exchange protocols, gives a formal definition for trust strand of the third party, and presents general criteria of designing a secure and effective fair exchange protocol. Taking a typical fair exchange protocol as an example, this paper presents the analysis steps of fair exchange protocols appealing to our model. An unknown attack is uncovered. The analysis reveals the process of a complete attack, discovering deeper reasons for causing an attack.Finally, we modify the flawed protocol and the revised protocol ensures the desirable properties.

  4. Reducing treatments in cattle superovulation protocols by combining a pituitary extract with a 5% hyaluronan solution: Is it able to diminish activation of the hypothalamic pituitary adrenal axis compared to the traditional protocol?

    Science.gov (United States)

    Biancucci, Andrea; Sbaragli, Tatiana; Comin, Antonella; Sylla, Lakamy; Monaci, Maurizio; Peric, Tanja; Stradaioli, Giuseppe

    2016-03-15

    Traditional superovulation protocols that include multiple gonadotropin treatments are time-consuming and labor intensive. These protocols require multiple handling and restraining of embryo donors. This will likely increase the risks of injuries in both animals and humans and induce stress that may lead to a reduced superovulatory response. These are more evident when working with cattle that are rarely handled or raised on extensive grazing. The objectives of this experiment were to compare the efficacy of a split-injection protocol of porcine pituitary-derived porcine FSH (pFSH) preparation (slow release [SR] group) to the traditional 4-day treatment with pFSH administered twice daily (C group) and to determine the concentrations of cortisol in the hair as a marker of activation of the hypothalamic-pituitary-adrenal (HPA) axis during the two superovulatory treatments. Thirty-two heifers were stimulated twice in a 2 × 2 crossover design and compared for ovarian response and numbers and characteristics of recovered ova-embryo among treatments. No differences between SR and C groups were found in terms of percentage of responsive animals (100% vs. 93.8%) and ovulation rate (83.7 ± 1.1 vs. 79.5 ± 1.0%). A positive correlation was found between the number of follicles responsive to pFSH (2-8 mm) at the beginning of treatments and the superovulatory response, and no differences were found in these follicular populations between the two treatment groups. The numbers of CLs, ova-embryos, fertilized ova, transferable and freezable embryos recovered per cow were found to be significantly higher in SR compared with C group (14.0 ± 1.6 vs. 10.6 ± 1.0, 12.1 ± 1.6 vs. 7.6 ± 1.0, 11.1 ± 1.1 vs. 7.3 ± 1.0, 9.6 ± 1.4 vs. 6.6 ± 1.0, and 9.4 ± 1.4 vs. 6.0 ± 1.0 for SR and C group, respectively). The SR group produced also a significantly greater number of excellent- and/or good-quality embryos compared with the C group. The concentrations of cortisol in the hair at

  5. (dtltt) protocol

    African Journals Online (AJOL)

    2013-03-01

    Mar 1, 2013 ... Keywords: multi-access, multiservice, network, synchronous, asynchronous, traffic, timed-token. 1. ... 12, 13 ] SAFENET [14], Manufacturing Automation. Protocol (MAP) ...... ken circulation on mobile Ad Hoc Networks. 21th In-.

  6. Zone refining of plutonium metal

    Energy Technology Data Exchange (ETDEWEB)

    Blau, Michael S. [Univ. of Idaho, Moscow, ID (United States)

    1994-08-01

    The zone refining process was applied to Pu metal containing known amounts of impurities. Rod specimens of plutonium metal were melted into and contained in tantalum boats, each of which was passed horizontally through a three-turn, high-frequency coil in such a manner as to cause a narrow molten zone to pass through the Pu metal rod 10 times. The impurity elements Co, Cr, Fe, Ni, Np, U were found to move in the same direction as the molten zone as predicted by binary phase diagrams. The elements Al, Am, and Ga moved in the opposite direction of the molten zone as predicted by binary phase diagrams. As the impurity alloy was zone refined, {delta}-phase plutonium metal crystals were produced. The first few zone refining passes were more effective than each later pass because an oxide layer formed on the rod surface. There was no clear evidence of better impurity movement at the slower zone refining speed. Also, constant or variable coil power appeared to have no effect on impurity movement during a single run (10 passes). This experiment was the first step to developing a zone refining process for plutonium metal.

  7. Bauxite Mining and Alumina Refining

    Science.gov (United States)

    Frisch, Neale; Olney, David

    2014-01-01

    Objective: To describe bauxite mining and alumina refining processes and to outline the relevant physical, chemical, biological, ergonomic, and psychosocial health risks. Methods: Review article. Results: The most important risks relate to noise, ergonomics, trauma, and caustic soda splashes of the skin/eyes. Other risks of note relate to fatigue, heat, and solar ultraviolet and for some operations tropical diseases, venomous/dangerous animals, and remote locations. Exposures to bauxite dust, alumina dust, and caustic mist in contemporary best-practice bauxite mining and alumina refining operations have not been demonstrated to be associated with clinically significant decrements in lung function. Exposures to bauxite dust and alumina dust at such operations are also not associated with the incidence of cancer. Conclusions: A range of occupational health risks in bauxite mining and alumina refining require the maintenance of effective control measures. PMID:24806720

  8. Data refinement for true concurrency

    Directory of Open Access Journals (Sweden)

    Brijesh Dongol

    2013-05-01

    Full Text Available The majority of modern systems exhibit sophisticated concurrent behaviour, where several system components modify and observe the system state with fine-grained atomicity. Many systems (e.g., multi-core processors, real-time controllers also exhibit truly concurrent behaviour, where multiple events can occur simultaneously. This paper presents data refinement defined in terms of an interval-based framework, which includes high-level operators that capture non-deterministic expression evaluation. By modifying the type of an interval, our theory may be specialised to cover data refinement of both discrete and continuous systems. We present an interval-based encoding of forward simulation, then prove that our forward simulation rule is sound with respect to our data refinement definition. A number of rules for decomposing forward simulation proofs over both sequential and parallel composition are developed.

  9. Pointing Refinement of SIRTF Images

    CERN Document Server

    Masci, F; Moshir, M; Shupe, D; Fowler, J W; Fowler, John W.

    2002-01-01

    The soon-to-be-launched Space Infrared Telescope Facility (SIRTF) shall produce image data with an a-posteriori pointing knowledge of 1.4 arcsec (1 sigma radial) with a goal of 1.2 arcsec in the International Celestial Reference System (ICRS). To perform robust image coaddition, mosaic generation, extraction and position determination of faint sources, the pointing will need to be refined to better than a few-tenths of an arcsecond. We use a linear-sparse matrix solver to find a "global-minimization" of all relative image offsets in a mosaic from which refined pointings and orientations can be computed. This paper summarizes the pointing-refinement algorithm and presents the results of testing on simulated data.

  10. Symmetric cryptographic protocols for extended millionaires' problem

    Institute of Scientific and Technical Information of China (English)

    LI ShunDong; WANG DaoShun; DAI YiQi

    2009-01-01

    Yao's millionaires' problem is a fundamental problem in secure multiparty computation, and its solutions have become building blocks of many secure multiparty computation solutions. Unfortunately,most protocols for millionaires' problem are constructed based on public cryptography, and thus are inefficient. Furthermore, all protocols are designed to solve the basic millionaires' problem, that is,to privately determine which of two natural numbers is greater. If the numbers are real, existing solutions do not directly work. These features limit the extensive application of the existing protocols. This study introduces and refines the first symmetric cryptographic protocol for the basic millionaires' problem, and then extends the symmetric cryptographic protocol to privately determining which of two real numbers is greater, which are called the extended millionaires' problem, and proposes corresponding Constructed based on symmetric cryptography, these protocols are very efficient.

  11. Biofuels Refining Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Lobban, Lance [Univ. of Oklahoma, Norman, OK (United States)

    2017-03-28

    carbon capture and hydrogen efficiency. Our research approach combined catalyst synthesis, measurements of catalyst activity and selectivity in different reactor systems and conditions, and detailed catalyst characterization to develop fundamental understanding of reaction pathways and the capability to predict product distributions. Nearly all of the candidate catalysts were prepared in-house via standard techniques such as impregnation, co-impregnation, or chemical vapor deposition. Supports were usually purchased, but in some cases coprecipitation was used to simultaneously create the support and active component, which can be advantageous for strong active component-support interactions and for achieving high active component dispersion. In-house synthesis also allowed for studies of the effects on catalyst activity and selectivity of such factors as support porosity, calcination temperature, and reduction/activation conditions. Depending on the physical characteristics of the molecule, catalyst activity measurements were carried out in tubular flow reactors (for vapor phase reactions) or stirred tank reactors (for liquid phase reactions) over a wide range of pressures and temperatures. Reactant and product concentrations were measured using gas chromatography (both on-line and off-line, with TCD, FID, and/or mass spectrometric detection). For promising catalysts, detailed physicochemical characterization was carried out using FTIR, Raman, XPS, and XRD spectroscopies (all available in our laboratories) and TEM spectroscopy (available at OU). Additional methods included temperature programmed techniques (TPD, TPO) and surface area measurements by nitrogen adsorption techniques.

  12. DSm Vector Spaces of Refined Labels

    CERN Document Server

    Kandasamy, W B Vasantha

    2011-01-01

    In this book the authors introduce the notion of DSm vector spaces of refined labels. They also realize the refined labels as a plane and a n-dimensional space. Further, using these refined labels, several algebraic structures are defined. Finally DSm semivector space or refined labels is described. Authors also propose some research problems.

  13. Refining Nodes and Edges of State Machines

    DEFF Research Database (Denmark)

    Hallerstede, Stefan; Snook, Colin

    2011-01-01

    State machines are hierarchical automata that are widely used to structure complex behavioural specifications. We develop two notions of refinement of state machines, node refinement and edge refinement. We compare the two notions by means of examples and argue that, by adopting simple convention...... refinement theory and UML-B state machine refinement influences the style of node refinement. Hence we propose a method with direct proof of state machine refinement avoiding the detour via Event-B that is needed by UML-B....

  14. Conformal refinement of unstructured quadrilateral meshes

    Energy Technology Data Exchange (ETDEWEB)

    Garmella, Rao [Los Alamos National Laboratory

    2009-01-01

    We present a multilevel adaptive refinement technique for unstructured quadrilateral meshes in which the mesh is kept conformal at all times. This means that the refined mesh, like the original, is formed of only quadrilateral elements that intersect strictly along edges or at vertices, i.e., vertices of one quadrilateral element do not lie in an edge of another quadrilateral. Elements are refined using templates based on 1:3 refinement of edges. We demonstrate that by careful design of the refinement and coarsening strategy, we can maintain high quality elements in the refined mesh. We demonstrate the method on a number of examples with dynamically changing refinement regions.

  15. The effects of a combined static-dynamic stretching protocol on athletic performance in elite Gaelic footballers: A randomised controlled crossover trial.

    Science.gov (United States)

    Loughran, Martin; Glasgow, Philip; Bleakley, Chris; McVeigh, Joseph

    2017-05-01

    To determine the effect of three different static-dynamic stretching protocols on sprint and jump performance in Gaelic footballers. Double-blind, controlled, crossover trial. Sports Institute research environment. Seventeen male elite level Gaelic footballers, aged 18-30 years, completed three stretching protocols. Athletic performance was measured by countermovement jump height and power, and timed 10 m, 20 m, and 40 m sprints. Static stretching reduced sprint speed by 1.1% over 40 m and 1.0% over 20 m. Static stretching also reduced countermovement jump height by 10.6% and jump power by 6.4%. When static stretching was followed by dynamic stretching, sprint speed improved by 1.0% over 20 m and 0.7% over 40 m (p stretching protocol also improved countermovement jump height by 8.7% (p stretching reduces sprint speed and jump performance. Static stretching should be followed by dynamic stretching during warm-up to nullify any performance deficits caused by static stretching. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Selective refinement and selection of near-native models in protein structure prediction.

    Science.gov (United States)

    Zhang, Jiong; Barz, Bogdan; Zhang, Jingfen; Xu, Dong; Kosztin, Ioan

    2015-10-01

    In recent years in silico protein structure prediction reached a level where fully automated servers can generate large pools of near-native structures. However, the identification and further refinement of the best structures from the pool of models remain problematic. To address these issues, we have developed (i) a target-specific selective refinement (SR) protocol; and (ii) molecular dynamics (MD) simulation based ranking (SMDR) method. In SR the all-atom refinement of structures is accomplished via the Rosetta Relax protocol, subject to specific constraints determined by the size and complexity of the target. The best-refined models are selected with SMDR by testing their relative stability against gradual heating through all-atom MD simulations. Through extensive testing we have found that Mufold-MD, our fully automated protein structure prediction server updated with the SR and SMDR modules consistently outperformed its previous versions.

  17. A rapid and versatile combined DNA/RNA extraction protocol and its application to the analysis of a novel DNA marker set polymorphic between Arabidopsis thaliana ecotypes Col-0 and Landsberg erecta

    Directory of Open Access Journals (Sweden)

    Coupland George

    2005-08-01

    Full Text Available Abstract Background Many established PCR-based approaches in plant molecular biology rely on lengthy and expensive methods for isolation of nucleic acids. Although several rapid DNA isolation protocols are available, they have not been tested for simultaneous RNA isolation for RT-PCR applications. In addition, traditional map-based cloning technologies often use ill-proportioned marker regions even when working with the model plant Arabidopsis thaliana, where the availability of the full genome sequence can now be exploited for the creation of a high-density marker systems. Results We designed a high-density polymorphic marker set between two frequently used ecotypes. This new polymorphic marker set allows size separation of PCR products on agarose gels and provides an initial resolution of 10 cM in linkage mapping experiments, facilitated by a rapid plant nucleic acid extraction protocol using minimal amounts of A. thaliana tissue. Using this extraction protocol, we have also characterized segregating T-DNA insertion mutations. In addition, we have shown that our rapid nucleic acid extraction protocol can also be used for monitoring transcript levels by RT-PCR amplification. Finally we have demonstrated that our nucleic acid isolation method is also suitable for other plant species, such as tobacco and barley. Conclusion To facilitate high-throughput linkage mapping and other genomic applications, our nucleic acid isolation protocol yields sufficient quality of DNA and RNA templates for PCR and RT-PCR reactions, respectively. This new technique requires considerably less time compared to other purification methods, and in combination with a new polymorphic PCR marker set dramatically reduces the workload required for linkage mapping of mutations in A. thaliana utilizing crosses between Col-0 and Landsberg erecta (Ler ecotypes.

  18. Refining analgesia strategies using lasers.

    Science.gov (United States)

    Hampshire, Victoria

    2015-08-01

    Sound programs for the humane care and use of animals within research facilities incorporate experimental refinements such as multimodal approaches for pain management. These approaches can include non-traditional strategies along with more established ones. The use of lasers for pain relief is growing in popularity among companion animal veterinary practitioners and technologists. Therefore, its application in the research sector warrants closer consideration.

  19. On Interaction Refinement in Middleware

    DEFF Research Database (Denmark)

    Truyen, Eddy; Jørgensen, Bo Nørregaard; Joosen, Wouter;

    2000-01-01

    components together. We have examined a reflective technique that improve the dynamics of this gluing process such that interaction between components can be refined at run-time. In this paper, we show how we have used this reflective technique to dynamically integrate into the architecture of middleware...

  20. Automated refinement of macromolecular structures at low resolution using prior information

    Science.gov (United States)

    Kovalevskiy, Oleg; Nicholls, Robert A.; Murshudov, Garib N.

    2016-01-01

    Since the ratio of the number of observations to adjustable parameters is small at low resolution, it is necessary to use complementary information for the analysis of such data. ProSMART is a program that can generate restraints for macromolecules using homologous structures, as well as generic restraints for the stabilization of secondary structures. These restraints are used by REFMAC5 to stabilize the refinement of an atomic model. However, the optimal refinement protocol varies from case to case, and it is not always obvious how to select appropriate homologous structure(s), or other sources of prior information, for restraint generation. After running extensive tests on a large data set of low-resolution models, the best-performing refinement protocols and strategies for the selection of homologous structures have been identified. These strategies and protocols have been implemented in the Low-Resolution Structure Refinement (LORESTR) pipeline. The pipeline performs auto-detection of twinning and selects the optimal scaling method and solvent parameters. LORESTR can either use user-supplied homologous structures, or run an automated BLAST search and download homologues from the PDB. The pipeline executes multiple model-refinement instances using different parameters in order to find the best protocol. Tests show that the automated pipeline improves R factors, geometry and Ramachandran statistics for 94% of the low-resolution cases from the PDB included in the test set. PMID:27710936

  1. Potential of 80-kV high-resolution cone-beam CT imaging combined with an optimized protocol for neurological surgery

    Energy Technology Data Exchange (ETDEWEB)

    Kanayama, Seisaku; Hara, Takayuki [Toranomon Hospital, Department of Neurosurgery, Tokyo (Japan); Hamada, Yusuke [Toranomon Hospital, Department of Radiology, Tokyo (Japan); Matsumaru, Yuji [Toranomon Hospital, Department of Neuro-Endovascular Therapy, Tokyo (Japan)

    2014-11-05

    With the development of computed tomography (CT) and magnetic resonance imaging (MRI), the use of conventional X-ray angiography including digital subtraction angiography (DSA) for diagnosis has decreased, as it is an invasive technique with a risk of neurological complications. However, X-ray angiography imaging technologies have progressed markedly, along with the development of endovascular treatments. A newly developed angiography technique using cone-beam CT (CBCT) technology provides higher spatial resolution than conventional CT. Herein, we describe the potential of this technology for neurosurgical operations with reference to clinical cases. Two hundred twenty-five patients who received 80-kV high-resolution CBCT from July 2011 to June 2014 for preoperative examinations were included in this study. For pathognomonical cases, images were taken with suitable reconstruction modes and contrast protocols. Cases were compared with intraoperative findings or images from other modalities. We observed the following pathognomonical types: (1) imaging of the distal dural ring (DDR) and the surrounding structure for paraclinoid aneurysms, (2) imaging of thin blood vessels, and (3) imaging of both brain tumors and their surrounding anatomy. Our devised 80-kV high-resolution CBCT imaging system provided clear visualization of detailed anatomy when compared with other modalities in almost all cases. Only two cases provided poor visualization due to movement artifact. Eighty-kilovolt high-resolution CBCT has the potential to provide detailed anatomy for neurosurgical operations when utilizing suitable modes and contrast protocols. (orig.)

  2. Medium-/Long-Term Effects of a Specific Exercise Protocol Combined with Patient Education on Spine Mobility, Chronic Fatigue, Pain, Aerobic Fitness and Level of Disability in Fibromyalgia

    Directory of Open Access Journals (Sweden)

    Erika Giannotti

    2014-01-01

    Full Text Available Objective. To propose a rehabilitation protocol able to produce immediate and long-term beneficial effects on level of disability and overall performance in ADLs. Materials and Methods. Forty-one FM patients were randomized to an exercise and educational-behavioral programme group (experimental group, EG = 21 or to a control group (CG = 20. Each subject was evaluated before, at the end (T1, and after 6 months (T6 from the conclusion of the rehabilitation treatment using the Fibromyalgia Impact Questionnaire (FIQ, the visual analogue scale (VAS, the Health Assessment Questionnaire (HAQ, the fatigue severity scale (FSS, the 6-minute walking test (6MWT, tender points count (TPC, and spinal active range of motion. The exercise protocol included 20 sessions consisting in self-awareness, stretching, strengthening, spine flexibility, and aerobic exercises, which patients were subsequently educated to perform at home. Results. The two groups were comparable at baseline. At T1, the EG showed a positive trend in FIQ, VAS, HAQ, and FSS scales and significant improvement in 6MWT and in most spinal active range of motion measurements (P between 0.001 and 0.04. The positive results were maintained at the follow-up. Conclusion. The proposed programme was well tolerated and produced immediate and medium-term beneficial effects improving function and strain endurance. This trial is registered with DRKS00005071 on DRKS.

  3. Refining Visually Detected Object poses

    DEFF Research Database (Denmark)

    Holm, Preben; Petersen, Henrik Gordon

    2010-01-01

    Automated industrial assembly today require that the 3D position and orientation (hereafter ''pose`) of the objects to be assembled are known precisely. Today this precision is mostly established by a dedicated mechanical object alignment system. However, such systems are often dedicated...... that enables direct assembly. Conventional vision systems and laser triangulation systems can locate randomly placed known objects (with 3D CAD models available) with some accuracy, but not necessarily a good enough accuracy. In this paper, we present a novel method for refining the pose accuracy of an object...... that has been located based on the appearance as detected by a monocular camera. We illustrate the quality of our refinement method experimentally....

  4. Iterative Goal Refinement for Robotics

    Science.gov (United States)

    2014-06-01

    Iterative Goal Refinement for Robotics Mark Roberts1, Swaroop Vattam1, Ronald Alford2, Bryan Auslander3, Justin Karneeb3, Matthew Molineaux3... robotics researchers and practitioners. We present a goal lifecycle and define a formal model for GR that (1) relates distinct disciplines concerning...researchers to collaborate in exploring this exciting frontier. 1. Introduction Robotic systems often act using incomplete models in environments

  5. Beneficial effects of short-term combination exercise training on diverse cognitive functions in healthy older people: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Nouchi Rui

    2012-10-01

    Full Text Available Abstract Background Results of previous studies have shown that exercise training can improve cognitive functions in healthy older people. Some studies have demonstrated that long-term combination exercise training can facilitate memory function improvement better than either aerobic or strength exercise training alone. Nevertheless, it remains unclear whether short-term combination exercise training can improve diverse cognitive functions in healthy older people or not. We investigate the effects of four weeks of short-term combination exercise training on various cognitive functions (executive functions, episodic memory, short-term memory, working memory, attention, reading ability, and processing speed of healthy older people. Methods A single-blinded intervention with two parallel groups (combination exercise training; waiting list control is used. Testers are blind to the study hypothesis and the participants’ group membership. Through an advertisement in a local newspaper, 64 healthy older adults are recruited and then assigned randomly to a combination exercise training group or a waiting list control group. Participants in the combination exercise training group must participate in the short-term combination exercise training (aerobic and strength exercise training three days per week during the four weeks (12 workouts in total. The waiting list group does not participate in the combination exercise training. The primary outcome measure is the Stroop test score: a measure of executive function. Secondary outcome measures are assessments including the Verbal Fluency Task, Logical Memory, First and Second Names, Digit Span Forward, Digit span backward, Japanese Reading Test, Digit Cancellation Task, Digit Symbol Coding, and Symbol Search. We assess these outcome measures before and after the intervention. Discussion This report is the first of a study that investigates the beneficial effects of short-term combination exercise training on

  6. Refining the classification criteria for primary Sjogren syndrome

    NARCIS (Netherlands)

    Vissink, Arjan; Bootsma, Hendrika

    2017-01-01

    The 2016 ACR-EULAR classification criteria for primary Sjogren syndrome, which are intended to facilitate uniform classification of patients for enrolment in clinical studies, combine features of previous criteria sets. These new criteria are a step in the right direction, but further refinement wou

  7. The management of cornea blindness from severe corneal scarring, with the Athens Protocol (transepithelial topography-guided PRK therapeutic remodeling, combined with same-day, collagen cross-linking

    Directory of Open Access Journals (Sweden)

    Kanellopoulos AJ

    2012-02-01

    Full Text Available Anastasios John KanellopoulosLaservision.gr Institute, Athens, Greece; Manhattan Eye, Ear and Throat Hospital, New York, NY, USA; New York University Medical School, New York, NY, USAPurpose: To evaluate the safety and efficacy of combined transepithelial topography-guided photorefractive keratectomy (PRK therapeutic remodeling, combined with same-day, collagen cross-linking (CXL. This protocol was used for the management of cornea blindness due to severe corneal scarring.Methods: A 57-year-old man had severe corneal blindness in both eyes. Both corneas had significant central scars attributed to a firework explosion 45 years ago, when the patient was 12 years old. Corrected distance visual acuity (CDVA was 20/100 both eyes (OU with refraction: +4.00, –4.50 at 135° in the right eye and +3.50, –1.00 at 55° in the left. Respective keratometries were: 42.3, 60.4 at 17° and 35.8, 39.1 at 151.3°. Cornea transplantation was the recommendation by multiple cornea specialists as the treatment of choice. We decided prior to considering a transplant to employ the Athens Protocol (combined topography-guided partial PRK and CXL in the right eye in February 2010 and in the left eye in September 2010. The treatment plan for both eyes was designed on the topography-guided wavelight excimer laser platform.Results: Fifteen months after the right eye treatment, the right cornea had improved translucency and was topographically stable with uncorrected distance visual acuity (UDVA 20/50 and CDVA 20/40 with refraction +0.50, –2.00 at 5°. We noted a similar outcome after similar treatment applied in the left eye with UDVA 20/50 and CDVA 20/40 with –0.50, –2.00 at 170° at the 8-month follow-up.Conclusion: In this case, the introduction of successful management of severe cornea abnormalities and scarring with the Athens Protocol may provide an effective alternative to other existing surgical or medical options.Keywords: Athens Protocol, collagen cross

  8. The Refined Function-Behaviour-Structure Framework

    NARCIS (Netherlands)

    Diertens, B.

    2013-01-01

    We refine the function-behaviour-structure framework for design introduced by John Gero in order to deal with complexity. We do this by connecting the frameworks for the desing of two models, one the refinement of the other. The result is a refined framework for the design of an object on two levels

  9. Histology protocols

    Directory of Open Access Journals (Sweden)

    CarloAlberto Redi

    2010-06-01

    Full Text Available Tim D. Hewitson & Ian A. Darby (Eds Humana press, Totowa, New Jersey (USA Series: Springer Protocols Methods in Molecular Biology, Volume 611, 2010 Pages: 230; € 83.15 ISBN: 978-1-60327-344-2 Impressive as it can sounds in the era that Biology see a clear dominance of reductionism with the idea that complexity can be disentagled more and more thanks to the use of molecular tools, the reader will remain fascinated by this slim and agile volume devoted to bring together what apparently are two separeted words: molecular biology and histology. Simply remembering to the youngest scientists.....

  10. Inflation in a refined racetrack

    CERN Document Server

    Wen, Wen-Yu

    2007-01-01

    In this note, we refine the racetrack inflation model constructed in arXiv:hep-th/0406230 by including the open string modulus. This modulus encodes the embedding of our braneworld inside some Calabi-Yau throat. We argue that in generic this open string modulus dynamically runs with the inflaton field thanks to its nonlinear coupling. A full analysis becomes difficult because the scalar potential changes progressively during the inflation epoch. Nevertheless, by explicit construction we are still able to build a realistic model through appropriate choices of the initial conditions.

  11. THE MASTER PROTOCOL CONCEPT

    Science.gov (United States)

    Allegra, Carmen J.

    2015-01-01

    During the past decade, biomedical technologies have undergone an explosive evolution---from the publication of the first complete human genome in 2003, after more than a decade of effort and at a cost of hundreds of millions of dollars---to the present time, where a complete genomic sequence can be available in less than a day and at a small fraction of the cost of the original sequence. The widespread availability of next generation genomic sequencing has opened the door to the development of precision oncology. The need to test multiple new targeted agents both alone and in combination with other targeted therapies, as well as classic cytotoxic agents, demand the development of novel therapeutic platforms (particularly Master Protocols) capable of efficiently and effectively testing multiple targeted agents or targeted therapeutic strategies in relatively small patient subpopulations. Here, we describe the Master Protocol concept, with a focus on the expected gains and complexities of the use of this design. An overview of Master Protocols currently active or in development is provided along with a more extensive discussion of the Lung Master Protocol (Lung-MAP study). PMID:26433553

  12. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  13. Elliptic Solvers with Adaptive Mesh Refinement on Complex Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Phillip, B.

    2000-07-24

    Adaptive Mesh Refinement (AMR) is a numerical technique for locally tailoring the resolution computational grids. Multilevel algorithms for solving elliptic problems on adaptive grids include the Fast Adaptive Composite grid method (FAC) and its parallel variants (AFAC and AFACx). Theory that confirms the independence of the convergence rates of FAC and AFAC on the number of refinement levels exists under certain ellipticity and approximation property conditions. Similar theory needs to be developed for AFACx. The effectiveness of multigrid-based elliptic solvers such as FAC, AFAC, and AFACx on adaptively refined overlapping grids is not clearly understood. Finally, a non-trivial eye model problem will be solved by combining the power of using overlapping grids for complex moving geometries, AMR, and multilevel elliptic solvers.

  14. Treatment of locally advanced carcinomas of head and neck with intensity-modulated radiation therapy (IMRT in combination with cetuximab and chemotherapy: the REACH protocol

    Directory of Open Access Journals (Sweden)

    Simon Christian

    2010-11-01

    Full Text Available Abstract Background Primary treatment of carcinoma of the oro-/hypopharynx or larynx may consist of combined platinum-containing chemoradiotherapy. In order to improve clinical outcome (i.e. local control/overall survival, combined therapy is intensified by the addition of the EGFR inhibitor cetuximab (Erbitux®. Radiation therapy (RT is carried out as intensity-modulated RT (IMRT to avoid higher grade acute and late toxicity by sparing of surrounding normal tissues. Methods/Design The REACH study is a prospective phase II study combining chemoradiotherapy with carboplatin/5-Fluorouracil (5-FU and the monoclonal epidermal growth factor-receptor (EGFR antibody cetuximab (Erbitux® as intensity-modulated radiation therapy in patients with locally advanced squamous-cell carcinomas of oropharynx, hypopharynx or larynx. Patients receive weekly chemotherapy infusions in the 1st and 5th week of RT. Additionally, cetuximab is administered weekly throughout the treatment course. IMRT is delivered as in a classical concomitant boost concept (bid from fraction 16 to a total dose of 69,9 Gy. Discussion Primary endpoint of the trial is local-regional control (LRC. Disease-free survival, progression-free survival, overall survival, toxicity, proteomic and genomic analyses are secondary endpoints. The aim is to explore the efficacy as well as the safety and feasibility of this combined radioimmunchemotherapy in order to improve the outcome of patients with advanced head and neck cancer. Trial registration ISRCTN87356938

  15. phenix.ensemble_refinement: a test study of apo and holo BACE1

    NARCIS (Netherlands)

    Burnley, B.T.; Gros, P.

    2013-01-01

    phenix.ensemble_refinement (Burnley et al. 2012) combines molecular dynamic (MD) simulations with X-­‐ray structure refinement to generate ensemble models fitted to diffraction data. It is an evolution of the ‘time-­‐averaging’ method first proposed by Gros et al. in 1990 (Gros, van Gunsteren, and H

  16. Rotary impeller refinement of 7075Al alloy

    Institute of Scientific and Technical Information of China (English)

    WANG Liping; GUO Erjun; HUANG Yongchang; LU Bin

    2009-01-01

    The effects of four parameters, gas flow, rotational speed, refining time, and stewing time, on the rotary impeller refinement of 7075 Al were studied. The effects of C2Cl6refining, rotary impeller refuting, and composite refining of 7075 AI alloy were compared with each other. The results showed that the greatest impact parameter of rotary impeller refinement was rotational speed, followed by gas flow, refining time, and stewing time. The optimum purification parameters obtained by orthogonal analysis were as follows: rotor speed of 400 r/min, inert gas flow of 0.4 mL/h, refining time of 15 min, and stewing time of 6 min. The best degassing effect can be obtained by the composite refuting of C2Cl6 and rotary impeller. The degassing rate of C2Cl6 rotary impeller, and composite refining was 34.5%, 69.2%, and 78%, respectively. The mechanical properties of the specimen refined by rotary impeller were higher than those by C2C16 refining, but lower than those by composite refining.

  17. Zone refining of plutonium metal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The purpose of this study was to investigate zone refining techniques for the purification of plutonium metal. The redistribution of 10 impurity elements from zone melting was examined. Four tantalum boats were loaded with plutonium impurity alloy, placed in a vacuum furnace, heated to 700{degrees}C, and held at temperature for one hour. Ten passes were made with each boat. Metallographic and chemical analyses performed on the plutonium rods showed that, after 10 passes, moderate movement of certain elements were achieved. Molten zone speeds of 1 or 2 inches per hour had no effect on impurity element movement. Likewise, the application of constant or variable power had no effect on impurity movement. The study implies that development of a zone refining process to purify plutonium is feasible. Development of a process will be hampered by two factors: (1) the effect on impurity element redistribution of the oxide layer formed on the exposed surface of the material is not understood, and (2) the tantalum container material is not inert in the presence of plutonium. Cold boat studies are planned, with higher temperature and vacuum levels, to determine the effect on these factors. 5 refs., 1 tab., 5 figs.

  18. Adaptive mesh refinement for shocks and material interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Dai, William Wenlong [Los Alamos National Laboratory

    2010-01-01

    There are three kinds of adaptive mesh refinement (AMR) in structured meshes. Block-based AMR sometimes over refines meshes. Cell-based AMR treats cells cell by cell and thus loses the advantage of the nature of structured meshes. Patch-based AMR is intended to combine advantages of block- and cell-based AMR, i.e., the nature of structured meshes and sharp regions of refinement. But, patch-based AMR has its own difficulties. For example, patch-based AMR typically cannot preserve symmetries of physics problems. In this paper, we will present an approach for a patch-based AMR for hydrodynamics simulations. The approach consists of clustering, symmetry preserving, mesh continuity, flux correction, communications, management of patches, and load balance. The special features of this patch-based AMR include symmetry preserving, efficiency of refinement across shock fronts and material interfaces, special implementation of flux correction, and patch management in parallel computing environments. To demonstrate the capability of the AMR framework, we will show both two- and three-dimensional hydrodynamics simulations with many levels of refinement.

  19. Bayesian ensemble refinement by replica simulations and reweighting.

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-28

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  20. Refinement in Z and Object-Z foundations and advanced applications

    CERN Document Server

    Derrick, John

    2013-01-01

    Refinement is one of the cornerstones of the formal approach to software engineering, and its use in various domains has led to research on new applications and generalisation. This book brings together this important research in one volume, with the addition of examples drawn from different application areas. It covers four main themes:Data refinement and its application to ZGeneralisations of refinement that change the interface and atomicity of operationsRefinement in Object-ZModelling state and behaviour by combining Object-Z with CSPRefinement in Z and Object-Z: Foundations and Advanced A

  1. Unilever food safety assurance system for refined vegetable oils and fats

    Directory of Open Access Journals (Sweden)

    van Duijn Gerrit

    2010-03-01

    Full Text Available The Unilever Food Safety Assurance system for refined oils and fats is based on risk assessments for the presence of contaminants or pesticide residues in crude oils, and refining process studies to validate the removal of these components. Crude oil risk assessments were carried out by combining supply chain visits, and analyses of the contaminant and pesticide residue levels in a large number of crude oil samples. Contaminants like poly-aromatic hydrocarbons and hydrocarbons of mineral origin, and pesticide residues can largely be removed by refining. For many years, this Food Safety Assurance System has proven to be effective in controlling contaminant levels in refined vegetable oils and fats.

  2. KoBaMIN: a knowledge-based minimization web server for protein structure refinement

    Science.gov (United States)

    Rodrigues, João P. G. L. M.; Levitt, Michael; Chopra, Gaurav

    2012-01-01

    The KoBaMIN web server provides an online interface to a simple, consistent and computationally efficient protein structure refinement protocol based on minimization of a knowledge-based potential of mean force. The server can be used to refine either a single protein structure or an ensemble of proteins starting from their unrefined coordinates in PDB format. The refinement method is particularly fast and accurate due to the underlying knowledge-based potential derived from structures deposited in the PDB; as such, the energy function implicitly includes the effects of solvent and the crystal environment. Our server allows for an optional but recommended step that optimizes stereochemistry using the MESHI software. The KoBaMIN server also allows comparison of the refined structures with a provided reference structure to assess the changes brought about by the refinement protocol. The performance of KoBaMIN has been benchmarked widely on a large set of decoys, all models generated at the seventh worldwide experiments on critical assessment of techniques for protein structure prediction (CASP7) and it was also shown to produce top-ranking predictions in the refinement category at both CASP8 and CASP9, yielding consistently good results across a broad range of model quality values. The web server is fully functional and freely available at http://csb.stanford.edu/kobamin. PMID:22564897

  3. Autologous hematopoietic stem cell transplantation in combination with immunoablative protocol in secondary progressive multiple sclerosis: A 10-year follow-up of the first transplanted patient

    Directory of Open Access Journals (Sweden)

    Obradović Dragana

    2016-01-01

    Full Text Available Introduction. Multiple sclerosis (MS is an immunemediated disease of the central nervous system that affects young individuals and leads to severe disability. High dose immunoablation followed by autologous hemopoietic stem cell transplantation (AHSCT has been considered in the last 15 years as potentialy effective therapeutic approach for agressive MS. The most recent long-time follow-up results suggest that AHSCT is not only effective for highly aggressive MS, but for relapsing-remitting MS as well, providing long-term remission, or maybe even cure. We presented a 10- year follow-up of the first MS patient being treated by immunoablation therapy and AHSCT. Case report. A 27-year-old male experienced the first symptoms - intermitent numbness and paresthesia of arms and legs of what was treated for two years by psychiatrist as anxiety disorder. After he developed severe paraparesis he was admitted to the Neurology Clinic and diagnosed with MS. Our patient developed aggressive MS with frequent relapses, rapid disability progression and transition to secondary progressive form 6 years after MS onset [the Expanded Disability Status Scale (EDSS 7.0 Ambulation Index (AI 7]. AHSCT was performed, cyclophosphamide was used for hemopoietic stem cell mobilization and the BEAM protocol was used as conditionig regimen. No major adverse events followed the AHSCT. Neurological impairment improved, EDSS 6.5, AI 6 and during a 10-year followup remained unchanged. Brain MRI follow-up showed the absence of gadolinium enhancing lesions and a mild progression of brain atrophy. Conclusion. The patient with rapidly evolving, aggressive, noninflammatory MS initialy improved and remained stable, without disability progression for 10 years, after AHSCT. This kind of treatment should be considered in aggressive MS, or in disease modifying treatment nonresponsive MS patients, since appropriately timed AHSCT treatment may not only prevent disability progression but reduce

  4. Phenotype in combination with genotype improves outcome prediction in acute myeloid leukemia: a report from Children's Oncology Group protocol AAML0531.

    Science.gov (United States)

    Voigt, Andrew P; Eidenschink Brodersen, Lisa; Alonzo, Todd A; Gerbing, Robert B; Menssen, Andrew J; Wilson, Elisabeth R; Kahwash, Samir; Raimondi, Susana C; Hirsch, Betsy A; Gamis, Alan S; Meshinchi, Soheil; Wells, Denise A; Loken, Michael R

    2017-09-07

    Diagnostic biomarkers can be used to determine relapse risk in acute myeloid leukemia, and certain genetic aberrancies have prognostic relevance. A diagnostic immunophenotypic expression profile, which quantifies the amounts of distinct gene products, not just their presence or absence, was established to improve outcome prediction for patients with acute myeloid leukemia. The immunophenotypic expression profile, which defines each patient's leukemia as a location in 15-dimensional space, was generated for 769 patients enrolled in the Children's Oncology Group AAML0531 protocol. Unsupervised hierarchical clustering grouped patients with similar immunophenotypic expression profiles into eleven patient cohorts, demonstrating high associations among phenotype, genotype, morphology, and outcome. Of 95 patients with inv(16), 79% segregated in Cluster A. Of 109 patients with t(8;21), 92% segregated in Clusters A and B. Of 152 patients with 11q23 alterations, 78% segregated in Clusters D, E, F, G, or H. For both inv(16) and 11q23 abnormalities, differential phenotypic expression identified patient groups with different survival characteristics (P<0.05). Clinical outcome analysis revealed that Cluster B (predominantly t(8;21)) was associated with favorable outcome (P<0.001) and Clusters E, G, H, and K were associated with adverse outcomes (P<0.05). Multivariable regression analysis revealed that Clusters E, G, H, and K were independently associated with worse survival (P range <0.001 to 0.008). The Children's Oncology Group AAML0531 trial is registered at www.clinicaltrials.gov as NCT00372593. Copyright © 2017, Ferrata Storti Foundation.

  5. ATRX mRNA expression combined with IDH1/2 mutational status and Ki-67 expression refines the molecular classification of astrocytic tumors: evidence from the whole transcriptome sequencing of 169 samples samples.

    Science.gov (United States)

    Cai, Jinquan; Yang, Pei; Zhang, Chuanbao; Zhang, Wei; Liu, Yanwei; Bao, Zhaoshi; Liu, Xing; Du, Wenzhong; Wang, Hongjun; Jiang, Tao; Jiang, Chuanlu

    2014-05-15

    Astrocytic tumors are the most common primary brain tumors in adults. ATRX mutations have been identified in gliomas and are correlated with its loss of expression, which causes alternative lengthening of telomeres (ALT) leading to genomic instability. In this study, we aimed to explore the role of ATRX mRNA expression alteration in the progression and subclassification of astrocytic tumors and examine its impact on clinical outcome. We investigated ATRX mRNA expression and its association with IDH1 and IDH2 mutations in 169 adult astrocytic tumors using whole transcriptome sequencing. In our cohort, low ATRX mRNA expression was detected in 68% of astrocytomas, 50% of anaplastic astrocytomas and 41.6% of glioblastomas. Low ATRX expression closely overlapped with mutations in IDH1/2 (PATRX expression and longer overall survival was identified in our cohort (PATRX combined with IDH1/2 and Ki-67 was used to re-classify patients with astrocytic tumors: group A1 containing IDH1/2 mutations and low ATRX expression predicted a better prognostic outcome, whereas group A3 carrying wild-type IDH1/2 and high Ki-67 expression had the shortest overall survival; IDH-mutant tumors with low ATRX expression and IDH-wild-type tumors with high Ki-67 expression were grouped into group A2. In summary, our results showed that ATRX in cooperation with IDH1/2 and Ki-67 defines three subgroups of astrocytic tumors regardless of the conventional WHO grades consensus. The molecular stratification in astrocytic tumors may aid in treatment strategy selection, therapeutic trial design, and clinical prognosis evaluation.

  6. Protein NMR structures refined with Rosetta have higher accuracy relative to corresponding X-ray crystal structures.

    Science.gov (United States)

    Mao, Binchen; Tejero, Roberto; Baker, David; Montelione, Gaetano T

    2014-02-05

    We have found that refinement of protein NMR structures using Rosetta with experimental NMR restraints yields more accurate protein NMR structures than those that have been deposited in the PDB using standard refinement protocols. Using 40 pairs of NMR and X-ray crystal structures determined by the Northeast Structural Genomics Consortium, for proteins ranging in size from 5-22 kDa, restrained Rosetta refined structures fit better to the raw experimental data, are in better agreement with their X-ray counterparts, and have better phasing power compared to conventionally determined NMR structures. For 37 proteins for which NMR ensembles were available and which had similar structures in solution and in the crystal, all of the restrained Rosetta refined NMR structures were sufficiently accurate to be used for solving the corresponding X-ray crystal structures by molecular replacement. The protocol for restrained refinement of protein NMR structures was also compared with restrained CS-Rosetta calculations. For proteins smaller than 10 kDa, restrained CS-Rosetta, starting from extended conformations, provides slightly more accurate structures, while for proteins in the size range of 10-25 kDa the less CPU intensive restrained Rosetta refinement protocols provided equally or more accurate structures. The restrained Rosetta protocols described here can improve the accuracy of protein NMR structures and should find broad and general for studies of protein structure and function.

  7. A phase II prospective study of the "Sandwich" protocol, L-asparaginase, cisplatin, dexamethasone and etoposide chemotherapy combined with concurrent radiation and cisplatin, in newly diagnosed, I/II stage, nasal type, extranodal natural killer/T-cell lymphoma.

    Science.gov (United States)

    Jiang, Ming; Zhang, Li; Xie, Li; Zhang, Hong; Jiang, Yu; Liu, Wei-Ping; Zhang, Wen-Yan; Tian, Rong; Deng, Yao-Tiao; Zhao, Sha; Zou, Li-Qun

    2017-07-25

    Nasal-type, extranodal NK/T cell lymphoma (ENKTCL) is a special type of lymphomas with geographic and racial specificity. Up to now, the standard first-line treatment is still not unified. In our previous report, the "sandwich" protocol produced good results. Continuing to use the "sandwich" mode, a new chemotherapy composed of L-asparaginase, cisplatin, etoposide and dexamethasone (LVDP) plus concurrent chemoradiotherapy (CCRT) was conducted in more patients with newly diagnosed, I/II stage ENKTCL. The results showed that 66 patients were enrolled. Overall response rate was 86.4% including 83.3% complete response and 3.0% partial remission. With the median follow-up of 23.5 months, 3-year overall survival and 3-year progression-free survival were 70.1% and 67.4%, respectively. The survival rate in stage II and extra-cavity stage I was significantly less than that in limited stage I (p < 0.05). Therefore, we thought that the "sandwich" mode was worthy of being generalized and LVDP combined with CCRT was an effective protocol for I/II stage ENKTCL. But this regimen was not suitable for all stage I/II patients and warrants larger sample and layering investigation. This study was a registered clinical trial with number ChiCTR-TNC-12002353.

  8. Antimicrobial photodynamic therapy combined with periodontal treatment for metabolic control in patients with type 2 diabetes mellitus: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Evangelista, Erika Elisabeth; França, Cristiane Miranda; Veni, Priscila; de Oliveira Silva, Tamires; Gonçalves, Rafael Moredo; de Carvalho, Verônica Franco; Deana, Alessandro Melo; Fernandes, Kristianne P S; Mesquita-Ferrari, Raquel A; Camacho, Cleber P; Bussadori, Sandra Kalil; Alvarenga, Letícia Heineck; Prates, Renato Araujo

    2015-05-27

    protocol for this trial was registered with Clinical Trials registration number NCT01964833 on 14 October 2013.

  9. The intracarotid amobarbital procedure (Wada test with two protocols combined, Montreal and Seattle Procedimentos do amobarbital intracarotideo (teste de Wada com dois protocolos combinados, Montreal e Seattle

    Directory of Open Access Journals (Sweden)

    Paulo J. M. Leite

    1997-06-01

    Full Text Available The intracarotid amobarbital procedure was carried out in 8 male and 7 female candidates to temporal lobectomy, and a female candidate to frontal lesionectomy, aged 18-50 (mean 32.5 years. Language and memory were tested after injection in each hemisphere. Both were measured by the Montreal procedure. In 9 patients language and memory were evaluated with the Seattle procedure too. In 12 patients the left hemisphere was dominant for language; three had bilateral dominance. In I patient the Seattle procedure demonstrated the dominant hemisphere by relatively slowness of speech during the drug effect in the left hemisphere. Memory was defined to be in the left hemisphere in 12 patients, in the right in 2, bilateral in 1 and in another lateralization was not possible. In 1 patient memory dominance was determined by the Montreal protocol alone because of lack of cooperation. These early results indicate that the methods may be complementary for determination of language and memory dominance in epilepsy surgery candidates.O testo do amobarbital intracarotídeo foi realizado em 8 homens e 7 mulheres candidatos a lobectomia temporal e em uma mulher candidata a lesionectomia frontal, com idades de 18-50 (média 32,5 anos. Linguagem e memória foram testadas após a injeção do amobarbial em cada hemisfério cerebral. Todos os pacientes foram avaliados pelo método de Montreal e 9 também pelo método de Seattle. Em 12 pacientes o hemisfério cerebral esquerdo foi dominante para linguagem e em 3 pacientes houve dominância bilateral. Em uma paciente a linguagem foi determinada apenas através do método de Seattle, com lentificação relativa da fal a, sob ação da droga no hemisfério cerebral esquerdo. Dominância da memória à esquerda foi observada em 12 pacientes, à direita em 2, bilateral em 1 e em outro não foi lateralizada. Dominância da memória foi definida apenas através do método de Montreal em um paciente, devido à pouca coopera

  10. Adaptive mesh refinement in titanium

    Energy Technology Data Exchange (ETDEWEB)

    Colella, Phillip; Wen, Tong

    2005-01-21

    In this paper, we evaluate Titanium's usability as a high-level parallel programming language through a case study, where we implement a subset of Chombo's functionality in Titanium. Chombo is a software package applying the Adaptive Mesh Refinement methodology to numerical Partial Differential Equations at the production level. In Chombo, the library approach is used to parallel programming (C++ and Fortran, with MPI), whereas Titanium is a Java dialect designed for high-performance scientific computing. The performance of our implementation is studied and compared with that of Chombo in solving Poisson's equation based on two grid configurations from a real application. Also provided are the counts of lines of code from both sides.

  11. Algorithm refinement for fluctuating hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Sarah A.; Bell, John B.; Garcia, Alejandro L.

    2007-07-03

    This paper introduces an adaptive mesh and algorithmrefinement method for fluctuating hydrodynamics. This particle-continuumhybrid simulates the dynamics of a compressible fluid with thermalfluctuations. The particle algorithm is direct simulation Monte Carlo(DSMC), a molecular-level scheme based on the Boltzmann equation. Thecontinuum algorithm is based on the Landau-Lifshitz Navier-Stokes (LLNS)equations, which incorporate thermal fluctuations into macroscopichydrodynamics by using stochastic fluxes. It uses a recently-developedsolver for LLNS, based on third-order Runge-Kutta. We present numericaltests of systems in and out of equilibrium, including time-dependentsystems, and demonstrate dynamic adaptive refinement by the computationof a moving shock wave. Mean system behavior and second moment statisticsof our simulations match theoretical values and benchmarks well. We findthat particular attention should be paid to the spectrum of the flux atthe interface between the particle and continuum methods, specificallyfor the non-hydrodynamic (kinetic) time scales.

  12. SILICON REFINING BY VACUUM TREATMENT

    Directory of Open Access Journals (Sweden)

    André Alexandrino Lotto

    2014-12-01

    Full Text Available This work aims to investigate the phosphorus removal by vacuum from metallurgical grade silicon (MGSi (98.5% to 99% Si. Melting experiments were carried out in a vacuum induction furnace, varying parameters such as temperature, time and relation area exposed to the vacuum / volume of molten silicon. The results of chemical analysis were obtained by inductively coupled plasma (ICP, and evaluated based on thermodynamic and kinetic aspects of the reaction of vaporization of the phosphorus in the silicon. The phosphorus was decreased from 33 to approximately 1.5 ppm after three hours of vacuum treatment, concluding that the evaporation step is the controlling step of the process for parameters of temperature, pressure and agitation used and refining by this process is technically feasible.

  13. Synergistic effect of combined transcranial direct current stimulation/constraint-induced movement therapy in children and young adults with hemiparesis: study protocol.

    Science.gov (United States)

    Gillick, Bernadette; Menk, Jeremiah; Mueller, Bryon; Meekins, Gregg; Krach, Linda E; Feyma, Timothy; Rudser, Kyle

    2015-11-12

    Perinatal stroke occurs in more than 1 in 2,500 live births and resultant congenital hemiparesis necessitates investigation into interventions which may improve long-term function and decreased burden of care beyond current therapies ( http://www.cdc.gov/ncbddd/cp/data.html ). Constraint-Induced Movement Therapy (CIMT) is recognized as an effective hemiparesis rehabilitation intervention. Transcranial direct current stimulation as an adjunct treatment to CIMT may potentiate neuroplastic responses and improve motor function. The methodology of a clinical trial in children designed as a placebo-controlled, serial -session, non-invasive brain stimulation trial incorporating CIMT is described here. The primary hypotheses are 1) that no serious adverse events will occur in children receiving non-invasive brain stimulation and 2) that children in the stimulation intervention group will show significant improvements in hand motor function compared to children in the placebo stimulation control group. A randomized, controlled, double-blinded clinical trial. Twenty children and/or young adults (ages 8-21) with congenital hemiparesis, will be enrolled. The intervention group will receive ten 2-hour sessions of transcranial direct current stimulation combined with constraint-induced movement therapy and the control group will receive sham stimulation with CIMT. The primary outcome measure is safety assessment of transcranial direct current stimulation by physician evaluation, vital sign monitoring and symptom reports. Additionally, hand function will be evaluated using the Assisting Hand Assessment, grip strength and assessment of goals using the Canadian Occupational Performance Measure. Neuroimaging will confirm diagnoses, corticospinal tract integrity and cortical activation. Motor cortical excitability will also be examined using transcranial magnetic stimulation techniques. Combining non-invasive brain stimulation and CIMT interventions has the potential to improve motor

  14. Combining Exergame Training with Omega-3 Fatty Acid Supplementation: Protocol for a Randomized Controlled Study assessing the Effect on Neuronal Structure/Function in the Elderly Brain

    Directory of Open Access Journals (Sweden)

    Alexandra Schättin

    2016-11-01

    Full Text Available A common problem in the older population is the risk of falling that might lead to injury, immobility, and reduced survival. Age-related neuronal changes, e.g. decline in grey- and white-matter, affect neuronal, cognitive, and motor functioning. The improvement of these factors might decrease fall events in elderly. Studies showed that the sole administration of video game-based physical exercise, a so-called exergame, or omega-3 fatty acid (FA may improve motor and/or cognitive functioning through neuronal changes in the brain of older adults. The aim of this study is to assess the effects of a combination of exergame training with omega-3 FA supplementation on the elderly brain. We hypothesize that an intervention using a combination approach differently effects on the neuronal structure and function of the elderly’s brain as compared to the sole administration of exergame training. The study is a parallel, double-blinded, randomized controlled trial lasting 26 weeks. Sixty autonomous living, non-smoking, and right-handed healthy older (>65 years adults who live independently or in a senior residency are included, randomized, and allocated to one of two study groups. The experimental group receives a daily amount of 13.5ml fish oil (including 2.9g of omega-3 FA, whereas the control group receives a daily amount of 13.5ml olive oil for 26 weeks. After 16 weeks, both groups start with an exergame training program three times per week. Measurements are performed on three time-points by treatment blinded investigators: pre-intervention measurement, blood sample after 16 week, and post-intervention measurements. The main outcomes are motor evoked potentials of the right M. tibialis anterior (transcranial magnetic stimulation and response-related potentials (electroencephalography during a cognitive test. For secondary outcomes, reaction times during cognitive tests and spatio-temporal parameters during gait performance are measured. Statistics

  15. Technical Considerations for Filler and Neuromodulator Refinements

    Science.gov (United States)

    Wilson, Anthony J.; Chang, Brian L.; Percec, Ivona

    2016-01-01

    Background: The toolbox for cosmetic practitioners is growing at an unprecedented rate. There are novel products every year and expanding off-label indications for neurotoxin and soft-tissue filler applications. Consequently, aesthetic physicians are increasingly challenged by the task of selecting the most appropriate products and techniques to achieve optimal patient outcomes. Methods: We employed a PubMed literature search of facial injectables from the past 10 years (2005–2015), with emphasis on those articles embracing evidence-based medicine. We evaluated the scientific background of every product and the physicochemical properties that make each one ideal for specific indications. The 2 senior authors provide commentary regarding their clinical experience with specific technical refinements of neuromodulators and soft-tissue fillers. Results: Neurotoxins and fillers are characterized by unique physical characteristics that distinguish each product. This results in subtle but important differences in their clinical applications. Specific indications and recommendations for the use of the various neurotoxins and soft-tissue fillers are reviewed. The discussion highlights refinements in combination treatments and product physical modifications, according to specific treatment zones. Conclusions: The field of facial aesthetics has evolved dramatically, mostly secondary to our increased understanding of 3-dimensional structural volume restoration. Our work reviews Food and Drug Administration–approved injectables. In addition, we describe how to modify products to fulfill specific indications such as treatment of the mid face, décolletage, hands, and periorbital regions. Although we cannot directly evaluate the duration or exact physical properties of blended products, we argue that “product customization” is safe and provides natural results with excellent patient outcomes. PMID:28018778

  16. Essays on refining markets and environmental policy

    Science.gov (United States)

    Oladunjoye, Olusegun Akintunde

    This thesis is comprised of three essays. The first two essays examine empirically the relationship between crude oil price and wholesale gasoline prices in the U.S. petroleum refining industry while the third essay determines the optimal combination of emissions tax and environmental research and development (ER&D) subsidy when firms organize ER&D either competitively or as a research joint venture (RJV). In the first essay, we estimate an error correction model to determine the effects of market structure on the speed of adjustment of wholesale gasoline prices, to crude oil price changes. The results indicate that market structure does not have a strong effect on the dynamics of price adjustment in the three regional markets examined. In the second essay, we allow for inventories to affect the relationship between crude oil and wholesale gasoline prices by allowing them to affect the probability of regime change in a Markov-switching model of the refining margin. We find that low gasoline inventory increases the probability of switching from the low margin regime to the high margin regime and also increases the probability of staying in the high margin regime. This is consistent with the predictions of the competitive storage theory. In the third essay, we extend the Industrial Organization R&D theory to the determination of optimal environmental policies. We find that RJV is socially desirable. In comparison to competitive ER&D, we suggest that regulators should encourage RJV with a lower emissions tax and higher subsidy as these will lead to the coordination of ER&D activities and eliminate duplication of efforts while firms internalize their technological spillover externality.

  17. Technical Considerations for Filler and Neuromodulator Refinements.

    Science.gov (United States)

    Montes, José Raúl; Wilson, Anthony J; Chang, Brian L; Percec, Ivona

    2016-12-01

    Background: The toolbox for cosmetic practitioners is growing at an unprecedented rate. There are novel products every year and expanding off-label indications for neurotoxin and soft-tissue filler applications. Consequently, aesthetic physicians are increasingly challenged by the task of selecting the most appropriate products and techniques to achieve optimal patient outcomes. Methods: We employed a PubMed literature search of facial injectables from the past 10 years (2005-2015), with emphasis on those articles embracing evidence-based medicine. We evaluated the scientific background of every product and the physicochemical properties that make each one ideal for specific indications. The 2 senior authors provide commentary regarding their clinical experience with specific technical refinements of neuromodulators and soft-tissue fillers. Results: Neurotoxins and fillers are characterized by unique physical characteristics that distinguish each product. This results in subtle but important differences in their clinical applications. Specific indications and recommendations for the use of the various neurotoxins and soft-tissue fillers are reviewed. The discussion highlights refinements in combination treatments and product physical modifications, according to specific treatment zones. Conclusions: The field of facial aesthetics has evolved dramatically, mostly secondary to our increased understanding of 3-dimensional structural volume restoration. Our work reviews Food and Drug Administration-approved injectables. In addition, we describe how to modify products to fulfill specific indications such as treatment of the mid face, décolletage, hands, and periorbital regions. Although we cannot directly evaluate the duration or exact physical properties of blended products, we argue that "product customization" is safe and provides natural results with excellent patient outcomes.

  18. Combining Exergame Training with Omega-3 Fatty Acid Supplementation: Protocol for a Randomized Controlled Study Assessing the Effect on Neuronal Structure/Function in the Elderly Brain.

    Science.gov (United States)

    Schättin, Alexandra; de Bruin, Eling D

    2016-01-01

    A common problem in the older population is the risk of falling and related injury, immobility, and reduced survival. Age-related neuronal changes, e.g., decline in gray-and white-matter, affect neuronal, cognitive, and motor functioning. The improvement of these factors might decrease fall events in elderly. Studies showed that administration of video game-based physical exercise, a so-called exergame, or omega-3 fatty acid (FA) may improve motor and/or cognitive functioning through neuronal changes in the brain of older adults. The aim of this study is to assess the effects of a combination of exergame training with omega-3 FA supplementation on the elderly brain. We hypothesize that an intervention using a combination approach differently affects on the neuronal structure and function of the elderly's brain as compared to the sole administration of exergame training. The study is a parallel, double-blinded, randomized controlled trial lasting 26 weeks. Sixty autonomous living, non-smoking, and right-handed healthy older (>65 years) adults who live independently or in a senior residency are included, randomized, and allocated to one of two study groups. The experimental group receives a daily amount of 13.5 ml fish oil (including 2.9 g of omega-3 FA), whereas the control group receives a daily amount of 13.5 ml olive oil for 26 weeks. After 16 weeks, both groups start with an exergame training program three times per week. Measurements are performed on three time-points by treatment blinded investigators: pre-intervention measurements, blood sample after 16 week, and post-intervention measurements. The main outcomes are motor evoked potentials of the right M. tibialis anterior (transcranial magnetic stimulation) and response-related potentials (electroencephalography) during a cognitive test. For secondary outcomes, reaction time during cognitive tests and spatio-temporal parameters during gait performance are measured. Statistics will include effect sizes and a

  19. Zone refining of cadmium and related characterization

    Indian Academy of Sciences (India)

    N R Munirathnam; D S Prasad; Ch Sudheer; J V Rao; T L Prakash

    2005-06-01

    We present the zone refining results of cadmium using horizontal resistive zone refiner under constant flow of moisture free hydrogen gas. The boron impurity in cadmium can be avoided using quartz (GE 214 grade) boat in lieu of high pure graphite boat. The analytical results using inductively coupled plasma optical emission spectrometry (ICPOES) show that majority of the impurities are less than the detection limits. Comparatively, zinc is the most difficult impurity element to remove in cadmium matrix by zone refining.

  20. Refined curve counting on complex surfaces

    OpenAIRE

    Göttsche, Lothar; Shende, Vivek

    2012-01-01

    We define refined invariants which "count" nodal curves in sufficiently ample linear systems on surfaces, conjecture that their generating function is multiplicative, and conjecture explicit formulas in the case of K3 and abelian surfaces. We also give a refinement of the Caporaso-Harris recursion, and conjecture that it produces the same invariants in the sufficiently ample setting. The refined recursion specializes at y = -1 to the Itenberg-Kharlamov-Shustin recursion for Welschinger invari...

  1. Treatment Protocols as Hierarchical Structures

    Science.gov (United States)

    Ben-Bassat, Moshe; Carlson, Richard W.; Puri, Vinod K.; Weil, Max Harry

    1978-01-01

    We view a treatment protocol as a hierarchical structure of therapeutic modules. The lowest level of this structure consists of individual therapeutic actions. Combinations of individual actions define higher level modules, which we call routines. Routines are designed to manage limited clinical problems, such as the routine for fluid loading to correct hypovolemia. Combinations of routines and additional actions, together with comments, questions, or precautions organized in a branching logic, in turn, define the treatment protocol for a given disorder. Adoption of this modular approach may facilitate the formulation of treatment protocols, since the physician is not required to prepare complex flowcharts. This hierarchical approach also allows protocols to be updated and modified in a flexible manner. By use of such a standard format, individual components may be fitted together to create protocols for multiple disorders. The technique is suited for computer implementation. We believe that this hierarchical approach may facilitate standarization of patient care as well as aid in clinical teaching. A protocol for acute pancreatitis is used to illustrate this technique.

  2. Using Generalized Equivalent Uniform Dose Atlases to Combine and Analyze Prospective Dosimetric and Radiation Pneumonitis Data From 2 Non-Small Cell Lung Cancer Dose Escalation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Liu Fan; Yorke, Ellen D. [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Belderbos, Jose S.A.; Borst, Gerben R. [The Netherlands Cancer Institute, Antoni Van Leeuwenhoek Hospital, Amsterdam (Netherlands); Rosenzweig, Kenneth E. [Mount Sinai School of Medicine, New York, New York (United States); Lebesque, Joos V. [The Netherlands Cancer Institute, Antoni Van Leeuwenhoek Hospital, Amsterdam (Netherlands); Jackson, Andrew, E-mail: jacksona@mskcc.org [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States)

    2013-01-01

    Purpose: To demonstrate the use of generalized equivalent uniform dose (gEUD) atlas for data pooling in radiation pneumonitis (RP) modeling, to determine the dependence of RP on gEUD, to study the consistency between data sets, and to verify the increased statistical power of the combination. Methods and Materials: Patients enrolled in prospective phase I/II dose escalation studies of radiation therapy of non-small cell lung cancer at Memorial Sloan-Kettering Cancer Center (MSKCC) (78 pts) and the Netherlands Cancer Institute (NKI) (86 pts) were included; 10 (13%) and 14 (17%) experienced RP requiring steroids (RPS) within 6 months after treatment. gEUD was calculated from dose-volume histograms. Atlases for each data set were created using 1-Gy steps from exact gEUDs and RPS data. The Lyman-Kutcher-Burman model was fit to the atlas and exact gEUD data. Heterogeneity and inconsistency statistics for the fitted parameters were computed. gEUD maps of the probability of RPS rate {>=}20% were plotted. Results: The 2 data sets were homogeneous and consistent. The best fit values of the volume effect parameter a were small, with upper 95% confidence limit around 1.0 in the joint data. The likelihood profiles around the best fit a values were flat in all cases, making determination of the best fit a weak. All confidence intervals (CIs) were narrower in the joint than in the individual data sets. The minimum P value for correlations of gEUD with RPS in the joint data was .002, compared with P=.01 and .05 for MSKCC and NKI data sets, respectively. gEUD maps showed that at small a, RPS risk increases with gEUD. Conclusions: The atlas can be used to combine gEUD and RPS information from different institutions and model gEUD dependence of RPS. RPS has a large volume effect with the mean dose model barely included in the 95% CI. Data pooling increased statistical power.

  3. Combined effect of new complete dentures and simple dietary advice on nutritional status in edentulous patients: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Komagamine, Yuriko; Kanazawa, Manabu; Iwaki, Maiko; Jo, Ayami; Suzuki, Hiroyuki; Amagai, Noriko; Minakuchi, Shunsuke

    2016-11-09

    Individuals who are edentulous have a lower intake of fruit, vegetables, fiber, and protein compared with their dentate counterparts because tooth loss is accompanied by a decrease in ability to chew. Whether or not a combination of prosthetic rehabilitation and simple dietary advice produces improvement in dietary intake among edentulous persons is unclear. We aim to investigate the effect of a simultaneous combination of simple dietary advice delivered by dentists and provision of new complete dentures on dietary intake in edentulous individuals who request new dentures. Through a double-blinded, parallel, randomized controlled trial in which 70 edentate persons who request new complete dentures will be enrolled, eligible study participants will be randomly allocated to either a dietary intervention group receiving dietary advice or to a control group receiving only advice on the care and maintenance of dentures. Outcome measures include daily intake of nutrients and food items, assessed using a brief self-administered diet history questionnaire; antioxidant capacity, determined using blood and urine samples; nutritional status, assessed with the Mini-Nutritional Assessment-Short Form; oral health-related quality of life, assessed with the Japanese version of the Oral Health Impact Profile-EDENT and the Geriatric Oral Health Assessment Index; subjective chewing ability; masticatory performance, assessed using a color-changeable chewing gum and a gummy jelly; patient self-assessment of dentures; mild cognitive impairment, assessed with the Japanese version of the Montreal Cognitive Assessment; and functional capacity, assessed with the Japan Science and Technology Agency Index of Competence. Outcome measures, except for antioxidant capacity, are to be implemented at three time points: at baseline and at 3 and 6 months following intervention. Antioxidant capacity data are to be collected twice: at baseline and at 3 months following intervention. Differences

  4. Physiotherapy alone or in combination with corticosteroid injection for acute lateral epicondylitis in general practice: A protocol for a randomised, placebo-controlled study

    Directory of Open Access Journals (Sweden)

    Holmedal Øystein

    2009-12-01

    Full Text Available Abstract Background Lateral epicondylitis is a painful condition responsible for loss of function and sick leave for long periods of time. In many countries, the treatment guidelines recommend a wait-and-see policy, reflecting that no conclusions on the best treatment can be drawn from the available research, published studies and meta-analyses. Methods/Design Randomized double blind controlled clinical trial in a primary care setting. While earlier trials have either compared corticosteroid injections to physical therapy or to naproxen orally, we will compare the clinical effect of physiotherapy alone or physiotherapy combined with corticosteroid injection in the initial treatment of acute tennis elbow. Patients seeing their general practitioner with lateral elbow pain of recent onset will be randomised to one of three interventions: 1: physiotherapy, corticosteroid injection and naproxen or 2: physiotherapy, placebo injection and naproxen or 3: wait and see treatment with naproxen alone. Treatment and assessments are done by two different doctors, and the contents of the injection is unknown to both the treating doctor and patient. The primary outcome measure is the patient's evaluation of improvement after 6, 12, 26 and 52 weeks. Secondary outcome measures are pain, function and severity of main complaint, pain-free grip strength, maximal grip strength, pressure-pain threshold, the patient's satisfaction with the treatment and duration of sick leave. Conclusion This article describes a randomized, double blind, controlled clinical trial with a one year follow up to investigate the effects of adding steroid injections to physiotherapy in acute lateral epicondylitis. Trial Registration ClinicalTrials.gov Identifier: NCT00826462

  5. Protein structure refinement by optimization.

    Science.gov (United States)

    Carlsen, Martin; Røgen, Peter

    2015-09-01

    Knowledge-based protein potentials are simplified potentials designed to improve the quality of protein models, which is important as more accurate models are more useful for biological and pharmaceutical studies. Consequently, knowledge-based potentials often are designed to be efficient in ordering a given set of deformed structures denoted decoys according to how close they are to the relevant native protein structure. This, however, does not necessarily imply that energy minimization of this potential will bring the decoys closer to the native structure. In this study, we introduce an iterative strategy to improve the convergence of decoy structures. It works by adding energy optimized decoys to the pool of decoys used to construct the next and improved knowledge-based potential. We demonstrate that this strategy results in significantly improved decoy convergence on Titan high resolution decoys and refinement targets from Critical Assessment of protein Structure Prediction competitions. Our potential is formulated in Cartesian coordinates and has a fixed backbone potential to restricts motions to be close to those of a dihedral model, a fixed hydrogen-bonding potential and a variable coarse grained carbon alpha potential consisting of a pair potential and a novel solvent potential that are b-spline based as we use explicit gradient and Hessian for efficient energy optimization.

  6. Model Checking Linearizability via Refinement

    Science.gov (United States)

    Liu, Yang; Chen, Wei; Liu, Yanhong A.; Sun, Jun

    Linearizability is an important correctness criterion for implementations of concurrent objects. Automatic checking of linearizability is challenging because it requires checking that 1) all executions of concurrent operations be serializable, and 2) the serialized executions be correct with respect to the sequential semantics. This paper describes a new method to automatically check linearizability based on refinement relations from abstract specifications to concrete implementations. Our method avoids the often difficult task of determining linearization points in implementations, but can also take advantage of linearization points if they are given. The method exploits model checking of finite state systems specified as concurrent processes with shared variables. Partial order reduction is used to effectively reduce the search space. The approach is built into a toolset that supports a rich set of concurrent operators. The tool has been used to automatically check a variety of implementations of concurrent objects, including the first algorithms for the mailbox problem and scalable NonZero indicators. Our system was able to find all known and injected bugs in these implementations.

  7. A web delivered intervention for depression combining Behavioural Activation with physical activity promotion: study protocol for a pilot randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Jeffrey David Lambert

    2015-10-01

    Full Text Available Background: Physical activity (PA yields moderate effect sizes for treating depression (Cooney et al., 2013. PA may also help reduce depressive relapse, providing additional psychological benefits such as positive self-regard and a sense of competence (Babyak et al., 2000. Behavioural Activation (BA is an evidence-based psychological therapy for depression, which aims to get people more engaged with activities that provide positive reinforcement for non-depressed behaviours (Hopko, Lejuez, LePage, Hopko, & McNeil, 2003. The structured nature of BA is consistent with the use of good behaviour change techniques (specific goal-setting, self-regulation offering a potential platform for promoting PA alongside depression treatment. BA may also be useful for gradually increasing PA in people who are more sedentary than the general population. Aims: This pilot randomised controlled trial aims to assess the feasibility, acceptability, and fidelity of a web-delivered intervention combining BA and PA (eBAcPAc to enhance mental and physical health, and assess the trial methods. Method: A community sample of 120 people exhibiting symptoms of depression and who are participating in less than 150 minutes of moderate to vigorous physical activity per week will be randomized to receive eBAcPAc or be put on a wait list control group. eBAcPAc is informed by previous work (Farrand et al., 2014; Pentecost et al., 2015 and further developed using the Centre for eHealth Research and Disease management Roadmap (CeHReS (van Gemert-Pijnen et al., 2011 in order to be applied in an web-based setting. A platform hosted by the University of Glasgow which has been used to deliver a wide range of successful web-delivered interventions for mental health, will be used to deliver eBAcPAc. Feasibility measures will include data on recruitment, attrition and acceptability. Pre-post outcome measures will include the PHQ-9, and self-reported and accelerometer measured PA. Process and

  8. Preventing relapse in recurrent depression using mindfulness-based cognitive therapy, antidepressant medication or the combination: trial design and protocol of the MOMENT study

    Directory of Open Access Journals (Sweden)

    Huijbers Marloes J

    2012-08-01

    Full Text Available Abstract Background Depression is a common psychiatric disorder characterized by a high rate of relapse and recurrence. The most commonly used strategy to prevent relapse/recurrence is maintenance treatment with antidepressant medication (mADM. Recently, it has been shown that Mindfulness-Based Cognitive Therapy (MBCT is at least as effective as mADM in reducing the relapse/recurrence risk. However, it is not yet known whether combination treatment of MBCT and mADM is more effective than either of these treatments alone. Given the fact that most patients have a preference for either mADM or for MBCT, the aim of the present study is to answer the following questions. First, what is the effectiveness of MBCT in addition to mADM? Second, how large is the risk of relapse/recurrence in patients withdrawing from mADM after participating in MBCT, compared to those who continue to use mADM after MBCT? Methods/design Two parallel-group, multi-center randomized controlled trials are conducted. Adult patients with a history of depression (3 or more episodes, currently either in full or partial remission and currently treated with mADM (6 months or longer are recruited. In the first trial, we compare mADM on its own with mADM plus MBCT. In the second trial, we compare MBCT on its own, including tapering of mADM, with mADM plus MBCT. Follow-up assessments are administered at 3-month intervals for 15 months. Primary outcome is relapse/recurrence. Secondary outcomes are time to, duration and severity of relapse/recurrence, quality of life, personality, several process variables, and incremental cost-effectiveness ratio. Discussion Taking into account patient preferences, this study will provide information about a the clinical and cost-effectiveness of mADM only compared with mADM plus MBCT, in patients with a preference for mADM, and b the clinical and cost-effectiveness of withdrawing from mADM after MBCT, compared with mADM plus MBCT, in patients with a

  9. Refinement Checking on Parametric Modal Transition Systems

    DEFF Research Database (Denmark)

    Benes, Nikola; Kretínsky, Jan; Larsen, Kim Guldstrand

    2015-01-01

    Modal transition systems (MTS) is a well-studied specification formalism of reactive systems supporting a step-wise refinement methodology. Despite its many advantages, the formalism as well as its currently known extensions are incapable of expressing some practically needed aspects in the refin...

  10. Refined large N duality for torus knots

    DEFF Research Database (Denmark)

    Nawata, Satoshi; Kameyama, Masaya

    We formulate large N duality of U(N) refined Chern-Simons theory with a torus knot/link in S³. By studying refined BPS states in M-theory, we provide the explicit form of low-energy effective actions of Type IIA string theory with D4-branes on the Ω-background. This form enables us to relate...

  11. Refinement Checking on Parametric Modal Transition Systems

    DEFF Research Database (Denmark)

    Benes, Nikola; Kretínsky, Jan; Larsen, Kim Guldstrand

    2015-01-01

    Modal transition systems (MTS) is a well-studied specification formalism of reactive systems supporting a step-wise refinement methodology. Despite its many advantages, the formalism as well as its currently known extensions are incapable of expressing some practically needed aspects in the refin...

  12. Refined Black Hole Ensembles and Topological Strings

    CERN Document Server

    Aganagic, Mina

    2012-01-01

    We formulate a refined version of the Ooguri-Strominger-Vafa (OSV) conjecture. The OSV conjecture that Z_{BH} = |Z_{top}|^2 relates the BPS black hole partition function to the topological string partition function Z_{top}. In the refined conjecture, Z_{BH} is the partition function of BPS black holes counted with spin, or more precisely the protected spin character. Z_{top} becomes the partition function of the refined topological string, which is itself an index. Both the original and the refined conjecture are examples of large N duality in the 't Hooft sense. The refined conjecture applies to non-compact Calabi-Yau manifolds only, so the black holes are really BPS particles with large entropy, of order N^2. The refined OSV conjecture states that the refined BPS partition function has a large N dual which is captured by the refined topological string. We provide evidence that the conjecture holds by studying local Calabi-Yau threefolds consisting of line bundles over a genus g Riemann surface. We show that...

  13. Refined large N duality for torus knots

    DEFF Research Database (Denmark)

    Nawata, Satoshi; Kameyama, Masaya

    We formulate large N duality of U(N) refined Chern-Simons theory with a torus knot/link in S³. By studying refined BPS states in M-theory, we provide the explicit form of low-energy effective actions of Type IIA string theory with D4-branes on the Ω-background. This form enables us to relate...

  14. Body surface area adapted iopromide 300 mg/ml versus 370 mg/ml contrast medium injection protocol: Influence on quantitative and clinical assessment in combined PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Verburg, Frederik A., E-mail: fverburg@ukaachen.de [RWTH Aachen University Hospital, Department of Nuclear Medicine, Pauwelsstraße 30, 52074 Aachen (Germany); Maastricht University Medical Center, Department of Nuclear Medicine, P. Debyelaan 25, 6229 HX Maastricht (Netherlands); Apitzsch, Jonas [RWTH Aachen University Hospital, Department of Diagnostic and Interventional Radiology, Pauwelsstraße 30, 52074 Aachen (Germany); Lensing, Carina [RWTH Aachen University Hospital, Department of Nuclear Medicine, Pauwelsstraße 30, 52074 Aachen (Germany); Kuhl, Christiane K. [RWTH Aachen University Hospital, Department of Diagnostic and Interventional Radiology, Pauwelsstraße 30, 52074 Aachen (Germany); Pietsch, Hubertus [Bayer Pharma AG, Müllerstrasse 178, 13353 Berlin (Germany); Mottaghy, Felix M. [RWTH Aachen University Hospital, Department of Nuclear Medicine, Pauwelsstraße 30, 52074 Aachen (Germany); Maastricht University Medical Center, Department of Nuclear Medicine, P. Debyelaan 25, 6229 HX Maastricht (Netherlands); Behrendt, Florian F. [RWTH Aachen University Hospital, Department of Nuclear Medicine, Pauwelsstraße 30, 52074 Aachen (Germany)

    2013-12-01

    Purpose: To investigate the quantitative and qualitative differences between combined positron emission tomography and computed X-ray tomography (PET/CT) enhanced with contrast medium with either an iodine concentration 300 mg/ml or 370 mg/ml. Materials and methods: 120 consecutive patients scheduled for F-18-Fluorodeoxyglucose (FDG) PET/CT were included. The first (second) 60 patients received contrast medium with 300 (370) mg iodine/ml. Intravenous injection protocols were adapted for an identical iodine delivery rate (1.3 mg/s) and body surface area (BSA) adapted iodine dose (22.26 g I/m{sup 2}). Maximum and mean standardized uptake values (SUV{sub max}; SUV{sub mean}) and contrast enhancement (HU) were determined in the ascending aorta, the abdominal aorta, the inferior vena cava, the portal vein, the liver and the right kidney in the venous contrast medium phase. PET data were evaluated visually for the presence of malignancy and image quality. Results: Both media caused significantly higher values for HU, SUV{sub mean} and SUV{sub max} for the enhanced PET/CT than the non-enhanced one (all p < 0.01). There were no significant differences in the degree of increase of HU, SUV{sub mean} and SUV{sub max} between the two contrast media at any anatomic site (all p > 0.05). Visual evaluation of lesions showed no differences between contrast and non-contrast PET/CT or between the two different contrast media (p = 0.77). Conclusion: When using a constant iodine delivery rate and total iodine dose in a BSA adapted injection protocol, there are no quantitative or qualitative differences in either CT or PET between contrast media with an iodine concentration of 300 mg/ml and 370 mg/ml, respectively.

  15. Evaluation of Enrichment Protocols for Bacterial Endosymbionts of Ciliates by Real-Time PCR.

    Science.gov (United States)

    Castelli, Michele; Lanzoni, Olivia; Rossi, Leonardo; Potekhin, Alexey; Schrallhammer, Martina; Petroni, Giulio

    2016-06-01

    Large-scale studies on obligate bacterial endosymbionts may frequently require preliminary purification and enrichment protocols, which are often elaborate to set up and to evaluate, especially if the host organism is a protist. The purpose of this study was to develop a real-time PCR-based strategy and employ it for assessing two of such enrichment protocols for Holospora caryophila, hosted by the ciliate Paramecium. Four SSU rRNA gene-targeted real-time PCR assays were designed, which allowed to compare the amount of H. caryophila to other organisms, namely the host, its food bacterium (Raoultella planticola), and free-living bacteria present in the culture medium. By the use of the real-time PCR assays in combination, it was possible to conclude that the "cell fractionation" protocol was quite successful in the enrichment of the symbiont, while the "Percoll gradient" protocol will need further refinements to be fully repeatable. The proposed approach has the potential to facilitate and encourage future studies on the yet underexplored field of bacterial endosymbionts of ciliates and other protists. It can also find valuable applications for experimental questions other than those tested, such as fast and precise assessment of symbiont abundance in natural populations and comparison among multiple coexisting symbionts.

  16. North Dakota Refining Capacity Study

    Energy Technology Data Exchange (ETDEWEB)

    Dennis Hill; Kurt Swenson; Carl Tuura; Jim Simon; Robert Vermette; Gilberto Marcha; Steve Kelly; David Wells; Ed Palmer; Kuo Yu; Tram Nguyen; Juliam Migliavacca

    2011-01-05

    According to a 2008 report issued by the United States Geological Survey, North Dakota and Montana have an estimated 3.0 to 4.3 billion barrels of undiscovered, technically recoverable oil in an area known as the Bakken Formation. With the size and remoteness of the discovery, the question became 'can a business case be made for increasing refining capacity in North Dakota?' And, if so what is the impact to existing players in the region. To answer the question, a study committee comprised of leaders in the region's petroleum industry were brought together to define the scope of the study, hire a consulting firm and oversee the study. The study committee met frequently to provide input on the findings and modify the course of the study, as needed. The study concluded that the Petroleum Area Defense District II (PADD II) has an oversupply of gasoline. With that in mind, a niche market, naphtha, was identified. Naphtha is used as a diluent used for pipelining the bitumen (heavy crude) from Canada to crude markets. The study predicted there will continue to be an increase in the demand for naphtha through 2030. The study estimated the optimal configuration for the refinery at 34,000 barrels per day (BPD) producing 15,000 BPD of naphtha and a 52 percent refinery charge for jet and diesel yield. The financial modeling assumed the sponsor of a refinery would invest its own capital to pay for construction costs. With this assumption, the internal rate of return is 9.2 percent which is not sufficient to attract traditional investment given the risk factor of the project. With that in mind, those interested in pursuing this niche market will need to identify incentives to improve the rate of return.

  17. Time Critical Isosurface Refinement and Smoothing

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, V.; Bajaj, C.L.

    2000-07-10

    Multi-resolution data-structures and algorithms are key in Visualization to achieve real-time interaction with large data-sets. Research has been primarily focused on the off-line construction of such representations mostly using decimation schemes. Drawbacks of this class of approaches include: (i) the inability to maintain interactivity when the displayed surface changes frequently, (ii) inability to control the global geometry of the embedding (no self-intersections) of any approximated level of detail of the output surface. In this paper we introduce a technique for on-line construction and smoothing of progressive isosurfaces. Our hybrid approach combines the flexibility of a progressive multi-resolution representation with the advantages of a recursive sub-division scheme. Our main contributions are: (i) a progressive algorithm that builds a multi-resolution surface by successive refinements so that a coarse representation of the output is generated as soon as a coarse representation of the input is provided, (ii) application of the same scheme to smooth the surface by means of a 3D recursive subdivision rule, (iii) a multi-resolution representation where any adaptively selected level of detail surface is guaranteed to be free of self-intersections.

  18. Grain refinement of AZ31 magnesium alloy by electromagnetic stirring under effect of grain-refiner

    Indian Academy of Sciences (India)

    S Y Gao; Q C Le; Z Q Zhang; J Z Cui

    2012-08-01

    The effects of electromagnetic stirring and Al4C3 grain refiner on the grain refinement of semicontinuously cast AZ31 magnesium alloy were discussed in this investigation. The results indicate that electromagnetic stirring has an effective refining effect on the grain size of AZ31 magnesium alloy under the effect of Al4C3 grain refiner. Electromagnetic stirring can `activate’ the Al4C3 particles, resulting in more heterogeneous nucleation sites for the primary -Mg grains. But, longer holding time can `deactivate’ the Al4C3 particles and poison the grain refining effect.

  19. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...... of communication protocols. With it, partners can suggest a new protocol by sending its specification. After formally verifying the specification, each partner generates an implementation, which can then be used for establishing communication. We also present a practical realisation of the Protocol Implementation...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....

  20. Bipolar Neutrosophic Refined Sets and Their Applications in Medical Diagnosis

    OpenAIRE

    Deli, Irfan; Yusuf, Şubaş

    2016-01-01

    This paper proposes concept of bipolar neutrosophic refined set and its some operations. Firstly, a score certainty and accuracy function to compare the bipolar neutrosophic refined information is defined. Secondly, to aggregate the bipolar neutrosophic refined information, a bipolar neutrosophic refined weighted average operator and a bipolar neutrosophic refined weighted geometric operator is developed.

  1. Bipolar Neutrosophic Refined Sets and Their Applications in Medical Diagnosis

    OpenAIRE

    Deli, Irfan; Yusuf, Şubaş

    2016-01-01

    This paper proposes concept of bipolar neutrosophic refined set and its some operations. Firstly, a score certainty and accuracy function to compare the bipolar neutrosophic refined information is defined. Secondly, to aggregate the bipolar neutrosophic refined information, a bipolar neutrosophic refined weighted average operator and a bipolar neutrosophic refined weighted geometric operator is developed.

  2. Topological coding and its application in the refinement of SIFT.

    Science.gov (United States)

    Liu, Run Zong; Tang, Yuan Yan; Fang, Bin

    2014-11-01

    Point pattern matching plays a prominent role in the fields of computer vision and pattern recognition. A technique combining the circular onion peeling and the radial decomposition is proposed to analyze the topology structure of a point pattern. The analysis derives a feature which records the topological structure of a point pattern. This novel feature is free from isometric assumption. It can resist various deformations such as adding points, suppressing points, affine transformations, projective transformations and elastic transformations to some degree. A refinement solution of the well known scale invariant feature transform (SIFT) algorithm is also proposed based on the probabilistic analysis of this feature. Experimental results show that the proposed refinement solution for SIFT using this feature is effective and robust.

  3. Facade model refinement by fusing terrestrial laser data and image

    Science.gov (United States)

    Liu, Yawen; Qin, Sushun

    2015-12-01

    The building facade model is one of main landscapes of a city and basic data of city geographic information. It is widely useful in accurate path planning, real navigation through the urban environment, location-based application, etc. In this paper, a method of facade model refinement by fusing terrestrial laser data and image is presented. It uses the matching of model edge and image line combined with laser data verification and effectively refines facade geometry model that reconstructed from laser data. The laser data of geometric structures on building facade such as window, balcony and door are segmented, and used as a constraint for further selecting the optical model edges that are located at the cross-line of point data and no data. The results demonstrate the deviation of model edges caused by laser sampling interval can be removed in the proposed method.

  4. General refinement strategy for magnetic structures using spherical neutron polarimetry and representation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poole, A. [Department of Chemistry, University College London, 20 Gordon Street, London WC1H 0AJ (United Kingdom); London Centre for Nanotechnology, 17-19 Gordon Street, London WC1H 0AH (United Kingdom); Lelievre-Berna, E. [Institut Laue-Langevin, BP 156, 38042 Grenoble Cedex 9 (France); Wills, A.S., E-mail: a.s.wills@ucl.ac.u [Department of Chemistry, University College London, 20 Gordon Street, London WC1H 0AJ (United Kingdom); London Centre for Nanotechnology, 17-19 Gordon Street, London WC1H 0AH (United Kingdom)

    2009-09-01

    Spherical neutron polarimetry (SNP) is a powerful technique for the determination of magnetic structures which may otherwise be intractable. The complexity of the neutron scattering process and the large number of different possible trial structures typically leads to refinements based on a simple trial and error generation of possible models and a possible failure to explore valid possible models. The combination of the model symmetry types determined from representational analysis and reverse-Monte Carlo refinement creates a generalized refinement strategy for SNP data that allows refinement in terms of symmetry adapted modes built up from the basis vectors that describe the orientations of the magnetic moments on the different magnetic sites, and those of the different domains that are possible in a sample: spin (S)-domains and K-domains. This methodology typically leads to a large reduction in the number of refined parameters as well as the rigorous inclusion of any symmetry related domains. In combination with reverse-Monte Carlo refinement algorithms a general strategy for refining complex magnetic structures can be created. We present an example of a frustrated magnetic structure that have been determined using this approach Er{sub 2}Ti{sub 2}O{sub 7}.

  5. Clinical efficacy of β-lactam/β-lactamase inhibitor combinations for the treatment of bloodstream infection due to extended-spectrum β-lactamase-producing Enterobacteriaceae in haematological patients with neutropaenia: a study protocol for a retrospective observational study (BICAR)

    Science.gov (United States)

    Gudiol, C; Royo-Cebrecos, C; Tebe, C; Abdala, E; Akova, M; Álvarez, R; Maestro-de la Calle, G; Cano, A; Cervera, C; Clemente, W T; Martín-Dávila, P; Freifeld, A; Gómez, L; Gottlieb, T; Gurguí, M; Herrera, F; Manzur, A; Maschmeyer, G; Meije, Y; Montejo, M; Peghin, M; Rodríguez-Baño, J; Ruiz-Camps, I; Sukiennik, T C; Carratalà, J

    2017-01-01

    Introduction Bloodstream infection (BSI) due to extended-spectrum β-lactamase-producing Gram-negative bacilli (ESBL-GNB) is increasing at an alarming pace worldwide. Although β-lactam/β-lactamase inhibitor (BLBLI) combinations have been suggested as an alternative to carbapenems for the treatment of BSI due to these resistant organisms in the general population, their usefulness for the treatment of BSI due to ESBL-GNB in haematological patients with neutropaenia is yet to be elucidated. The aim of the BICAR study is to compare the efficacy of BLBLI combinations with that of carbapenems for the treatment of BSI due to an ESBL-GNB in this population. Methods and analysis A multinational, multicentre, observational retrospective study. Episodes of BSI due to ESBL-GNB occurring in haematological patients and haematopoietic stem cell transplant recipients with neutropaenia from 1 January 2006 to 31 March 2015 will be analysed. The primary end point will be case-fatality rate within 30 days of onset of BSI. The secondary end points will be 7-day and 14-day case-fatality rates, microbiological failure, colonisation/infection by resistant bacteria, superinfection, intensive care unit admission and development of adverse events. Sample size The number of expected episodes of BSI due to ESBL-GNB in the participant centres will be 260 with a ratio of control to experimental participants of 2. Ethics and dissemination The protocol of the study was approved at the first site by the Research Ethics Committee (REC) of Hospital Universitari de Bellvitge. Approval will be also sought from all relevant RECs. Any formal presentation or publication of data from this study will be considered as a joint publication by the participating investigators and will follow the recommendations of the International Committee of Medical Journal Editors (ICMJE). The study has been endorsed by the European Study Group for Bloodstream Infection and Sepsis (ESGBIS) and the European Study Group

  6. Multivariate refinement equation with nonnegative masks

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper is concerned with multivariate refinement equations of the type ψ = ∑α∈Zs a(α)ψ(Mx - α),where ψ is the unknown function defined on the s-dimensional Euclidean space Rs, a is a finitely supported nonnegative sequence on Zs, and M is an s × s dilation matrix with m := |detM|. We characterize the existence of L2-solution of refinement equation in terms of spectral radius of a certain finite matrix or transition operator associated with refinement mask a and dilation matrix M. For s = 1 and M = 2, the sufficient and necessary conditions are obtained to characterize the existence of continuous solution of this refinement equation.

  7. Refining the theory of basic individual values.

    Science.gov (United States)

    Schwartz, Shalom H; Cieciuch, Jan; Vecchione, Michele; Davidov, Eldad; Fischer, Ronald; Beierlein, Constanze; Ramos, Alice; Verkasalo, Markku; Lönnqvist, Jan-Erik; Demirutku, Kursad; Dirilen-Gumus, Ozlem; Konty, Mark

    2012-10-01

    We propose a refined theory of basic individual values intended to provide greater heuristic and explanatory power than the original theory of 10 values (Schwartz, 1992). The refined theory more accurately expresses the central assumption of the original theory that research has largely ignored: Values form a circular motivational continuum. The theory defines and orders 19 values on the continuum based on their compatible and conflicting motivations, expression of self-protection versus growth, and personal versus social focus. We assess the theory with a new instrument in 15 samples from 10 countries (N = 6,059). Confirmatory factor and multidimensional scaling analyses support discrimination of the 19 values, confirming the refined theory. Multidimensional scaling analyses largely support the predicted motivational order of the values. Analyses of predictive validity demonstrate that the refined values theory provides greater and more precise insight into the value underpinnings of beliefs. Each value correlates uniquely with external variables.

  8. Progressive refinement for robust image registration

    Institute of Scientific and Technical Information of China (English)

    Li Song; Yuanhua Zhou; Jun Zhou

    2005-01-01

    @@ A new image registration algorithm with robust cost function and progressive refinement estimation is developed on the basis of direct method (DM). The robustness lies in M-estimation to avert larger local noise and outliers.

  9. Oil Refining Giants to Cut Output

    Institute of Scientific and Technical Information of China (English)

    Zhang Dingmin

    2002-01-01

    @@ China's refined oil product output rose slightly in the first five months of this year on strong demand,but the two biggest domestic producers plan to cut production rates in an attempt to bolster the sluggish price.

  10. Multidataset Refinement Resonant Diffraction, and Magnetic Structures

    OpenAIRE

    Attfield, J. Paul

    2004-01-01

    The scope of Rietveld and other powder diffraction refinements continues to expand, driven by improvements in instrumentation, methodology and software. This will be illustrated by examples from our research in recent years. Multidataset refinement is now commonplace; the datasets may be from different detectors, e.g., in a time-of-flight experiment, or from separate experiments, such as at several x-ray energies giving resonant information. The complementary use of x rays and neutrons is exe...

  11. A Grid-enabled web portal for NMR structure refinement with AMBER.

    Science.gov (United States)

    Bertini, Ivano; Case, David A; Ferella, Lucio; Giachetti, Andrea; Rosato, Antonio

    2011-09-01

    The typical workflow for NMR structure determination involves collecting thousands of conformational restraints, calculating a bundle of 20-40 conformers in agreement with them and refining the energetics of these conformers. The structure calculation step employs simulated annealing based on molecular dynamics (MD) simulations with very simplified force fields. The value of refining the calculated conformers using restrained MD (rMD) simulations with state-of-art force fields is documented. This refinement however presents various subtleties, from the proper formatting of conformational restraints to the definition of suitable protocols. We describe a web interface to set up and run calculations with the AMBER package, which we called AMPS-NMR (AMBER-based Portal Server for NMR structures). The interface allows the refinement of NMR structures through rMD. Some predefined protocols are provided for this purpose, which can be personalized; it is also possible to create an entirely new protocol. AMPS-NMR can handle various restraint types. Standard rMD refinement in explicit water of the structures of three different proteins are shown as examples. AMPS-NMR additionally includes a workspace for the user to store different calculations. As an ancillary service, a web interface to AnteChamber is available, enabling the calculation of force field parameters for organic molecules such as ligands in protein-ligand adducts. AMPS-NMR is embedded within the NMR services of the WeNMR project and is available at http://py-enmr.cerm.unifi.it/access/index/amps-nmr; its use requires registration with a digital certificate. ivanobertini@cerm.unifi.it Supplementary data are available at Bioinformatics online.

  12. Refining MARGINS Mini-Lessons Using Classroom Observations

    Science.gov (United States)

    Iverson, E. A.; Manduca, C. A.; McDaris, J. R.; Lee, S.

    2009-12-01

    process instructive. Activity authors found the observations very helpful and the first mini-lessons have now been revised using feedback from testers. Initial results show that the tested mini-lessons give students hands-on experience with scientific data and help students make connections between geologic phenomena and data. Productive feedback ranged from suggestions for improving activity design, adaptations for other audiences, suggestions for clearer presentation, and tips for using the materials. The team plans to broaden the use of the protocol to test and refine all of the mini-lessons in the MARGINS collection.

  13. Reaching Adolescent Gay, Bisexual, and Queer Men Online: Development and Refinement of a National Recruitment Strategy

    OpenAIRE

    Tonya L Prescott; Phillips II, Gregory; DuBois, L. Zachary; Bull, Sheana S; Mustanski, Brian; Ybarra, Michele L

    2016-01-01

    Background Using social networking websites to recruit research participants is increasingly documented in the literature, although few studies have leveraged these sites to reach those younger than 18 years. Objective To discuss the development and refinement of a recruitment protocol to reach and engage adolescent gay, bisexual, and other teenaged men who have sex with men (AGBM). Participants were recruited for development and evaluation activities related to Guy2Guy, a text messaging–base...

  14. Some Strategies for the Refinement of Disordered Structures with the SHELXL-97 Refinement Program

    Institute of Scientific and Technical Information of China (English)

    Ng Seik Weng

    2005-01-01

    Examples of the refinement of disordered structures taken from work published in Acta Crystallographica are presented together with explanations on how the disordered structures are dealt with. The constraint and restraint commands in the SHELXL-97 refinement program that are specific to the particular type of disorder are described. Some ways to divine the hydrogen atoms on water are given in detail. The refinements of the Flack parameters and twinned structures are briefly discussed.

  15. Refining SCJ Mission Specifications into Parallel Handler Designs

    Directory of Open Access Journals (Sweden)

    Frank Zeyda

    2013-05-01

    Full Text Available Safety-Critical Java (SCJ is a recent technology that restricts the execution and memory model of Java in such a way that applications can be statically analysed and certified for their real-time properties and safe use of memory. Our interest is in the development of comprehensive and sound techniques for the formal specification, refinement, design, and implementation of SCJ programs, using a correct-by-construction approach. As part of this work, we present here an account of laws and patterns that are of general use for the refinement of SCJ mission specifications into designs of parallel handlers used in the SCJ programming paradigm. Our notation is a combination of languages from the Circus family, supporting state-rich reactive models with the addition of class objects and real-time properties. Our work is a first step to elicit laws of programming for SCJ and fits into a refinement strategy that we have developed previously to derive SCJ programs.

  16. Parallel Block Structured Adaptive Mesh Refinement on Graphics Processing Units

    Energy Technology Data Exchange (ETDEWEB)

    Beckingsale, D. A. [Atomic Weapons Establishment (AWE), Aldermaston (United Kingdom); Gaudin, W. P. [Atomic Weapons Establishment (AWE), Aldermaston (United Kingdom); Hornung, R. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gunney, B. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Herdman, J. A. [Atomic Weapons Establishment (AWE), Aldermaston (United Kingdom); Jarvis, S. A. [Atomic Weapons Establishment (AWE), Aldermaston (United Kingdom)

    2014-11-17

    Block-structured adaptive mesh refinement is a technique that can be used when solving partial differential equations to reduce the number of zones necessary to achieve the required accuracy in areas of interest. These areas (shock fronts, material interfaces, etc.) are recursively covered with finer mesh patches that are grouped into a hierarchy of refinement levels. Despite the potential for large savings in computational requirements and memory usage without a corresponding reduction in accuracy, AMR adds overhead in managing the mesh hierarchy, adding complex communication and data movement requirements to a simulation. In this paper, we describe the design and implementation of a native GPU-based AMR library, including: the classes used to manage data on a mesh patch, the routines used for transferring data between GPUs on different nodes, and the data-parallel operators developed to coarsen and refine mesh data. We validate the performance and accuracy of our implementation using three test problems and two architectures: an eight-node cluster, and over four thousand nodes of Oak Ridge National Laboratory’s Titan supercomputer. Our GPU-based AMR hydrodynamics code performs up to 4.87× faster than the CPU-based implementation, and has been scaled to over four thousand GPUs using a combination of MPI and CUDA.

  17. Evolving frontiers in severe polytrauma management - refining the essential principles.

    Science.gov (United States)

    Chak Wah, Kam; Wai Man, Choi; Janet Yuen Ha, Wong; Lai, Vincent; Kit Shing John, Wong

    2013-01-01

    This editorial aims to refine the severe polytrauma management principles. While keeping ABCDE priorities, the termination of futile resuscitation and the early use of tourniquet to stop exsanguinating limb bleeding are crucial. Difficult-airway-management (DAM) is by a structured 5-level approach. The computerised tomography (CT) scanner is the tunnel to death for hemodynamically unstable patients. Focused Abdominal Sonography for Trauma-Ultrasonography (FAST USG) has replaced diagnostic peritoneal lavage (DPL) and is expanding to USG life support. Direct whole-body multidetector-row computed tomography (MDCT) expedites diagnosis & treatment. Non-operative management is a viable option in rapid responders in shock. Damage control resuscitation comprising of permissive hypotension, hemostatic resuscitation & damage control surgery (DCS) help prevent the lethal triad of trauma. Massive transfusion protocol reduces mortality and decreases the blood requirement. DCS attains rapid correction of the deranged physiology. Mortality reduction in major pelvic disruption requires a multi-disciplinary protocol, the novel pre-peritoneal pelvic packing and the angio-embolization. When operation is the definitive treatment for injury, prevention is best therapy.

  18. Research on Protocol Migration

    Institute of Scientific and Technical Information of China (English)

    汪芸; 顾冠群; 等

    1996-01-01

    This paper elaborates the concept and model of protocol migration in network interconnection.Migration strategies and principles are discussed and several cases are studied in detail which show the basic procedure and techniques used in protocol migration.

  19. Views on Behaviour Protocols and Their Semantic Foundation

    Science.gov (United States)

    Bauer, Sebastian S.; Hennicker, Rolf

    We consider UML protocol state machines where states are annotated with invariants and transitions are labelled with pre- and postconditions of operations. We claim that the meaning of a protocol strongly depends on the role in which the protocol is actually used. In our study we distinguish three different views on protocols: The user’s and the implementor’s perspective, which both are inspired by the contract principle, and an interaction perspective which focuses on collaborations. For each view we define a model-theoretic semantics based on labelled transition systems and simulation relations integrating control flow and the evolution of data states. Our approach is compositional in the sense that correct user and implementation models can be composed to a correct interaction model. Moreover, we define a refinement relation for protocols which retains our compositionality results.

  20. Human Umbilical Cord Mesenchymal Stromal Cell Transplantation in Myocardial Ischemia (HUC-HEART Trial). A Study Protocol of a Phase 1/2, Controlled and Randomized Trial in Combination with Coronary Artery Bypass Grafting.

    Science.gov (United States)

    Can, Alp; Ulus, Ahmet Tulga; Cinar, Ozgur; Topal Celikkan, Ferda; Simsek, Erdal; Akyol, Mesut; Canpolat, Ugur; Erturk, Murat; Kara, Fadil; Ilhan, Osman

    2015-10-01

    Mesenchymal stem cells (MSCs), which may be obtained from the bone marrow, have been studied for more than a decade in the setting of coronary artery disease (CAD). Adipose tissue-derived MSCs have recently come into focus and are being tested in a series of clinical trials. MSC-like cells have also been derived from a variety of sources, including umbilical cord stroma, or HUC-MSCs. The HUC-HEART trail (ClinicalTrials.gov Identifier: NCT02323477) is a phase 1/2, controlled, multicenter, randomized clinical study of the intramyocardial delivery of allogeneic HUC-MSCs in patients with chronic ischemic cardiomyopathy. A total of 79 patients (ages 30-80) with left ventricle ejection fractions ranging between 25 and 45% will be randomized in a 2:1:1 pattern in order to receive an intramyocardial injection of either HUC-MSCs or autologous bone marrow-derived mononuclear cells (BM-MNCs) in combination with coronary arterial bypass grafting (CABG) surgery. The control group of patients will receive no cells and undergo CABG alone. Human HUC-MSCs will be isolated, propagated and banked in accordance with a cGMP protocol, whereas the autologous BM-MNCs will be isolated via aspiration from the iliac crest and subsequently process in a closed-circuit cell purification system shortly before cell transplantation. The cell injections will be implemented in 10 peri-infarct areas. Baseline and post-transplantation outcome measures will be primarily utilized to test both the safety and the efficacy of the administered cells for up to 12 months.

  1. Safety and Efficacy of miltefosine alone and in combination with sodium stibogluconate and liposomal amphotericin B for the treatment of primary visceral leishmaniasis in East Africa: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Kinoti Dedan

    2011-06-01

    Full Text Available Abstract Background Treatment options for Visceral Leishmaniasis (VL in East Africa are far from satisfactory due to cost, toxicity, prolonged treatment duration or emergence of parasite resistance. Hence there is a need to explore alternative treatment protocols such as miltefosine alone or in combinations including miltefosine, sodium stibogluconate (SSG or liposomal amphotericin B. The aim of this trial is to identify regimen(s which are sufficiently promising for future trials in East Africa. Methods/Design A phase II randomized, parallel arm, open-labelled trial is being conducted to assess the efficacy of each of the three regimens: liposomal amphotericin B with SSG, Liposomal amphotericin B with miltefosine and miltefosine alone. The primary endpoint is cure at day 28 with secondary endpoint at day 210 (6 months. Initial cure is a single composite measure based on parasitologic evaluation (bone marrow, spleen or lymph node aspirate and clinical assessment. Repeated interim analyses have been planned after recruitment of 15 patients in each arm with a maximum sample size of 63 for each. These will follow group-sequential methods (the triangular test to identify when a regimen is inadequate (90% efficacy. We describe a method to ensure consistency of the sequential analysis of day 28 cure with the non-sequential analysis of day 210 cure. Discussion A regimen with adequate efficacy would be a candidate for treatment of VL with reasonable costs. The design allows repeated testing throughout the trial recruitment period while maintaining good statistical properties (Type I & II error rates and reducing the expected sample sizes. Trial Registration ClinicalTrials.gov: NCT01067443

  2. ASYNCHRONOUS BYZANTINE AGREEMENT PROTOCOL BASED ON VERIFIABLE SIGNATURE SHARING

    Institute of Scientific and Technical Information of China (English)

    Ji Dongyao; Feng Dengguo

    2006-01-01

    An ([n / 3]- 1 )-resilient Asynchronous Byzantine Agreement Protocol (ABAP) that combines verifiable signature sharing and random secret sharing is proposed. The protocol works in the asynchronous network environment and produces Byzantine agreement within a fixed expected number of computational rounds. The correctness of the protocol is proved in theory.

  3. Protocol channels as a new design alternative of covert channels

    CERN Document Server

    Wendzel, Steffen

    2008-01-01

    Covert channel techniques are used by attackers to transfer hidden data. There are two main categories of covert channels: timing channels and storage channels. This paper introduces a third category called protocol channels. A protocol channel switches one of at least two protocols to send a bit combination to a destination while sent packets include no hidden information themselves.

  4. The evolution of oil refining in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Reid, A. [CONCAWE, Brussels (Belgium)

    2013-04-01

    Back in 1963 when CONCAWE was founded, the world looked very different from what it is today, and so did the global and European refining industry. Oil product markets were expanding fast and new refineries were being built at a steady rate. The oil crisis of the 1970s brought an abrupt end to this, heralding a long era of consolidation and stepwise adaptation. At the same time the nature of the global oil business shifted from fully integrated companies producing, transporting and refining their own oil to a much more diversified situation where oil production ('upstream') and refining/distribution ('downstream') gradually became two essentially separate businesses. From being purely a 'cost centre' in an integrated chain, refining has become a separate activity in its own right, operating as a 'profit centre' between two global markets - crude oil and products - which, although not entirely independent, have their own dynamics and influences. In addition demand gradually shifted towards lighter products while the quality requirements on all products were considerably tightened. This article explores the new challenges that these changes have imposed on EU refiners, and describes CONCAWE's contributions to understanding their impact on refinery production and investments.

  5. Development of an efficient grain refiner for Al-7Si alloy and its modification with strontium

    Energy Technology Data Exchange (ETDEWEB)

    Kori, S.A.; Murty, B.S.; Chakraborty, M. [Indian Inst. of Technol., Kharagpur (India). Dept. of Metall. and Mater. Eng.

    2000-05-15

    The grain refining response of Al and Al-7Si alloy has been studied with various Al-Ti, Al-B and Al-Ti-B master alloys at different addition levels. The results show that Al-B and B rich Al-Ti-B master alloys cannot grain refine Al, while they are efficient grain refiners to Al-7Si alloy. The level of grain refinement saturates after 0.03% of Ti or B for most of the master alloys studied both at short and long holding times. The grain refining efficiency of some elements other than Ti and B on Al-7Si alloy has also been studied. Interestingly, all the elements studied (B, Cr, Fe, Mg, Ni, Ti and Zr) have resulted in some grain refinement of Al-7Si alloy at short holding time and have shown fading/poisoning on long holding, which increased in the order of B (no poisoning), Ti, Cr, Ni, Fe, Mg, Zr. Sr (0.02%) has been found to provide complete modification of the eutectic in Al-7Si alloy within 2 min, which is not lost even after long holding up to 120 min. Significant improvements in the mechanical properties have been obtained by a combination of grain refinement and modification to an extent that was not possible by either of them alone. (orig.)

  6. Topological Design of Protocols

    CERN Document Server

    Jaffe, Arthur; Wozniakowski, Alex

    2016-01-01

    We give a topological simulation for tensor networks that we call the two-string model. In this approach we give a new way to design protocols, and we discover a new multipartite quantum communication protocol. We introduce the notion of topologically compressed transformations. Our new protocol can implement multiple, non-local compressed transformations among multi-parties using one multipartite resource state.

  7. Vertical Protocol Composition

    DEFF Research Database (Denmark)

    Groß, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol com...

  8. Refining Animal Models to Enhance Animal Welfare

    Institute of Scientific and Technical Information of China (English)

    Patricia V.Turner

    2012-01-01

    The use of animals in research will be necessary for scientific advances in the basic and biomedical sciences for the foreseeable future.As we learn more about the ability of animals to experience pain,suffering,and distress,and particularly for mammals,it becomes the responsibility of scientists,institutions,animal caregivers,and veterinarians to seek ways to improve the lives of research animals and refine their care and use.Refinement is one of the three R's emphasized by Russell and Burch,and refers to modification of procedures to minimise the potential for pain,suffering and distress. It may also refer to procedures used to enhance animal comfort. This paper summarizes considerations for refinements in research animal.

  9. FEM electrode refinement for electrical impedance tomography.

    Science.gov (United States)

    Grychtol, Bartlomiej; Adler, Andy

    2013-01-01

    Electrical Impedance Tomography (EIT) reconstructs images of electrical tissue properties within a body from electrical transfer impedance measurements at surface electrodes. Reconstruction of EIT images requires the solution of an inverse problem in soft field tomography, where a sensitivity matrix, J, of the relationship between internal changes and measurements is calculated, and then a pseudo-inverse of J is used to update the image estimate. It is therefore clear that a precise calculation of J is required for solution accuracy. Since it is generally not possible to use analytic solutions, the finite element method (FEM) is typically used. It has generally been recommended in the EIT literature that FEMs be refined near electrodes, since the electric field and sensitivity is largest there. In this paper we analyze the accuracy requirement for FEM refinement near electrodes in EIT and describe a technique to refine arbitrary FEMs.

  10. Model refinement for offshore platforms: Experimental study

    Science.gov (United States)

    Zhang, Min; Chen, Zongli; Wu, Yanjian

    2017-08-01

    Offshore jacket platforms are widely used in offshore oil and gas exploitation. Finite element models of such structures need to have many degrees of freedom (DOFs) to represent the geometrical detail of complex structures, thereby leading to incompatibility in the number of DOFs of experimental models. To bring them both to the same order while ensuring that the essential eigen- properties of the refined model match those of experimental models, an extended model refinement procedure is presented in this paper. Vibration testing of an offshore jacket platform model is performed to validate the applicability of the proposed approach. A full-order finite element model of the platform is established and then tuned to meet the measured modal properties identified from the acceleration signals. Both model reduction and modal expansion methods are investigated, as well as various scenarios of sensor arrangements. Upon completion of the refinement, the updated jacket platform model matches the natural frequencies of the measured model well.

  11. Refining Linear Fuzzy Rules by Reinforcement Learning

    Science.gov (United States)

    Berenji, Hamid R.; Khedkar, Pratap S.; Malkani, Anil

    1996-01-01

    Linear fuzzy rules are increasingly being used in the development of fuzzy logic systems. Radial basis functions have also been used in the antecedents of the rules for clustering in product space which can automatically generate a set of linear fuzzy rules from an input/output data set. Manual methods are usually used in refining these rules. This paper presents a method for refining the parameters of these rules using reinforcement learning which can be applied in domains where supervised input-output data is not available and reinforcements are received only after a long sequence of actions. This is shown for a generalization of radial basis functions. The formation of fuzzy rules from data and their automatic refinement is an important step in closing the gap between the application of reinforcement learning methods in the domains where only some limited input-output data is available.

  12. Blind Collective Signature Protocol

    Directory of Open Access Journals (Sweden)

    Nikolay A. Moldovyan

    2011-06-01

    Full Text Available Using the digital signature (DS scheme specified by Belarusian DS standard there are designed the collective and blind collective DS protocols. Signature formation is performed simultaneously by all of the assigned signers, therefore the proposed protocols can be used also as protocols for simultaneous signing a contract. The proposed blind collective DS protocol represents a particular implementation of the blind multisignature schemes that is a novel type of the signature schemes. The proposed protocols are the first implementations of the multisignature schemes based on Belarusian signature standard.

  13. A model based security testing method for protocol implementation.

    Science.gov (United States)

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  14. A Model Based Security Testing Method for Protocol Implementation

    Directory of Open Access Journals (Sweden)

    Yu Long Fu

    2014-01-01

    Full Text Available The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  15. Grid refinement for entropic lattice Boltzmann models.

    Science.gov (United States)

    Dorschner, B; Frapolli, N; Chikatamarla, S S; Karlin, I V

    2016-11-01

    We propose a multidomain grid refinement technique with extensions to entropic incompressible, thermal, and compressible lattice Boltzmann models. Its validity and accuracy are assessed by comparison to available direct numerical simulation and experiment for the simulation of isothermal, thermal, and viscous supersonic flow. In particular, we investigate the advantages of grid refinement for the setups of turbulent channel flow, flow past a sphere, Rayleigh-Bénard convection, as well as the supersonic flow around an airfoil. Special attention is paid to analyzing the adaptive features of entropic lattice Boltzmann models for multigrid simulations.

  16. Grid refinement for entropic lattice Boltzmann models

    CERN Document Server

    Dorschner, B; Chikatamarla, S S; Karlin, I V

    2016-01-01

    We propose a novel multi-domain grid refinement technique with extensions to entropic incompressible, thermal and compressible lattice Boltzmann models. Its validity and accuracy are accessed by comparison to available direct numerical simulation and experiment for the simulation of isothermal, thermal and viscous supersonic flow. In particular, we investigate the advantages of grid refinement for the set-ups of turbulent channel flow, flow past a sphere, Rayleigh-Benard convection as well as the supersonic flow around an airfoil. Special attention is payed to analyzing the adaptive features of entropic lattice Boltzmann models for multi-grid simulations.

  17. Contextual Distance Refining for Image Retrieval

    KAUST Repository

    Islam, Almasri

    2014-09-16

    Recently, a number of methods have been proposed to improve image retrieval accuracy by capturing context information. These methods try to compensate for the fact that a visually less similar image might be more relevant because it depicts the same object. We propose a new quick method for refining any pairwise distance metric, it works by iteratively discovering the object in the image from the most similar images, and then refine the distance metric accordingly. Test show that our technique improves over the state of art in terms of accuracy over the MPEG7 dataset.

  18. Concrete Cutting Refinement for Crater Repair

    Science.gov (United States)

    2015-08-01

    ER D C/ G SL T R- 15 -2 9 Concrete Cutting Refinement for Crater Repair G eo te ch ni ca l a nd S tr uc tu re s La bo ra to ry...by ERDC, visit the ERDC online library at http://acwc.sdp.sirsi.net/client/default. ERDC/GSL TR-15-29 August 2015 Concrete Cutting Refinement...equipment in both soft and hard aggregate-mixed concrete . The investigated equipment included the Vermeer RW1236W and CC1531, Caterpillar SW45 and

  19. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.

    2014-03-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  20. Using supercritical fluids to refine hydrocarbons

    Science.gov (United States)

    Yarbro, Stephen Lee

    2015-06-09

    A system and method for reactively refining hydrocarbons, such as heavy oils with API gravities of less than 20 degrees and bitumen-like hydrocarbons with viscosities greater than 1000 cp at standard temperature and pressure, using a selected fluid at supercritical conditions. A reaction portion of the system and method delivers lightweight, volatile hydrocarbons to an associated contacting unit which operates in mixed subcritical/supercritical or supercritical modes. Using thermal diffusion, multiphase contact, or a momentum generating pressure gradient, the contacting unit separates the reaction products into portions that are viable for use or sale without further conventional refining and hydro-processing techniques.

  1. Relativistic MHD with Adaptive Mesh Refinement

    CERN Document Server

    Anderson, M; Liebling, S L; Neilsen, D; Anderson, Matthew; Hirschmann, Eric; Liebling, Steven L.; Neilsen, David

    2006-01-01

    We solve the relativistic magnetohydrodynamics (MHD) equations using a finite difference Convex ENO method (CENO) in 3+1 dimensions within a distributed parallel adaptive mesh refinement (AMR) infrastructure. In flat space we examine a Balsara blast wave problem along with a spherical blast wave and a relativistic rotor test both with unigrid and AMR simulations. The AMR simulations substantially improve performance while reproducing the resolution equivalent unigrid simulation results. We also investigate the impact of hyperbolic divergence cleaning for the spherical blast wave and relativistic rotor. We include unigrid and mesh refinement parallel performance measurements for the spherical blast wave.

  2. Adaptive Mesh Refinement for Storm Surge

    CERN Document Server

    Mandli, Kyle T

    2014-01-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the \\geoclaw framework and compared to \\adcirc for Hurricane Ike along with observed tide gauge data and the computational cost of each model run.

  3. Refining - Panorama 2008; Raffinage - Panorama 2008

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    Investment rallied in 2007, and many distillation and conversion projects likely to reach the industrial stage were announced. With economic growth sustained in 2006 and still pronounced in 2007, oil demand remained strong - especially in emerging countries - and refining margins stayed high. Despite these favorable business conditions, tensions persisted in the refining sector, which has fallen far behind in terms of investing in refinery capacity. It will take renewed efforts over a long period to catch up. Looking at recent events that have affected the economy in many countries (e.g. the sub-prime crisis), prudence remains advisable.

  4. On-orbit Geometric Parameters Refinement of Mapping Satellite-1 Triple Line Array Camera

    Directory of Open Access Journals (Sweden)

    GENG Hongyi

    2016-03-01

    Full Text Available To find the model and method of on-orbit geometric refinement suitable for the triple line array camera of Mapping Satellite-1, this paper first analyzed the impact of the exterior orientation line element error on the geometric parameters refinement, then eliminated the high-frequency noise by the preprocessing of the attitude data, and compensated the low-frequency flutter of satellite platform in the course of flying by sine function and designed the constant angular error model for the lens of the triple line array camera. In addition, an interior orientation model, using directly pixel coordinates as observations, was constructed based on conventional additional parameter model and the combination of the best refinement model parameters and the solution strategy were determined by the unilateral control extrapolative location. The experiments show that the planar accuracy and vertical accuracy are about 1 GSD and 0.8 GSD by the proposed refinement plan and the rational distribution of GCPS.

  5. Development of the series products of Eucommia ulmoides Olive. leaves by the combining technique of microfluidization nanometer refining and complex biological enzymolysis%应用纳米细化和复合生物酶解技术开发杜仲全叶系列产品

    Institute of Scientific and Technical Information of China (English)

    罗阳帆; 万茵; 柴建新; 陈建芳; 李红歌; 王建涛; 胡明; 付桂明

    2011-01-01

    以杜仲叶为原料,利用纳米细化和复合生物酶解技术开发杜仲全叶系列产品,将杜仲叶中的粗纤维加工成细小的微晶纤维素,解决了食用时渣状口感,同时这些细小的微晶纤维素具有很强的吸附作用,能充分吸附杜仲叶中的活性成分并保护其不被破坏,全面利用杜仲全叶有效成分。处理后的杜仲叶通过配料、杀菌、灌装、造粒干燥和压片等技术处理,进一步加工成杜仲全叶汁、杜仲全叶冲剂和杜仲全叶咀嚼片等系列产品,符合天然、健康、营养的食品发展趋势,为杜仲叶的有效利用提供了一条新途径。%In this paper, three health products of Eucommia ulmoides Oliv. leaves were produced by the combining technique of microfluidization nanometre refining and complex biological enzymolysis. The crude fiber in Eucommia ulmoides Oliv. leaves were processed into tiny microcrystalline cellulose, which has the strong function of sorption and protection for active components in Eucommia ulmoides Oliv. leaves. At the same time, it was settled the uncomfortable palate and fully utilized the active constitutents. After treatment, three health products, the Eucommia ulmoides leaves juice, the Eucommia ulmoides leaves medicinal instant granules and the Eucommia ulmoides leaves chewable tablets, were processed by ingredients, sterilization, canning, drying and tableting. It is consistent with the tendency of nature, health and nutrition, and lead to a new way for the effective utilization of Eucommia ulmoides Oliv. leaves.

  6. A Rational Approach to Cryptographic Protocols

    CERN Document Server

    Caballero-Gil, P; Bruno-Castañeda, C; 10.1016/j.mcm.2006.12.013

    2010-01-01

    This work initiates an analysis of several cryptographic protocols from a rational point of view using a game-theoretical approach, which allows us to represent not only the protocols but also possible misbehaviours of parties. Concretely, several concepts of two-person games and of two-party cryptographic protocols are here combined in order to model the latters as the formers. One of the main advantages of analysing a cryptographic protocol in the game-theory setting is the possibility of describing improved and stronger cryptographic solutions because possible adversarial behaviours may be taken into account directly. With those tools, protocols can be studied in a malicious model in order to find equilibrium conditions that make possible to protect honest parties against all possible strategies of adversaries.

  7. Mesh Generation via Local Bisection Refinement of Triangulated Grids

    Science.gov (United States)

    2015-06-01

    tb [Maubach 1995, Theorem 5.1]. This is exploited in the recursive algorithm RefineTriangle (Algorithm 2) to com- patibly refine a given triangle; the... recursion depth of RefineTriangle is bounded by the maximum level of refinement in T [Maubach 1995]. RefineTriangle calls itself repeatedly on a... sequence of triangles until a compatibly divisible triangle is found. This sequence of triangles is then bisected in reverse order to preserve

  8. POLY-SCALE REFINABLE FUNCTION AND THEIR PROPERTIES

    Institute of Scientific and Technical Information of China (English)

    YANG Shou-zhi

    2006-01-01

    Poly-scale refinable function with dilation factor a is introduced. The existence of solution of poly-scale refinable equation is investigated. Specially, necessary and sufficient conditions for the orthonormality of solution function φ of poly-scale refinable equation with integer dilation factor a are established. Some properties of poly-scale refinable function are discussed. Several examples illustrating how to use the method to construct poly-scale refinable function are given.

  9. Anomalies in the refinement of isoleucine

    NARCIS (Netherlands)

    Berntsen, K.R.M.; Vriend, G.

    2014-01-01

    A study of isoleucines in protein structures solved using X-ray crystallography revealed a series of systematic trends for the two side-chain torsion angles chi1 and chi2 dependent on the resolution, secondary structure and refinement software used. The average torsion angles for the nine rotamers w

  10. Structural refinement and coarsening in deformed metals

    DEFF Research Database (Denmark)

    Hansen, N.; Huang, X.; Xing, Q.

    2005-01-01

    The microstructural refinement by plastic deformation is analysed in terms of key parameters, the spacing between and the misorientation angle across the boundaries subdividing the structure. Coarsening of such structures by annealing is also characterised. For both deformed and annealed structur...

  11. Energy Bandwidth for Petroleum Refining Processes

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2006-10-01

    The petroleum refining energy bandwidth report analyzes the most energy-intensive unit operations used in U.S. refineries: crude oil distillation, fluid catalytic cracking, catalytic hydrotreating, catalytic reforming, and alkylation. The "bandwidth" provides a snapshot of the energy losses that can potentially be recovered through best practices and technology R&D.

  12. SOME REFINEMENTS OF ENESTROM-KAKEYA THEOREM

    Institute of Scientific and Technical Information of China (English)

    A.Aziz; B.A.Zargar

    2007-01-01

    In this paper we present certain interesting refinements of a well-known Enestrom-Kakeya theorem in the theory of distribution of zeros of polynomials which among other things also improve upon some results of Aziz and Mohammad, Govil and Rehman and others.

  13. Refinement from a control problem to program

    DEFF Research Database (Denmark)

    Schenke, Michael; Ravn, Anders P.

    1996-01-01

    for a control task, exemplified by a steam boiler.The same formalism is used to refine requirements to a functional design.Through a suitable transformation this is taken to an event andaction based formalism. Finally components in this design for a distributedarchitecture are transformed to occam-like programs....

  14. 两种不同抗凝药物在组合型人工肾治疗中的对照%Two different anticoagulant protocols for treatment of combined-type artificial kidney

    Institute of Scientific and Technical Information of China (English)

    邢利; 王沛育; 蔡文利; 王柠; 李海霞; 王艳芳

    2011-01-01

    BACKGROUND: Combined-type artificial kidney can completely get rid of various toxins in uremia patients, prevent and treatlong-term complications caused by dialysis. But dialyzer and perfusion instalment blood coagulation trouble physicians andpatients.OBJECTIVE: To investigate the anticoagulant effects and safety of two different anticoagulant protocols in treatment of chronicrenal failure in combined-type artificial kidney.METHODS: Heparin sodium and low molecular heparin sodium was used for 4-week anticoagulation in chronic renal failurepatients who received combined-type artificial kidney (hemoperfusion+hemodialysis) at the amount of 1.5-2.0 times of originalanticoagulant protocols. Anticoagulant effects, hemorrhage, prothrombin time, partial thromboplastin time, thrombin time, Woodplatelet and hemoglobin levels before treatment. 4 and 8 weeks after treatment were compared between two differentanticoagulant drugs.RESULTS AND CONCLUSION: When heparin sodium was used, grade 1 blood coagulation appeared in two cases and grade 2blood coagulation in one case. When low molecular heparin sodium was used, grade 1 blood coagulation appeared in throecases and grade 2 blood coagulation in two cases. Grade 3 blood coagulation was not observed when heparin sodium or towmolecular heparin sodium was used. When heparin sodium was used, grade 1 hemorrhage was observed in 12 cases and grade3 hemorrhage in three cases. When low molecular heparin sodium was used, grade 1 hemorrhage was observed in two casesand grade 2 hemorrhage in one case. After heparin sodium application, blood platelets were reduced, partial thromboplastin timewas prolonged, and there was no significant difference compared with before treatment. There was no significant difference ineach index between before and after low molecular heparin sodium application (P > 0.05). Ttiese results showed that irtcombined-type artificial kidney treatment in chronic renal failure patients, low molecular heparin sodium and

  15. Satellite SAR geocoding with refined RPC model

    Science.gov (United States)

    Zhang, Lu; Balz, Timo; Liao, Mingsheng

    2012-04-01

    Recent studies have proved that the Rational Polynomial Camera (RPC) model is able to act as a reliable replacement of the rigorous Range-Doppler (RD) model for the geometric processing of satellite SAR datasets. But its capability in absolute geolocation of SAR images has not been evaluated quantitatively. Therefore, in this article the problems of error analysis and refinement of SAR RPC model are primarily investigated to improve the absolute accuracy of SAR geolocation. Range propagation delay and azimuth timing error are identified as two major error sources for SAR geolocation. An approach based on SAR image simulation and real-to-simulated image matching is developed to estimate and correct these two errors. Afterwards a refined RPC model can be built from the error-corrected RD model and then used in satellite SAR geocoding. Three experiments with different settings are designed and conducted to comprehensively evaluate the accuracies of SAR geolocation with both ordinary and refined RPC models. All the experimental results demonstrate that with RPC model refinement the absolute location accuracies of geocoded SAR images can be improved significantly, particularly in Easting direction. In another experiment the computation efficiencies of SAR geocoding with both RD and RPC models are compared quantitatively. The results show that by using the RPC model such efficiency can be remarkably improved by at least 16 times. In addition the problem of DEM data selection for SAR image simulation in RPC model refinement is studied by a comparative experiment. The results reveal that the best choice should be using the proper DEM datasets of spatial resolution comparable to that of the SAR images.

  16. Encryption Switching Protocols

    OpenAIRE

    Couteau, Geoffroy; Peters, Thomas; Pointcheval, David

    2016-01-01

    International audience; We formally define the primitive of encryption switching protocol (ESP), allowing to switch between two encryption schemes. Intuitively, this two-party protocol converts given ciphertexts from one scheme into ciphertexts of the same messages under the other scheme, for any polynomial number of switches, in any direction. Although ESP is a special kind of two-party computation protocol, it turns out that ESP implies general two-party computation (2-PC) under natural con...

  17. Multiparty Quantum Cryptographic Protocol

    Institute of Scientific and Technical Information of China (English)

    M. Ramzan; M. K. Khan

    2008-01-01

    We propose a multiparty quantum cryptographic protocol. Unitary operators applied by Bob and Charlie, on their respective qubits of a tripartite entangled state encoding a classical symbol that can be decoded at Alice's end with the help of a decoding matrix. Eve's presence can be detected by the disturbance of the decoding matrix. Our protocol is secure against intercept-resend attacks. Furthermore, it is efficient and deterministic in the sense that two classical bits can be transferred per entangled pair of qubits. It is worth mentioning that in this protocol, the same symbol can be used for key distribution and Eve's detection that enhances the effciency of the protocol.

  18. Structure prediction for CASP8 with all-atom refinement using Rosetta

    Science.gov (United States)

    Raman, Srivatsan; Vernon, Robert; Thompson, James; Tyka, Michael; Sadreyev, Ruslan; Pei, Jimin; Kim, David; Kellogg, Elizabeth; DiMaio, Frank; Lange, Oliver; Kinch, Lisa; Sheffler, Will; Kim, Bong-Hyun; Das, Rhiju; Grishin, Nick V.; Baker, David

    2012-01-01

    We describe predictions made using the Rosetta structure prediction methodology for the Eighth Critical Assessment of Techniques for Protein Structure Prediction. Aggressive sampling and all-atom refinement were carried out for nearly all targets. A combination of alignment methodologies was used to generate starting models from a range of templates, and the models were then subjected to Rosetta all atom refinement. For 50 targets with readily identified templates, the best submitted model was better than the best alignment to the best template in the Protein Data Bank for 24 domains, and improved over the best starting model for 43 domains. For 13 targets where only very distant sequence relationships to proteins of known structure were detected, models were generated using the Rosetta de novo structure prediction methodology followed by all-atom refinement; in several cases the submitted models were better than those based on the available templates. Of the 12 refinement challenges, the best submitted model improved on the starting model in 7 cases. These improvements over the starting template-based models and refinement tests demonstrate the power of Rosetta structure refinement in improving model accuracy. PMID:19701941

  19. Internet Protocol Television (IPTV

    Directory of Open Access Journals (Sweden)

    Lokesh Mittal

    2012-09-01

    Full Text Available IPTV is one of the mostly used technology of Internet and IP application. IPTV is a service for the delivery of broadcast TV, movies on demand and other interactive multimedia services over a secure, end-to-end operator managed broadband IP data network with desired QoS to the public with a broadband Internet connection. IPTV system may also include Internet services such as Web access and VoIP where it may be called Triple Play and is typically supplied by a broadband operator using the same infrastructure. IPTV is not the Internet Video that simply allows users to watch videos, like movie previews and web-cams, over the Internet in a best effort fashion. IPTV technology offers revenue-generating opportunities for the telecom and cable service providers. For traditional telephone service providers, Triple Play is delivered using a combination of optical fiber and Digital Subscriber Line (DSL technologies to its residential base. IPTV is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure, which may include delivery by a broadband connection. A general definition of IPTV is television content that, instead of being delivered through traditional broadcast and cable formats, is received by the viewer through the technologies used for computer networks. In this paper I am trying to discuss this topic as my knowledge, including what is IPTV, how it works, its advantages and its applications

  20. Parabolic refined invariants and Macdonald polynomials

    CERN Document Server

    Chuang, Wu-yen; Donagi, Ron; Pantev, Tony

    2013-01-01

    A string theoretic derivation is given for the conjecture of Hausel, Letellier, and Rodriguez-Villegas on the cohomology of character varieties with marked points. Their formula is identified with a refined BPS expansion in the stable pair theory of a local root stack, generalizing previous work of the first two authors in collaboration with G. Pan. Haiman's geometric construction for Macdonald polynomials is shown to emerge naturally in this context via geometric engineering. In particular this yields a new conjectural relation between Macdonald polynomials and refined local orbifold curve counting invariants. The string theoretic approach also leads to a new spectral cover construction for parabolic Higgs bundles in terms of holomorphic symplectic orbifolds.

  1. Adaptive Mesh Refinement for Characteristic Grids

    CERN Document Server

    Thornburg, Jonathan

    2009-01-01

    I consider techniques for Berger-Oliger adaptive mesh refinement (AMR) when numerically solving partial differential equations with wave-like solutions, using characteristic (double-null) grids. Such AMR algorithms are naturally recursive, and the best-known past Berger-Oliger characteristic AMR algorithm, that of Pretorius & Lehner (J. Comp. Phys. 198 (2004), 10), recurses on individual "diamond" characteristic grid cells. This leads to the use of fine-grained memory management, with individual grid cells kept in 2-dimensional linked lists at each refinement level. This complicates the implementation and adds overhead in both space and time. Here I describe a Berger-Oliger characteristic AMR algorithm which instead recurses on null \\emph{slices}. This algorithm is very similar to the usual Cauchy Berger-Oliger algorithm, and uses relatively coarse-grained memory management, allowing entire null slices to be stored stored in contiguous arrays in memory. The algorithm is very efficient in both space and ti...

  2. Cleaning Data with OpenRefine

    Directory of Open Access Journals (Sweden)

    Seth van Hooland

    2013-08-01

    Full Text Available Duplicate records, empty values and inconsistent formats are phenomena we should be prepared to deal with when using historical data sets. This lesson will teach you how to discover inconsistencies in data contained within a spreadsheet or a database. As we increasingly share, aggregate and reuse data on the web, historians will need to respond to data quality issues which inevitably pop up. Using a program called OpenRefine, you will be able to easily identify systematic errors such as blank cells, duplicates, spelling inconsistencies, etc. OpenRefine not only allows you to quickly diagnose the accuracy of your data, but also to act upon certain errors in an automated manner.

  3. Using supercritical fluids to refine hydrocarbons

    Science.gov (United States)

    Yarbro, Stephen Lee

    2014-11-25

    This is a method to reactively refine hydrocarbons, such as heavy oils with API gravities of less than 20.degree. and bitumen-like hydrocarbons with viscosities greater than 1000 cp at standard temperature and pressure using a selected fluid at supercritical conditions. The reaction portion of the method delivers lighter weight, more volatile hydrocarbons to an attached contacting device that operates in mixed subcritical or supercritical modes. This separates the reaction products into portions that are viable for use or sale without further conventional refining and hydro-processing techniques. This method produces valuable products with fewer processing steps, lower costs, increased worker safety due to less processing and handling, allow greater opportunity for new oil field development and subsequent positive economic impact, reduce related carbon dioxide, and wastes typical with conventional refineries.

  4. European refiners re-adjust margins strategy

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.G. [ed.

    1996-05-01

    Refiners in Europe are adjusting operating strategies to reflect the volatilities of tight operating margins. From the unexpected availability of quality crudes (e.g., Brent, 0.3% sulfur), to the role of government in refinery planning, the European refining industry is positioning itself to reverse the past few years of steadily declining profitability. Unlike expected increases in US gasoline demand, European gasoline consumption is not expected to increase, and heavy fuel oil consumption is also declining. However, diesel fuel consumption is expected to increase, even though diesel processing capacity has recently decreased (i.e., more imports). Some of the possible strategies that Europeans may adapt to improve margins and reduce volatility include: Increase conversion capacity to supply growing demand for middle distillates and LPG; alleviate refinery cash flow problems with alliances; and direct discretionary investment toward retail merchandising (unless there is a clear trend toward a widening of the sweet-sour crude price differential).

  5. Solidification Based Grain Refinement in Steels

    Science.gov (United States)

    2009-07-24

    and a coarser microstructure forms (See Figure 17). ROLE OF COBALT ALUMINATE CoAI204 produced very different results in the 1010 and 1030 TA...determine what role CoAl204 played in both steels. Other researchers have observed refinement with cobalt aluminate .2"󈧙 Their work has been using...produced in this work. The one unexplained result for CoAl204 is the different undercooling behavior found. Cobalt aluminate reduced undercooling in the

  6. Mesh refinement strategy for optimal control problems

    OpenAIRE

    Paiva, Luis Tiago; Fontes, Fernando,

    2013-01-01

    International audience; Direct methods are becoming the most used technique to solve nonlinear optimal control problems. Regular time meshes having equidistant spacing are frequently used. However, in some cases these meshes cannot cope accurately with nonlinear behavior. One way to improve the solution is to select a new mesh with a greater number of nodes. Another way, involves adaptive mesh refinement. In this case, the mesh nodes have non equidistant spacing which allow a non uniform node...

  7. Understanding Solidification Based Grain Refinement in Steels

    Science.gov (United States)

    2014-12-18

    interatomic spacing along [uvw]s, and 0 = the angle between the [uvwjjand [uvw]n.^’ Work on the grain refinement of aluminum and magnesium alloys has found...chemistry ternary plot from ASPEX system. The author found some evidence of reactions occurring with some rare earth oxide or sulfide inclusions and... sulfide .^^"^^ Another approach would be to react a sample of misch metal or rare earth silicide at elevated temperatures to form the desired oxide or

  8. REFINEMENTS OF THE FAN-TODD'S INEQUALITIES

    Institute of Scientific and Technical Information of China (English)

    庄国基

    2002-01-01

    Refinements to inequalities oninner product spaces are presented. In this respect, inequalities dealt with in this paper are: Cauchy's inequality, Bessel's inequality, Fan-Todd's inequality and Fan-Todd's determinantal inequality. In each case, a strictly increasing function is put forward, which lies between the smaller and the larger quantities of each inequality. As a result,an improved condition for equality of the Fan-Todd's determinantal inequality is deduced.

  9. Refining Housing, Husbandry and Care for Animals Used in Studies Involving Biotelemetry

    Directory of Open Access Journals (Sweden)

    Penny Hawkins

    2014-06-01

    Full Text Available Biotelemetry can contribute towards reducing animal numbers and suffering in disciplines including physiology, pharmacology and behavioural research. However, the technique can also cause harm to animals, making biotelemetry a ‘refinement that needs refining’. Current welfare issues relating to the housing and husbandry of animals used in biotelemetry studies are single vs. group housing, provision of environmental enrichment, long term laboratory housing and use of telemetered data to help assess welfare. Animals may be singly housed because more than one device transmits on the same wavelength; due to concerns regarding damage to surgical sites; because they are wearing exteriorised jackets; or if monitoring systems can only record from individually housed animals. Much of this can be overcome by thoughtful experimental design and surgery refinements. Similarly, if biotelemetry studies preclude certain enrichment items, husbandry refinement protocols can be adapted to permit some environmental stimulation. Nevertheless, long-term laboratory housing raises welfare concerns and maximum durations should be defined. Telemetered data can be used to help assess welfare, helping to determine endpoints and refine future studies. The above measures will help to improve data quality as well as welfare, because experimental confounds due to physiological and psychological stress will be minimised.

  10. Crystal Structure Refinement of Synthetic Pure Gyrolite

    Directory of Open Access Journals (Sweden)

    Arūnas Baltušnikas

    2015-03-01

    Full Text Available Pure calcium silicate hydrate – gyrolite was prepared under the saturated steam pressure at 473 K temperature in rotating autoclave. The crystal structure of synthetic gyrolite was investigated by X-ray diffraction and refined using Le Bail, Rietveld and crystal structure modelling methods. Background, peak shape parameters and verification of the space group were performed by the Le Bail full pattern decomposition. Peculiarities of interlayer sheet X of gyrolite unit cell were highlighted by Rietveld refinement. Possible atomic arrangement in interlayer sheet X was solved by global optimization method. Most likelihood crystal structure model of gyrolite was calculated by final Rietveld refinement. It was crystallographically showed, that cell parameters are: a = 0.9713(2 nm, b = 0.9715(2 nm, c = 2.2442(3 nm and alfa = 95.48(2 º, beta = 91.45(2 °, gamma = l20.05(3 °.DOI: http://dx.doi.org/10.5755/j01.ms.21.1.5460

  11. Mesh refinement strategy for optimal control problems

    Science.gov (United States)

    Paiva, L. T.; Fontes, F. A. C. C.

    2013-10-01

    Direct methods are becoming the most used technique to solve nonlinear optimal control problems. Regular time meshes having equidistant spacing are frequently used. However, in some cases these meshes cannot cope accurately with nonlinear behavior. One way to improve the solution is to select a new mesh with a greater number of nodes. Another way, involves adaptive mesh refinement. In this case, the mesh nodes have non equidistant spacing which allow a non uniform nodes collocation. In the method presented in this paper, a time mesh refinement strategy based on the local error is developed. After computing a solution in a coarse mesh, the local error is evaluated, which gives information about the subintervals of time domain where refinement is needed. This procedure is repeated until the local error reaches a user-specified threshold. The technique is applied to solve the car-like vehicle problem aiming minimum consumption. The approach developed in this paper leads to results with greater accuracy and yet with lower overall computational time as compared to using a time meshes having equidistant spacing.

  12. Refining of biodiesel by ceramic membrane separation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yong; Ou, Shiyi; Tan, Yanlai; Tang, Shuze [Department of Food Science and Engineering, Jinan University, Guangzhou 510632 (China); Wang, Xingguo; Liu, Yuanfa [School of Food Science and Technology, Jiangnan University, Wuxi 214112 (China)

    2009-03-15

    A ceramic membrane separation process for biodiesel refining was developed to reduce the considerable usage of water needed in the conventional water washing process. Crude biodiesel produced by refined palm oil was micro-filtered by ceramic membranes of the pore size of 0.6, 0.2 and 0.1 {mu}m to remove the residual soap and free glycerol, at the transmembrane pressure of 0.15 MPa and temperature of 60 C. The flux through membrane maintained at 300 L m{sup -} {sup 2} h{sup -} {sup 1} when the volumetric concentrated ratio reached 4. The content of potassium, sodium, calcium and magnesium in the whole permeate was 1.40, 1.78, 0.81 and 0.20 mg/kg respectively, as determined by inductively coupled plasma-atomic emission spectroscopy. These values are lower than the EN 14538 specifications. The residual free glycerol in the permeate was estimated by water extraction, its value was 0.0108 wt.%. This ceramic membrane technology was a potential environmental process for the refining of biodiesel. (author)

  13. Using Induction to Refine Information Retrieval Strategies

    Science.gov (United States)

    Baudin, Catherine; Pell, Barney; Kedar, Smadar

    1994-01-01

    Conceptual information retrieval systems use structured document indices, domain knowledge and a set of heuristic retrieval strategies to match user queries with a set of indices describing the document's content. Such retrieval strategies increase the set of relevant documents retrieved (increase recall), but at the expense of returning additional irrelevant documents (decrease precision). Usually in conceptual information retrieval systems this tradeoff is managed by hand and with difficulty. This paper discusses ways of managing this tradeoff by the application of standard induction algorithms to refine the retrieval strategies in an engineering design domain. We gathered examples of query/retrieval pairs during the system's operation using feedback from a user on the retrieved information. We then fed these examples to the induction algorithm and generated decision trees that refine the existing set of retrieval strategies. We found that (1) induction improved the precision on a set of queries generated by another user, without a significant loss in recall, and (2) in an interactive mode, the decision trees pointed out flaws in the retrieval and indexing knowledge and suggested ways to refine the retrieval strategies.

  14. Parallel object-oriented adaptive mesh refinement

    Energy Technology Data Exchange (ETDEWEB)

    Balsara, D.; Quinlan, D.J.

    1997-04-01

    In this paper we study adaptive mesh refinement (AMR) for elliptic and hyperbolic systems. We use the Asynchronous Fast Adaptive Composite Grid Method (AFACX), a parallel algorithm based upon the of Fast Adaptive Composite Grid Method (FAC) as a test case of an adaptive elliptic solver. For our hyperbolic system example we use TVD and ENO schemes for solving the Euler and MHD equations. We use the structured grid load balancer MLB as a tool for obtaining a load balanced distribution in a parallel environment. Parallel adaptive mesh refinement poses difficulties in expressing both the basic single grid solver, whether elliptic or hyperbolic, in a fashion that parallelizes seamlessly. It also requires that these basic solvers work together within the adaptive mesh refinement algorithm which uses the single grid solvers as one part of its adaptive solution process. We show that use of AMR++, an object-oriented library within the OVERTURE Framework, simplifies the development of AMR applications. Parallel support is provided and abstracted through the use of the P++ parallel array class.

  15. Formal refinement of extended state machines

    Directory of Open Access Journals (Sweden)

    Thomas Fayolle

    2016-06-01

    Full Text Available In a traditional formal development process, e.g. using the B method, the informal user requirements are (manually translated into a global abstract formal specification. This translation is especially difficult to achieve. The Event-B method was developed to incrementally and formally construct such a specification using stepwise refinement. Each increment takes into account new properties and system aspects. In this paper, we propose to couple a graphical notation called Algebraic State-Transition Diagrams (ASTD with an Event-B specification in order to provide a better understanding of the software behaviour. The dynamic behaviour is captured by the ASTD, which is based on automata and process algebra operators, while the data model is described by means of an Event-B specification. We propose a methodology to incrementally refine such specification couplings, taking into account new refinement relations and consistency conditions between the control specification and the data specification. We compare the specifications obtained using each approach for readability and proof complexity. The advantages and drawbacks of the traditional approach and of our methodology are discussed. The whole process is illustrated by a railway CBTC-like case study. Our approach is supported by tools for translating ASTD's into B and Event-B into B.

  16. Composing Interfering Abstract Protocols

    Science.gov (United States)

    2016-04-01

    Fähndrich and K. R. M. Leino. Heap monotonic typestate. In IWACO 2003. [11] X. Feng. Local rely-guarantee reasoning . In POPL ’09. [12] T. Freeman...While protocol-based techniques to reason about interference abound, they do not address two practical concerns: the decidability of protocol...46 C Examples using Informal Extensions 48 C.1 Monotonic Counter

  17. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  18. A Formal Software Development Approach Using Refinement Calculus

    Institute of Scientific and Technical Information of China (English)

    WANG Yunfeng; PANG Jun; ZHA Ming; YANG Zhaohui; ZHENG Guoliang

    2001-01-01

    The advantage of COOZ (Complete Object-Oriented Z) is to specify large scale software, but it does not support refinement calculus. Thus its applica tion is confined for software development. Including refinement calculus into COOZ overcomes its disadvantage during design and implementation. The separation be tween the design and implementation for structure and notation is removed as well. Then the software can be developed smoothly in the same frame. The combina tion of COOZ and refinement calculus can build object-oriented frame, in which the specification in COOZ is refined stepwise to code by calculus. In this paper, the development model is established, which is based on COOZ and refinement calculus. Data refinement is harder to deal with in a refinement tool than ordinary algorithmic refinement, since data refinement usually has to be done on a large program compo nent at once. As to the implementation technology of refinement calculus, the data refinement calculator is constructed and an approach for data refinement which is based on data refinement calculus and program window inference is offered.

  19. Level 5: user refinement to aid the fusion process

    Science.gov (United States)

    Blasch, Erik P.; Plano, Susan

    2003-04-01

    The revised JDL Fusion model Level 4 process refinement covers a broad spectrum of actions such as sensor management and control. A limitation of Level 4 is the purpose of control - whether it be for user needs or system operation. Level 5, User Refinement, is a modification to the Revised JDL model that distinguishes between machine process refinement and user refinement. User refinement can either be human control actions or refinement of the user's cognitive model. In many cases, fusion research concentrates on the machine and does not take full advantage of the human as not only a qualified expert to refine the fusion process, but also as customer for whom the fusion system is designed. Without user refinement, sensor fusion is incomplete, inadequate, and the user neglects its worthiness. To capture user capabilities, we explore the concept of user refinement through decision and action based on situational leadership models. We develop a Fuse-Act Situational User Refinement (FASUR) model that details four refinement behaviors: Neglect, Consult, Rely, and Interact and five refinement functions: Planning, Organizing, Coordinating, Directing, and Controlling. Process refinement varies for different systems and different user information needs. By designing a fusion system with a specific user in mind, vis Level 5, a fusion architecture can meet user's information needs for varying situations, extend user sensing capabilities for action, and increase the human-machine interaction.

  20. Refined Dummy Atom Model of Mg(2+) by Simple Parameter Screening Strategy with Revised Experimental Solvation Free Energy.

    Science.gov (United States)

    Jiang, Yang; Zhang, Haiyang; Feng, Wei; Tan, Tianwei

    2015-12-28

    Metal ions play an important role in the catalysis of metalloenzymes. To investigate metalloenzymes via molecular modeling, a set of accurate force field parameters for metal ions is highly imperative. To extend its application range and improve the performance, the dummy atom model of metal ions was refined through a simple parameter screening strategy using the Mg(2+) ion as an example. Using the AMBER ff03 force field with the TIP3P model, the refined model accurately reproduced the experimental geometric and thermodynamic properties of Mg(2+). Compared with point charge models and previous dummy atom models, the refined dummy atom model yields an enhanced performance for producing reliable ATP/GTP-Mg(2+)-protein conformations in three metalloenzyme systems with single or double metal centers. Similar to other unbounded models, the refined model failed to reproduce the Mg-Mg distance and favored a monodentate binding of carboxylate groups, and these drawbacks needed to be considered with care. The outperformance of the refined model is mainly attributed to the use of a revised (more accurate) experimental solvation free energy and a suitable free energy correction protocol. This work provides a parameter screening strategy that can be readily applied to refine the dummy atom models for metal ions.

  1. Connective tissue diseases : Refining the classification criteria for primary Sjögren syndrome

    NARCIS (Netherlands)

    Vissink, Arjan; Bootsma, Hendrika

    2017-01-01

    The 2016 ACR-EULAR classification criteria for primary Sjogren syndrome, which are intended to facilitate uniform classification of patients for enrolment in clinical studies, combine features of previous criteria sets. These new criteria are a step in the right direction, but further refinement wou

  2. Interactive visual exploration and refinement of cluster assignments.

    Science.gov (United States)

    Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R

    2017-09-12

    With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.

  3. Grain Refinement of Permanent Mold Cast Copper Base Alloys

    Energy Technology Data Exchange (ETDEWEB)

    M.Sadayappan; J.P.Thomson; M.Elboujdaini; G.Ping Gu; M. Sahoo

    2005-04-01

    Grain refinement is a well established process for many cast and wrought alloys. The mechanical properties of various alloys could be enhanced by reducing the grain size. Refinement is also known to improve casting characteristics such as fluidity and hot tearing. Grain refinement of copper-base alloys is not widely used, especially in sand casting process. However, in permanent mold casting of copper alloys it is now common to use grain refinement to counteract the problem of severe hot tearing which also improves the pressure tightness of plumbing components. The mechanism of grain refinement in copper-base alloys is not well understood. The issues to be studied include the effect of minor alloy additions on the microstructure, their interaction with the grain refiner, effect of cooling rate, and loss of grain refinement (fading). In this investigation, efforts were made to explore and understand grain refinement of copper alloys, especially in permanent mold casting conditions.

  4. The Analysis of the Refined Financial Management of Modern Enterprises

    OpenAIRE

    Li Ran

    2016-01-01

    This paper briefly introduces the concept of the refined financial management, elaborates on its characteristics and puts forward some main points about it. It also comes up with some personal suggestions for reference on effective ways of refining financial management.

  5. Effect of a low dose combined oral contraceptive pill on the hormonal profile and cycle outcome following COS with a GnRH antagonist protocol in women over 35 years old.

    Science.gov (United States)

    Bakas, Panagiotis; Hassiakos, Dimitrios; Grigoriadis, Charalampos; Vlahos, Nikolaos F; Liapis, Angelos; Creatsas, George

    2014-11-01

    This prospective study examines if pre-treatment with two different doses of an oral contraceptive pill (OCP) modifies significantly the hormonal profile and/or the IVF/ICSI outcome following COS with a GnRH antagonist protocol. Infertile patients were allocated to receive either OCP containing 0.03 mg of ethinylestradiol and 3 mg of drospirenone, or OCP containing 0.02 mg of ethinylestradiol and 3 mg of drospirenone prior to initiation of controlled ovarian stimulation (COS) with recombinant gonadotropins on a variable multi-dose antagonist protocol (Ganirelix), while the control group underwent COS without OCP pretreatment. Lower dose OCP was associated with recovery of FSH on day 3 instead of day 5, but the synchronization of the follicular cohort, the number of retrieved oocytes and the clinical pregnancy rate were similar to higher dose OCP.

  6. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  7. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  8. Protocol for determining bull trout presence

    Science.gov (United States)

    Peterson, James; Dunham, Jason B.; Howell, Philip; Thurow, Russell; Bonar, Scott

    2002-01-01

    The Western Division of the American Fisheries Society was requested to develop protocols for determining presence/absence and potential habitat suitability for bull trout. The general approach adopted is similar to the process for the marbled murrelet, whereby interim guidelines are initially used, and the protocols are subsequently refined as data are collected. Current data were considered inadequate to precisely identify suitable habitat but could be useful in stratifying sampling units for presence/absence surveys. The presence/absence protocol builds on previous approaches (Hillman and Platts 1993; Bonar et al. 1997), except it uses the variation in observed bull trout densities instead of a minimum threshold density and adjusts for measured differences in sampling efficiency due to gear types and habitat characteristics. The protocol consists of: 1. recommended sample sizes with 80% and 95% detection probabilities for juvenile and resident adult bull trout for day and night snorkeling and electrofishing adjusted for varying habitat characteristics for 50m and 100m sampling units, 2. sampling design considerations, including possible habitat characteristics for stratification, 3. habitat variables to be measured in the sampling units, and 3. guidelines for training sampling crews. Criteria for habitat strata consist of coarse, watershed-scale characteristics (e.g., mean annual air temperature) and fine-scale, reach and habitat-specific features (e.g., water temperature, channel width). The protocols will be revised in the future using data from ongoing presence/absence surveys, additional research on sampling efficiencies, and development of models of habitat/species occurrence.

  9. ATM and Internet protocol

    CERN Document Server

    Bentall, M; Turton, B

    1998-01-01

    Asynchronous Transfer Mode (ATM) is a protocol that allows data, sound and video being transferred between independent networks via ISDN links to be supplied to, and interpreted by, the various system protocols.ATM and Internet Protocol explains the working of the ATM and B-ISDN network for readers with a basic understanding of telecommunications. It provides a handy reference to everyone working with ATM who may not require the full standards in detail, but need a comprehensive guide to ATM. A substantial section is devoted to the problems of running IP over ATM and there is some discussion o

  10. Playing With Population Protocols

    Directory of Open Access Journals (Sweden)

    Xavier Koegler

    2009-06-01

    Full Text Available Population protocols have been introduced as a model of sensor networks consisting of very limited mobile agents with no control over their own movement: A collection of anonymous agents, modeled by finite automata, interact in pairs according to some rules. Predicates on the initial configurations that can be computed by such protocols have been characterized under several hypotheses. We discuss here whether and when the rules of interactions between agents can be seen as a game from game theory. We do so by discussing several basic protocols.

  11. Linear Logical Voting Protocols

    DEFF Research Database (Denmark)

    DeYoung, Henry; Schürmann, Carsten

    2012-01-01

    . In response, we promote linear logic as a high-level language for both specifying and implementing voting protocols. Our linear logical specifications of the single-winner first-past-the-post (SW- FPTP) and single transferable vote (STV) protocols demonstrate that this approach leads to concise......Current approaches to electronic implementations of voting protocols involve translating legal text to source code of an imperative programming language. Because the gap between legal text and source code is very large, it is difficult to trust that the program meets its legal specification...... implementations that closely correspond to their legal specification, thereby increasing trust....

  12. Refinement verification of the lazy caching algorithm

    NARCIS (Netherlands)

    Hesselink, Wim H.

    2006-01-01

    The lazy caching algorithm of Afek et al. (ACM Trans. Program. Lang. Syst. 15, 182-206, 1993) is a protocol that allows the use of local caches with delayed updates. It results in a memory model that is not atomic (linearizable) but only sequentially consistent as defined by Lamport. In Distributed

  13. Current State and Prospect of China's Refining Sector%Current State and Prospect of China's Refining Sector

    Institute of Scientific and Technical Information of China (English)

    Zhu He

    2012-01-01

    (1) China's refining capacity will reach 650 million tons per year, the cooperation with oil producing countries in refining sector will be strengthened and a diversified competitive framework for the sector will be further pushed forward.

  14. Query Refinement by Multi Word Term expansions and semantic synonymy

    CERN Document Server

    Lux-Pogodalla, Veronila

    2008-01-01

    We developed a system, TermWatch (https://stid-bdd.iut.univ-metz.fr/TermWatch/index.pl), which combines a linguistic extraction of terms, their structuring into a terminological network with a clustering algorithm. In this paper we explore its ability in integrating the most promising aspects of the studies on query refinement: choice of meaningful text units to cluster (domain terms), choice of tight semantic relations with which to cluster terms, structuring of terms in a network enabling abetter perception of domain concepts. We have run this experiment on the 367 645 English abstracts of PASCAL 2005-2006 bibliographic database (http://www.inist.fr) and compared the structured terminological resource automatically build by TermWarch to the English segment of TermScience resource (http://termsciences.inist.fr/) containing 88 211 terms.

  15. The IVF Outcome Counseling Based on the Model Combining DHEAS and Age in Patients with Low AMH Prior to the First Cycle of GnRH Antagonist Protocol of Ovarian Stimulation

    Science.gov (United States)

    Alebić, Miro Šimun; Žuvić-Butorac, Marta

    2013-01-01

    Objective. To investigate the endocrine and/or clinical characteristics of women with low anti-Müllerian hormone (AMH) that could improve the accuracy of IVF outcome prediction based on the female age alone prior to the first GnRH antagonist IVF cycle. Methods. Medical records of 129 patients with low AMH level (5.7 pmol/L, 60% (9/15) of all pregnancies were achieved with CPR of 37.5%. Conclusions. DHEAS appears to be predictive for clinical pregnancy in younger women (<37.5 years) with low AMH after the first GnRH antagonist IVF cycle. Therefore, DHEAS-age model could refine the pretreatment counseling on pregnancy prospects following IVF. PMID:23509455

  16. Quantum Geometry of Refined Topological Strings

    CERN Document Server

    Aganagic, Mina; Dijkgraaf, Robbert; Krefl, Daniel; Vafa, Cumrun

    2011-01-01

    We consider branes in refined topological strings. We argue that their wave-functions satisfy a Schr\\"odinger equation depending on multiple times and prove this in the case where the topological string has a dual matrix model description. Furthermore, in the limit where one of the equivariant rotations approaches zero, the brane partition function satisfies a time-independent Schroedinger equation. We use this observation, as well as the back reaction of the brane on the closed string geometry, to offer an explanation of the connection between integrable systems and N=2 gauge systems in four dimensions observed by Nekrasov and Shatashvili.

  17. Quantum geometry of refined topological strings

    Science.gov (United States)

    Aganagic, Mina; Cheng, Miranda C. N.; Dijkgraaf, Robbert; Krefl, Daniel; Vafa, Cumrun

    2012-11-01

    We consider branes in refined topological strings. We argue that their wavefunctions satisfy a Schrödinger equation depending on multiple times and prove this in the case where the topological string has a dual matrix model description. Furthermore, in the limit where one of the equivariant rotations approaches zero, the brane partition function satisfies a time-independent Schrödinger equation. We use this observation, as well as the back reaction of the brane on the closed string geometry, to offer an explanation of the connection between integrable systems and {N}=2 gauge systems in four dimensions observed by Nekrasov and Shatashvili.

  18. A refinement driven component-based design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter;

    2007-01-01

    to integrate sophisticated checkers, generators and transformations. A feasible approach to ensuring high quality of such add-ins is to base them on sound formal foundations. This paper summarizes our research on the Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from...... the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may be integrated in computer-aided software engineering (CASE) tools for adding formally supported...

  19. Liquid rheology study on refined rapeseed oil

    Institute of Scientific and Technical Information of China (English)

    刘其梅; 罗迎社; 殷水平; 陈胜铭; 张党权; 彭万喜

    2008-01-01

    The rapeseed oil extracted from the mature seeds was purified by refining processing,and the rheological characteristic analysis of the viscosity and dynamic shear rate at gradient temperatures was made.The result shows that at 20,40,60 and 80 ℃ respectively,when the shear rate gradually rises,the torque increases accordingly but its viscosity does not vary distinctly.The result suggests that when rapeseed oil is used as the raw of edible oils and industries,the working procedures at high temperature will not influence its rheological characteristic distinctly.

  20. Retrenchment. An engineering variation on refinement

    Energy Technology Data Exchange (ETDEWEB)

    Banach, R.; Poppleton, M

    1999-07-01

    It is argued that refinement, in which I/O signatures stay the same, preconditions are weakened and postconditions strengthened, is too restrictive to describe all but a fraction of many realistic developments. An alternative notion is proposed called retrenchment, which allows information to migrate between I/O and state aspects of operations at different levels of abstraction, and which allows only a fraction of the high level behaviour to be captured at the low level. This permits more of the informal aspects of design to be formally captured and checked. The details are worked out for the B-Method. (author)

  1. Surface biotechnology for refining cochlear implants.

    Science.gov (United States)

    Tan, Fei; Walshe, Peter; Viani, Laura; Al-Rubeai, Mohamed

    2013-12-01

    The advent of the cochlear implant is phenomenal because it is the first surgical prosthesis that is capable of restoring one of the senses. The subsequent rapid evolution of cochlear implants through increasing complexity and functionality has been synchronized with the recent advancements in biotechnology. Surface biotechnology has refined cochlear implants by directly influencing the implant–tissue interface. Emerging surface biotechnology strategies are exemplified by nanofibrous polymeric materials, topographical surface modification, conducting polymer coatings, and neurotrophin-eluting implants. Although these novel developments have received individual attention in the recent literature, the time has come to investigate their collective applications to cochlear implants to restore lost hearing.

  2. Adaptive refinement tools for tetrahedral unstructured grids

    Science.gov (United States)

    Pao, S. Paul (Inventor); Abdol-Hamid, Khaled S. (Inventor)

    2011-01-01

    An exemplary embodiment providing one or more improvements includes software which is robust, efficient, and has a very fast run time for user directed grid enrichment and flow solution adaptive grid refinement. All user selectable options (e.g., the choice of functions, the choice of thresholds, etc.), other than a pre-marked cell list, can be entered on the command line. The ease of application is an asset for flow physics research and preliminary design CFD analysis where fast grid modification is often needed to deal with unanticipated development of flow details.

  3. Computing Refined Buneman Trees in Cubic Time

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Östlin, A.

    2003-01-01

    Reconstructing the evolutionary tree for a set of n species based on pairwise distances between the species is a fundamental problem in bioinformatics. Neighbor joining is a popular distance based tree reconstruction method. It always proposes fully resolved binary trees despite missing evidence...... in the underlying distance data. Distance based methods based on the theory of Buneman trees and refined Buneman trees avoid this problem by only proposing evolutionary trees whose edges satisfy a number of constraints. These trees might not be fully resolved but there is strong combinatorial evidence for each...

  4. Acoustic Logging Modeling by Refined Biot's Equations

    Science.gov (United States)

    Plyushchenkov, Boris D.; Turchaninov, Victor I.

    An explicit uniform completely conservative finite difference scheme for the refined Biot's equations is proposed. This system is modified according to the modern theory of dynamic permeability and tortuosity in a fluid-saturated elastic porous media. The approximate local boundary transparency conditions are constructed. The acoustic logging device is simulated by the choice of appropriate boundary conditions on its external surface. This scheme and these conditions are satisfactory for exploring borehole acoustic problems in permeable formations in a real axial-symmetrical situation. The developed approach can be adapted for a nonsymmetric case also.

  5. Refining thick brane models via electroweak data

    CERN Document Server

    Chumbes, A E R; Hott, M B

    2015-01-01

    After discussing the localization of Abelian and non-Abelian gauge fields and Higgs fields on a thick brane, we introduce a procedure of dimensional reduction and its consequences to the rescaled parameters of the boson sector of the Standard Model. The parameters encodes some power dependence on the extra dimension, usually narrow, warp factor and hence it also depend on the position related with the extra dimension inside the thick brane. In this vein, the observable parameters may be used to refine the braneworld models via the brane thickness.

  6. Formal language theory: refining the Chomsky hierarchy.

    Science.gov (United States)

    Jäger, Gerhard; Rogers, James

    2012-07-19

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).

  7. A Technical Refinement of the Vernier Optometer.

    Science.gov (United States)

    1988-07-01

    Naa Aeropc Meia Reeac - -.- .r* ES * A TECHNICAL REFINEMENT OF THE VERNIER OPTOMETER William B. Cushman and Leonard A. Temme N DTIC fELECTE NOVI 6 W8...Vernier Optometer 12 PERSONAL AUTHOR(S) William B. Cushman* and Leonard A. Temme 14._ DATE_ OF_ REPORT_(YearMonth, _Day)_15_______COUN 13a14. DATE OF...Fellow 17 COSATI CODES 18 SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP Vernier optometer , laser

  8. Refining the asteroid taxonomy by polarimetric observations

    Science.gov (United States)

    Belskaya, I. N.; Fornasier, S.; Tozzi, G. P.; Gil-Hutton, R.; Cellino, A.; Antonyuk, K.; Krugly, Yu. N.; Dovgopol, A. N.; Faggi, S.

    2017-03-01

    We present new results of polarimetric observations of 15 main belt asteroids of different composition. By merging new and published data we determined polarimetric parameters characterizing individual asteroids and mean values of the same parameters characterizing different taxonomic classes. The majority of asteroids show polarimetric phase curves close to the average curve of the corresponding class. We show that using polarimetric data it is possible to refine asteroid taxonomy and derive a polarimetric classification for 283 main belt asteroids. Polarimetric observations of asteroid (21) Lutetia are found to exhibit possible variations of the position angle of the polarization plane over the surface.

  9. Relativistic MHD with adaptive mesh refinement

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Matthew [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, LA 70803-4001 (United States); Hirschmann, Eric W [Department of Physics and Astronomy, Brigham Young University, Provo, UT 84602 (United States); Liebling, Steven L [Department of Physics, Long Island University-C W Post Campus, Brookville, NY 11548 (United States); Neilsen, David [Department of Physics and Astronomy, Brigham Young University, Provo, UT 84602 (United States)

    2006-11-22

    This paper presents a new computer code to solve the general relativistic magnetohydrodynamics (GRMHD) equations using distributed parallel adaptive mesh refinement (AMR). The fluid equations are solved using a finite difference convex ENO method (CENO) in 3 + 1 dimensions, and the AMR is Berger-Oliger. Hyperbolic divergence cleaning is used to control the {nabla} . B = 0 constraint. We present results from three flat space tests, and examine the accretion of a fluid onto a Schwarzschild black hole, reproducing the Michel solution. The AMR simulations substantially improve performance while reproducing the resolution equivalent unigrid simulation results. Finally, we discuss strong scaling results for parallel unigrid and AMR runs.

  10. Unitary appreciative inquiry: evolution and refinement.

    Science.gov (United States)

    Cowling, W Richard; Repede, Elizabeth

    2010-01-01

    Unitary appreciative inquiry (UAI), developed over the past 20 years, provides an orientation and process for uncovering human wholeness and discovering life patterning in individuals and groups. Refinements and a description of studies using UAI are presented. Assumptions and conceptual underpinnings of the method distinguishing its contributions from other methods are reported. Data generation strategies that capture human wholeness and elucidate life patterning are proposed. Data synopsis as an alternative to analysis is clarified and explicated. Standards that suggest enhancing the legitimacy of knowledge and credibility of research are specified. Potential expansions of UAI offer possibilities for extending epistemologies, aesthetic integration, and theory development.

  11. Development of an acquisition protocol and a segmentation algortihm for wounds of cutaneous Leishmaniasis in digital images

    Science.gov (United States)

    Diaz, Kristians; Castañeda, Benjamín; Miranda, César; Lavarello, Roberto; Llanos, Alejandro

    2010-03-01

    We developed a protocol for the acquisition of digital images and an algorithm for a color-based automatic segmentation of cutaneous lesions of Leishmaniasis. The protocol for image acquisition provides control over the working environment to manipulate brightness, lighting and undesirable shadows on the injury using indirect lighting. Also, this protocol was used to accurately calculate the area of the lesion expressed in mm2 even in curved surfaces by combining the information from two consecutive images. Different color spaces were analyzed and compared using ROC curves in order to determine the color layer with the highest contrast between the background and the wound. The proposed algorithm is composed of three stages: (1) Location of the wound determined by threshold and mathematical morphology techniques to the H layer of the HSV color space, (2) Determination of the boundaries of the wound by analyzing the color characteristics in the YIQ space based on masks (for the wound and the background) estimated from the first stage, and (3) Refinement of the calculations obtained on the previous stages by using the discrete dynamic contours algorithm. The segmented regions obtained with the algorithm were compared with manual segmentations made by a medical specialist. Broadly speaking, our results support that color provides useful information during segmentation and measurement of wounds of cutaneous Leishmaniasis. Results from ten images showed 99% specificity, 89% sensitivity, and 98% accuracy.

  12. 1996 : Track Count Protocol

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The goal of St. Vincent National Wildlife Refuge's Track Count Protocol is to provide an index to the population size of game animals inhabiting St. Vincent Island.

  13. Quantum deniable authentication protocol

    Science.gov (United States)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2014-07-01

    The proposed quantum identity authentication schemes only involved authentication between two communicators, but communications with deniability capability are often desired in electronic applications such as online negotiation and electronic voting. In this paper, we proposed a quantum deniable authentication protocol. According to the property of unitary transformation and quantum one-way function, this protocol can provide that only the specified receiver can identify the true source of a given message and the specified receiver cannot prove the source of the message to a third party by a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Security analysis results show that this protocol satisfies the basic security requirements of deniable authentication protocol such as completeness and deniability and can withstand the forgery attack, impersonation attack, inter-resend attack.

  14. Unconditionally Secure Protocols

    DEFF Research Database (Denmark)

    Meldgaard, Sigurd Torkel

    the secure computation. Especially we look at the communication complexity of protocols in this model, and perfectly secure protocols. We show general protocols for any finite functionality with statistical security and optimal communication complexity (but exponential amount of preprocessing). And for two......This thesis contains research on the theory of secure multi-party computation (MPC). Especially information theoretically (as opposed to computationally) secure protocols. It contains results from two main lines of work. One line on Information Theoretically Secure Oblivious RAMS, and how...... they are used to speed up secure computation. An Oblivious RAM is a construction for a client with a small $O(1)$ internal memory to store $N$ pieces of data on a server while revealing nothing more than the size of the memory $N$, and the number of accesses. This specifically includes hiding the access pattern...

  15. The Extended Clustering AD HOC Routing Protocol (ECRP

    Directory of Open Access Journals (Sweden)

    Buthayna Al-Sharaa

    2013-06-01

    Full Text Available Ad hoc networks are acollection of mobile nodes communicating via wireless channels without any fixedinfrastructure. Because of their ease and low cost of building, ad hoc networks have a lot of attractiveapplications in different fields. The topology of ad hoc networks changes dynamically, and each node in thenetwork can act as a host or router.With the increase in the number of wirelessdevices andlarge amountof traffic to be exchanged, the demand for scalable routing protocols has increased. This paper presents ascalable routing protocol, based on AODV protocol, called the Extended Clustering Ad Hoc RoutingProtocol (ECRP. This is a hybrid protocol, which combines reactive and proactive approaches in routing.The protocol uses theGlobal PositioningSystem todetermine the position of certain nodesin the network.The evaluation methodology and simulation results obtained showthat the protocol is efficient and scaleswell in large networks

  16. A Multiple Access Protocol for Multimedia Transmission over Wireless Networks

    CERN Document Server

    Yu, Hong

    2012-01-01

    This paper develops and evaluates the performance of an advanced multiple access protocol for transmission of full complement of multimedia signals consisting of various combinations of voice, video, data, text and images over wireless networks. The protocol is called Advanced Multiple Access Protocol for Multimedia Transmission (AMAPMT) and is to be used in the Data Link Layer of the protocol stack. The principle of operation of the protocol is presented in a number of logical flow charts. The protocol grants permission to transmit to a source on the basis of a priority scheme that takes into account a time-to-live (TTL) parameter of all the transactions, selectable priorities assigned to all the sources and relevant channel state information (CSI) in this order. Performance of the protocol is evaluated in terms of quality of service parameters like packet loss ratio (PLR), mean packet transfer delay (MPTD) and throughput. Using a simulation model based on an OPNET simulation software package does the evalua...

  17. USA-USSR protocol

    CERN Multimedia

    1970-01-01

    On 30 November the USA Atomic Energy Commission and the USSR State Committee for the Utilization of Atomic Energy signed, in Washington, a protocol 'on carrying out of joint projects in the field of high energy physics at the accelerators of the National Accelerator Laboratory (Batavia) and the Institute for High Energy Physics (Serpukhov)'. The protocol will be in force for five years and can be extended by mutual agreement.

  18. Cognitive Protocol Stack Design

    Science.gov (United States)

    2015-12-30

    directly related to the protocol stack, e.g., environmental or positioning data) that can be exploited to design and test novel cognitive networking ...quality of service (QoS) is challenging. Currently, 5G technologies are being developed to answer the need for further increasing network capacity, and...SECURITY CLASSIFICATION OF: In the ARO “Cognitive Protocol Stack Design" project we proposed cognitive networking solutions published in international

  19. Grain refinement of AZ31 magnesium alloy by Al-Ti-C-Y master alloy

    Institute of Scientific and Technical Information of China (English)

    XU Chunxiang; LU Binfeng; L(U) Zhengling; LIANG Wei

    2008-01-01

    Al-Ti-C-Y master alloy was prepared by combining SHS technique and melting-casting method. The microstructure of master alloy and its grain-refining effect on AZ31 alloy were investigated by means of OM, XRD, SEM and EDS. Experimental results indicated that the prepared master alloy consisted of α-Al, TiAl3, TiC and Al3Y phases, and exhibited good grain-refining performance of AZ31 alloy. Morphology of α-Mg changed from coarse dendritic to fine equiaxed and the average grain size of α-Mg matrix reduced from the original 580 to 170 μm after adding 1.0 wt.% master alloy. The grain refining efficiency of Al-Ti-C-Y master alloy on AZ31 alloy was mainly attributed to heterogeneous nucleation of TiC particles and grain growth restriction of Al-Y compound or TiC at grain boundaries.

  20. GPU-Accelerated Asynchronous Error Correction for Mixed Precision Iterative Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Antz, Hartwig [Karlsruhe Inst. of Technology (KIT) (Germany); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Univ. of Manchester (United Kingdom); Heuveline, Vinent [Karlsruhe Inst. of Technology (KIT) (Germany)

    2011-12-14

    In hardware-aware high performance computing, block- asynchronous iteration and mixed precision iterative refinement are two techniques that are applied to leverage the computing power of SIMD accelerators like GPUs. Although they use a very different approach for this purpose, they share the basic idea of compensating the convergence behaviour of an inferior numerical algorithm by a more efficient usage of the provided computing power. In this paper, we want to analyze the potential of combining both techniques. Therefore, we implement a mixed precision iterative refinement algorithm using a block-asynchronous iteration as an error correction solver, and compare its performance with a pure implementation of a block-asynchronous iteration and an iterative refinement method using double precision for the error correction solver. For matrices from theUniversity of FloridaMatrix collection,we report the convergence behaviour and provide the total solver runtime using different GPU architectures.

  1. Visual Deprivation Causes Refinement of Intracortical Circuits in the Auditory Cortex

    Directory of Open Access Journals (Sweden)

    Xiangying Meng

    2015-08-01

    Full Text Available Loss of a sensory modality can lead to functional enhancement of the remaining senses. For example, short-term visual deprivations, or dark exposure (DE, can enhance neuronal responses in the auditory cortex to sounds. These enhancements encompass increased spiking rates and frequency selectivity as well as increased spiking reliability. Although we previously demonstrated enhanced thalamocortical transmission after DE, increased synaptic strength cannot account for increased frequency selectivity or reliability. We thus investigated whether other changes in the underlying circuitry contributed to improved neuronal responses. We show that DE can lead to refinement of intra- and inter-laminar connections in the mouse auditory cortex. Moreover, we use a computational model to show that the combination of increased transmission and circuit refinement can lead to increased firing reliability. Thus cross-modal influences can alter the spectral and temporal processing of sensory stimuli by refinement of thalamocortical and intracortical circuits.

  2. A newly designed radiation therapy protocol in combination with prednisolone as treatment for meningoencephalitis of unknown origin in dogs: a prospective pilot study introducing magnetic resonance spectroscopy as monitor tool.

    Science.gov (United States)

    Beckmann, Katrin; Carrera, Inés; Steffen, Frank; Golini, Lorenzo; Kircher, Patrick R; Schneider, Uwe; Bley, Carla Rohrer

    2015-01-31

    A plethora of treatment options have been described for canine meningoencephalitis of unknown origin (MUO), yet a gold standard has not been established. The aim of this prospective pilot study was to document the effect of a newly designed 30 Gray (Gy) radiation therapy (RT) protocol plus corticosteroids as treatment for focal and multifocal MUO, to monitor clinical and imaging changes during the course of the disease with conventional magnetic resonance imaging (MRI) and proton MR Spectroscopy (H-1 MRS) and to detect the occurrence of radiation related side effects. Six dogs (3 with focal and 3 with multifocal lesions) were included in the study. The RT protocol used consisted of 30 Gy in 10 fractions. The neurological status of all six dogs improved during RT, with 3 of 6 cases returning to a normal condition. One dog was euthanized early during follow-up (<3 weeks after end of RT). Three month follow up MRI was normal in one dog and improved in 3 dogs and H-1 MRS normalized in 4. In the dog without improvement of the MRI lesions, the N-acetyl aspartate continued to decrease, while choline and creatine concentrations remained stable during that time. This dog was euthanized 18 month after the end of RT due to relapse. One dog was lost to follow up 12 month after completion of RT. The other 3 dogs are still alive at the time of writing. RT with 30 Gy in 10 fractions can provide an additional option for anti-inflammatory treatment of focal and multifocal MUO. The protocol used for treatment monitoring was feasible while no side effects of RT could be observed during the follow up period. Moreover, H-1 MRS could represent a new and non-invasive tool to control the progression of the disease during the treatment course.

  3. Addiction therapy. Refining deep brain stimulation to emulate optogenetic treatment of synaptic pathology.

    Science.gov (United States)

    Creed, Meaghan; Pascoli, Vincent Jean; Lüscher, Christian

    2015-02-06

    Circuit remodeling driven by pathological forms of synaptic plasticity underlies several psychiatric diseases, including addiction. Deep brain stimulation (DBS) has been applied to treat a number of neurological and psychiatric conditions, although its effects are transient and mediated by largely unknown mechanisms. Recently, optogenetic protocols that restore normal transmission at identified synapses in mice have provided proof of the idea that cocaine-adaptive behavior can be reversed in vivo. The most efficient protocol relies on the activation of metabotropic glutamate receptors, mGluRs, which depotentiates excitatory synaptic inputs onto dopamine D1 receptor medium-sized spiny neurons and normalizes drug-adaptive behavior. We discovered that acute low-frequency DBS, refined by selective blockade of dopamine D1 receptors, mimics optogenetic mGluR-dependent normalization of synaptic transmission. Consequently, there was a long-lasting abolishment of behavioral sensitization. Copyright © 2015, American Association for the Advancement of Science.

  4. Neurofeedback as a Treatment for Impulsivity in a Forensic Psychiatric Population With Substance Use Disorder: Study Protocol of a Randomized Controlled Trial Combined With an N-of-1 Clinical Trial.

    Science.gov (United States)

    Fielenbach, Sandra; Donkers, Franc Cl; Spreen, Marinus; Bogaerts, Stefan

    2017-01-25

    Impulsivity and substance use disorder (SUD) are strongly interconnected, with persons scoring high on impulsivity being more vulnerable to develop substance abuse, facing more challenges for successful treatment, and being more prone to engage in criminal behavior. Studies have shown that impulsivity and craving for substances are strongly correlated. Neurofeedback is an effective treatment to reduce impulsive behavior. This study intends to determine to what extent a neurofeedback-intervention that is aimed at reducing impulsivity can also reduce levels of craving in forensic patients with SUD and comorbid Axis I and/or II diagnoses. The main objective of this study is to investigate to what extent a reduction in impulsivity by a sensorimotor rhythm (SMR)-neurofeedback intervention will lead to a reduction in craving in a population of forensic psychiatric patients with a diagnosis of SUD. Participants will be male SUD patients with various comorbidities residing in an inpatient forensic treatment facility approached through treatment supervisors for participation. Participants have tested positive for drug use in the past 24 months. The study consists of 2 parts: a randomized controlled trial (RCT) and a n-of-1 clinical series. In the RCT, 50 patients will be randomly assigned to an intervention (n=25) or a control (n=25) condition. Patients in the intervention group will receive 20 SMR neurofeedback sessions aimed at reducing impulsivity; participants in the control group receive treatment-as-usual (TAU). Additionally, 4 in depth n-of-1 clinical trials will be conducted where effects of an SMR neurofeedback intervention will be compared to effects of sham neurofeedback. Results of this study are expected by the end of 2017. This protocol describes the design of a study testing the effects of an impulsivity-based neurofeedback protocol among forensic patients with SUD and various comorbidities. We expect a significant reduction in impulsive behavior, level of

  5. Refining Automatically Extracted Knowledge Bases Using Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Chunhua Li

    2017-01-01

    Full Text Available Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.

  6. Multiuser Successive Refinement and Multiple Description Coding

    CERN Document Server

    Tian, Chao; Diggavi, Suhas

    2007-01-01

    We consider the multiuser successive refinement (MSR) problem, where the users are connected to a central server via links with different noiseless capacities, and each user wishes to reconstruct in a successive-refinement fashion. An achievable region is given for the two-user two-layer case and it provides the complete rate-distortion region for the Gaussian source under the MSE distortion measure. The key observation is that this problem includes the multiple description (MD) problem (with two descriptions) as a subsystem, and the techniques useful in the MD problem can be extended to this case. We show that the coding scheme based on the universality of random binning is sub-optimal, because multiple Gaussian side informations only at the decoders do incur performance loss, in contrast to the case of single side information at the decoder. We further show that unlike the single user case, when there are multiple users, the loss of performance by a multistage coding approach can be unbounded for the Gaussi...

  7. DSR-Based Selective Repeat ARQ Protocol in MANET

    Institute of Scientific and Technical Information of China (English)

    张全新; 宋瀚涛

    2003-01-01

    The efficient route algorithms involved in mobile ad hoc network(MANET) are studied. An arrangement of a combination of the traditional dynamic source routing(DSR) protocol is put forward and the selective repeat ARQ protocol is put forward by analyzing and studying them in detail and providing the scheme. In networks, especially in wireless networks, the nodes are capable to process data much faster than transmission, the DSR-based selective repeat ARQ protocol has real meanings in MANET.

  8. 40 CFR 80.235 - How does a refiner obtain approval as a small refiner?

    Science.gov (United States)

    2010-07-01

    ... small refiner status, plus any subsidiary companies, any parent company and subsidiaries of the parent company, and any joint venture partners: (1)(i) A listing of the name and address of each location where... corporate contact person. (d) For joint ventures, the total number of employees includes the...

  9. Quantum Key Distribution Protocol with Private-Public Key

    OpenAIRE

    Serna, Eduin H.

    2009-01-01

    A quantum cryptographic protocol based in public key cryptography combinations and private key cryptography is presented. Unlike the BB84 protocol [1] and its many variants [2,3] two quantum channels are used. The present research does not make reconciliation mechanisms of information to derive the key. A three related system of key distribution are described.

  10. A Dialogue Game Protocol for Agent Purchase Negotiations

    NARCIS (Netherlands)

    McBurney, P.; Eijk, R.M. van; Parsons, S.; Amgoud, L.

    2003-01-01

    We propose a dialogue game protocol for purchase negotiation dialogues which identifies appropriate speech acts, defines constraints on their utterances, and specifies the different sub-tasks agents need to perform in order to engage in dialogues according to this protocol. Our formalism combines a

  11. A Dialogue Game Protocol for Agent Purchase Negotiations

    NARCIS (Netherlands)

    McBurney, P.; Eijk, R.M. van; Parsons, S.; Amgoud, L.

    2003-01-01

    We propose a dialogue game protocol for purchase negotiation dialogues which identifies appropriate speech acts, defines constraints on their utterances, and specifies the different sub-tasks agents need to perform in order to engage in dialogues according to this protocol. Our formalism combines

  12. Cochleotoxicity monitoring protocol.

    Science.gov (United States)

    Ferreira Penêda, José; Barros Lima, Nuno; Ribeiro, Leandro; Helena, Diamantino; Domingues, Bruno; Condé, Artur

    2017-05-10

    Cochlear damage is frequent in long-term aminoglycosides therapy or chemotherapeutic treatments with platinum-based agents. Despite its prevalence, it is currently underestimated and underdiagnosed. A monitoring protocol is vital to the early detection of cochleotoxicity and its implementation is widely encouraged in every hospital unit. Our aim was to elaborate a cochleotoxicity monitoring protocol for patients treated with platinum compounds or aminoglycosides antibiotics. PubMed® database was searched using terms relevant to drug cochleotoxicity in order to identify the most adequate protocol. Several articles and guidelines influenced our decision. There is no consensus on a universal monitoring protocol. Its formulation and application rely heavily on available resources and personnel. High-frequency audiometry and otoacoustic emissions play an important role on early detection of cochleotoxicity caused by aminoglycoside antibiotics and platinum compounds. A cochleotoxicity monitoring protocol consisting on an initial evaluation, treatment follow-up and post-treatment evaluation is proposed. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.

  13. GRChombo: Numerical relativity with adaptive mesh refinement

    Science.gov (United States)

    Clough, Katy; Figueras, Pau; Finkel, Hal; Kunesch, Markus; Lim, Eugene A.; Tunyasuvunakool, Saran

    2015-12-01

    In this work, we introduce {\\mathtt{GRChombo}}: a new numerical relativity code which incorporates full adaptive mesh refinement (AMR) using block structured Berger-Rigoutsos grid generation. The code supports non-trivial ‘many-boxes-in-many-boxes’ mesh hierarchies and massive parallelism through the message passing interface. {\\mathtt{GRChombo}} evolves the Einstein equation using the standard BSSN formalism, with an option to turn on CCZ4 constraint damping if required. The AMR capability permits the study of a range of new physics which has previously been computationally infeasible in a full 3 + 1 setting, while also significantly simplifying the process of setting up the mesh for these problems. We show that {\\mathtt{GRChombo}} can stably and accurately evolve standard spacetimes such as binary black hole mergers and scalar collapses into black holes, demonstrate the performance characteristics of our code, and discuss various physics problems which stand to benefit from the AMR technique.

  14. Formal language theory: refining the Chomsky hierarchy

    Science.gov (United States)

    Jäger, Gerhard; Rogers, James

    2012-01-01

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages). PMID:22688632

  15. Algorithm refinement for stochastic partial differential equations.

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, F. J. (Francis J.); Garcia, Alejandro L.,; Tartakovsky, D. M. (Daniel M.)

    2001-01-01

    A hybrid particle/continuum algorithm is formulated for Fickian diffusion in the fluctuating hydrodynamic limit. The particles are taken as independent random walkers; the fluctuating diffusion equation is solved by finite differences with deterministic and white-noise fluxes. At the interface between the particle and continuum computations the coupling is by flux matching, giving exact mass conservation. This methodology is an extension of Adaptive Mesh and Algorithm Refinement to stochastic partial differential equations. A variety of numerical experiments were performed for both steady and time-dependent scenarios. In all cases the mean and variance of density are captured correctly by the stochastic hybrid algorithm. For a non-stochastic version (i.e., using only deterministic continuum fluxes) the mean density is correct, but the variance is reduced except within the particle region, far from the interface. Extensions of the methodology to fluid mechanics applications are discussed.

  16. Performance of a streaming mesh refinement algorithm.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Pebay, Philippe Pierre

    2004-08-01

    In SAND report 2004-1617, we outline a method for edge-based tetrahedral subdivision that does not rely on saving state or communication to produce compatible tetrahedralizations. This report analyzes the performance of the technique by characterizing (a) mesh quality, (b) execution time, and (c) traits of the algorithm that could affect quality or execution time differently for different meshes. It also details the method used to debug the several hundred subdivision templates that the algorithm relies upon. Mesh quality is on par with other similar refinement schemes and throughput on modern hardware can exceed 600,000 output tetrahedra per second. But if you want to understand the traits of the algorithm, you have to read the report!

  17. Refinement and stability of grain structure

    Energy Technology Data Exchange (ETDEWEB)

    John Humphreys, F.; Bate, P.S. [Manchester Univ. (United Kingdom). Manchester Materials Science Centre

    2001-07-01

    The various methods of producing fine-grained alloys are discussed and it is concluded that thermomechanical processing routes are most suitable for the economic production of the large quantities of material required for structural applications. The limits of grain refinement by conventional discontinuous recrystallization are considered, and the production of micron-grained alloys by continuous recrystallization processes during or after large strain deformation is discussed. The stability of highly deformed microstructures against recrystallization is analysed, and the effect of second-phase particles on grain growth is discussed. It is shown that perturbations of the Zener drag during high temperature deformation may lead to dynamic grain growth in two-phase alloys. (orig.)

  18. Error bounds from extra precise iterative refinement

    Energy Technology Data Exchange (ETDEWEB)

    Demmel, James; Hida, Yozo; Kahan, William; Li, Xiaoye S.; Mukherjee, Soni; Riedy, E. Jason

    2005-02-07

    We present the design and testing of an algorithm for iterative refinement of the solution of linear equations, where the residual is computed with extra precision. This algorithm was originally proposed in the 1960s [6, 22] as a means to compute very accurate solutions to all but the most ill-conditioned linear systems of equations. However two obstacles have until now prevented its adoption in standard subroutine libraries like LAPACK: (1) There was no standard way to access the higher precision arithmetic needed to compute residuals, and (2) it was unclear how to compute a reliable error bound for the computed solution. The completion of the new BLAS Technical Forum Standard [5] has recently removed the first obstacle. To overcome the second obstacle, we show how a single application of iterative refinement can be used to compute an error bound in any norm at small cost, and use this to compute both an error bound in the usual infinity norm, and a componentwise relative error bound. We report extensive test results on over 6.2 million matrices of dimension 5, 10, 100, and 1000. As long as a normwise (resp. componentwise) condition number computed by the algorithm is less than 1/max{l_brace}10,{radical}n{r_brace} {var_epsilon}{sub w}, the computed normwise (resp. componentwise) error bound is at most 2 max{l_brace}10,{radical}n{r_brace} {center_dot} {var_epsilon}{sub w}, and indeed bounds the true error. Here, n is the matrix dimension and w is single precision roundoff error. For worse conditioned problems, we get similarly small correct error bounds in over 89.4% of cases.

  19. Towards a Protocol for the Collection of VGI Vector Data

    Directory of Open Access Journals (Sweden)

    Peter Mooney

    2016-11-01

    Full Text Available A protocol for the collection of vector data in Volunteered Geographic Information (VGI projects is proposed. VGI is a source of crowdsourced geographic data and information which is comparable, and in some cases better, than equivalent data from National Mapping Agencies (NMAs and Commercial Surveying Companies (CSC. However, there are many differences in how NMAs and CSC collect, analyse, manage and distribute geographic information to that of VGI projects. NMAs and CSC make use of robust and standardised data collection protocols whilst VGI projects often provide guidelines rather than rigorous data collection specifications. The proposed protocol addresses formalising the collection and creation of vector data in VGI projects in three principal ways: by manual vectorisation; field survey; and reuse of existing data sources. This protocol is intended to be generic rather than being linked to any specific VGI project. We believe that this is the first protocol for VGI vector data collection that has been formally described in the literature. Consequently, this paper shall serve as a starting point for on-going development and refinement of the protocol.

  20. TECHNOLOGICAL REFINEMENT OF GRAIN CROP PRODUCTION INVOLVING MACHINERY APPLICATION

    Directory of Open Access Journals (Sweden)

    Maslov G. G.

    2016-01-01

    Full Text Available There have been suggested the courses of the machine technologies refinement in the process of spiked cereals production. The course of their technical update was studied in our previous article. There were analyzed the drawbacks of the modern machine production of crops and we presented the course of their elimination due to the technology optimization, resource and energy preservation, machine technologies of soil improvement and new innovative technological solutions. The suggested technology optimization was designed taking into account rigorous alternation of crops in the rotation, optimizing of breeds and crossbreeds, application of intermediate crops simultaneously with harvesting the previous crop, introducing progressive methods of chemical treatment and synchronous tillage. The resource and energy preservation is based on the combination of technological operations coinciding with the tasks in agricultural terms during a single machinery pass across the field, application of the mobile power unit (UPU-450, low- and ultralow capacity spraying, optimization of choice of certain agrimethods in the process of crop production and the resource calculation of estimated crop yield. In the set of soil improvement courses we have studied the mechanization of the restoration processes of natural soil formation, defecate introduction, the use of stubbly remains, compulsory presence of permanent grasses in crop rotation. New innovative solutions in the crop production technologies include the refinement of the mechanization facilities in tillage, spraying, new methods of crop harvesting (unwinnowed bread, root tow, cleaning of thrashed heap after the harvesting, etc. We have analyzed the ways of mechanization of “organic farming” and seed treatment with biologic mixtures

  1. Cytoskeleton - Methods and Protocols

    Directory of Open Access Journals (Sweden)

    CarloAlberto Redi

    2010-03-01

    Full Text Available Cytoskeleton - Methods and ProtocolsSecond edition, 2010; Ray H. Gavin (Ed; Springer Protocols methods in molecular biology, vol. 586 Humana press, Totowa, New Jersey (USA; Pages: 390; €95.44; ISBN: 978-1-60761-375-6Ray H. Gavin, from the Brooklyn College of The City University of New York, Brooklyn, NY, USA, wrote a few line as preface of this book. This is quite understandable: there is not a great need of words when there are facts that sustain and favour the dissemination of a cultural product. This is the case of the second edition of Cytoskeleton - Methods and Protocols, which appears just ten years after the first edition...

  2. DNA repair protocols

    DEFF Research Database (Denmark)

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  3. Blind Cognitive MAC Protocols

    CERN Document Server

    Mehanna, Omar; Gamal, Hesham El

    2008-01-01

    We consider the design of cognitive Medium Access Control (MAC) protocols enabling an unlicensed (secondary) transmitter-receiver pair to communicate over the idle periods of a set of licensed channels, i.e., the primary network. The objective is to maximize data throughput while maintaining the synchronization between secondary users and avoiding interference with licensed (primary) users. No statistical information about the primary traffic is assumed to be available a-priori to the secondary user. We investigate two distinct sensing scenarios. In the first, the secondary transmitter is capable of sensing all the primary channels, whereas it senses one channel only in the second scenario. In both cases, we propose MAC protocols that efficiently learn the statistics of the primary traffic online. Our simulation results demonstrate that the proposed blind protocols asymptotically achieve the throughput obtained when prior knowledge of primary traffic statistics is available.

  4. IP Routing Protocols

    Science.gov (United States)

    Nolasco Pinto, Armando

    2002-03-01

    Uyless Black is a widely known expert in computer networks and data communications. He is author of more than ten books in the communication technologies field, which puts him in a good position to address this topic. In IP Routing Protocols he starts by providing the background and concepts required for understanding TCP/IP technology. This is done clearly and assumes little prior knowledge of the area. As might be expected, he emphasizes the IP route discovery problem. Later he details several routing protocols.

  5. Apoptosis - Methods and Protocols

    Directory of Open Access Journals (Sweden)

    CarloAlberto Redi

    2010-03-01

    Full Text Available Apoptosis - Methods and ProtocolsSecond edition, 2009; Peter Erhardt and Ambrus Toth (Eds; Springer Protocols - Methods in molecular biology, vol. 559; Humana press, Totowa, New Jersey (USA; Pages: 400; €88.35; ISBN: 978-1-60327-016-8The editors rightly begin the preface telling us that: “The ability to detect and quantify apoptosis, to understand its biochemistry and to identify its regulatory genes and proteins is crucial to biomedical research”. Nowadays this is a grounding concept of biology and medicine. What is particularly remarkable...

  6. Hybrid quantum repeater protocol with fast local processing

    DEFF Research Database (Denmark)

    Borregaard, Johannes; Brask, Jonatan Bohr; Sørensen, Anders Søndberg

    2012-01-01

    We propose a hybrid quantum repeater protocol combining the advantages of continuous and discrete variables. The repeater is based on the previous work of Brask et al. [ Phys. Rev. Lett. 105 160501 (2010)] but we present two ways of improving this protocol. In the previous protocol entangled single......-photon states are produced and grown into superpositions of coherent states, known as two-mode cat states. The entanglement is then distributed using homodyne detection. To improve the protocol, we replace the time-consuming nonlocal growth of cat states with local growth of single-mode cat states, eliminating...

  7. A Secure and Efficient Handover Authentication Protocol for Wireless Networks

    Directory of Open Access Journals (Sweden)

    Weijia Wang

    2014-06-01

    Full Text Available Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an effcient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable effciency features of PairHand, but enjoys the provable security in the random oracle model.

  8. Predictor Variables and Screening Protocol for Depressive and Anxiety Disorders in Cancer Outpatients.

    Directory of Open Access Journals (Sweden)

    Manuela Polidoro Lima

    Full Text Available Cancer patients are at increased risk of persistent depressive and anxiety symptoms and disorders compared to the general population. However, these issues are not always identified, which may worsen the prognosis and increase morbidity and mortality. Therefore, the objectives of this study are to identify predictor variables (demographic and clinical for the development of mood and anxiety disorders in cancer outpatients and to propose a probabilistic screening protocol considering these variables and certain standardized screening instruments.A total of 1,385 adults, of both genders, receiving outpatient cancer care were evaluated using a questionnaire and screening instruments. Thereafter, 400 of these subjects responded to the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (SCID-IV by telephone to confirm or rule out the presence of a Current Major Depressive Episode (CMDE or Anxiety Disorder (AD.Of the patients surveyed, 64% met the criteria for CMDE and 41% for AD. Female gender was found to be a risk factor for both disorders, and the presence of previous psychiatric history and marital status (divorced and widowed were risk factors for anxiety disorders. When scoring above the recommended cutoff score, the screening instruments also indicated a risk of the studied disorders. Based on these findings, a screening protocol and nomograms were created for the quantification, combination and probabilistic estimate of risk, with accuracy indicators >0.68.The prevalence rates for the disorders under study are extremely high in cancer patients. The use of the proposed protocol and nomogram can facilitate rapid and wide screening, thus refining triage and supporting the establishment of criteria for referral to mental health professionals, so that patients can be properly diagnosed and treated.

  9. Effects and mechanisms of grain refinement in aluminium alloys

    Indian Academy of Sciences (India)

    K T Kashyap; T Chandrashekar

    2001-08-01

    Grain refinement plays a crucial role in improving characteristics and properties of cast and wrought aluminium alloys. Generally Al–Ti and Al–Ti–B master alloys are added to the aluminium alloys to grain refine the solidified product. The mechanism of grain refinement is of considerable controversy in the scientific literature. The nucleant effects i.e. which particle and its characteristics nucleate -Al, has been the subject of intensive research. Lately the solute effect i.e. the effect of dissolved titanium on grain refinement, has come into forefront of grain refinement research. The present paper attempts to review the literature on the nucleant effects and solute effects on grain refinement and addresses the importance of dissolved titanium in promoting nucleation of -Al on nucleant particles.

  10. On macromolecular refinement at subatomic resolution withinteratomic scatterers

    Energy Technology Data Exchange (ETDEWEB)

    Afonine, Pavel V.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.; Lunin, Vladimir Y.; Urzhumtsev, Alexandre

    2007-11-09

    A study of the accurate electron density distribution in molecular crystals at subatomic resolution, better than {approx} 1.0 {angstrom}, requires more detailed models than those based on independent spherical atoms. A tool conventionally used in small-molecule crystallography is the multipolar model. Even at upper resolution limits of 0.8-1.0 {angstrom}, the number of experimental data is insufficient for the full multipolar model refinement. As an alternative, a simpler model composed of conventional independent spherical atoms augmented by additional scatterers to model bonding effects has been proposed. Refinement of these mixed models for several benchmark datasets gave results comparable in quality with results of multipolar refinement and superior of those for conventional models. Applications to several datasets of both small- and macro-molecules are shown. These refinements were performed using the general-purpose macromolecular refinement module phenix.refine of the PHENIX package.

  11. Deformable elastic network refinement for low-resolution macromolecular crystallography.

    Science.gov (United States)

    Schröder, Gunnar F; Levitt, Michael; Brunger, Axel T

    2014-09-01

    Crystals of membrane proteins and protein complexes often diffract to low resolution owing to their intrinsic molecular flexibility, heterogeneity or the mosaic spread of micro-domains. At low resolution, the building and refinement of atomic models is a more challenging task. The deformable elastic network (DEN) refinement method developed previously has been instrumental in the determinion of several structures at low resolution. Here, DEN refinement is reviewed, recommendations for its optimal usage are provided and its limitations are discussed. Representative examples of the application of DEN refinement to challenging cases of refinement at low resolution are presented. These cases include soluble as well as membrane proteins determined at limiting resolutions ranging from 3 to 7 Å. Potential extensions of the DEN refinement technique and future perspectives for the interpretation of low-resolution crystal structures are also discussed.

  12. Refinement of herpesvirus B-capsid structure on parallel supercomputers.

    Science.gov (United States)

    Zhou, Z H; Chiu, W; Haskell, K; Spears, H; Jakana, J; Rixon, F J; Scott, L R

    1998-01-01

    Electron cryomicroscopy and icosahedral reconstruction are used to obtain the three-dimensional structure of the 1250-A-diameter herpesvirus B-capsid. The centers and orientations of particles in focal pairs of 400-kV, spot-scan micrographs are determined and iteratively refined by common-lines-based local and global refinement procedures. We describe the rationale behind choosing shared-memory multiprocessor computers for executing the global refinement, which is the most computationally intensive step in the reconstruction procedure. This refinement has been implemented on three different shared-memory supercomputers. The speedup and efficiency are evaluated by using test data sets with different numbers of particles and processors. Using this parallel refinement program, we refine the herpesvirus B-capsid from 355-particle images to 13-A resolution. The map shows new structural features and interactions of the protein subunits in the three distinct morphological units: penton, hexon, and triplex of this T = 16 icosahedral particle.

  13. Mirror of the refined topological vertex from a matrix model

    CERN Document Server

    Eynard, B

    2011-01-01

    We find an explicit matrix model computing the refined topological vertex, starting from its representation in terms of plane partitions. We then find the spectral curve of that matrix model, and thus the mirror symmetry of the refined vertex. With the same method we also find a matrix model for the strip geometry, and we find its mirror curve. The fact that there is a matrix model shows that the refined topological string amplitudes also satisfy the remodeling the B-model construction.

  14. Mars Sample Handling Protocol Workshop Series: Workshop 4

    Science.gov (United States)

    Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)

    2001-01-01

    In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol

  15. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    Science.gov (United States)

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  16. Security Protocol Design: A Case Study Using Key Distribution Protocols

    Directory of Open Access Journals (Sweden)

    Reiner Dojen

    2009-10-01

    Full Text Available Nowadays security protocols are a key component in providing security services for fixed and mobile networks. These services include data confidentiality, radio link encryption, message integrity, mobile subscriber authentication, electronic payment, certified e-mail, contract signing and nonrepudiation. This paper is concerned with design of effective security protocols. Security protocols are introduced and some common attacks against security protocols are discussed. The vulnerabilities that lead to theattacks are analyzed and guidelines for effective security protocol design are proposed. The presented guidelines are applied to the Andrew Secure RPC protocol and its adapted versions. It is demonstrated that compliance with the guidelines successfully avoidsfreshness and parallel session attacks.

  17. Physical model of aluminium refining process in URC-7000

    Directory of Open Access Journals (Sweden)

    M. Saternus

    2009-07-01

    Full Text Available The paper presents short characteristics of the most frequently used reactors for the continuous refining of aluminium and its alloys. Refining depends on the flow rate of refining gas. It influences the shape and size of gas bubbles. So the physical model was created to determine the level of gas bubble dispersion in liquids. Schemes of gas dispersion in liquid metal were presented taking into consideration that the gas flow rate is changing from 2 to 30 dm3/min. The range selection of the flow rate of refining gas value for five patterns of the dispersion (no dispersion, minimum, intimate, uniform and overdispersion was also done.

  18. New Process for Grain Refinement of Aluminum. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Joseph A. Megy

    2000-09-22

    A new method of grain refining aluminum involving in-situ formation of boride nuclei in molten aluminum just prior to casting has been developed in the subject DOE program over the last thirty months by a team consisting of JDC, Inc., Alcoa Technical Center, GRAS, Inc., Touchstone Labs, and GKS Engineering Services. The Manufacturing process to make boron trichloride for grain refining is much simpler than preparing conventional grain refiners, with attendant environmental, capital, and energy savings. The manufacture of boride grain refining nuclei using the fy-Gem process avoids clusters, salt and oxide inclusions that cause quality problems in aluminum today.

  19. Protocol Materials: A Clarification.

    Science.gov (United States)

    Innerd, Wilfred; O'Gorman, David

    "Protocol materials" are records or recordings of a wide variety of behavioral situations. Characteristically they are neither simulated nor extensively edited. They are to be used for the empirical verification of concepts derived from both educational theory and the social sciences. They are attempts to capture reality so that it may be studied…

  20. Principles of Protocol Design

    DEFF Research Database (Denmark)

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one...

  1. Hybrid quantum repeater protocol with fast local processing

    DEFF Research Database (Denmark)

    Borregaard, Johannes; Brask, Jonatan Bohr; Sørensen, Anders Søndberg

    2012-01-01

    the need for classical communication during growth. Entanglement is generated in subsequent connection processes. Furthermore the growth procedure is optimized. We review the main elements of the original protocol and present the two modifications. Finally the two protocols are compared and the modified......We propose a hybrid quantum repeater protocol combining the advantages of continuous and discrete variables. The repeater is based on the previous work of Brask et al. [ Phys. Rev. Lett. 105 160501 (2010)] but we present two ways of improving this protocol. In the previous protocol entangled single......-photon states are produced and grown into superpositions of coherent states, known as two-mode cat states. The entanglement is then distributed using homodyne detection. To improve the protocol, we replace the time-consuming nonlocal growth of cat states with local growth of single-mode cat states, eliminating...

  2. An Internet Key Exchange Protocol Based on Public Key Infrastructure

    Institute of Scientific and Technical Information of China (English)

    朱建明; 马建峰

    2004-01-01

    Internet key exchange (IKE) is an automated key exchange mechanism that is used to facilitate the transfer of IPSec security associations (SAs). Public key infrastructure (PKI) is considered as a key element for providing security to new distributed communication networks and services. In this paper, we concentrate on the properties of the protocol of Phase 1 IKE. After investigating IKE protocol and PKI technology, we combine IKE protocol and PKI and present an implementation scheme of the IKE based on PKI. Then, we give a logic analysis of the proposed protocol with the BAN-logic and discuss the security of the protocol. The result indicates that the protocol is correct and satisfies the security requirements of Internet key exchange.

  3. MILITARY MULTI-HOP ROUTING PROTOCOL

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Military tactical network is one of the most important applications of ad hoc network. Currently the existing routing protocols are put forward based on the simple and universal network model which has the ability of multi-hop routing but cannot work efficiently in the special military applications. The narrow bandwidth and limited energy of the wireless network make general routing protocols seem to be more redundant.In order to satisfy military needs properly, this letter makes a detailed analysis of physical nodes and traffic distribution in military network, uses cross-layer design concept for reference, combines traffic characteristic of application layer with routing protocol design. It categorized routing problem as crucial routes and non-crucial routes which adopted separate maintenance methods. It not only realizes basic transmission function but also works efficiently. It is quite fit for military application.

  4. Protocols for Copying and Proofreading in Template-Assisted Polymerization

    Science.gov (United States)

    Pigolotti, Simone; Sartori, Pablo

    2016-03-01

    We discuss how information encoded in a template polymer can be stochastically copied into a copy polymer. We consider four different stochastic copy protocols of increasing complexity, inspired by building blocks of the mRNA translation pathway. In the first protocol, monomer incorporation occurs in a single stochastic transition. We then move to a more elaborate protocol in which an intermediate step can be used for error correction. Finally, we discuss the operating regimes of two kinetic proofreading protocols: one in which proofreading acts from the final copying step, and one in which it acts from an intermediate step. We review known results for these models and, in some cases, extend them to analyze all possible combinations of energetic and kinetic discrimination. We show that, in each of these protocols, only a limited number of these combinations leads to an improvement of the overall copying accuracy.

  5. Adaptive mesh refinement and multilevel iteration for multiphase, multicomponent flow in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Hornung, R.D. [Duke Univ., Durham, NC (United States)

    1996-12-31

    An adaptive local mesh refinement (AMR) algorithm originally developed for unsteady gas dynamics is extended to multi-phase flow in porous media. Within the AMR framework, we combine specialized numerical methods to treat the different aspects of the partial differential equations. Multi-level iteration and domain decomposition techniques are incorporated to accommodate elliptic/parabolic behavior. High-resolution shock capturing schemes are used in the time integration of the hyperbolic mass conservation equations. When combined with AMR, these numerical schemes provide high resolution locally in a more efficient manner than if they were applied on a uniformly fine computational mesh. We will discuss the interplay of physical, mathematical, and numerical concerns in the application of adaptive mesh refinement to flow in porous media problems of practical interest.

  6. A Pareto-optimal refinement method for protein design scaffolds.

    Science.gov (United States)

    Nivón, Lucas Gregorio; Moretti, Rocco; Baker, David

    2013-01-01

    Computational design of protein function involves a search for amino acids with the lowest energy subject to a set of constraints specifying function. In many cases a set of natural protein backbone structures, or "scaffolds", are searched to find regions where functional sites (an enzyme active site, ligand binding pocket, protein-protein interaction region, etc.) can be placed, and the identities of the surrounding amino acids are optimized to satisfy functional constraints. Input native protein structures almost invariably have regions that score very poorly with the design force field, and any design based on these unmodified structures may result in mutations away from the native sequence solely as a result of the energetic strain. Because the input structure is already a stable protein, it is desirable to keep the total number of mutations to a minimum and to avoid mutations resulting from poorly-scoring input structures. Here we describe a protocol using cycles of minimization with combined backbone/sidechain restraints that is Pareto-optimal with respect to RMSD to the native structure and energetic strain reduction. The protocol should be broadly useful in the preparation of scaffold libraries for functional site design.

  7. A family of quantum protocols

    CERN Document Server

    Devetak, I; Winter, A

    2003-01-01

    We introduce two dual, purely quantum protocols: for entanglement distillation assisted by quantum communication (``mother'' protocol) and for entanglement assisted quantum communication (``father'' protocol). We show how a large class of ``children'' protocols (including many previously known ones) can be derived from the two by direct application of teleportation or super-dense coding. Furthermore, the parent may be recovered from most of the children protocols by making them ``coherent''. We also summarize the various resource trade-offs these protocols give rise to.

  8. GRChombo : Numerical Relativity with Adaptive Mesh Refinement

    CERN Document Server

    Clough, Katy; Finkel, Hal; Kunesch, Markus; Lim, Eugene A; Tunyasuvunakool, Saran

    2015-01-01

    Numerical relativity has undergone a revolution in the past decade. With a well-understood mathematical formalism, and full control over the gauge modes, it is now entering an era in which the science can be properly explored. In this work, we introduce GRChombo, a new numerical relativity code written to take full advantage of modern parallel computing techniques. GRChombo's features include full adaptive mesh refinement with block structured Berger-Rigoutsos grid generation which supports non-trivial "many-boxes-in-many-boxes" meshing hierarchies, and massive parallelism through the Message Passing Interface (MPI). GRChombo evolves the Einstein equation with the standard BSSN formalism, with an option to turn on CCZ4 constraint damping if required. We show that GRChombo passes all the standard "Apples-to-Apples" code comparison tests. We also show that it can stably and accurately evolve vacuum black hole spacetimes such as binary black hole mergers, and non-vacuum spacetimes such as scalar collapses into b...

  9. Worldsheet realization of the refined topological string

    Energy Technology Data Exchange (ETDEWEB)

    Antoniadis, I., E-mail: ignatios.antoniadis@cern.ch [Department of Physics, CERN – Theory Division, CH-1211 Geneva 23 (Switzerland); Florakis, I., E-mail: florakis@mppmu.mpg.de [Max-Planck-Institut für Physik, Werner-Heisenberg-Institut, 80805 München (Germany); Hohenegger, S., E-mail: stefan.hohenegger@cern.ch [Department of Physics, CERN – Theory Division, CH-1211 Geneva 23 (Switzerland); Narain, K.S., E-mail: narain@ictp.trieste.it [High Energy Section, The Abdus Salam International Center for Theoretical Physics, Strada Costiera, 11-34014 Trieste (Italy); Zein Assi, A., E-mail: zeinassi@cern.ch [Department of Physics, CERN – Theory Division, CH-1211 Geneva 23 (Switzerland); Centre de Physique Théorique (UMR CNRS 7644), Ecole Polytechnique, 91128 Palaiseau (France)

    2013-10-01

    A worldsheet realization of the refined topological string is proposed in terms of physical string amplitudes that compute generalized N=2 F-terms of the form F{sub g,n}W{sup 2g}ϒ{sup 2n} in the effective supergravity action. These terms involve the chiral Weyl superfield W and a superfield ϒ defined as an N=2 chiral projection of a particular anti-chiral T{sup ¯}-vector multiplet. In Heterotic and Type I theories, obtained upon compactification on the six-dimensional manifold K3×T{sup 2}, T is the usual Kähler modulus of the T{sup 2} torus. These amplitudes are computed exactly at the one-loop level in string theory. They are shown to reproduce the correct perturbative part of the Nekrasov partition function in the field theory limit when expanded around an SU(2) enhancement point of the string moduli space. The two deformation parameters ϵ{sub −} and ϵ{sub +} of the Ω supergravity background are then identified with the constant field strength backgrounds for the anti-self-dual graviphoton and self-dual gauge field of the T{sup ¯} vector multiplet, respectively.

  10. Refining and blending of aviation turbine fuels.

    Science.gov (United States)

    White, R D

    1999-02-01

    Aviation turbine fuels (jet fuels) are similar to other petroleum products that have a boiling range of approximately 300F to 550F. Kerosene and No.1 grades of fuel oil, diesel fuel, and gas turbine oil share many similar physical and chemical properties with jet fuel. The similarity among these products should allow toxicology data on one material to be extrapolated to the others. Refineries in the USA manufacture jet fuel to meet industry standard specifications. Civilian aircraft primarily use Jet A or Jet A-1 fuel as defined by ASTM D 1655. Military aircraft use JP-5 or JP-8 fuel as defined by MIL-T-5624R or MIL-T-83133D respectively. The freezing point and flash point are the principle differences between the finished fuels. Common refinery processes that produce jet fuel include distillation, caustic treatment, hydrotreating, and hydrocracking. Each of these refining processes may be the final step to produce jet fuel. Sometimes blending of two or more of these refinery process streams are needed to produce jet fuel that meets the desired specifications. Chemical additives allowed for use in jet fuel are also defined in the product specifications. In many cases, the customer rather than the refinery will put additives into the fuel to meet their specific storage or flight condition requirements.

  11. FPGA Congestion-Driven Placement Refinement

    Energy Technology Data Exchange (ETDEWEB)

    Vicente de, J.

    2005-07-01

    The routing congestion usually limits the complete proficiency of the FPGA logic resources. A key question can be formulated regarding the benefits of estimating the congestion at placement stage. In the last years, it is gaining acceptance the idea of a detailed placement taking into account congestion. In this paper, we resort to the Thermodynamic Simulated Annealing (TSA) algorithm to perform a congestion-driven placement refinement on the top of the common Bounding-Box pre optimized solution. The adaptive properties of TSA allow the search to preserve the solution quality of the pre optimized solution while improving other fine-grain objectives. Regarding the cost function two approaches have been considered. In the first one Expected Occupation (EO), a detailed probabilistic model to account for channel congestion is evaluated. We show that in spite of the minute detail of EO, the inherent uncertainty of this probabilistic model impedes to relieve congestion beyond the sole application of the Bounding-Box cost function. In the second approach we resort to the fast Rectilinear Steiner Regions algorithm to perform not an estimation but a measurement of the global routing congestion. This second strategy allows us to successfully reduce the requested channel width for a set of benchmark circuits with respect to the widespread Versatile Place and Route (VPR) tool. (Author) 31 refs.

  12. Linear Prediction Using Refined Autocorrelation Function

    Directory of Open Access Journals (Sweden)

    M. Shahidur Rahman

    2007-07-01

    Full Text Available This paper proposes a new technique for improving the performance of linear prediction analysis by utilizing a refined version of the autocorrelation function. Problems in analyzing voiced speech using linear prediction occur often due to the harmonic structure of the excitation source, which causes the autocorrelation function to be an aliased version of that of the vocal tract impulse response. To estimate the vocal tract characteristics accurately, however, the effect of aliasing must be eliminated. In this paper, we employ homomorphic deconvolution technique in the autocorrelation domain to eliminate the aliasing effect occurred due to periodicity. The resulted autocorrelation function of the vocal tract impulse response is found to produce significant improvement in estimating formant frequencies. The accuracy of formant estimation is verified on synthetic vowels for a wide range of pitch frequencies typical for male and female speakers. The validity of the proposed method is also illustrated by inspecting the spectral envelopes of natural speech spoken by high-pitched female speaker. The synthesis filter obtained by the current method is guaranteed to be stable, which makes the method superior to many of its alternatives.

  13. Elliptic Solvers for Adaptive Mesh Refinement Grids

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D.J.; Dendy, J.E., Jr.; Shapira, Y.

    1999-06-03

    We are developing multigrid methods that will efficiently solve elliptic problems with anisotropic and discontinuous coefficients on adaptive grids. The final product will be a library that provides for the simplified solution of such problems. This library will directly benefit the efforts of other Laboratory groups. The focus of this work is research on serial and parallel elliptic algorithms and the inclusion of our black-box multigrid techniques into this new setting. The approach applies the Los Alamos object-oriented class libraries that greatly simplify the development of serial and parallel adaptive mesh refinement applications. In the final year of this LDRD, we focused on putting the software together; in particular we completed the final AMR++ library, we wrote tutorials and manuals, and we built example applications. We implemented the Fast Adaptive Composite Grid method as the principal elliptic solver. We presented results at the Overset Grid Conference and other more AMR specific conferences. We worked on optimization of serial and parallel performance and published several papers on the details of this work. Performance remains an important issue and is the subject of continuing research work.

  14. Water Refining By the Membrane Separation Method

    Directory of Open Access Journals (Sweden)

    Zdenka Bolanca

    2000-03-01

    Full Text Available The investigations in this paper refer to the possibility of wastewater refining from deinking flotation process by the method of membrane separation. The offset impressions made with the conventional printing inks as wellas the models based on the raw materials of renewable origin were used in the recycling process. Before andafter the water treatment, the concentration of the sulphate, chloride, nitrate, total organic substance, oxygenconsumption, pH and electric conductivity was determinedin all the samples by standard analytical methods.The results point at the dependence of the concentration of the investigated inorganic and organic pollution with the characteristics of the prints used in the recycling process. The concentration decrease of the investigated pollution using the method of the membrane separation in relation to the boundary values of the water quality necessary for the paper production has been discussed. Besides the specific scientific contribution, the practical application of the investigation is the contribution to the closing of water circuit in paper production.

  15. Refinement of some estimates for values of the hypergeometric functions

    Directory of Open Access Journals (Sweden)

    P. L. Ivankov

    2014-01-01

    Full Text Available In this paper we consider the arithmetic properties of values of generalized hypergeometric functions and their derivatives (including those with respect to parameter. Some parameters of these functions are irrational. The research aims at refining the lower bound estimates of the modules of heterogeneous linear forms with coefficients from some imaginary quadratic field of the abovementioned values (heterogeneity here is understood in the sense that the function values are considered together with the unit. To solve this problem we use the effective construction of simultaneous approximations. The degrees of polynomials to form such a construction we choose in a special way. The latter circumstance allows more accurate estimates of the modules of corresponding linear forms.To solve similar problems a Siegel's method is, usually, used. This is well known in the theory of transcendental numbers method in which functional approximations are constructed by Dirichlet principle. The advantage of this method is the generality of results. In case of irrational parameters such method cannot be applied directly because the minimal common denominator of the hypergeometric series coefficients with irrational parameters grows too fast. Therefore, the first estimates of linear form modules from the values of hypergeometric functions with irrational parameters have been obtained via effective construction of linear approximating forms. Subsequently, the method has been developed, comprising both the elements of Siegel's method and those of the method based on the effective construction of linear approximating forms. Using a combined method enables us to estimate the linear forms under consideration in the most general situation. This method, however, has not been used yet to study the arithmetic nature of hypergeometric function values differentiated with respect to parameter.Note that all known estimates of the type under consideration still fall far short

  16. OT-Combiners Via Secure Computation

    DEFF Research Database (Denmark)

    Harnik, Danny; Ishai, Yuval; Kushilevitz, Eyal

    2008-01-01

    An OT-combiner implements a secure oblivious transfer (OT) protocol using oracle access to n OT-candidates of which at most t may be faulty. We introduce a new general approach for combining OTs by making a simple and modular use of protocols for secure computation. Specifically, we obtain an OT......, in a network consisting of a single OT-channel. Our approach applies both to the “semi-honest” and the “malicious” models of secure computation, yielding the corresponding types of OT-combiners. Instantiating our general approach with secure computation protocols from the literature, we conceptually simplify......-combiner from any instantiation of the following two ingredients: (1) a t-secure n-party protocol for the OT functionality, in a network consisting of secure point-to-point channels and a broadcast primitive; and (2) a secure two-party protocol for a functionality determined by the former multiparty protocol...

  17. [Treatment of localized forms of Hodgkin's disease with 3 courses of chemotherapy (ABVD-MP) in combination with wide focal and prophylactic lumbo-splenic radiotherapy. The POF 81/12 protocol].

    Science.gov (United States)

    Desablens, B; Tourani, J M; Casassus, P; Brière, J; Harousseau, J L; Lemevel, A; Ifrah, N; Le Prise, P Y; Gandhour, C; Guilhot, F

    1989-11-25

    Between October 10, 1981 and December 31, 1987, we used the Hodgkin POF 81/12 protocol to treat 235 patients aged from 5 to 65 years (mean 30 years) with localized Hodgkin's disease clinically classified as stage IA (n = 6), stage IIA (n = 128) and stage IIB (n = 53). A contiguous lesion was present in 22 cases and a mediastinal lesion in 170 cases. The patients received 3 monthly courses of ABVD-MP (doxorubicin 25, bleomycin 10, vinblastine 6, dacarbazine 375 and methylprednisolone 120 mg per sq.m intravenously on days 1 and 15), except for stage IA non-mediastinal patients who received only one course. Thereafter, in the absence of failure (lack of changes or progression under chemotherapy), a 40 Gy wide focal irradiation and a 30 Gy prophylactic lumbo-splenic irradiation were performed. Complete remission (CR) was obtained in 229 patients, and the 6 failures (4 after ABVD-MP, 2 after radiotherapy) were treated with specific programmes. On December 1, 1988 (median survival 42 months, range 12-86 months) we had recorded 9 relapses (after 9 to 51 months) and 7 deaths (2 failures, 2 relapses and 3 patients in CR: ovarian carcinoma, road accident, exploratory pleural puncture). The current actuarial relapse and survival rates at 7 years are 5 and 94 respectively. Two unfavourable forms of the disease were identified: infra-diaphragmatic with massive lumbo-aortic lesions (5 cases: 1 failure, 1 relapse) and supra-diaphragmatic with a mediastinum/chest ratio of 0,45 or more (30 cases: 5 failures, 5 relapses). In the 200 patients devoid of these 2 risk factors the results obtained were: CR 100 percent, only 2 relapses and survival at 7 years 98 percent.

  18. Investigation of PDE5/PDE6 and PDE5/PDE11 selective potent tadalafil-like PDE5 inhibitors using combination of molecular modeling approaches, molecular fingerprint-based virtual screening protocols and structure-based pharmacophore development.

    Science.gov (United States)

    Kayık, Gülru; Tüzün, Nurcan Ş; Durdagi, Serdar

    2017-12-01

    The essential biological function of phosphodiesterase (PDE) type enzymes is to regulate the cytoplasmic levels of intracellular second messengers, 3',5'-cyclic guanosine monophosphate (cGMP) and/or 3',5'-cyclic adenosine monophosphate (cAMP). PDE targets have 11 isoenzymes. Of these enzymes, PDE5 has attracted a special attention over the years after its recognition as being the target enzyme in treating erectile dysfunction. Due to the amino acid sequence and the secondary structural similarity of PDE6 and PDE11 with the catalytic domain of PDE5, first-generation PDE5 inhibitors (i.e. sildenafil and vardenafil) are also competitive inhibitors of PDE6 and PDE11. Since the major challenge of designing novel PDE5 inhibitors is to decrease their cross-reactivity with PDE6 and PDE11, in this study, we attempt to identify potent tadalafil-like PDE5 inhibitors that have PDE5/PDE6 and PDE5/PDE11 selectivity. For this aim, the similarity-based virtual screening protocol is applied for the "clean drug-like subset of ZINC database" that contains more than 20 million small compounds. Moreover, molecular dynamics (MD) simulations of selected hits complexed with PDE5 and off-targets were performed in order to get insights for structural and dynamical behaviors of the selected molecules as selective PDE5 inhibitors. Since tadalafil blocks hERG1 K channels in concentration dependent manner, the cardiotoxicity prediction of the hit molecules was also tested. Results of this study can be useful for designing of novel, safe and selective PDE5 inhibitors.

  19. Filled pause refinement based on the pronunciation probability for lecture speech.

    Science.gov (United States)

    Long, Yan-Hua; Ye, Hong

    2015-01-01

    Nowadays, although automatic speech recognition has become quite proficient in recognizing or transcribing well-prepared fluent speech, the transcription of speech that contains many disfluencies remains problematic, such as spontaneous conversational and lecture speech. Filled pauses (FPs) are the most frequently occurring disfluencies in this type of speech. Most recent studies have shown that FPs are widely believed to increase the error rates for state-of-the-art speech transcription, primarily because most FPs are not well annotated or provided in training data transcriptions and because of the similarities in acoustic characteristics between FPs and some common non-content words. To enhance the speech transcription system, we propose a new automatic refinement approach to detect FPs in British English lecture speech transcription. This approach combines the pronunciation probabilities for each word in the dictionary and acoustic language model scores for FP refinement through a modified speech recognition forced-alignment framework. We evaluate the proposed approach on the Reith Lectures speech transcription task, in which only imperfect training transcriptions are available. Successful results are achieved for both the development and evaluation datasets. Acoustic models trained on different styles of speech genres have been investigated with respect to FP refinement. To further validate the effectiveness of the proposed approach, speech transcription performance has also been examined using systems built on training data transcriptions with and without FP refinement.

  20. Refinements of the Weyl tensor classification in five dimensions

    NARCIS (Netherlands)

    Coley, A.; Hervik, S.; Ortaggio, M.; Wylleman, L.

    2012-01-01

    We refine the null alignment classification of the Weyl tensor of a five-dimensional spacetime. The paper focusses on the algebraically special alignment types N, III, II and D, while types I and G are briefly discussed. A first refinement is provided by the notion of spin type of the components of

  1. Crystal structure refinement a crystallographers guide to SHELXL

    CERN Document Server

    2006-01-01

    A crystallographers guide to SHELXL, covering various aspects of practical crystal structure refinement, from the treatment of hydrogen atoms to the assignment of atom types, and more. After an introduction to SHELXL, a brief survey of crystal structure refinement is provided.

  2. Optimization of Refining Craft for Vegetable Insulating Oil

    Science.gov (United States)

    Zhou, Zhu-Jun; Hu, Ting; Cheng, Lin; Tian, Kai; Wang, Xuan; Yang, Jun; Kong, Hai-Yang; Fang, Fu-Xin; Qian, Hang; Fu, Guang-Pan

    2016-05-01

    Vegetable insulating oil because of its environmental friendliness are considered as ideal material instead of mineral oil used for the insulation and the cooling of the transformer. The main steps of traditional refining process included alkali refining, bleaching and distillation. This kind of refining process used in small doses of insulating oil refining can get satisfactory effect, but can't be applied to the large capacity reaction kettle. This paper using rapeseed oil as crude oil, and the refining process has been optimized for large capacity reaction kettle. The optimized refining process increases the acid degumming process. The alkali compound adds the sodium silicate composition in the alkali refining process, and the ratio of each component is optimized. Add the amount of activated clay and activated carbon according to 10:1 proportion in the de-colorization process, which can effectively reduce the oil acid value and dielectric loss. Using vacuum pumping gas instead of distillation process can further reduce the acid value. Compared some part of the performance parameters of refined oil products with mineral insulating oil, the dielectric loss of vegetable insulating oil is still high and some measures are needed to take to further optimize in the future.

  3. Cavitation-aided grain refinement in aluminium alloys

    NARCIS (Netherlands)

    Atamanenko, T.V.

    2010-01-01

    This thesis deals with grain refinement under the influence of ultrasonic-driven cavitation in aluminium casting processes. Three major goals of this research were: (1) to identify the mechanism of the cavitation-aided grain refinement at different stages of solidification; (2) to reveal the

  4. Review and Prospect of China's Oil Refining Industry

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Driven by the steady and rapid development of domestic economy and the rapid growth of oil demand in recent years, China's oil refining industry has overcome the influence of fluctuating international crude oil price and international financial crisis, and witnessed rapid increase in refining capacity and continuous rising of processed crude oil and production of gasoline,

  5. The Analysis of the Refined Financial Management of Modern Enterprises

    Directory of Open Access Journals (Sweden)

    Li Ran

    2016-01-01

    Full Text Available This paper briefly introduces the concept of the refined financial management, elaborates on its characteristics and puts forward some main points about it. It also comes up with some personal suggestions for reference on effective ways of refining financial management.

  6. Cavitation-aided grain refinement in aluminium alloys

    NARCIS (Netherlands)

    Atamanenko, T.V.

    2010-01-01

    This thesis deals with grain refinement under the influence of ultrasonic-driven cavitation in aluminium casting processes. Three major goals of this research were: (1) to identify the mechanism of the cavitation-aided grain refinement at different stages of solidification; (2) to reveal the conditi

  7. REFMAC5 for the refinement of macromolecular crystal structures

    Energy Technology Data Exchange (ETDEWEB)

    Murshudov, Garib N., E-mail: garib@ysbl.york.ac.uk [Structural Biology Laboratory, Department of Chemistry, University of York, Heslington, York YO10 5YW (United Kingdom); Skubák, Pavol [Biophysical Structural Chemistry, Leiden University, PO Box 9502, 2300 RA Leiden (Netherlands); Lebedev, Andrey A. [Structural Biology Laboratory, Department of Chemistry, University of York, Heslington, York YO10 5YW (United Kingdom); Pannu, Navraj S. [Biophysical Structural Chemistry, Leiden University, PO Box 9502, 2300 RA Leiden (Netherlands); Steiner, Roberto A. [Randall Division of Cell and Molecular Biophysics, New Hunt’s House, King’s College London, London (United Kingdom); Nicholls, Robert A. [Structural Biology Laboratory, Department of Chemistry, University of York, Heslington, York YO10 5YW (United Kingdom); Winn, Martyn D. [STFC Daresbury Laboratory, Warrington WA4 4AD (United Kingdom); Long, Fei; Vagin, Alexei A. [Structural Biology Laboratory, Department of Chemistry, University of York, Heslington, York YO10 5YW (United Kingdom)

    2011-04-01

    The general principles behind the macromolecular crystal structure refinement program REFMAC5 are described. This paper describes various components of the macromolecular crystallographic refinement program REFMAC5, which is distributed as part of the CCP4 suite. REFMAC5 utilizes different likelihood functions depending on the diffraction data employed (amplitudes or intensities), the presence of twinning and the availability of SAD/SIRAS experimental diffraction data. To ensure chemical and structural integrity of the refined model, REFMAC5 offers several classes of restraints and choices of model parameterization. Reliable models at resolutions at least as low as 4 Å can be achieved thanks to low-resolution refinement tools such as secondary-structure restraints, restraints to known homologous structures, automatic global and local NCS restraints, ‘jelly-body’ restraints and the use of novel long-range restraints on atomic displacement parameters (ADPs) based on the Kullback–Leibler divergence. REFMAC5 additionally offers TLS parameterization and, when high-resolution data are available, fast refinement of anisotropic ADPs. Refinement in the presence of twinning is performed in a fully automated fashion. REFMAC5 is a flexible and highly optimized refinement package that is ideally suited for refinement across the entire resolution spectrum encountered in macromolecular crystallography.

  8. RBT—A Tool for Building Refined Buneman Trees

    DEFF Research Database (Denmark)

    Besenbacher, Søren; Mailund; Westh-Nielsen, Lasse;

    2005-01-01

    We have developed a tool implementing an efficient algorithm for refined Buneman tree reconstruction. The algorithm—which has the same complexity as the neighbour-joining method and the (plain) Buneman tree construction—enables refined Buneman tree reconstruction on large taxa sets....

  9. CNPC Hits New High in Refining & Petrochemical Business

    Institute of Scientific and Technical Information of China (English)

    Luo Shichao

    2010-01-01

    @@ During the "11th Five-Year Plan" period, CNPC adjusts the distribution of refining territory, and increases processed crude oil steadily, which is significant to sustained and fast development of upstream, midstream and downstream business.Of the 28 main economic and technical indicators of refining and chemical business, 21 hit record high.

  10. Refined Orbital Architecture for Targets of Naval Interest

    Science.gov (United States)

    2015-03-01

    pioneered the Walker delta pattern.29 The technique distributes t total satellites into p orbital planes, evenly spaced with a f relative phasing...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited REFINED ORBITAL ...Thesis 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS REFINED ORBITAL ARCHITECTURE FOR TARGETS OF NAVAL INTEREST 6. AUTHOR(S) Andrew Konowicz 7

  11. Carpet: Adaptive Mesh Refinement for the Cactus Framework

    Science.gov (United States)

    Schnetter, Erik; Hawley, Scott; Hawke, Ian

    2016-11-01

    Carpet is an adaptive mesh refinement and multi-patch driver for the Cactus Framework (ascl:1102.013). Cactus is a software framework for solving time-dependent partial differential equations on block-structured grids, and Carpet acts as driver layer providing adaptive mesh refinement, multi-patch capability, as well as parallelization and efficient I/O.

  12. Refined analysis results for multimedia network costs and profits

    DEFF Research Database (Denmark)

    Tahkokorpi, M.; Falch, Morten; Skouby, Knud Erik

    This deliverable describes the techno-economic business model developed in EURORIM WP3 and presents the refined results of the multimedia service delivery cost-profit calculations......This deliverable describes the techno-economic business model developed in EURORIM WP3 and presents the refined results of the multimedia service delivery cost-profit calculations...

  13. Mitosis Methods & Protocols

    Directory of Open Access Journals (Sweden)

    CarloAlberto Redi

    2010-06-01

    Full Text Available Mitosis Methods & Protocols Andrew D. McAinsh (Edt Humana press, Totowa, New Jersey (USA Series: Springer Protocols Methods in Molecular Biology, Volume 545, 2009 ISBN: 978-1-60327-992-5   It is quite clear from the contents of this book that the remarkably fascinating phenomenon of mitosis (that captured, and still is capturing, the attention of entire generations of scientists is still open to research. This is mainly due to our lack of knowledge of so many multifaced events of this extraordinarly complex process. The reader giving a glace through the Contents and Contributors sections is speechless: All of the first-class models (i.e., budding yeast, Caenorabditis, Drosophila, Xenopus and Human are presented..... 

  14. Symmetric cryptographic protocols

    CERN Document Server

    Ramkumar, Mahalingam

    2014-01-01

    This book focuses on protocols and constructions that make good use of symmetric pseudo random functions (PRF) like block ciphers and hash functions - the building blocks for symmetric cryptography. Readers will benefit from detailed discussion of several strategies for utilizing symmetric PRFs. Coverage includes various key distribution strategies for unicast, broadcast and multicast security, and strategies for constructing efficient digests of dynamic databases using binary hash trees.   •        Provides detailed coverage of symmetric key protocols •        Describes various applications of symmetric building blocks •        Includes strategies for constructing compact and efficient digests of dynamic databases

  15. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  16. Computer technology in oil refining: cost or benefit

    Energy Technology Data Exchange (ETDEWEB)

    Payne, B. (KBC Process Technology (GB))

    1990-04-01

    There is undoubtedly a commitment in the oil refining industry to computerise wherever possible, and to develop advanced mathematical modelling techniques to improved profitability. However, many oil refiners are now asking themselves whether computer solutions are a cost, or are truly a benefit to their organisation. Problems have been caused by distributed computing running out of control in many organisations. This has been partly brought to reign recently, by advanced networking of PCs along with mainframe facilities, and development of management information systems with common data bases for all users to build their applications on. Implementation of information technology strategies helped many refiners to plan the way ahead for the future. The use of computers across the refining sector in the current marketplace is reviewed. The conclusion drawn is that although computer technology is a cost it can also be ranked as a significant benefit and success in the refining industry at present. (author).

  17. Action Refinement for Real-Time Concurrent Processes with Urgency

    Institute of Scientific and Technical Information of China (English)

    Guang-Ping Qin; Jin-Zhao Wu

    2005-01-01

    Action refinement for real-time concurrent processes with urgent interactions is studied, where a partial-order setting, i.e., timed bundle event structures, is used as the system model and a real-time LOTOS-like process algebra is used as the specification language. It is shown that the proposed refinement approaches have the commonly expected properties:(1) the behaviour of the refined process can be inferred compositionally from the behaviour of the original process and from the behaviour of the processes substituted for actions; (2) the timed extensions of pomset (partially ordered multiset) trace equivalence and history preserving bisimulation equivalence are both congruences under the refinement; (3) the syntactic and semantic refinements coincide up to the aforementioned equivalence relations with respect to a cpo-based denotational semantics.

  18. Review and Prospect of China's Oil Refining Industry

    Institute of Scientific and Technical Information of China (English)

    Jin Yun; Zhu He

    2012-01-01

    Prospect of China's oil refining industry during the "12th Five-Year Plan" period The "12th Five-Year Plan" period is critical to China oil refining industry to transform growth mode and grow from big to strong. China oil refining industry will base itself on domestic market demand and maintain rapider development through construction and reconstruction. The total scale will continue to grow, crude runs and product oil output will rise steadily, and the structure and layout of refining units will be further adjusted and optimized. During the period, the number of 10-million-tonnage refineries will rise to about 30, the quality of oil products will be upgraded, overseas interest capability will be improved, and the diversified of oil refining industry will be market competition pattern developed further.

  19. Specification and analysis of a high speed transport protocol

    Science.gov (United States)

    Tipici, Huseyin A.

    1993-06-01

    While networks have been getting faster, perceived throughput at the application has not always increased accordingly and the bottleneck has moved to the communications processing part of the system. The issues that cause the performance bottlenecks in the current transport protocols are discussed in this thesis, and a further study on a high speed transport protocol which tries to overcome these difficulties with some unique features is presented. By using the Systems of Communicating Machines (SCM) model as a framework, a refined and improved version of the formal protocol specification is built over the previous work, and it is analyzed to verify that the protocol is free from logical errors such as deadlock, unspecified reception, unexecuted transitions and blocking loops. The analysis is conducted in two phases which consists of the application of the associated system state analysis and the simulation of the protocol using the programming language ADA. The thesis also presents the difficulties encountered during the course of the analysis, and suggests possible solutions to some of the problems.

  20. Effective dose comparison between protocols stitched and usual protocols in dental cone beam CT for complete arcade

    Energy Technology Data Exchange (ETDEWEB)

    Soares, M. R.; Maia, A. F. [Universidade Federal de Sergipe, Departamento de Fisica, Cidade Universitaria Prof. Jose Aloisio de Campos, Marechal Rondon s/n, Jardim Rosa Elze, 49-100000 Sao Cristovao, Sergipe (Brazil); Batista, W. O. G. [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho, Salvador, 40301015 Bahia (Brazil); Lara, P. A., E-mail: wilsonottobatista@gmail.com [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    To visualization a complete dental radiology dental lives together with two separate proposals: [1] protocols diameter encompassing the entire arch (single) or [2] protocol with multiple fields of view (Fov) which together encompass the entire arch (stitched Fov s). The objective of this study is to evaluate effective dose values in examination protocols for all dental arcade available in different outfits with these two options. For this, a female anthropomorphic phantom manufactured by Radiology Support Devices twenty six thermoluminescent dosimeters inserted in relevant bodies and positions was used. Irradiate the simulator in the clinical conditions. The protocols were averaged and compared: [a] 14.0 cm x 8.5 cm and [b] 8.5 cm x 8.5 cm (Gendex Tomography GXCB 500), [c] protocol stitched for jaw combination of three volumes of 5.0 cm x 3.7 cm (Kodak 9000 3D scanner) [d] protocol stitched Fov s 5.0 cm x 8.0 cm (Planmeca Pro Max 3D) and [e] single technical Fov 14 cm x 8 cm (i-CAT Classical). Our results for the effective dose were: a range between 43.1 and 111.1 micro Sv for technical single Fov and 44.5 and 236.2 for technical stitched Fov s. The protocol presented the highest estimated effective dose was [d] and showed that lowest index was registered [a]. These results demonstrate that the protocol stitched Fov generated in Kodak 9000 3D machine applied the upper dental arch has practically equal value effective dose obtained by protocol extended diameter of, [a], which evaluates in a single image upper and lower arcade. It also demonstrates that the protocol [d] gives an estimate of five times higher than the protocol [a]. Thus, we conclude that in practical terms the protocol [c] stitched Fov s, not presents dosimetric advantages over other protocols. (Author)

  1. Dysphonia risk screening protocol

    Directory of Open Access Journals (Sweden)

    Katia Nemr

    2016-03-01

    Full Text Available OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children, 29.25 (adult women, 22.75 (adult men, and 27.10 (seniors. CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics.

  2. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  3. Refining the quantitative pathway of the Pathways to Mathematics model.

    Science.gov (United States)

    Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda

    2015-03-01

    In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. A Refined Hybrid Image Retrieval System using Text and Color

    Directory of Open Access Journals (Sweden)

    Nidhi Goel

    2012-07-01

    Full Text Available Image retrieval (IR continues to be most exciting and fastest growing research areas due to significant progress in data storage and image acquisition techniques. Broadly, Image Retrieval can be Text based or Content based. Text-based Image Retrieval (TBIR is proficient in 'named-entity queries (e.g. searching images of 'TajMahal. Content Based Image Retrieval (CBIR shows its proficiency in querying by visual content. Both the techniques having their own advantages and disadvantages and still have not been very successful in uncovering the hidden meanings/semantics of the image. In this paper, we propose a hybrid approach that improves the quality of image retrieval and overcomes the limitations of individual approaches. For text retrieval, matching term frequency-inverse document frequency (tf-idf weightings and cosine similarity are used, whereas for content matching the search space is narrowed down using color moments and then the two results obtained are combined to show better results than the individual approaches. Further refinement using color histogram technique improves the performance of the system significantly.

  5. QAODV: Improved Security Routing Protocol of AODV

    OpenAIRE

    Ruihui Mu; Wuqiang Zhang

    2013-01-01

    Network control system (NCS) has higher requirements on QoS index of the network, but MANET can hardly meet the demands of control systems because of its own characteristics. The paper improves AODV and proposes AODV routing protocol QAODV with certain QoS security in combination with NCS and MANET, and expounds routing detection, routing register and routing maintenance. Simulation results show that end-to-end delay and packet loss of QAODV reduces evidently compared with AODV, and the proto...

  6. Development of a combined canine distemper virus specific RT-PCR protocol for the differentiation of infected and vaccinated animals (DIVA) and genetic characterization of the hemagglutinin gene of seven Chinese strains demonstrated in dogs.

    Science.gov (United States)

    Yi, Li; Cheng, Shipeng; Xu, Hongli; Wang, Jianke; Cheng, Yuening; Yang, Shen; Luo, Bin

    2012-01-01

    A combined reverse-transcription polymerase chain reaction (RT-PCR) method was developed for the detection and differentiation of wild-type and vaccine strains of the canine distemper virus (CDV). A pair of primers (P1/P2) was used to detect both CDV wild-type strains and vaccines. Another pair (P3/P4) was used to detect only CDV wild-type strains. A 335bp fragment was amplified from the genomic RNA of the vaccine and wild-type strains. A 555bp fragment was amplified specifically from the genomic RNA of the wild-type strains. No amplification was achieved for the uninfected cells, cells infected with canine parvovirus, canine coronavirus, or canine adenovirus. The combined RT-PCR method detected effectively and differentiated the CDV wild-type and vaccine strains by two separate RT-PCRs. The method can be used for clinical detection and epidemiological surveillance. The phylogenetic analysis of the hemagglutinin gene of the local wild-type CDV strains revealed that the seven local isolates all belonged to the Asia-1 lineage, and were clustered closely with one another at the same location. These results suggested that the CDV genotype Asia-1 is circulating currently in domestic dogs in China.

  7. Rationale and protocol of the MetNET-1 trial, a prospective, single center, phase II study to evaluate the activity and safety of everolimus in combination with octreotide LAR and metformin in patients with advanced pancreatic neuroendocrine tumors.

    Science.gov (United States)

    Pusceddu, Sara; de Braud, Filippo; Concas, Laura; Bregant, Cristina; Leuzzi, Livia; Formisano, Barbara; Buzzoni, Roberto

    2014-01-01

    Abnormal PI3K-AKT-mTOR pathway signalling and autocrine activation of the mTOR pathway, mediated through insulin-like growth factor-1, have been implicated in the proliferation of pancreatic neuroendocrine tumor (pNET) cells. Everolimus, an mTOR inhibitor, has shown antitumor benefit in pNETs alone and in combination with octreotide LAR in RADIANT-1 and RADIANT-3 studies. Although everolimus-based phase II/III trials have improved progression-free survival for pNET, its use has not impacted on prolonging overall survival. Metformin has recently shown some anti-cancer activity in both in vitro and in vivo studies by its indirect properties to decrease insulin and insulin-like growth factor-1 (IGF-1) levels and by its antitumour effect to promote AMPK activation and consequently inhibition to TSC1-2/mTOR complex. In light of even more retrospective evidence of metformin's anticancer activity, a prospective evaluation is required to either confirm or discard these preliminary findings. With the aim to evaluate the antiproliferative effect of metformin in combination with everolimus and octreotide LAR in pancreatic well-differentiated neuroendocrine tumor patients, a single arm, prospective, single center phase II study was designed (MetNET-1 trial, NCT 02294006). Forty-three patients are expected to be evaluated. The study is ongoing, and recruitment is estimated to be completed in August 2016. The results will be anticipated in 2017.

  8. Production and characterization of refined oils obtained from Indian oil sardine (Sardinella longiceps).

    Science.gov (United States)

    Chakraborty, Kajal; Joseph, Deepu

    2015-01-28

    Crude Sardinella longiceps oil was refined in different stages such as degumming, neutralization, bleaching, and deodorization. The efficiency of these processes was evaluated on the basis of free fatty acid (FFA), peroxide (PV), p-anisidine (pAV), total oxidation (TOTOX), thiobarbituric acid reactive species (TBARS) values, Lovibond CIE-L*a*b* color analyses, and (1)H NMR or GC-MS experiments. The utilities of NMR-based proton signal characteristics as new analytical tools to understand the signature peaks and relative abundance of different fatty acids and monitoring the refining process of fish oil have been demonstrated. Phosphoric acid (1%) was found to be an effective degumming reagent to obtain oil with the lowest FFA, PV, pAV, TOTOX, and TBARS values and highest color reduction. Significant reduction in the contents of hydrocarbon functionalities as shown by the decrease in proton integral in the characteristic (1)H NMR region was demonstrated by using 1% H3PO4 during the course of the degumming process. A combination (1.25:3.75%) of activated charcoal and Fuller's earth at 3% concentration for a stirring time of 40 min was found to be effective in bleaching the sardine oil. This study demonstrated that unfavorable odor-causing components, particularly low molecular weight carbonyl compounds, could successfully be removed by the refining process. The alkane-dienals/alkanes, which cause unfavorable fishy odors, were successfully removed by distillation (100 °C) under vacuum with aqueous acetic acid solution (0.25 N) to obtain greater quality of refined sardine oil, a rich source of essential fatty acids and improved oxidative stability. The present study demonstrated that the four-stage refinement process of sardine oil resulted in a significant improvement in quality characteristics and nutritional values, particularly n-3 PUFAs, with improved fish oil characteristics for use in the pharmaceutical and functional food industries.

  9. Model Refinement in the Model Driven Architecture Context

    Directory of Open Access Journals (Sweden)

    Paulo Cezar Stadzisz

    2012-01-01

    Full Text Available Problem statement: Model Driven Architecture (MDA is a software development approach based on the design and the transformation of models. In MDA, models are systematically translated to other models and to a source code. Model transformation plays a key role in MDA. Several model transformation languages have been launched lately, aiming to facilitate the translation of input models to output models. The employment of such languages in practical contexts has succeed, although quite often those languages cannot be directly applied to a particular type of model transformation, called refinement. Approach: This study provides a general overview on model refinement and investigates two approaches for model refinement based on Atlas Transformation Language (ATL referred to as: Refining mode and module superimposition. ATL is a widely adopted language for solving model transformation problems in the MDA approach. Results: This study presents the comparative results obtained from the analysis of the Refining Mode and the Module Superimposition approaches, emphasizing their application benefits. Conclusion: The increasing use of MDA for the design of software systems empowered researches on how developers may benefit from approaches that perform model refinement. The main advantages achieved with the use of the Module Superimposition technique are maintainability and reusability improvement, obtained through module composition and rule superimposition. In its turn, the Refining Mode stands out for its ease of use.

  10. Adjoint Methods for Guiding Adaptive Mesh Refinement in Tsunami Modeling

    Science.gov (United States)

    Davis, B. N.; LeVeque, R. J.

    2016-12-01

    One difficulty in developing numerical methods for tsunami modeling is the fact that solutions contain time-varying regions where much higher resolution is required than elsewhere in the domain, particularly when tracking a tsunami propagating across the ocean. The open source GeoClaw software deals with this issue by using block-structured adaptive mesh refinement to selectively refine around propagating waves. For problems where only a target area of the total solution is of interest (e.g., one coastal community), a method that allows identifying and refining the grid only in regions that influence this target area would significantly reduce the computational cost of finding a solution. In this work, we show that solving the time-dependent adjoint equation and using a suitable inner product with the forward solution allows more precise refinement of the relevant waves. We present the adjoint methodology first in one space dimension for illustration and in a broad context since it could also be used in other adaptive software, and potentially for other tsunami applications beyond adaptive refinement. We then show how this adjoint method has been integrated into the adaptive mesh refinement strategy of the open source GeoClaw software and present tsunami modeling results showing that the accuracy of the solution is maintained and the computational time required is significantly reduced through the integration of the adjoint method into adaptive mesh refinement.

  11. The REFER (REFer for EchocaRdiogram protocol: a prospective validation of a clinical decision rule, NT-proBNP, or their combination, in the diagnosis of heart failure in primary care. Rationale and design

    Directory of Open Access Journals (Sweden)

    Tait Lynda

    2012-10-01

    Full Text Available Abstract Background Heart failure is a major cause of mortality and morbidity. As mortality rates are high, it is important that patients seen by general practitioners with symptoms suggestive of heart failure are identified quickly and treated appropriately. Identifying patients with heart failure or deciding which patients need further tests is a challenge. All patients with suspected heart failure should be diagnosed using objective tests such as echocardiography, but it is expensive, often delayed, and limited by the significant skill shortage of trained echocardiographers. Alternative approaches for diagnosing heart failure are currently limited. Clinical decision tools that combine clinical signs, symptoms or patient characteristics are designed to be used to support clinical decision-making and validated according to strict methodological procedures. The REFER Study aims to determine the accuracy and cost-effectiveness of our previously derived novel, simple clinical decision rule, a natriuretic peptide assay, or their combination, in the triage for referral for echocardiography of symptomatic adult patients who present in general practice with symptoms suggestive of heart failure. Methods/design This is a prospective, Phase II observational, diagnostic validation study of a clinical decision rule, natriuretic peptides or their combination, for diagnosing heart failure in primary care. Consecutive adult primary care patients 55 years of age or over presenting to their general practitioner with a chief complaint of recent new onset shortness of breath, lethargy or peripheral ankle oedema of over 48 hours duration, with no obvious recurrent, acute or self-limiting cause will be enrolled. Our reference standard is based upon a three step expert specialist consensus using echocardiography and clinical variables and tests. Discussion Our clinical decision rule offers a potential solution to the diagnostic challenge of providing a timely and

  12. Adenosine stress protocols for myocardial perfusion imaging

    Directory of Open Access Journals (Sweden)

    Baškot Branislav

    2008-01-01

    Full Text Available Background/Aim. Treadmill test combined with myocardial perfusion scintigraphy (MPS is a commonly used technique in the assessment of coronary artery disease. There are many patients, however, who may not be able to undergo treadmill test. Such patients would benefit from pharmacological stress procedures combined with MPS. The most commonly used pharmacological agents for cardiac stress are coronary vasodilatators (adenosine, dipyridamol and catecholamines. Concomitant low-level treadmill exercise with adenosine pharmacologic stress (AdenoEX during MPS has become commonly used in recent years. A number of studies have demonstrated a beneficial impact of AdenoEX protocol. The aim of the study was, besides introducing into practice the two types of protocols of pharmatological stress test with adenosine, as a preparation for MPS, to compare and monitor the frequency of their side effects to quality, acquisition, as well as to standardize the onset time of acquisition (diagnostic imaging for both protocols. Methods. A total of 130 patients underwent pharmacological stress test with adenosine (vasodilatator. In 108 of the patients we performed concomitant exercise (AdenoEX of low level (50W by a bicycle ergometar. In 28 of the patients we performed Adenosine abbreviated protocol (AdenoSCAN. Side effects of adenosine were followed and compared between the two kinds of protocols AdenoEX and AdenoSCAN. Also compared were image quality and suggested time of acquisition after the stress test. Results. Numerous side effects were found, but being short-lived they did not require any active interventions. The benefit of AdenoEX versus AdenoSCAN included decreased side effects (62% vs 87%, improved safety and patients tolerance, improved target-to-background ratios because of less subdiaphragmatic activity, earlier acquisition, and improved sensitivity. Conclusion. The safety and efficacy of adenosine pharmacological stress is even better with concomitant

  13. CT protocol review and optimization.

    Science.gov (United States)

    Kofler, James M; Cody, Dianna D; Morin, Richard L

    2014-03-01

    To reduce the radiation dose associated with CT scans, much attention is focused on CT protocol review and improvement. In fact, annual protocol reviews will soon be required for ACR CT accreditation. A major challenge in the protocol review process is determining whether a current protocol is optimal and deciding what steps to take to improve it. In this paper, the authors describe methods for pinpointing deficiencies in CT protocols and provide a systematic approach for optimizing them. Emphasis is placed on a team approach, with a team consisting of at least one radiologist, one physicist, and one technologist. This core team completes a critical review of all aspects of a CT protocol and carefully evaluates proposed improvements. Changes to protocols are implemented only with consensus of the core team, with consideration of all aspects of the CT examination, including image quality, radiation dose, patient care and safety, and workflow.

  14. FAST CP: protocol of a randomised controlled trial of the efficacy of a 12-week combined Functional Anaerobic and Strength Training programme on muscle properties and mechanical gait deficiencies in adolescents and young adults with spastic-type cerebral palsy.

    Science.gov (United States)

    Gillett, Jarred G; Lichtwark, Glen A; Boyd, Roslyn N; Barber, Lee A

    2015-06-26

    Individuals with cerebral palsy (CP) have muscles that are smaller, weaker and more resistant to stretch compared to typically developing people. Progressive resistance training leads to increases in muscle size and strength. In CP, the benefits of resistance training alone may not transfer to improve other activities such as walking; however, the transfer of strength improvements to improved mobility may be enhanced by performing training that involves specific functional tasks or motor skills. This study aims to determine the efficacy of combined functional anaerobic and strength training in (1) influencing muscle strength, structure and function and (2) to determine if any changes in muscle strength and structure following training impact on walking ability and gross motor functional capacity and performance in the short (following 3 months of training) and medium terms (a further 3 months post-training). 40 adolescents and young adults with CP will be recruited to undertake a 12-week training programme. The training programme will consist of 3 × 75 min sessions per week, made up of 5 lower limb resistance exercises and 2-3 functional anaerobic exercises per session. The calf muscles will be specifically targeted, as they are the most commonly impacted muscles in CP and are a key muscle group involved in walking. If, as we believe, muscle properties change following combined strength and functional training, there may be long-term benefits of this type of training in slowing the deterioration of muscle function in people with spastic-type CP. Ethical approval has been obtained from the ethics committees at The University of Queensland (2014000066) and Children's Health Queensland (HREC/15/QRCH/30). The findings will be disseminated by publications in peer-reviewed journals, conferences and local research organisations' media. Australian and New Zealand Clinical Trials Registry (ACTRN12614001217695). Published by the BMJ Publishing Group Limited. For permission

  15. FAST CP: protocol of a randomised controlled trial of the efficacy of a 12-week combined Functional Anaerobic and Strength Training programme on muscle properties and mechanical gait deficiencies in adolescents and young adults with spastic-type cerebral palsy

    Science.gov (United States)

    Gillett, Jarred G; Lichtwark, Glen A; Boyd, Roslyn N; Barber, Lee A

    2015-01-01

    Introduction Individuals with cerebral palsy (CP) have muscles that are smaller, weaker and more resistant to stretch compared to typically developing people. Progressive resistance training leads to increases in muscle size and strength. In CP, the benefits of resistance training alone may not transfer to improve other activities such as walking; however, the transfer of strength improvements to improved mobility may be enhanced by performing training that involves specific functional tasks or motor skills. This study aims to determine the efficacy of combined functional anaerobic and strength training in (1) influencing muscle strength, structure and function and (2) to determine if any changes in muscle strength and structure following training impact on walking ability and gross motor functional capacity and performance in the short (following 3 months of training) and medium terms (a further 3 months post-training). Methods and analysis 40 adolescents and young adults with CP will be recruited to undertake a 12-week training programme. The training programme will consist of 3×75 min sessions per week, made up of 5 lower limb resistance exercises and 2–3 functional anaerobic exercises per session. The calf muscles will be specifically targeted, as they are the most commonly impacted muscles in CP and are a key muscle group involved in walking. If, as we believe, muscle properties change following combined strength and functional training, there may be long-term benefits of this type of training in slowing the deterioration of muscle function in people with spastic-type CP. Ethics and dissemination Ethical approval has been obtained from the ethics committees at The University of Queensland (2014000066) and Children's Health Queensland (HREC/15/QRCH/30). The findings will be disseminated by publications in peer-reviewed journals, conferences and local research organisations’ media. Trial registration number Australian and New Zealand Clinical Trials

  16. FRENCH PROTOCOL CARDS

    CERN Multimedia

    Division des Ressources Humaines; Human Resources Division; Tel. 74683-79494

    2000-01-01

    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.2000, are requested to return these cards and those of family members, for extension to: Bureau des cartes, Bât 33.1-009/1-015 Should the three spaces for authentication on the back of the card be full, please enclose two passport photographs for a new card. In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.

  17. FRENCH PROTOCOL CARDS

    CERN Multimedia

    Human Resources Division

    2000-01-01

    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.2000, are requested to return these cards and those of family members, for extension to: Bureau des cartes, Bât 33.1-009/1-015 Should the three spaces for authentication on the back of the card be full, please enclose two passport photographs for a new card. In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.

  18. FRENCH PROTOCOL CARDS

    CERN Multimedia

    Division du Personnel

    1999-01-01

    Senior officials, holders of FRENCH PROTOCOL cards (blue cards) due to expire on 31.12.1999, are requested to return these cards and those of family members, for extension to:Bureau des cartes, bâtiment 33.1-025Should the 3 spaces for authentication on the back of the card be full, please enclose 2 passport photographs for a new card.In the case of children aged 14 and over, an attestation of dependency and a school certificate should be returned with the card.Personnel DivisionTel. 79494/74683

  19. Adaptive mesh refinement for stochastic reaction-diffusion processes

    Science.gov (United States)

    Bayati, Basil; Chatelain, Philippe; Koumoutsakos, Petros

    2011-01-01

    We present an algorithm for adaptive mesh refinement applied to mesoscopic stochastic simulations of spatially evolving reaction-diffusion processes. The transition rates for the diffusion process are derived on adaptive, locally refined structured meshes. Convergence of the diffusion process is presented and the fluctuations of the stochastic process are verified. Furthermore, a refinement criterion is proposed for the evolution of the adaptive mesh. The method is validated in simulations of reaction-diffusion processes as described by the Fisher-Kolmogorov and Gray-Scott equations.

  20. China Expands Oil Refining Capacity to Meet Rising Demand

    Institute of Scientific and Technical Information of China (English)

    Yin Yin

    2006-01-01

    @@ With the booming of China's auto market, the oil refining industry is falling over itself to boost capacity in the coastal regions. Sinochem Corporation is the latest out of the starting blocks. Its new oil refining project in southeastern Fujian Province will see it move into territory already occupied by the nation's big-three oil companies - China National Petrochemical Corporation(Sinopec), China's largest oil refiner,China National Petroleum Corporation (CNPC),the largest crude producer in the country, and China National Offshore Oil Corporation(CNOOC), the giant offshore oil player.

  1. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.

    2016-08-04

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  2. Field validation of protocols developed to evaluate in-line mastitis detection systems.

    Science.gov (United States)

    Kamphuis, C; Dela Rue, B T; Eastwood, C R

    2016-02-01

    This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM

  3. Estimation of the Thurstonian model for the 2-AC protocol

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Lee, Hye-Seong; Brockhoff, Per B.

    2012-01-01

    The 2-AC protocol is a 2-AFC protocol with a “no-difference” option and is technically identical to the paired preference test with a “no-preference” option. The Thurstonian model for the 2-AC protocol is parameterized by δ and a decision parameter τ, the estimates of which can be obtained...... by fairly simple well-known methods. In this paper we describe how standard errors of the parameters can be obtained and how exact power computations can be performed. We also show how the Thurstonian model for the 2-AC protocol is closely related to a statistical model known as a cumulative probit model....... This relationship makes it possible to extract estimates and standard errors of δ and τ from general statistical software, and furthermore, it makes it possible to combine standard regression modelling with the Thurstonian model for the 2-AC protocol. A model for replicated 2-AC data is proposed using cumulative...

  4. Protocol for Combined Analysis of FOXFIRE, SIRFLOX, and FOXFIRE-Global Randomized Phase III Trials of Chemotherapy +/- Selective Internal Radiation Therapy as First-Line Treatment for Patients With Metastatic Colorectal Cancer.

    Science.gov (United States)

    Virdee, Pradeep S; Moschandreas, Joanna; Gebski, Val; Love, Sharon B; Francis, E Anne; Wasan, Harpreet S; van Hazel, Guy; Gibbs, Peter; Sharma, Ricky A

    2017-03-28

    In colorectal cancer (CRC), unresectable liver metastases are associated with a poor prognosis. The FOXFIRE (an open-label randomized phase III trial of 5-fluorouracil, oxaliplatin, and folinic acid +/- interventional radioembolization as first-line treatment for patients with unresectable liver-only or liver-predominant metastatic colorectal cancer), SIRFLOX (randomized comparative study of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma), and FOXFIRE-Global (assessment of overall survival of FOLFOX6m plus SIR-Spheres microspheres versus FOLFOX6m alone as first-line treatment in patients with nonresectable liver metastases from primary colorectal carcinoma in a randomized clinical study) clinical trials were designed to evaluate the efficacy and safety of combining first-line chemotherapy with selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres, also called transarterial radioembolization. The aim of this analysis is to prospectively combine clinical data from 3 trials to allow adequate power to evaluate the impact of chemotherapy with SIRT on overall survival. Eligible patients are adults with histologically confirmed CRC and unequivocal evidence of liver metastases which are not treatable by surgical resection or local ablation with curative intent at the time of study entry. Patients may also have limited extrahepatic metastases. Final analysis will take place when all participants have been followed up for a minimum of 2 years. Efficacy and safety estimates derived using individual participant data (IPD) from SIRFLOX, FOXFIRE, and FOXFIRE-Global will be pooled using 2-stage prospective meta-analysis. Secondary outcome measures include progression-free survival (PFS), liver-specific PFS, health-related quality of life, response rate, resection rate, and adverse event profile. The large study population will

  5. Gradual refinement for application-specific MPSoC design from Simulink model to RTL implementation

    Institute of Scientific and Technical Information of China (English)

    Kai HUANG; Xiao-lang YAN; Sang-il HAN; Soo-ik CHAE; Ahmed A. JERRAYA; Katalin POPOVICI; Xavier GUERIN; Lisane BRISOLARA; Lui-gi CARRO

    2009-01-01

    The application-specific multiprocessor system-on-chip (MPSoC) architecture is becoming an attractive solution to deal with increasingly complex embedded applications, which require both high performance and flexible programmability. As an effective method for MPSoC development, we present a gradual refinement flow starting from a high-level Simulink model to a synthesizable and executable hardware and software specification. The proposed methodology consists of five different abstract levels: Simulink combined algorithm and architecture model (CAAM), virtual architecture (VA), transactional accurate archi-tecture (TA), virtual prototype (VP) and field-programmable gate array (FPGA) emulation. Experimental results of Motion-JPEG and H.264 show that the proposed gradual refinement flow can generate various MPSoC architectures from an original Simulink model, allowing processor, communication and tasks design space exploration.

  6. The Kantorovich macro-or-mesoscopic refined solution for the heterogeneous functionally gradient material complex structure

    Institute of Scientific and Technical Information of China (English)

    LI; Yong(李永); SONG; Jian(宋健); ZHANG; Zhiming(张志民)

    2003-01-01

    This paper is a piece of research on the complex structure of functionally gradient materials, which is an applicable triangular cantilever plate structure locally fixed and supported by its round revolving axis. Combined with the generalized Euler equation and the generalized boundary conditions, Kantorovich method and the principle of the two independent variables generalized calculus of variations are adopted to establish the bending governing equation of plates to work out the solution. In comparison with the previous work on the problem, this paper, taking into account three generalized mechanical factors and FGM macro-or-mesoscopic heterogeneity, proposes a new concept of translating the issue of theoretical initial value into the problem of semi-analytical boundary value to obtain the refined solution and then researches the joint effect of grads stress fields. Thereby a refined version of Kantorovich macro-or-mesoscopic solution is developed.

  7. COMPARISON ON REFINEMENT OF IRON POWDER BY BALL MILLING ASSISTED BY DIFFERENT EXTERNAL FIELDS

    Institute of Scientific and Technical Information of China (English)

    L.Y. Dai; B. Cao; M. Zhu

    2006-01-01

    The cryogenic milling and milling in conjunction with dielectric barrier discharge plasma (DBDP)have been separately set up. The combined effect of low temperature and plasma on ball milling has been investigated by examining the refinement of particle size and grain size of iron powder using scanning electron microscopy, X-ray diffraction, and small angle X-ray scattering. It was found that the mean size of iron particles could reach 104nm only after 10 hours of ball milling in conjunction with DBDP, whereas a minimum average grain size of 8.4nm was obtained by cryomilling at -20℃; however, it is difficult to refine the particle size and grain size under the same milling condition in the absence of DBDP and cryogenic temperature.

  8. Effect of transcranial direct current stimulation combined with gait and mobility training on functionality in children with cerebral palsy: study protocol for a double-blind randomized controlled clinical trial.

    Science.gov (United States)

    Grecco, Luanda André Collange; Duarte, Natália de Almeida Carvalho; de Mendonça, Mariana Emerenciano; Pasini, Hugo; Lima, Vânia Lúcia Costa de Carvalho; Franco, Renata Calhes; de Oliveira, Luis Vicente Franco; de Carvalho, Paulo de Tarso Camilo; Corrêa, João Carlos Ferrari; Collange, Nelci Zanon; Sampaio, Luciana Maria Malosá; Galli, Manuela; Fregni, Felipe; Oliveira, Claudia Santos

    2013-10-11

    The project proposes three innovative intervention techniques (treadmill training, mobility training with virtual reality and transcranial direct current stimulation that can be safely administered to children with cerebral palsy. The combination of transcranial stimulation and physical therapy resources will provide the training of a specific task with multiple rhythmic repetitions of the phases of the gait cycle, providing rich sensory stimuli with a modified excitability threshold of the primary motor cortex to enhance local synaptic efficacy and potentiate motor learning. A prospective, double-blind, randomized, controlled, analytical, clinical trial will be carried out.Eligible participants will be children with cerebral palsy classified on levels I, II and III of the Gross Motor Function Classification System between four and ten years of age. The participants will be randomly allocated to four groups: 1) gait training on a treadmill with placebo transcranial stimulation; 2) gait training on a treadmill with active transcranial stimulation; 3) mobility training with virtual reality and placebo transcranial stimulation; 4) mobility training with virtual reality and active transcranial stimulation. Transcranial direct current stimulation will be applied with the anodal electrode positioned in the region of the dominant hemisphere over C3, corresponding to the primary motor cortex, and the cathode positioned in the supraorbital region contralateral to the anode. A 1 mA current will be applied for 20 minutes. Treadmill training and mobility training with virtual reality will be performed in 30-minute sessions five times a week for two weeks (total of 10 sessions). Evaluations will be performed on four occasions: one week prior to the intervention; one week following the intervention; one month after the end of the intervention;and 3 months after the end of the intervention. The evaluations will involve three-dimensional gait analysis, analysis of cortex

  9. Plasmodium falciparum gametocyte carriage, sex ratios and asexual parasite rates in Nigerian children before and after a treatment protocol policy change instituting the use of artemisinin-based combination therapies

    Directory of Open Access Journals (Sweden)

    Grace Olusola Gbotosho

    2011-09-01

    Full Text Available The effects of artemisinin-based combination therapies (ACTs on transmission of Plasmodium falciparum were evaluated after a policy change instituting the use of ACTs in an endemic area. P. falciparum gametocyte carriage, sex ratios and inbreeding rates were examined in 2,585 children at presentation with acute falciparum malaria during a 10-year period from 2001-2010. Asexual parasite rates were also evaluated from 2003-2010 in 10,615 children before and after the policy change. Gametocyte carriage declined significantly from 12.4% in 2001 to 3.6% in 2010 (@@χ2 for trend = 44.3, p < 0.0001, but sex ratios and inbreeding rates remained unchanged. Additionally, overall parasite rates remained unchanged before and after the policy change (47.2% vs. 45.4%, but these rates declined significantly from 2003-2010 (@@χ2 for trend 35.4, p < 0.0001. Chloroquine (CQ and artemether-lumefantrine (AL were used as prototype drugs before and after the policy change, respectively. AL significantly shortened the duration of male gametocyte carriage in individual patients after treatment began compared with CQ (log rank statistic = 7.92, p = 0.005. ACTs reduced the rate of gametocyte carriage in children with acute falciparum infections at presentation and shortened the duration of male gametocyte carriage after treatment. However, parasite population sex ratios, inbreeding rates and overall parasite rate were unaffected.

  10. Refined Aluminum Industry Suffers From Deficit and Western Investment Accelerates

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    <正>Under the backdrop of loss of the entire refined aluminum industry,the investment in electrolytic aluminum accelerates.The reporter learnt from a recent survey that,many companies including Shandong Xinfa Group,East Hope

  11. Biorthogonal multiple wavelets generated by vector refinement equation

    Institute of Scientific and Technical Information of China (English)

    Song LI; Jun XIAN

    2007-01-01

    Biorthogonal multiple wavelets are generated from refinable function vectors by using the multiresolution analysis. In this paper we provide a general method for the construction of compactly supported biorthogonal multiple wavelets by refinable function vectors which are the solutions of vector refinement equations of the formψ(x) =∑α∈Zs α(α)ψ(Mx-α), x ∈ Rs,sequence of r × r matrices called the refinement mask, and M is an s × s integer matrix such that limn→∞ M-n = 0. Our characterizations are in the general setting and the main results of this paper are the real extensions of some known results.

  12. Biorthogonal multiple wavelets generated by vector refinement equation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Biorthogonal multiple wavelets are generated from refinable function vectors by using the multiresolution analysis.In this paper we provide a general method for the construction of compactly supported biorthogonal multiple wavelets by refinable function vectors which are the solutions of vector refinement equations of the form (?)(x)=(?)a(α)(?)(Mx-α),x∈Rs, where the vector of functions(?)=((?)1,...,(?)r)T is in(L2(Rs))r,a=:(a(α))α∈Zsis a finitely supported sequence of r×r matrices called the refinement mask,and M is an s×s integer matrix such that limn→∞M-n=0.Our characterizations are in the general setting and the main results of this paper are the real extensions of some known results.

  13. Recent Development and Status of CNPC's Refining and Chemical Industry

    Institute of Scientific and Technical Information of China (English)

    Zhou Weiyong

    1997-01-01

    @@ In the past one and halfyear since carrying China National Petroleum Corporation's (CNPC) Ninth Five-Year Plan (1996-2000), CNPC's oil refining and chemical industry has made considerable progress.

  14. Latest Development in China's Oil Refining & Petrochemical Sector

    Institute of Scientific and Technical Information of China (English)

    Xia Zuhua

    2008-01-01

    @@ China's big-three oil companies-CNPC, Sinopec and CNOOC-have intensified their competition in the domestic oil refining and petrochemical sector in recent years so as to better position themselves in the oil products market at home.

  15. Value Source of Integrated Refining/Chemical Plant

    Institute of Scientific and Technical Information of China (English)

    Zhao Jianwei

    2009-01-01

    Based on an integrated refining/chemical plant processing 15 Mt/a of crude and manufacturing 1.0 Mt/a of ethylene under the guideline of"engaging in refining, olefins and aromatics by whatever appropriate means" to maximize the overall value of the integrated refining/chemical plant, it is necessary to concentrate on working on the flow diagram and the solution for mutual supply of materials between the refinery and ethylene plant. After analyzing the feedstock slate, the composition and properties of products, it is proposed to optimize the integrated refining/chemical plant in order to reduce investment and operating cost to realize maximization of the value of the integrated plant.

  16. ADAPTIVE MODEL REFINEMENT FOR THE IONOSPHERE AND THERMOSPHERE

    Data.gov (United States)

    National Aeronautics and Space Administration — ADAPTIVE MODEL REFINEMENT FOR THE IONOSPHERE AND THERMOSPHERE ANTHONY M. D’AMATO∗, AARON J. RIDLEY∗∗, AND DENNIS S. BERNSTEIN∗∗∗ Abstract. Mathematical models of...

  17. High resolution single particle refinement in EMAN2.1.

    Science.gov (United States)

    Bell, James M; Chen, Muyuan; Baldwin, Philip R; Ludtke, Steven J

    2016-05-01

    EMAN2.1 is a complete image processing suite for quantitative analysis of grayscale images, with a primary focus on transmission electron microscopy, with complete workflows for performing high resolution single particle reconstruction, 2-D and 3-D heterogeneity analysis, random conical tilt reconstruction and subtomogram averaging, among other tasks. In this manuscript we provide the first detailed description of the high resolution single particle analysis pipeline and the philosophy behind its approach to the reconstruction problem. High resolution refinement is a fully automated process, and involves an advanced set of heuristics to select optimal algorithms for each specific refinement task. A gold standard FSC is produced automatically as part of refinement, providing a robust resolution estimate for the final map, and this is used to optimally filter the final CTF phase and amplitude corrected structure. Additional methods are in-place to reduce model bias during refinement, and to permit cross-validation using other computational methods.

  18. Risk Assessment: Perchloroethylene Dry Cleaners Refined Human Health Risk Characterization

    Science.gov (United States)

    This November 2005 memo and appendices describe the methods by which EPA conducted its refined risk assessment of the Major Source and Area Source facilities within the perchloroethylene (perc) dry cleaners source category.

  19. Refinement of reduced-models for dynamic systems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    A refinement procedure for the reduced models of structural dynamic systems is presented in this article. The refinement procedure is to "tune" the parameters of a reduced model, which could be obtained from any traditional model reduction scheme, into an improved reduced model. Upon the completion of the refinement, the improved reduced model matches the dynamic characteristics - the chosen structural frequencies and their mode shapes - of the full order model. Mathematically, the procedure to implement the model refinement technique is an application of the recently developed cross-model cross-mode (CMCM) method for model updating. A numerical example of reducing a 5-DOF (degree-of-freedom) classical mass-spring (or shear-building) model into a 3-DOF generalized mass-spring model is demonstrated in this article.

  20. CITGO Petroleum Corporation and PDV Midwest Refining, LLC Settlement

    Science.gov (United States)

    CITGO Petroleum Corporation and PDV Midwest Refining, LLC (collectively, CITGO) have agreed to pay a $1,955,000 civil penalty, perform environmental projects totaling more than $2 million, and spend an estimated $42 million in injunctive relief to resolve.

  1. REFINABLE DISTRIBUTIONS SUPPORTED ON SELF-AFFINE TILES

    Institute of Scientific and Technical Information of China (English)

    DaiXinrong

    2002-01-01

    In this paper,some conditions which assure the compactly supported refinable distributions supported on a self-affine tile to be Lebesgue-Stieltjes measures or absolutely continuous measures with respect to Lebesgue-Stieltjes measures are given.

  2. Rack gasoline and refining margins - wanted: a summer romance

    Energy Technology Data Exchange (ETDEWEB)

    1988-04-13

    For the first time since late 1987, apparent refining margins on the US benchmark crude oil (based on spot purchase prices) are virtually zero. This felicitous bit of news comes loaded with possibilities of positive (maybe even good.) margins in coming months, if the differential between crude buying prices and the value of the refined barrel continues to improve. What refiners in the US market are watching most closely right now are motorists. This issue also contains the following: (1) ED refining netback data for the US Gulf and Western Coasts, Rotterdam, and Singapore, prices for early April 1988; and (2) ED fuel price/tax series for countries of the Western Hemisphere, April 1988 edition. 5 figures, 5 tables.

  3. National protocol framework for the inventory and monitoring of bees

    Science.gov (United States)

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative

  4. On Refined Mining%论精细采矿

    Institute of Scientific and Technical Information of China (English)

    曲胜利; 孙豁然; 李少辉

    2011-01-01

    随着一些废弃矿山的重新恢复生产,在开采残留矿脉的过程中,传统的粗放型采矿模式遇到了诸多问题,解决这些问题的途径是采用新的矿产资源开采模式--精细采矿模式.介绍了精细生产的基本含义和原理,提出了精细采矿的概念及其廊用范围.精细采矿的主要内容包括开采对象的描述精细化、回采方案精细化、矿山施工精细化、矿山信息精细化、安全保障精细化以及管理精细化等诸多方面.精细采矿综合采用各方面的先进技术,配合精细生产管理,以获得最好的矿山开采经济效益、环境效益与社会效益.%With the production of some abandoned mines restarting, there are many problems in the traditional mode of extensive mining during the exploration of residual ore. The way to solve these problems is to undertake the new model for exploring mineral resources - the mode of refined mining. The basic meanings and principle of refined mining are introduced,and its definition and application field are proposed. The refined mining mainly include refining the description of production target, refining the stoping scheme, refining the mining construction, refining the mining information, refining the safety control,refining the management of mining and so on. The refined mining comprehensively adopts advanced technologies in many respects, so as to acquire greatest economic benefits, environmental benefits and social benefits with the addition of refined management.

  5. On the application of grits to thermomechanical pulp refining

    Energy Technology Data Exchange (ETDEWEB)

    Somboon, P.

    2009-07-01

    The objective of this thesis was to develop a mechanical pulping process capable of producing good-quality pulp, while consuming less electrical energy. The study focused on applying of grits to TMP refining to enhance the breaking of the fiber cell wall promoting faster development of pulp fibers to the desired quality for papermaking. The study comprises tests at laboratory scale and development of an application for industrial, including trials at pilot scale A preliminary trial with the grit application in TMP refining was conducted at laboratory scale. The results showed that the grits should be introduced between the first and second stages in TMP refining. The grit treatment on the TMP fibers caused disruption of the wall structure, opening of the outer layers and peeling-off of the cell wall. The efficient disruption with minimal shortening and weakening of fibers was found to be operated at a low-intensity and high-frequency of treatment. According to an experiment with first-stage TMP pulp, the disrupted pulp developed faster during subsequent refining, while the energy consumption was reduced by up to 30% without a significant loss of pulp quality. With the aim of developing an industrial application, the refiner segments were modified by applying grits on the refiner segment surfaces. The grits were made from self-fluxing tungsten-carbide powder and a Ni-base alloy powder, which were laser-clad onto the surface of breaker bars, the inner part of a segment. Trials with grit segments were carried out on a pilot refiner. The grit segments were applied in first-stage TMP refining, followed by treatment with base segments operated under normal mill conditions. The grit segments were found to have no negative effects on the refining system. A refiner equipped with grit segments, operated at a speed of 2400 rpm, produced pulp with a higher level of disruption of fiber cell walls than a refiner equipped with the reference segments. According to the results, the

  6. COMBIT: protocol of a randomised comparison trial of COMbined modified constraint induced movement therapy and bimanual intensive training with distributed model of standard upper limb rehabilitation in children with congenital hemiplegia.

    Science.gov (United States)

    Boyd, Roslyn N; Ziviani, Jenny; Sakzewski, Leanne; Miller, Laura; Bowden, Joanne; Cunnington, Ross; Ware, Robert; Guzzetta, Andrea; Al Macdonell, Richard; Jackson, Graeme D; Abbott, David F; Rose, Stephen

    2013-06-28

    Children with congenital hemiplegia often present with limitations in using their impaired upper limb which impacts on independence in activities of daily living, societal participation and quality of life. Traditional therapy has adopted a bimanual training approach (BIM) and more recently, modified constraint induced movement therapy (mCIMT) has emerged as a promising unimanual approach. Evidence of enhanced neuroplasticity following mCIMT suggests that the sequential application of mCIMT followed by bimanual training may optimise outcomes (Hybrid CIMT). It remains unclear whether more intensely delivered group based interventions (hCIMT) are superior to distributed models of individualised therapy. This study aims to determine the optimal density of upper limb training for children with congenital hemiplegia. A total of 50 children (25 in each group) with congenital hemiplegia will be recruited to participate in this randomized comparison trial. Children will be matched in pairs at baseline and randomly allocated to receive an intensive block group hybrid model of combined mCIMT followed by intensive bimanual training delivered in a day camp model (COMBiT; total dose 45 hours direct, 10 hours of indirect therapy), or a distributed model of standard occupational therapy and physiotherapy care (SC) over 12 weeks (total 45 hours direct and indirect therapy). Outcomes will be assessed at 13 weeks after commencement, and retention of effects tested at 26 weeks. The primary outcomes will be bimanual coordination and unimanual upper-limb capacity. Secondary outcomes will be participation and quality of life. Advanced brain imaging will assess neurovascular changes in response to treatment. Analysis will follow standard principles for RCTs, using two-group comparisons on all participants on an intention-to-treat basis. Comparisons will be between treatment groups using generalized linear models. ACTRN12613000181707.

  7. 贝伐珠单抗联合化疗治疗转移性结直肠癌的研究进展%Research about Bevacizumab combined with FOLFOX protocol to treat metastasis colorectal cancer

    Institute of Scientific and Technical Information of China (English)

    吴玉立; 吕超; 郑铎; 吴硕东

    2013-01-01

    The common therapy of colorectal cancer is FOLFOX scheme,which contains flurouracil,leucovorin and oxaliplatin.Numerous clinical trials have demonstrated that bevacizumab combined with FOLFOX scheme in cancer's therapy is safe and effective.But the adverse reactions including hypertension,neurovirulence,gastrointestinal bleeding and perforation are raised up.Scholars have carried out a series of studies for the overall survival times,tumor response rates and survival qualities for the patients with metastatic colorectal cancer which using the joint scheme,but they draw different conclusions the usefulness and safety of the joint scheme still need more RCT and meta-ananlysis to be proved.%结直肠癌目前常规化疗方案为FOLFOX方案(氟尿嘧啶、亚叶酸钙、奥沙利铂),大量临床试验证明贝伐珠单抗联合FOLFOX方案治疗恶性肿瘤安全有效,但有高血压、神经毒性、胃肠道出血和穿孔等不良反应产生.对转移性结直肠癌患者应用此联合方案对整体生存时间、肿瘤应答率和生存质量进行的研究得出了不同的结论,其安全性和有效性仍需大样本随机对照试验或Meta分析来进一步研究证明.

  8. Phase III trial of postoperative cisplatin, interferon alpha-2b, and 5-FU combined with external radiation treatment versus 5-FU alone for patients with resected pancreatic adenocarcinoma – CapRI: study protocol [ISRCTN62866759

    Directory of Open Access Journals (Sweden)

    Schmitz-Winnenthal H

    2005-04-01

    Full Text Available Abstract After surgical intervention with curative intention in specialised centres the five-year survival of patients with carcinoma of the exocrine pancreas is only 15%. The ESPAC-1 trial showed an increased five-year survival of 21% achieved with adjuvant chemotherapy. Investigators from the Virginia Mason Clinic have reported a 5-year survival rate of 55% in a phase II trial evaluating adjuvant chemotherapy, immunotherapy and external-beam radiation. Design The CapRI study is an open, controlled, prospective, randomised multi-centre phase III trial. Patients in study arm A will be treated as outpatients with 5-Fluorouracil; Cisplatin and 3 million units Interferon alpha-2b for 5 1/2 weeks combined with external beam radiation. After chemo-radiation the patients receive continuous 5-FU infusions for two more cycles. Patients in study arm B will be treated as outpatients with intravenous bolus injections of folinic acid, followed by intravenous bolus injections of 5-FU given on 5 consecutive days every 28 days for 6 cycles. A total of 110 patients with specimen-proven R0 or R1 resected pancreatic adenocarcinoma will be enrolled. An interim analysis for patient safety reasons will be done one year after start of recruitment. Evaluation of the primary endpoint will be performed two years after the last patients' enrolment. Discussion The aim of this study is to evaluate the overall survival period attained by chemo-radiotherapy including interferon alpha 2b administration with adjuvant chemotherapy. The influence of interferon alpha on the effectiveness of the patients' chemoradiation regimen, the toxicity, the disease-free interval and the quality of life are analysed. Different factors are tested in terms of their potential role as predictive markers.

  9. Reverse mechanical after effect during hydrogenation of zone refined iron

    Energy Technology Data Exchange (ETDEWEB)

    Spivak, L.V.; Skryabina, N.E.; Kurmaeva, L.D.; Smirnov, L.V. (Permskij Gosudarstvennyj Univ. (USSR); AN SSSR, Sverdlovsk. Inst. Fiziki Metallov)

    1984-12-01

    The relationship between the process of hydrogenation and the reverse mechanical after effect (RMA) microplastic deformation in the zone refined iron has been studied. Metallographic investigations and mechanical testing of the samples hydrogenated under torsional strain have been performed. It is shown that in the zone refined iron the formation of voids responsible for irreversible hydrogen embrittlement does not occur, but the hydrogen-initiated RMA strain is conserved, i. e. the RMA effects are independent of the presence of discontinuities.

  10. Refinements and Social Order Beliefs: A Unified Survey

    OpenAIRE

    Atsushi Kajii; Stephen Morris

    1997-01-01

    This paper presents a simple framework that allows us to survey and relate some different strands of the game theory literature. We describe a "canonical" way of adding incomplete information to a complete information game. This framework allows us to give a simple "complete theory" interpretation (Kreps 1990) of standard normal form refinements such as perfection, and to relate refinements both to the "higher order beliefs literature" (Rubinstein 1989; Monderer and Samet 1989; Morris, Rob an...

  11. Refinement of Fourier Coefficients from the Stokes Deconvoluted Profile

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Computer-aided experimental technique was used to study the Stokes deconvolution of X-ray diffraction profile.Considerable difference can be found between the Fourier coefficients obtained from the deconvolutions of singlet and doublet experimental profiles. Nevertheless, the resultant physical profiles corresponding to singlet and doublet profiles are identical. An approach is proposed to refine the Fourier coefficients, and the refined Fourier coefficients coincide well with that obtained from the deconvolution of singlet experimental profile.

  12. A Simple Method for Grain Refinement of Pure Aluminium

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In order to produce the fully equiaxed and fine grained pure Al, a simple casting process that used cooling slope was examined and compared with conventional casting process under same casting conditions. By using cooling slope,grains over the whole section of castings were suddenly refined and ability of grain refinement kept up at very high mold temperature up to 550℃ in examined casting conditions.

  13. Low-radiation-dose dual-phase MDCT protocol with split contrast media dose and time optimization: protocol design for renal donors evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Bazeed, Mohamed Fayez (Dept. of Diagnostic Radiology, Faculty of Medicine, Mansoura Univ. (Egypt)), email: bazeed@mans.eun.eg; Fooshang, Fawzy F (National Inst. of Urology and Nephrology (Egypt)); Ahmed, Magdy Aly (Nephrology Dept., Armed Forces Hospitals Southern Region (Saudi Arabia))

    2011-10-15

    Background A routine, multiphase, computed tomography (CT) protocol is associated with high radiation exposure to potential kidney donors. To reduce radiation exposure, several authors have suggested a reduction in the number of phases. Purpose To evaluate a low-radiation-dose, dual-phase protocol (i.e. a protocol with an unenhanced phase and combined vascular and excretory phase) for the preoperative evaluation of potential renal donors. Material and Methods Sixty-five potential renal donors were divided into two groups. The first group was scanned with a routine quadric-phase protocol (non-contrast, arterial, venous, and delayed), and the second group was scanned with a triple-phase protocol (dual phase protocol + venous phase). In the second group, we replaced CT angiography with a routine abdominal CT technique. In addition to the evaluation of renal arteries, veins, and excretory systems, the radiation dose of the suggested protocol was compared to that of the routine quadric-phase protocol. Results The suggested protocol was efficient in the evaluation of renal arteries, veins, and excretory systems in all studied potential renal donors. Renal arteries were well visualized in the combined vascular excretory phase using the routine abdominal CT technique; no significant difference was noted when these results were compared to those obtained from the CT angiography used in the quadric-phase protocol. The mean effective radiation dose of our suggested dual-phase protocol was only 34% of the dose resulting from the routine quadric-phase protocol. Conclusion Use of a low-radiation, dual-phase, CT protocol, which relied on both an unenhanced phase and a combined vascular and excretory phase, significantly reduced radiation dose. Furthermore, the proposed protocol provides adequate visualization of renal arteries and veins, and affords sufficient opacification of the urinary tract using improved acquisition triggering

  14. Some theoretical comparisons of refined Ritz vectors and Ritz vectors

    Institute of Scientific and Technical Information of China (English)

    JIA Zhongxiao

    2004-01-01

    Refined projection methods proposed by the author have received attention internationally. We are concerned with a conventional projection method and its refined counterpart for computing approximations to a simple eigenpair (λ,x)of a large matrix A. Given a subspace w that contains anapproximation to x, these two methods compute approximations(μ(x~)) and (μ(x^)) to (λ,x),respectively. We establish three results. First, the refinedeigenvector approximation or simply the refined Ritz vector (x^) is unique as the deviation of x from w approaches zero if λ is simple. Second, interms of residual norm of the refined approximate eigenpair (μ,(x^)), we derive lower and upper bounds for the sine of the angle betweenthe Ritz vector (x~) and the refined eigenvector approximation (x^), and we prove that (x~)≠(x^) unless (x^)=x. Third, we establish relationships between theresidual norm ‖A(x~)-μ(x^)‖ of the conventionalmethods and the residual norm ‖A(x^)-μ(x^)‖ of therefined methods, and we show that the latter is always smallerthan the former if (μ,(x^)) is not an exact eigenpair ofA, indicating that the refined projection method is superiorto the corresponding conventional counterpart.

  15. Some theoretical comparisons of refined Ritz vectors and Ritz vectors

    Institute of Scientific and Technical Information of China (English)

    JIA; Zhongxiao

    2004-01-01

    Refined projection methods proposed by the author have received attention internationally. We are concerned with a conventional projection method and its refined counterpart for computing approximations to a simple eigenpair (λ,x)of a large matrix A. Given a subspace w that contains anapproximation to x, these two methods compute approximations(μ(x~)) and (μ(x)) to (λ,x),respectively. We establish three results. First, the refinedeigenvector approximation or simply the refined Ritz vector (x) is unique as the deviation of x from w approaches zero if λ is simple. Second, interms of residual norm of the refined approximate eigenpair (μ,(x)), we derive lower and upper bounds for the sine of the angle betweenthe Ritz vector (x~) and the refined eigenvector approximation (x), and we prove that (x~)≠(x) unless (x)=x. Third, we establish relationships between theresidual norm ‖A(x~)-μ(x)‖ of the conventionalmethods and the residual norm ‖A(x)-μ(x)‖ of therefined methods, and we show that the latter is always smallerthan the former if (μ,(x)) is not an exact eigenpair ofA, indicating that the refined projection method is superiorto the corresponding conventional counterpart.

  16. Ultrasonic sensor to characterize wood pulp during refining.

    Science.gov (United States)

    Greenwood, M S; Panetta, P D; Bond, L J; McCaw, M W

    2006-12-22

    A novel sensor concept has been developed for measuring the degree of refining, the water retention value (WRV), and the weight percentage of wood pulp during the refining process. The measurement time is less than 5 min and the sensor can operate in a slip-stream of the process line or as an at-line instrument. The degree of refining and the WRV are determined from settling measurements. The settling of a pulp suspension (with a weight percentage less than 0.5 wt%) is observed, after the mixer, which keeps the pulp uniformly distributed, is turned off. The attenuation of ultrasound as a function of time is recorded and these data show a peak at a time designated as the "peak time." The peak time T increases with the degree of refining, as demonstrated by measuring pulp samples with known degrees of refining. The WRV can be determined using the relative peak time, defined as the ratio T(2)/T(1), where T(1) is an initial peak time and T(2) is the value after additional refining. This method offers an alternative WRV test for the industry to the current time-consuming method.

  17. Effect of refining on quality and composition of sunflower oil.

    Science.gov (United States)

    Pal, U S; Patra, R K; Sahoo, N R; Bakhara, C K; Panda, M K

    2015-07-01

    An experimental oil refining unit has been developed and tested for sunflower oil. Crude pressed sunflower oil obtained from a local oil mill was refined using chemical method by degumming, neutralization, bleaching and dewaxing. The quality and composition of crude and refined oil were analysed compared. Reduction in phosphorous content from 6.15 ppm to 0, FFA content from 1.1 to 0.24 % (oleic acid), peroxide value from 22.5 to 7.9 meq/kg, wax content from 1,420 to 200 ppm and colour absorbance value from 0.149 to 0.079 (in spectrophotometer at 460 nm) were observed from crude to refined oil. It was observed that refining did not have significant effect on fatty acid compositions as found in the percentage peak area in the GC-MS chromatogram. The percentage of unsaturated fatty acid in both the oils were recorded to be about 95 % containing 9-Octadecenoic acid (Oleic acid) and 11,14-Eicosadienoic acid (elongated form of linoleic acid). The research results will be useful to small entrepreneurs and farmers for refining of sunflower oil for better marketability.

  18. Hybrid protocols plus natural treatments for inflammatory conditions.

    Science.gov (United States)

    1998-01-01

    Hybrid protocols combine one, two, or three pharmaceutical drugs with several nutritional or immune-based therapies. These protocols are not limited solely to FDA-approved drugs or strictly to alternative therapies. The rationale for using a hybrid protocol is to find an effective antiviral regimen that also restores immune function. The goal is to obtain the benefits of protease inhibitors without viral resistance and side effects which include problems with fat metabolism and cholesterol levels. Natural treatments for inflammatory conditions are also described. Options include licorice root, ginger root, and slippery elm.

  19. A Secure Elliptic Curve-Based RFID Protocol

    Institute of Scientific and Technical Information of China (English)

    Santi Martínez; Magda Valls; Concepcó Roig; Josep M. Miret; Francesc Giné

    2009-01-01

    Nowadays, the use of Radio Frequency Identification (RFID) systems in industry and stores has increased.Nevertheless, some of these systems present privacy problems that may discourage potential users. Hence, high confidence and efficient privacy protocols are urgently needed. Previous studies in the literature proposed schemes that are proven to be secure, but they have scalability problems. A feasible and scalable protocol to guarantee privacy is presented in this paper.The proposed protocol uses elliptic curve cryptography combined with a zero knowledge-based authentication scheme. An analysis to prove the system secure, and even forward secure is also provided.

  20. Stream Control Transmission Protocol Steganography

    CERN Document Server

    Fraczek, Wojciech; Szczypiorski, Krzysztof

    2010-01-01

    Stream Control Transmission Protocol (SCTP) is a new transport layer protocol that is due to replace TCP (Transmission Control Protocol) and UDP (User Datagram Protocol) protocols in future IP networks. Currently, it is implemented in such operating systems like BSD, Linux, HP-UX or Sun Solaris. It is also supported in Cisco network devices operating system (Cisco IOS) and may be used in Windows. This paper describes potential steganographic methods that may be applied to SCTP and may pose a threat to network security. Proposed methods utilize new, characteristic SCTP features like multi-homing and multistreaming. Identified new threats and suggested countermeasures may be used as a supplement to RFC 5062, which describes security attacks in SCTP protocol and can induce further standard modifications.

  1. Survey protocol for invasive species

    OpenAIRE

    Menza, Charles

    2009-01-01

    This protocol was developed by the Biogeography Branch of NOAA’s Center for Coastal Monitoring and Assessment to support invasive species research by the Papahānaumokuākea Marine National Monument. The protocol’s objective is to detect Carijoa riisei and Hypnea musciformis in deepwater habitats using visual surveys by technical divers. Note: This protocol is designed to detect the presence or absence of invasive species. A distinct protocol is required to collect information on abundance ...

  2. A PROGRAMMED LABOUR - PROTOCOL

    Directory of Open Access Journals (Sweden)

    Savita

    2014-12-01

    Full Text Available A Prospective randomized Clinical study of outcome of labour following. “A Programmed labour. Protocol” was done at Department of OBG, MRMC Gulbarga. The Protocol was aimed with dual. Objective of Providing Pain relief during labour and teaching the goal of safe motherhood by optimizing objective outcome. AIMS AND OBJECTIVES: Shortening of duration of labour. Effect of labour analgesia. Monitoring of the events during labour. Lowering the incidence of operative deliveries. METHODS: 100 cases primi pregnant women admitted in labour room are randomly selected. It is designed to apply to low risk primi parous, singleton cephalic presentation without evidence of CPD and spontaneous onset of labour. RESULTS: Shortened duration of all the stages of Labour, especially significant reduction in duration of active phase of labour. CONCLUSION: The programmed labour is simple easy and effective method for painless and safe delivery.

  3. Protocols for Scholarly Communication

    CERN Document Server

    Pepe, Alberto; Pepe, Alberto; Yeomans, Joanne

    2007-01-01

    CERN, the European Organization for Nuclear Research, has operated an institutional preprint repository for more than 10 years. The repository contains over 850,000 records of which more than 450,000 are full-text OA preprints, mostly in the field of particle physics, and it is integrated with the library's holdings of books, conference proceedings, journals and other grey literature. In order to encourage effective propagation and open access to scholarly material, CERN is implementing a range of innovative library services into its document repository: automatic keywording, reference extraction, collaborative management tools and bibliometric tools. Some of these services, such as user reviewing and automatic metadata extraction, could make up an interesting testbed for future publishing solutions and certainly provide an exciting environment for e-science possibilities. The future protocol for scientific communication should naturally guide authors towards OA publication and CERN wants to help reach a full...

  4. Enhancing scientific reasoning by refining students' models of multivariable causality

    Science.gov (United States)

    Keselman, Alla

    Inquiry learning as an educational method is gaining increasing support among elementary and middle school educators. In inquiry activities at the middle school level, students are typically asked to conduct investigations and infer causal relationships about multivariable causal systems. In these activities, students usually demonstrate significant strategic weaknesses and insufficient metastrategic understanding of task demands. Present work suggests that these weaknesses arise from students' deficient mental models of multivariable causality, in which effects of individual features are neither additive, nor constant. This study is an attempt to develop an intervention aimed at enhancing scientific reasoning by refining students' models of multivariable causality. Three groups of students engaged in a scientific investigation activity over seven weekly sessions. By creating unique combinations of five features potentially involved in earthquake mechanism and observing associated risk meter readings, students had to find out which of the features were causal, and to learn to predict earthquake risk. Additionally, students in the instructional and practice groups engaged in self-directed practice in making scientific predictions. The instructional group also participated in weekly instructional sessions on making predictions based on multivariable causality. Students in the practice and instructional conditions showed small to moderate improvement in their attention to the evidence and in their metastrategic ability to recognize effective investigative strategies in the work of other students. They also demonstrated a trend towards making a greater number of valid inferences than the control group students. Additionally, students in the instructional condition showed significant improvement in their ability to draw inferences based on multiple records. They also developed more accurate knowledge about non-causal features of the system. These gains were maintained

  5. Refined exposure assessment of Brown HT (E 155

    Directory of Open Access Journals (Sweden)

    European Food Safety Authority

    2014-05-01

    Full Text Available The European Food Safety Authority (EFSA carried out an exposure assessment of Brown HT (E 155 taking into account additional information on its use in foods as consumed. In 2010, the EFSA Panel on Food Additives and Nutrient Sources added to Food (ANS adopted a scientific opinion on the re-evaluation of Brown HT and concluded that dietary exposure in both adults and 1-10 year old children at the high level may exceed the Acceptable Daily Intake (ADI for Brown HT of 1.5 mg/kg body weight (bw/day at the upper end of the range. Following this conclusion, the European Commission requested EFSA to perform a refined exposure assessment for this food colour. Data on the presence of Brown HT in foods were requested from relevant stakeholders through a call for usage and concentration data. Usage levels were provided to EFSA for six out of 37 food categories in which Brown HT is authorised. A limited number of analytical results were also reported to EFSA, all below the limit of detection (LOD or limit of quantification (LOQ. Exposure assessment was performed using the EFSA Comprehensive Food Consumption Database. Three different scenarios were considered, including i exposure estimates based on Maximum Permitted Levels (MPLs, ii a combination of MPLs and reported maximum use levels and iii reported maximum use levels only. Considering the first two scenarios, high exposure levels (95th percentile exceeded the ADI for all age groups, with exception for the elderly. In comparison to the previous assessment, for both children and adults, the current mean exposure estimates are of the same order of magnitude, while the 95th percentile exposure is lower, particularly in adults. The mean and high level exposure estimates of Brown HT are below the ADI for all population groups when considering the reported use levels only.

  6. LOCALLY REFINED SPLINES REPRESENTATION FOR GEOSPATIAL BIG DATA

    Directory of Open Access Journals (Sweden)

    T. Dokken

    2015-08-01

    Full Text Available When viewed from distance, large parts of the topography of landmasses and the bathymetry of the sea and ocean floor can be regarded as a smooth background with local features. Consequently a digital elevation model combining a compact smooth representation of the background with locally added features has the potential of providing a compact and accurate representation for topography and bathymetry. The recent introduction of Locally Refined B-Splines (LR B-splines allows the granularity of spline representations to be locally adapted to the complexity of the smooth shape approximated. This allows few degrees of freedom to be used in areas with little variation, while adding extra degrees of freedom in areas in need of more modelling flexibility. In the EU fp7 Integrating Project IQmulus we exploit LR B-splines for approximating large point clouds representing bathymetry of the smooth sea and ocean floor. A drastic reduction is demonstrated in the bulk of the data representation compared to the size of input point clouds. The representation is very well suited for exploiting the power of GPUs for visualization as the spline format is transferred to the GPU and the triangulation needed for the visualization is generated on the GPU according to the viewing parameters. The LR B-splines are interoperable with other elevation model representations such as LIDAR data, raster representations and triangulated irregular networks as these can be used as input to the LR B-spline approximation algorithms. Output to these formats can be generated from the LR B-spline applications according to the resolution criteria required. The spline models are well suited for change detection as new sensor data can efficiently be compared to the compact LR B-spline representation.

  7. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.;

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  8. State Quo and Future of China's Refining Sector%State Quo and Future of China's Refining Sector

    Institute of Scientific and Technical Information of China (English)

    Zhu He

    2012-01-01

    Ever since the beginning of the 21st century and China's entering into the WTO, China's refining sector has made great strides with the support of a rapid economic growth and a strong demand for petroleum despite the fact that the industry has been facing a fiercer international competition, higher oil prices, policy-related losses (losses caused by the fact that refined oil prices strangely lower than that of the crude), international financial crisis, unstable energy geopolitics, and etc.

  9. A Performance Evaluation of Improved IPv6 Routing Protocol for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Vu Chien Thang

    2016-12-01

    Full Text Available In the near future, IP-based wireless sensor networks will play a key role in several application scenarios such as smart grid, smart home, healthcare, and building automation... An IPv6 routing protocol is expected to provide internet connectivity to any IP-based sensor node. In this paper, we propose IRPL protocol for IP-based wireless sensor networks. IRPL protocol uses a combination of two routing metrics that are the link quality and the remaining energy state of the preferred parent to select the optimal path. In IRPL protocol, we combine two metrics based on an alpha weight. IRPL protocol is implemented in ContikiOS and evaluated by using simulation and testbed experiments. The results show that IRPL protocol has achieved better network lifetime, data delivery ratio and energy balance compared to the traditional solution of RPL protocol

  10. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.

  11. Microcontroller Protocol for Secure Broadcast in Controller Area Networks

    Directory of Open Access Journals (Sweden)

    B.Vijayalakshmi

    2014-04-01

    Full Text Available Controller Area Network is a bus commonly used by controllers inside vehicles and in various industrial control applications. In the past controllers were assumed to operate in secure perimeters, but today these environments are well connected to the outside world and recent incidents showed them extremely vulnerable to cyber-attacks. To withstand such threats, one can implement security in the application layer of CAN. Here we design, refine and implement a broadcast authentication protocol based on the well known paradigm of using key-chains and time synchronization, a commonly used Mechanism in wireless sensor networks, which allows us to take advantage from the use of symmetric primitives without the need of secret shared keys during broadcast. But, as process control is a time critical operation we make several refinements in order to improve on the authentication delay. For this we study several trade-offs to alleviate shortcomings on computational speed, memory and bandwidth up to the point of using reduced versions of hash functions that can assure ad hoc security. To prove the efficiency of the protocol

  12. Severe hypomagnesaemia with tetany following ESHAP protocol

    Directory of Open Access Journals (Sweden)

    Majumdar Gautam

    2002-01-01

    Full Text Available Abstract Background One patient with B-cell Non-Hodgkin's Lymphoma developed severe hypomagnesaemia and tetany 15 days after the first course of treatment with ESHAP protocol. This prompted a careful look at the incidence and severity of hypomagnesaemia during treatment with this combination chemotherapy. Method This patient and two further patients having the same treatment were monitored for hypomagnesaemia throughout their treatment period. Result All three patients developed significant hypomagnesaemia requiring intravenous magnesium infusion in the second and third weeks after treatment though not after every course of chemotherapy. Conclusions ESHAP protocol is often associated with significant hypomagnesaemia two to three weeks after treatment. Therefore, serum magnesium level should be monitored throughout the treatment period.

  13. CRYPTOGRAPHIC PROTOCOL DEPENDING ON BIOMETRIC AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    SANJUKTA PAL

    2013-02-01

    Full Text Available In modern age, security is the most challenging issue for using the secure data used by computer. This cryptographic protocol on biometric authentication is nothing but the combination of cryptography and biometric authentication. Here the same idea of cryptography is working (i.e. using key, conversion of plain textinto cipher text called encryption and the reverse, means cipher text to plain text called decryption. Here the most promising method fingerprint geometry of biometric authentication is used as the key for encryption and decryption. Here this cryptographic protocol is just an algorithm for matching the key means matching of stored fingerprint images say DB Images with further given fingerprint image say Final Image. For matching purpose we used binary conversion of images. This algorithm is suitable for any type of data (means text data, multimedia data etc.

  14. Treatment protocols for cervical cancer

    Directory of Open Access Journals (Sweden)

    Vujkov Tamara

    2002-01-01

    Full Text Available Introduction Cervical cancer is the second most common cancer in women worldwide and the second cause of cancer death among women. About 95% (90% in developed countries of invasive carcinomas are of sqamous types, and 5% (10% in developed countries are adenocarcinomas. FIGO classification of cervical carcinomas, based on clinical staging and prognostic factor dictate therapeutic procedures and help in designing treatment protocols. Therapeutic modalities Surgical therapy includes conization, radical hysterectomy with pelvic lymphadenectomy and palliative operation urinary diversion and colostomy. Radiotherapy, brachytherapy and teletherapy are most recently combined with chemotherapy as concurrent chemoradiation. Discussion and conclusion No change in therapeutic modalities will ever decrease mortality rate of cervical carcinoma as much as education, prevention and early screening. The 5-year survival for locally advanced disease has not improved during the last 40 years as a result of failure to deliver therapy to the paraaortic region. Paraaortic lymph nodes should be evaluated before therapy planning by different imaging procedures, or more exactly by surgical staging: laparoscopy or laparotomy. Radical operations of cervical carcinoma should be performed by experienced surgeons, educated for this type of operation, with sufficient number of cases.

  15. Modified calibration protocol evaluated in a model-based testing of SBR flexibility

    DEFF Research Database (Denmark)

    Corominas, Lluís; Sin, Gürkan; Puig, Sebastià;

    2011-01-01

    operation using fixed phase length and dissolved oxygen set points and two real-time control strategies is compared to find optimal operation under dynamic conditions. The results show that a validated model of high quality is obtained using the updated protocol and that the optimization of the system......The purpose of this paper is to refine the BIOMATH calibration protocol for SBR systems, in particular to develop a pragmatic calibration protocol that takes advantage of SBR information-rich data, defines a simulation strategy to obtain proper initial conditions for model calibration and provides...... statistical evaluation of the calibration outcome. The updated calibration protocol is then evaluated on a case study to obtain a thoroughly validated model for testing the flexibility of an N-removing SBR to adapt the operating conditions to the changing influent wastewater load. The performance of reference...

  16. Modified calibration protocol evaluated in a model-based testing of SBR flexibility.

    Science.gov (United States)

    Corominas, Lluís; Sin, Gürkan; Puig, Sebastià; Balaguer, Maria Dolors; Vanrolleghem, Peter A; Colprim, Jesús

    2011-02-01

    The purpose of this paper is to refine the BIOMATH calibration protocol for SBR systems, in particular to develop a pragmatic calibration protocol that takes advantage of SBR information-rich data, defines a simulation strategy to obtain proper initial conditions for model calibration and provides statistical evaluation of the calibration outcome. The updated calibration protocol is then evaluated on a case study to obtain a thoroughly validated model for testing the flexibility of an N-removing SBR to adapt the operating conditions to the changing influent wastewater load. The performance of reference operation using fixed phase length and dissolved oxygen set points and two real-time control strategies is compared to find optimal operation under dynamic conditions. The results show that a validated model of high quality is obtained using the updated protocol and that the optimization of the system's performance can be achieved in different manners by implementing the proposed control strategies.

  17. A high order special relativistic hydrodynamic code with space-time adaptive mesh refinement

    CERN Document Server

    Zanotti, Olindo

    2013-01-01

    We present a high order one-step ADER-WENO finite volume scheme with space-time adaptive mesh refinement (AMR) for the solution of the special relativistic hydrodynamics equations. By adopting a local discontinuous Galerkin predictor method, a high order one-step time discretization is obtained, with no need for Runge-Kutta sub-steps. This turns out to be particularly advantageous in combination with space-time adaptive mesh refinement, which has been implemented following a "cell-by-cell" approach. As in existing second order AMR methods, also the present higher order AMR algorithm features time-accurate local time stepping (LTS), where grids on different spatial refinement levels are allowed to use different time steps. We also compare two different Riemann solvers for the computation of the numerical fluxes at the cell interfaces. The new scheme has been validated over a sample of numerical test problems in one, two and three spatial dimensions, exploring its ability in resolving the propagation of relativ...

  18. Refined BPS invariants, Chern-Simons theory, and the quantum dilogarithm

    Science.gov (United States)

    Dimofte, Tudor Dan

    In this thesis, we consider two main subjects: the refined BPS invariants of Calabi-Yau threefolds, and three-dimensional Chern-Simons theory with complex gauge group. We study the wall-crossing behavior of refined BPS invariants using a variety of techniques, including a four-dimensional supergravity analysis, statistical-mechanical melting crystal models, and relations to new mathematical invariants. We conjecture an equivalence between refined invariants and the motivic Donaldson-Thomas invariants of Kontsevich and Soibelman. We then consider perturbative Chern-Simons theory with complex gauge group, combining traditional and novel approaches to the theory (including a new state integral model) to obtain exact results for perturbative partition functions. We thus obtain a new class of topological invariants, which are not of finite type, defined in the background of genuinely nonabelian flat connections. The two main topics, BPS invariants and Chern-Simons theory, are connected at both a formal and (we believe) deeper conceptual level by the striking central role that the quantum dilogarithm function plays in each.

  19. Experimental and numerical investigation of the refinement of Hf by EBM

    Science.gov (United States)

    Donchev, Veliko; Vutova, Katia; Vassileva, Vania

    2014-05-01

    Electron beam melting (EBM) in vacuum is one of the most promising technologies for refining and recycling of metals that react with oxygen when heated. Hf is such a metal. Pure Hf (with a small content of gas and metal impurities) is needed for a variety of applications in the aerospace industry and metallurgy, in the production of components of nuclear reactors, microprocessors, optical components etc. We conducted experiments with the ELIT-60 equipment on Hf ingots at electron beam powers of 12, 15, 17 kW and obtained data about the concentration of impurities by the ICP-MS method. For further understanding and optimizing the Hf refining processes, a non-stationary heat model was applied for numerical simulation of the heat transfer processes. Simulation data about the liquid pool variation during the e-beam treatment was thus obtained. The flatness of the crystallization front shape, which is connected to the structure quality, was investigated by optimization criteria related to the curvature of the liquid/solid boundary curve. We also describe an algorithm for calculation of the criteria. One of the criteria was applied to EBM of Hf for different electron beam powers; the results obtained were confirmed by the experimental data. Combining experimental, theoretical and simulation results, a proper technological regime is proposed for better Hf refining.

  20. Using high-order methods on adaptively refined block-structured meshes - discretizations, interpolations, and filters.

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep; Lefantzi, Sophia; Najm, Habib N.; Kennedy, Christopher A.

    2006-01-01

    Block-structured adaptively refined meshes (SAMR) strive for efficient resolution of partial differential equations (PDEs) solved on large computational domains by clustering mesh points only where required by large gradients. Previous work has indicated that fourth-order convergence can be achieved on such meshes by using a suitable combination of high-order discretizations, interpolations, and filters and can deliver significant computational savings over conventional second-order methods at engineering error tolerances. In this paper, we explore the interactions between the errors introduced by discretizations, interpolations and filters. We develop general expressions for high-order discretizations, interpolations, and filters, in multiple dimensions, using a Fourier approach, facilitating the high-order SAMR implementation. We derive a formulation for the necessary interpolation order for given discretization and derivative orders. We also illustrate this order relationship empirically using one and two-dimensional model problems on refined meshes. We study the observed increase in accuracy with increasing interpolation order. We also examine the empirically observed order of convergence, as the effective resolution of the mesh is increased by successively adding levels of refinement, with different orders of discretization, interpolation, or filtering.

  1. Refining stent technologies for femoral interventions.

    Science.gov (United States)

    Bosiers, M; Deloose, K; Callaert, J; Maene, L; Keirse, K; Verbist, J; Peeters, P

    2012-08-01

    Stents were created as a mechanical scaffold to prevent vessel recoil and luminal renarrowing after percutaneous transluminal angioplasty (PTA). In femoropopliteal arteries, indication for stent implantation remains a topic much debated on, especially in long lesion configurations. Ever since the first stents were introduced on the market, in-stent restenosis (ISR) has been an important issue. The evolution in stent design has known a major progression in the last decades from the first generation of stents, plagued with high fracture rates and low primary patency rates, to the design of newer stents to tackle these outcomes. More flexible and longer stents decreased the high fracture rates and drug-eluting stents offered a solution to the restenosis rates by local drug application. The difficult recrossibility of the lesion because of the presence of a permanent vascular scaffold is an obstacle that the drug-coated balloon (DCB) overcomes. Future perspectives in the treatment of femoropopliteal lesions are found in the bioresorbable stent implantation. The bioresorbable stent combines the advantages of a drug-eluting scaffolding stent without the remainder of a foreign object in the long-term. Further investigations in this area will eventually evolve in the creation of a superior endovascular treatment modality with high long-term patency rates and minimal detriments.

  2. TUMORS INVADING PARAPHARYNGEAL SPACE: REFINED IMAGING DIAGNOSIS

    Institute of Scientific and Technical Information of China (English)

    Zhuang Qixin; Cheng Yingsheng; Yang Shixun; Shang kezhong; Yan Xinhua

    1998-01-01

    Objective: To investigate imaging findings of tumors invading parapharyngeal space. Methods: Magnetic resonance imaging (MRI) computerized tomography (CT), magnetic resonance angiography (MRA)and digital subtruation angiography (DSA) findings of 19patients with tumors infiltrating parapharyngeal space by surgery and pathology were analysed, including four branchial cleft cysts, three jugular glomus tumors, four carotid body tumors, three neurilemomas and five carcinomas of nasopharynx involving parapharyngeal space. Fifteen patients underwent MRI scanning nine patients had CT scanning, three patients MRA and five patients DSA. Results: MRI provided clinically useful informations about the size, shape, extent and site of the parapharyngeal space tumors, and also their effects on adjacent structures. The main MRI features of paraganglioma presented as many low signal tortuous and creeping vessels in the tumor. The main CT features of jugular glomus tumor revealed as jugular foramen enlargement with bone destruction. Tumor vessels were clearly displayed by MRA and DSA. Conclusion: MRI was superior to CT in the diagnosis of tumors invading parapharyngeal space. The location and nature of the lesions could be diagnosed accurately by MRI combined with CT or DSA.

  3. Refining the conditions for sympatric ecological speciation.

    Science.gov (United States)

    Débarre, F

    2012-12-01

    Can speciation occur in a single population when different types of resources are available, in the absence of any geographical isolation, or any spatial or temporal variation in selection? The controversial topics of sympatric speciation and ecological speciation have already stimulated many theoretical studies, most of them agreeing on the fact that mechanisms generating disruptive selection, some level of assortment, and enough heterogeneity in the available resources, are critical for sympatric speciation to occur. Few studies, however, have combined the three factors and investigated their interactions. In this article, I analytically derive conditions for sympatric speciation in a general model where the distribution of resources can be uni- or bimodal, and where a parameter controls the range of resources that an individual can exploit. This approach bridges the gap between models of a unimodal continuum of resources and Levene-type models with discrete resources. I then test these conditions against simulation results from a recently published article (Thibert-Plante & Hendry, 2011, J. Evol. Biol. 24: 2186-2196) and confirm that sympatric ecological speciation is favoured when (i) selection is disruptive (i.e. individuals with an intermediate trait are at a local fitness minimum), (ii) resources are differentiated enough and (iii) mating is assortative. I also discuss the role of mating preference functions and the need (or lack thereof) for bimodality in resource distributions for diversification.

  4. "Best fit" framework synthesis: refining the method.

    Science.gov (United States)

    Carroll, Christopher; Booth, Andrew; Leaviss, Joanna; Rick, Jo

    2013-03-13

    Following publication of the first worked example of the "best fit" method of evidence synthesis for the systematic review of qualitative evidence in this journal, the originators of the method identified a need to specify more fully some aspects of this particular derivative of framework synthesis. We therefore present a second such worked example in which all techniques are defined and explained, and their appropriateness is assessed. Specified features of the method include the development of new techniques to identify theories in a systematic manner; the creation of an a priori framework for the synthesis; and the "testing" of the synthesis. An innovative combination of existing methods of quality assessment, analysis and synthesis is used to complete the process. This second worked example was a qualitative evidence synthesis of employees' views of workplace smoking cessation interventions, in which the "best fit" method was found to be practical and fit for purpose. The method is suited to producing context-specific conceptual models for describing or explaining the decision-making and health behaviours of patients and other groups. It offers a pragmatic means of conducting rapid qualitative evidence synthesis and generating programme theories relating to intervention effectiveness, which might be of relevance both to researchers and policy-makers.

  5. Refining signal decomposition for GRETINA detectors

    Science.gov (United States)

    Prasher, V. S.; Campbell, C. M.; Cromaz, M.; Crawford, H. L.; Wiens, A.; Lee, I. Y.; Macchiavelli, A. O.; Lister; Merchan, E.; Chowdhury, P.; Radford, D. C.

    2013-04-01

    The reconstruction of the original direction and energy of gamma rays through locating their interaction points in solid state detectors is a crucial evolving technology for nuclear physics, space science and homeland security. New arrays AGATA and GRETINA have been built for nuclear science based on highly segmented germanium crystals. The signal decomposition process fits the observed waveform from each crystal segment with a linear combination of pre-calculated basis signals. This process occurs on an event-by-event basis in real time to extract the position and energy of γ-ray interactions. The methodology for generating a basis of pulse shapes, varying according to the position of the charge generating interactions, is in place. Improvements in signal decomposition can be realized by better modeling the crystals. Specifically, a better understanding of the true impurity distributions, internal electric fields, and charge mobilities will lead to more reliable bases, more precise definition of the interaction points, and hence more reliable tracking. In this presentation we will cover the current state-of-the-art for basis generation and then discuss the sensitivity of the predicted pulse shapes when varying some key parameters.

  6. Transdermal Delivery of Nisoldipine: Refinement of Vehicles.

    Science.gov (United States)

    El Maghraby, Gamal M; Ahmed, Amal A; Osman, Mohamed A

    2015-01-01

    Nisoldipine is used for the treatment of hypertension and angina pectoris. However, it has very low bioavailabil-ty, which is attributed to extensive pre-systemic metabolism. In addition, nisol-ipine is highly potent (used at a low dose). Taking into consideration the fact that transdermal delivery avoids the pre-systemic metabolism and is only suit-ble for potent drugs, nisoldipine can be considered as an excellent candidate for transdermal delivery. Accordingly, the purpose of this study was to optimize nisoldipine transdermal delivery. That was achieved initially by investigating the effect of vehicles on skin penetration. The tested vehicles were ranked with respect to transdermal flux of nisoldipine as isopropyl myristate > oleic acid > propylene glycol > water > polyethylene glycol 400. A combination of oleic acid with propylene glycol was synergistic with a ratio of 1:2 w/w being the best. These results were taken further to develop microemulsion systems using either oleic acid or isopropyl myristate as the oil phase. Both cases employed polyoxy-thylene sorbitan monooleate as a surfactant with propylene glycol being uti-ized as a cosurfactant in the case of oleic acid and ethanol in the case of isopropyl myristate. The developed microemulsions produced significant enhancement in nisoldipine transdermal delivery with the flux being even greater than that obtained from the corresponding pure vehicles. This achieve-ent was recorded in optimum microemulsion formulations which contained a cosurfactant. The study provided stepwise optimization of a vehicle for trans-ermal delivery of nisoldipine.

  7. Mathematical Modeling of the Vacuum Circulation Refining Processof Molten Steel

    Institute of Scientific and Technical Information of China (English)

    魏季和

    2003-01-01

    The available studies in the literature on mathematical modeling of the vacuum circulation (RH) refining process of molten steel have briefly been reviewed. The latest advances obtained by the author with his research group have been Summarized. On the basis of the mass and momentum balances in the system, a new mathematical model for decarburization and degassing during the RH and RH-KTB refining processes of molten steel was proposed and developed. The refining roles of the three reaction sites, i.e. the up-snorkel zone, the droplet group and steel bath in the vacuum vessel, were considered in the model. It was assumed that the mass transfer of reactive components in the molten steel is the rate control step of the refining reactions. And the friction losses and drags of flows in the snorkels and vacuum vessel were all counted. The model was applied to the refining of molten steel in a multifunction RH degasser of 90 t capacity. The decarburization and degassing processes in the degasser under the RH and RH-KTB operating condi-tions were modeled and analyzed using this model. Besides, proceeded from the two-resistance mass transfer theory and the mass bal-ance of sulphur in the system, a kinetic model for the desulphurization by powder injection and blowing in the RH refining of molten steel was developed. Modeling and predictions of the process of injecting and blowing the lime based powder flux under assumed oper-ating modes with the different initial contents of sulphur and amounts of powder injected and blown in a RH degasser of 300 t capacity were carried out using the model. It was demonstrated that for the RH and RH-KTB refining processes, and the desulphurization by powder injection and blowing in the RH refining, the results predicted by the models were all in good agreement respectively with data from industrial experiments and practice. These models may be expected to offer some useful information and a reliable basis for de-termining and optimizing

  8. Massive transfusion protocols: current best practice

    Directory of Open Access Journals (Sweden)

    Hsu YM

    2016-03-01

    Full Text Available Yen-Michael S Hsu,1 Thorsten Haas,2 Melissa M Cushing1 1Department of Pathology and Laboratory Medicine, Weill Cornell Medical College, New York, NY, USA; 2Department of Anesthesia, University Children's Hospital Zurich, Zurich, Switzerland Abstract: Massive transfusion protocols (MTPs are established to provide rapid blood replacement in a setting of severe hemorrhage. Early optimal blood transfusion is essential to sustain organ perfusion and oxygenation. There are many variables to consider when establishing an MTP, and studies have prospectively evaluated different scenarios and patient populations to establish the best practices to attain improved patient outcomes. The establishment and utilization of an optimal MTP is challenging given the ever-changing patient status during resuscitation efforts. Much of the MTP literature comes from the trauma population, due to the fact that massive hemorrhage is the leading cause of preventable trauma-related death. As we come to further understand the positive and negative clinical impacts of transfusion-related factors, massive transfusion practice can be further refined. This article will first discuss specific MTPs targeting different patient populations and current relevant international guidelines. Then, we will examine a wide selection of therapeutic products to support MTPs, including newly available products and the most suitable of the traditional products. Lastly, we will discuss the best design for an MTP, including ratio-based MTPs and MTPs based on the use of point-of-care coagulation diagnostic tools. Keywords: hemorrhage, MTP, antifibrinolytics, coagulopathy, trauma, ratio, logistics, guidelines, hemostatic

  9. Chapter 23: Combined Heat and Power

    Energy Technology Data Exchange (ETDEWEB)

    Simons, George [Itron, Davis, CA (United States); Barsun, Stephan [Itron, Davis, CA (United States)

    2016-11-01

    This protocol defines a combined heat and power (CHP) measure as a system that sequentially generates both electrical energy and useful thermal energy from one fuel source at a host customer's facility or residence. This protocol is aimed primarily at regulators and administrators of ratepayer-funded CHP programs; however, project developers may find the protocol useful to understand how CHP projects are evaluated.

  10. A Composed Protocol of Quantum Identity Authentication Plus Quantum Key Distribution Based on Squeezed States

    Institute of Scientific and Technical Information of China (English)

    张盛; 王剑; 唐朝京; 张权

    2011-01-01

    It is established that a single quantum cryptography protocol usually cooperates with other cryptographic systems, such as an authentication system, in the real world. However, few protocols have been proposed on how to combine two or more quantum protocols. To fill this gap, we propose a composed quantum protocol, containing both quantum identity authentication and quantum key distribution, using squeezed states. Hence, not only the identity can be verified, but also a new private key can be generated by our new protocol. We also analyze the security under an optimal attack, and the efficiency, which is defined by the threshold of the tolerant error rate, using Gaussian error function.

  11. Microstructure and tensile properties of magnesium alloy modified by Si/Ca based refiner

    Institute of Scientific and Technical Information of China (English)

    DUAN Zhi-chao; SUN Yang-shan; WEI Yu; DU Wen-wen; XUE Feng; ZHU Tian-bai

    2005-01-01

    Microstructure and mechanical properties of pure magnesium and AZ31 alloy with Ca/Si based refiner addition were investigated. The results indicate that addition of Ca/Si based refiners to pure magnesium and AZ31 alloy results in remarkable microstructure refinement. With proper amount of refiner addition, the grain size in as cast ingots can be one order of magnitude lower than that without refiner addition. Small amount of refiner addition to AZ31 alloy increases both ultimate strength and yield strength significantly, while the ductility of the alloy with refiner addition is similar to that without refiner addition. Addition of refiner improves the deformability of AZ31 alloy and extruded or hot rolled specimens (rods or sheets) with refiner addition exhibit higher surface quality and mechanical properties than those without refiner addition.

  12. REFINING BEHAVIOR OF BAMBOO LONG FIBER FRACTION PULP

    Institute of Scientific and Technical Information of China (English)

    Nishi K.Bhardwaj; S.V.Subrahmanyam; A.G.Kulkarni

    2004-01-01

    Refining of pulp significantly contributes to papermaking process by affecting the runnability of the stock and quality of end product. The greater the production rates of a paper, the greater are the effects of machine downtime through breaks. A correct approach towards refining treatment is very essential for energy savings and product with desired properties. Papermakers are being encouraged to use non-wood fibres for a variety of reasons - the environment, changing agricultural policy and long-term fibre supply. Compared with wood fibres,there is less knowledge on how to treat non-wood fibres for optimum performance. Laboratory studies aimed at understanding the behaviour of unbleached bamboo long fibre fraction pulp on refining in Escher Wyss Laborefiner under simulated process conditions included investigations of the important parameters like wet web tensile, wet web elongation, water retention value and paper strength properties. The paper describes the influence of two specific edge loads, 1000 and 1500 Ws/km on refining of pulp.Bamboo long fibre fraction pulp responds better to lower intensity refining to develop pulp strength.

  13. Optimal modifying and refining processes for A356 aluminium alloys

    Institute of Scientific and Technical Information of China (English)

    ZUO Xiu-rong; ZHONG Zhi-guo

    2006-01-01

    The influence of Ti, B, RE (cerium-rich mixtures of rare earth) and Sr, applied in modification and refining processes, on A356 aluminum alloys has been studied with the aids of OM, etc. The results show that the grain refining efficiency of Grain-Refining Aluminium Ingot (GRAI) and Al-Ti master alloys is similar. But the Ti addition from GRAI has lower costs than that with Al-Ti master alloys. So A356 aluminum alloys produced with GRAI are feasible in economy.Grain refining with GRAI with 0.01%Ti plus additional 0.03% B offers significant advantages over that from GRAI with 0.1% Ti only in A356 aluminum alloys, which made the dendrite α (Al) grain structure transform into equiaxed α (Al) grain structure, plus a partially modified eutectic silicon structure. RE addition to A356 aluminum alloys with Sr modifier is enable to achieve small DAS,reduced pores, well-refined fibrous silicon, small and dispersive eutectic colonies.

  14. REFINING BEHAVIOUR OF BAMBOO LONG FIBRE FRACTION PULP

    Institute of Scientific and Technical Information of China (English)

    NishiK.Bhardwaj; S.V.Subrahmanyam; A.G.Kulkarni

    2004-01-01

    Refining of pulp significantly contributes to papermaking process by affecting the runnability of the stock and quality of end product. The greater the production rates of a paper, the greater are the effects of machine downtime through breaks. A correct approach towards refining treatment is very essential for energy savings and product with desired properties. Papermakers are being encouraged to use non-wood fibres for a variety of reasons - the environment, changing agricultural policy and long-term fibre supply. Compared with wood fibres, there is less knowledge on how to treat non-wood fibres for optimum performance. Laboratory studies aimed at understanding the behaviour of unbleached bamboo long fibre fraction pulp on refining in Escher Wyss Laborefmer under simulated process conditions included investigations of the important parameters like wet web tensile, wet web elongation, water retention value and paper strength properties. The paper describes the influence of two specific edge loads, 1000 and 1500 Ws/km on refining of pulp. Bamboo long fibre fraction pulp responds better to lower intensity refining to develop pulp strength.

  15. Refined food addiction: a classic substance use disorder.

    Science.gov (United States)

    Ifland, J R; Preuss, H G; Marcus, M T; Rourke, K M; Taylor, W C; Burau, K; Jacobs, W S; Kadish, W; Manso, G

    2009-05-01

    Overeating in industrial societies is a significant problem, linked to an increasing incidence of overweight and obesity, and the resultant adverse health consequences. We advance the hypothesis that a possible explanation for overeating is that processed foods with high concentrations of sugar and other refined sweeteners, refined carbohydrates, fat, salt, and caffeine are addictive substances. Therefore, many people lose control over their ability to regulate their consumption of such foods. The loss of control over these foods could account for the global epidemic of obesity and other metabolic disorders. We assert that overeating can be described as an addiction to refined foods that conforms to the DSM-IV criteria for substance use disorders. To examine the hypothesis, we relied on experience with self-identified refined foods addicts, as well as critical reading of the literature on obesity, eating behavior, and drug addiction. Reports by self-identified food addicts illustrate behaviors that conform to the 7 DSM-IV criteria for substance use disorders. The literature also supports use of the DSM-IV criteria to describe overeating as a substance use disorder. The observational and empirical data strengthen the hypothesis that certain refined food consumption behaviors meet the criteria for substance use disorders, not unlike tobacco and alcohol. This hypothesis could lead to a new diagnostic category, as well as therapeutic approaches to changing overeating behaviors.

  16. Efficient Modularity Optimization: Multi-Step Greedy Algorithm and Vertex Mover Refinement

    CERN Document Server

    Schuetz, Philipp

    2007-01-01

    Identifying strongly connected substructures in large networks provides insight in their coarse-grained organization. Several approaches based on the optimization of a quality function, e.g. the modularity, have been proposed. We have developed a multi-step extension of the greedy algorithm (MSG) for modularity optimization and combined it with a simplistic refinement procedure called ``Vertex Mover'' (VM) which reassigns vertices to neighboring communities to improve the final modularity value. With an appropriate choice of the step-width in the MSG, the combined MSG&VM algorithm is able to find solutions of higher modularity than those reported previously. The multi-step extension does not alter the running time expectation of the greedy algorithm. It has been reported earlier that the greedy algorithm is the most efficient modularity optimization procedure for a generic (sparse) network (Danon et al., J. Stat. Mech. P09008 (2005)). Therefore, we conclude that the combined MSG&VM algorithm yields ne...

  17. AN IMPROVED AUTHENTICATED KEY AGREEMENT PROTOCOL

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    In 1999, Seo and Sweeney proposed a simple authenticated key agreement protocol that was designed to act as a Diffie-Hellman key agreement protocol with user authentication.Various attacks on this protocol are described and enhanced in the literature. Recently, Ku and Wang proposed an improved authenticated key agreement protocol, where they asserted the protocol could withstand the existing attacks. This paper shows that Ku and Wang's protocol is still vulnerable to the modification attack and presents an improved authenticated key agreement protocol to enhance the security of Ku and Wang's protocol. The protocol has more efficient performance by replacing exponentiation operations with message authentication code operations.

  18. Protocols.io: Virtual Communities for Protocol Development and Discussion.

    Science.gov (United States)

    Teytelman, Leonid; Stoliartchouk, Alexei; Kindler, Lori; Hurwitz, Bonnie L

    2016-08-01

    The detailed know-how to implement research protocols frequently remains restricted to the research group that developed the method or technology. This knowledge often exists at a level that is too detailed for inclusion in the methods section of scientific articles. Consequently, methods are not easily reproduced, leading to a loss of time and effort by other researchers. The challenge is to develop a method-centered collaborative platform to connect with fellow researchers and discover state-of-the-art knowledge. Protocols.io is an open-access platform for detailing, sharing, and discussing molecular and computational protocols that can be useful before, during, and after publication of research results.

  19. Immune Disorder HSCT Protocol

    Science.gov (United States)

    2016-11-01

    Immune Deficiency Disorders; Severe Combined Immunodeficiency; Chronic Granulomatous Disease; X-linked Agammaglobulinemia; Wiskott-Aldrich Syndrome; Hyper-IgM; DiGeorge Syndrome; Chediak-Higashi Syndrome; Common Variable Immune Deficiency; Immune Dysregulatory Disorders; Hemophagocytic Lymphohistiocytosis; IPEX; Autoimmune Lymphoproliferative Syndrome; X-linked Lymphoproliferative Syndrome

  20. Study on Laboratory Method for Refining of SR Diesel Fuel

    Institute of Scientific and Technical Information of China (English)

    Li Junling; Zheng Tinglu; Han Zhaping

    2006-01-01

    The method for refining the straight-run diesel fuel was studied in laboratory scale in order to make the acid number of diesel fraction comply with the standard while removing the naphthenic acids contained in diesel without causing environmental pollution. After comparing the effect of refining using three solvents, the isopropyl alcohol-HOA was specified as the best solvent. Meanwhile, the relationship between the acid number of diesel fraction and the amount of solvent used and the relationship between the concentration of solvent and temperature and the stability of diesel in terms of its acid number were also investigated. Experimental results had shown that when the mass fraction of the HOA-IPA solvent was 20% at a dosage of 17 mL of the solvent and a temperature of 30℃, the acid number of the refined diesel fraction was 0.015 mg KOH/g with a good stability of acidity in the diesel traction.