WorldWideScience

Sample records for throughput proteome-wide precision

  1. Precision production: enabling deterministic throughput for precision aspheres with MRF

    Science.gov (United States)

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  2. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  3. Mass spectrometry analysis of proteome-wide proteolytic post-translational degradation of proteins

    OpenAIRE

    Shen, Yufeng; Hixson, Kim K.; Tolić, Nikola; Camp, David G.; Purvine, Samuel O.; Moore, Ronald J.; Smith, Richard D.

    2008-01-01

    Protein proteolytic degradation is an essential component to proper cell function and its life cycle. Here, we study the protein degradation in yeast Saccharomyces cerevisiae cells on a proteome-wide scale by detection of the intermediate peptides produced from the intracellular degradation of proteins using sequencing-based tandem mass spectrometry. By tracing the detected ~1,100 peptides and their ~200 protein substrate origins we obtain evidence for new insights into the proteome-wide prot...

  4. A Proteome-wide, Quantitative Survey of In Vivo Ubiquitylation Sites Reveals Widespread Regulatory Roles

    DEFF Research Database (Denmark)

    Wagner, Sebastian Alexander; Beli, Petra; Weinert, Brian Tate

    2011-01-01

    Post-translational modification of proteins by ubiquitin is a fundamentally important regulatory mechanism. However, proteome-wide analysis of endogenous ubiquitylation remains a challenging task, and almost always has relied on cells expressing affinity tagged ubiquitin. Here we combine single...

  5. Spectrum-to-Spectrum Searching Using a Proteome-wide Spectral Library*

    Science.gov (United States)

    Yen, Chia-Yu; Houel, Stephane; Ahn, Natalie G.; Old, William M.

    2011-01-01

    The unambiguous assignment of tandem mass spectra (MS/MS) to peptide sequences remains a key unsolved problem in proteomics. Spectral library search strategies have emerged as a promising alternative for peptide identification, in which MS/MS spectra are directly compared against a reference library of confidently assigned spectra. Two problems relate to library size. First, reference spectral libraries are limited to rediscovery of previously identified peptides and are not applicable to new peptides, because of their incomplete coverage of the human proteome. Second, problems arise when searching a spectral library the size of the entire human proteome. We observed that traditional dot product scoring methods do not scale well with spectral library size, showing reduction in sensitivity when library size is increased. We show that this problem can be addressed by optimizing scoring metrics for spectrum-to-spectrum searches with large spectral libraries. MS/MS spectra for the 1.3 million predicted tryptic peptides in the human proteome are simulated using a kinetic fragmentation model (MassAnalyzer version2.1) to create a proteome-wide simulated spectral library. Searches of the simulated library increase MS/MS assignments by 24% compared with Mascot, when using probabilistic and rank based scoring methods. The proteome-wide coverage of the simulated library leads to 11% increase in unique peptide assignments, compared with parallel searches of a reference spectral library. Further improvement is attained when reference spectra and simulated spectra are combined into a hybrid spectral library, yielding 52% increased MS/MS assignments compared with Mascot searches. Our study demonstrates the advantages of using probabilistic and rank based scores to improve performance of spectrum-to-spectrum search strategies. PMID:21532008

  6. Improving quantitative precision and throughput by reducing calibrator use in liquid chromatography-tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rule, Geoffrey S., E-mail: geoffrey.s.rule@aruplab.com [ARUP Institute for Clinical and Experimental Pathology, 500 Chipeta Way, Salt Lake City, UT 84108 (United States); Rockwood, Alan L. [ARUP Institute for Clinical and Experimental Pathology, 500 Chipeta Way, Salt Lake City, UT 84108 (United States); Department of Pathology, University of Utah School of Medicine, 2100 Jones Medical Research Bldg., Salt Lake City, UT 84132 (United States)

    2016-05-05

    To improve efficiency in our mass spectrometry laboratories we have made efforts to reduce the number of calibration standards utilized for quantitation over time. We often analyze three or more batches of 96 samples per day, on a single instrument, for a number of assays. With a conventional calibration scheme at six concentration levels this amounts to more than 5000 calibration points per year. Modern LC-tandem mass spectrometric instrumentation is extremely rugged however, and isotopically labelled internal standards are widely available. This made us consider whether alternative calibration strategies could be utilized to reduce the number of calibration standards analyzed while still retaining high precision and accurate quantitation. Here we demonstrate how, by utilizing a single calibration point in each sample batch, and using the resulting response factor (RF) to update an existing, historical response factor (HRF), we are able to obtain improved precision over a conventional multipoint calibration approach, as judged by quality control samples. The laboratory component of this study was conducted with an existing LC tandem mass spectrometric method for three androgen analytes in our production laboratory. Using examples from both simulated and laboratory data we illustrate several aspects of our single point alternative calibration strategy and compare it with a conventional, multipoint calibration approach. We conclude that both the cost and burden of preparing multiple calibration standards with every batch of samples can be reduced while at the same time maintaining, or even improving, analytical quality. - Highlights: • Use of a weighted single point calibration approach improves quantitative precision. • A weighted response factor approach incorporates historical calibration information. • Several scenarios are discussed with regard to their influence on quantitation.

  7. Improving quantitative precision and throughput by reducing calibrator use in liquid chromatography-tandem mass spectrometry

    International Nuclear Information System (INIS)

    Rule, Geoffrey S.; Rockwood, Alan L.

    2016-01-01

    To improve efficiency in our mass spectrometry laboratories we have made efforts to reduce the number of calibration standards utilized for quantitation over time. We often analyze three or more batches of 96 samples per day, on a single instrument, for a number of assays. With a conventional calibration scheme at six concentration levels this amounts to more than 5000 calibration points per year. Modern LC-tandem mass spectrometric instrumentation is extremely rugged however, and isotopically labelled internal standards are widely available. This made us consider whether alternative calibration strategies could be utilized to reduce the number of calibration standards analyzed while still retaining high precision and accurate quantitation. Here we demonstrate how, by utilizing a single calibration point in each sample batch, and using the resulting response factor (RF) to update an existing, historical response factor (HRF), we are able to obtain improved precision over a conventional multipoint calibration approach, as judged by quality control samples. The laboratory component of this study was conducted with an existing LC tandem mass spectrometric method for three androgen analytes in our production laboratory. Using examples from both simulated and laboratory data we illustrate several aspects of our single point alternative calibration strategy and compare it with a conventional, multipoint calibration approach. We conclude that both the cost and burden of preparing multiple calibration standards with every batch of samples can be reduced while at the same time maintaining, or even improving, analytical quality. - Highlights: • Use of a weighted single point calibration approach improves quantitative precision. • A weighted response factor approach incorporates historical calibration information. • Several scenarios are discussed with regard to their influence on quantitation.

  8. High Throughput, High Precision Hot Testing Tool for HBLED Wafer Level Testing

    Energy Technology Data Exchange (ETDEWEB)

    Solarz, Richard [KLA-Tencor Corporation, Milpitas, CA (United States); McCord, Mark [KLA-Tencor Corporation, Milpitas, CA (United States)

    2015-12-31

    The Socrates research effort developed an in depth understanding and demonstrated in a prototype tool new precise methods for teh characterization of color characteristics and flux from individual LEDs for the production of uniform quality lighting. This effort was focused on improving the color quality and consistency of solid state lighting and potentially reducing characterization costs for all LED product types. The patented laser hot testing method was demonstrated to be far more accurate than all current state of the art color and flux characterization methods in use by the solid state lighting industry today. A seperately patented LED grouping method (statistical binning) was demonstrated to be a useful approach to improving utilization of entire lots of large color and flux distributions of manufactured LEDs for high quality color solid-state lighting. At the conclusion of the research in late 2015 the solid-state lighting industry was however generally satisfied with its existing production methods for high quality color products for the small segment of customers that demand it, albeit with added costs.

  9. 3D profile-based approach to proteome-wide discovery of novel human chemokines.

    Directory of Open Access Journals (Sweden)

    Aurelie Tomczak

    Full Text Available Chemokines are small secreted proteins with important roles in immune responses. They consist of a conserved three-dimensional (3D structure, so-called IL8-like chemokine fold, which is supported by disulfide bridges characteristic of this protein family. Sequence- and profile-based computational methods have been proficient in discovering novel chemokines by making use of their sequence-conserved cysteine patterns. However, it has been recently shown that some chemokines escaped annotation by these methods due to low sequence similarity to known chemokines and to different arrangement of cysteines in sequence and in 3D. Innovative methods overcoming the limitations of current techniques may allow the discovery of new remote homologs in the still functionally uncharacterized fraction of the human genome. We report a novel computational approach for proteome-wide identification of remote homologs of the chemokine family that uses fold recognition techniques in combination with a scaffold-based automatic mapping of disulfide bonds to define a 3D profile of the chemokine protein family. By applying our methodology to all currently uncharacterized human protein sequences, we have discovered two novel proteins that, without having significant sequence similarity to known chemokines or characteristic cysteine patterns, show strong structural resemblance to known anti-HIV chemokines. Detailed computational analysis and experimental structural investigations based on mass spectrometry and circular dichroism support our structural predictions and highlight several other chemokine-like features. The results obtained support their functional annotation as putative novel chemokines and encourage further experimental characterization. The identification of remote homologs of human chemokines may provide new insights into the molecular mechanisms causing pathologies such as cancer or AIDS, and may contribute to the development of novel treatments. Besides

  10. Comparative Analysis of Proteome-Wide Lysine Acetylation in Juvenile and Adult Schistosoma japonicum

    Directory of Open Access Journals (Sweden)

    Qing Li

    2017-11-01

    Full Text Available Schistosomiasis is a devastating parasitic disease caused by tremotodes of the genus Schistosoma. Eggs produced by sexually mature schistosomes are the causative agents of for pathogenesis and transmission. Elucidating the molecular mechanism of schistosome development and sexual maturation would facilitate the prevention and control of schistosomiasis. Acetylation of lysine is a dynamic and reversible post-translational modification playing keys role in many biological processes including development in both eukaryotes and prokaryotes. To investigate the impacts of lysine acetylation on Schistosoma japonicum (S. japonicum development and sexual maturation, we used immunoaffinity-based acetyllysine peptide enrichment combined with mass spectrometry (MS, to perform the first comparative analysis of proteome-wide lysine acetylation in both female and male, juvenile (18 days post infection, 18 dpi and adult (28 dpi schistosome samples. In total, we identified 874 unique acetylated sites in 494 acetylated proteins. The four samples shared 47 acetylated sites and 46 proteins. More acetylated sites and proteins shared by both females and males were identified in 28 dpi adults (189 and 143, respectively than in 18 dpi schistosomula (76 and 59, respectively. More stage-unique acetylated sites and proteins were also identified in 28 dpi adults (494 and 210, respectively than in 18 dpi schistosomula (73 and 44, respectively. Functional annotation showed that in different developmental stages and genders, a number of proteins involving in muscle movement, glycometabolism, lipid metabolism, energy metabolism, environmental stress resistance, antioxidation, etc., displayed distinct acetylation profiles, which was in accordance with the changes of their biological functions during schistosome development, suggesting that lysine acetylation modification exerted important regulatory roles in schistosome development. Taken together, our data provided the first

  11. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  12. Real-Time Control System for Improved Precision and Throughput in an Ultrafast Carbon Fiber Placement Robot Using a SoC FPGA Extended Processing Platform

    Directory of Open Access Journals (Sweden)

    Gilberto Ochoa-Ruiz

    2017-01-01

    Full Text Available We present an architecture for accelerating the processing and execution of control commands in an ultrafast fiber placement robot. The system consists of a robotic arm designed by Coriolis Composites whose purpose is to move along a surface, on which composite fibers are deposed, via an independently controlled head. In first system implementation, the control commands were sent via Profibus by a PLC, limiting the reaction time and thus the precision of the fiber placement and the maximum throughput. Therefore, a custom real-time solution was imperative in order to ameliorate the performance and to meet the stringent requirements of the target industry (avionics, aeronautical systems. The solution presented in this paper is based on the use of a SoC FPGA processing platform running a real-time operating system (FreeRTOS, which has enabled an improved comamnd retrieval mechanism. The system’s placement precision was improved by a factor of 20 (from 1 mm to 0.05 mm, while the maximum achievable throughput was 1 m/s, compared to the average 30 cm/s provided by the original solution, enabling fabricating more complex and larger pieces in a significant fraction of the time.

  13. High-throughput dual-color precision imaging for brain-wide mapping of the connectome with cytoarchitectonic landmarks at the cellular level (Conference Presentation)

    Science.gov (United States)

    Luo, Qingming; Gong, Hui; Yuan, Jing; Li, Xiangning; Li, Anan; Xu, Tonghui

    2017-02-01

    Deciphering the fine morphology and precise location of neurons and neural circuits are crucial to enhance our understanding of brain function and diseases. Traditionally, we have to map brain images to coarse axial-sampling planar reference atlases to orient neural structures. However, this means might fail to orient neural projections at single-cell resolution due to position errors resulting from individual differences at the cellular level. Here, we present a high-throughput imaging method that can automatically obtain the fine morphologies and precise locations of both neurons and circuits, employing wide-field large-volume tomography to acquire three-dimensional images of thick tissue and implementing real-time soma counterstaining to obtain cytoarchitectonic landmarks during the imaging process. The reconstruction and orientation of brain-wide neural circuits at single-neuron resolution can be accomplished for the same mouse brain without additional counterstains or image registration. Using our method, mouse brain imaging datasets of multiple type-specific neurons and circuits were successfully acquired, demonstrating the versatility. The results show that the simultaneous acquisition of labeled neural structures and cytoarchitecture reference at single-neuron resolution in the same brain greatly facilitates precise tracing of long-range projections and accurate locating of nuclei. Our method provides a novel and effective tool for application in studies on genetic dissection, brain function and the pathology of the nervous system.

  14. Proteome-wide dataset supporting the study of ancient metazoan macromolecular complexes

    Directory of Open Access Journals (Sweden)

    Sadhna Phanse

    2016-03-01

    Full Text Available Our analysis examines the conservation of multiprotein complexes among metazoa through use of high resolution biochemical fractionation and precision mass spectrometry applied to soluble cell extracts from 5 representative model organisms Caenorhabditis elegans, Drosophila melanogaster, Mus musculus, Strongylocentrotus purpuratus, and Homo sapiens. The interaction network obtained from the data was validated globally in 4 distant species (Xenopus laevis, Nematostella vectensis, Dictyostelium discoideum, Saccharomyces cerevisiae and locally by targeted affinity-purification experiments. Here we provide details of our massive set of supporting biochemical fractionation data available via ProteomeXchange (http://www.ebi.ac.uk/pride/archive/projects/PXD002319-http://www.ebi.ac.uk/pride/archive/projects/PXD002328, PPIs via BioGRID (185267; and interaction network projections via (http://metazoa.med.utoronto.ca made fully accessible to allow further exploration. The datasets here are related to the research article on metazoan macromolecular complexes in Nature [1]. Keywords: Proteomics, Metazoa, Protein complexes, Biochemical, Fractionation

  15. Improving data quality and preserving HCD-generated reporter ions with EThcD for isobaric tag-based quantitative proteomics and proteome-wide PTM studies

    International Nuclear Information System (INIS)

    Yu, Qing; Shi, Xudong; Feng, Yu; Kent, K. Craig; Li, Lingjun

    2017-01-01

    Mass spectrometry (MS)-based isobaric labeling has undergone rapid development in recent years due to its capability for high throughput quantitation. Apart from its originally designed use with collision-induced dissociation (CID) and higher-energy collisional dissociation (HCD), isobaric tagging technique could also work with electron-transfer dissociation (ETD), which provides complementarity to CID and is preferred in sequencing peptides with post-translational modifications (PTMs). However, ETD suffers from long reaction time, reduced duty cycle and bias against peptides with lower charge states. In addition, common fragmentation mechanism in ETD results in altered reporter ion production, decreased multiplexing capability, and even loss of quantitation capability for some of the isobaric tags, including custom-designed dimethyl leucine (DiLeu) tags. Here, we demonstrate a novel electron-transfer/higher-energy collision dissociation (EThcD) approach that preserves original reporter ion channels, mitigates bias against lower charge states, improves sensitivity, and significantly improves data quality for quantitative proteomics and proteome-wide PTM studies. Systematic optimization was performed to achieve a balance between data quality and sensitivity. We provide direct comparison of EThcD with ETD and HCD for DiLeu- and TMT-labeled HEK cell lysate and IMAC enriched phosphopeptides. Results demonstrate improved data quality and phosphorylation localization accuracy while preserving sufficient reporter ion production. Biological studies were performed to investigate phosphorylation changes in a mouse vascular smooth muscle cell line treated with four different conditions. Overall, EThcD exhibits superior performance compared to conventional ETD and offers distinct advantages compared to HCD in isobaric labeling based quantitative proteomics and quantitative PTM studies. - Highlights: • EThcD was optimized for isobaric tag-labeled peptides for quantitative

  16. Improving data quality and preserving HCD-generated reporter ions with EThcD for isobaric tag-based quantitative proteomics and proteome-wide PTM studies

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Qing [School of Pharmacy, University of Wisconsin, Madison, WI 53705 (United States); Shi, Xudong [Department of Surgery, School of Medicine and Public Health, University of Wisconsin, Madison, WI 53705 (United States); Feng, Yu [School of Pharmacy, University of Wisconsin, Madison, WI 53705 (United States); Kent, K. Craig [Department of Surgery, School of Medicine and Public Health, University of Wisconsin, Madison, WI 53705 (United States); Li, Lingjun, E-mail: lingjun.li@wisc.edu [School of Pharmacy, University of Wisconsin, Madison, WI 53705 (United States); Department of Chemistry, University of Wisconsin, Madison, WI 53706 (United States)

    2017-05-22

    Mass spectrometry (MS)-based isobaric labeling has undergone rapid development in recent years due to its capability for high throughput quantitation. Apart from its originally designed use with collision-induced dissociation (CID) and higher-energy collisional dissociation (HCD), isobaric tagging technique could also work with electron-transfer dissociation (ETD), which provides complementarity to CID and is preferred in sequencing peptides with post-translational modifications (PTMs). However, ETD suffers from long reaction time, reduced duty cycle and bias against peptides with lower charge states. In addition, common fragmentation mechanism in ETD results in altered reporter ion production, decreased multiplexing capability, and even loss of quantitation capability for some of the isobaric tags, including custom-designed dimethyl leucine (DiLeu) tags. Here, we demonstrate a novel electron-transfer/higher-energy collision dissociation (EThcD) approach that preserves original reporter ion channels, mitigates bias against lower charge states, improves sensitivity, and significantly improves data quality for quantitative proteomics and proteome-wide PTM studies. Systematic optimization was performed to achieve a balance between data quality and sensitivity. We provide direct comparison of EThcD with ETD and HCD for DiLeu- and TMT-labeled HEK cell lysate and IMAC enriched phosphopeptides. Results demonstrate improved data quality and phosphorylation localization accuracy while preserving sufficient reporter ion production. Biological studies were performed to investigate phosphorylation changes in a mouse vascular smooth muscle cell line treated with four different conditions. Overall, EThcD exhibits superior performance compared to conventional ETD and offers distinct advantages compared to HCD in isobaric labeling based quantitative proteomics and quantitative PTM studies. - Highlights: • EThcD was optimized for isobaric tag-labeled peptides for quantitative

  17. Deformability measurement of red blood cells using a microfluidic channel array and an air cavity in a driving syringe with high throughput and precise detection of subpopulations.

    Science.gov (United States)

    Kang, Yang Jun; Ha, Young-Ran; Lee, Sang-Joon

    2016-01-07

    Red blood cell (RBC) deformability has been considered a potential biomarker for monitoring pathological disorders. High throughput and detection of subpopulations in RBCs are essential in the measurement of RBC deformability. In this paper, we propose a new method to measure RBC deformability by evaluating temporal variations in the average velocity of blood flow and image intensity of successively clogged RBCs in the microfluidic channel array for specific time durations. In addition, to effectively detect differences in subpopulations of RBCs, an air compliance effect is employed by adding an air cavity into a disposable syringe. The syringe was equally filled with a blood sample (V(blood) = 0.3 mL, hematocrit = 50%) and air (V(air) = 0.3 mL). Owing to the air compliance effect, blood flow in the microfluidic device behaved transiently depending on the fluidic resistance in the microfluidic device. Based on the transient behaviors of blood flows, the deformability of RBCs is quantified by evaluating three representative parameters, namely, minimum value of the average velocity of blood flow, clogging index, and delivered blood volume. The proposed method was applied to measure the deformability of blood samples consisting of homogeneous RBCs fixed with four different concentrations of glutaraldehyde solution (0%-0.23%). The proposed method was also employed to evaluate the deformability of blood samples partially mixed with normal RBCs and hardened RBCs. Thereafter, the deformability of RBCs infected by human malaria parasite Plasmodium falciparum was measured. As a result, the three parameters significantly varied, depending on the degree of deformability. In addition, the deformability measurement of blood samples was successfully completed in a short time (∼10 min). Therefore, the proposed method has significant potential in deformability measurement of blood samples containing hematological diseases with high throughput and precise detection of

  18. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    Science.gov (United States)

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  19. Novel CTL epitopes identified through a Y. pestis proteome-wide analysis in the search for vaccine candidates against plague.

    Science.gov (United States)

    Zvi, Anat; Rotem, Shahar; Zauberman, Ayelet; Elia, Uri; Aftalion, Moshe; Bar-Haim, Erez; Mamroud, Emanuelle; Cohen, Ofer

    2017-10-20

    The causative agent of Plague, Yersinia pestis, is a highly virulent pathogen and a potential bioweapon. Depending on the route of infection, two prevalent occurrences of the disease are known, bubonic and pneumonic. The latter has a high fatality rate. In the absence of a licensed vaccine, intense efforts to develop a safe and efficacious vaccine have been conducted, and humoral-driven subunit vaccines containing the F1 and LcrV antigens are currently under clinical trials. It is well known that a cellular immune response might have an essential additive value to immunity and protection against Y. pestis infection. Nevertheless, very few documented epitopes eliciting a protective T-cell response have been reported. Here, we present a combined high throughput computational and experimental effort towards identification of CD8 T-cell epitopes. All 4067 proteins of Y. pestis were analyzed with state-of-the-art recently developed prediction algorithms aimed at mapping potential MHC class I binders. A compilation of the results obtained from several prediction methods revealed a total of 238,000 peptide candidates, which necessitated downstream filtering criteria. Our previously established and proven approach for enrichment of true positive CTL epitopes, which relies on mapping clusters rich in tandem or overlapping predicted MHC binders ("hotspots"), was applied, as well as considerations of predicted binding affinity. A total of 1532 peptides were tested for their ability to elicit a specific T-cell response by following the production of IFNγ from splenocytes isolated from vaccinated mice. Altogether, the screen resulted in 178 positive responders (11.8%), all novel Y. pestis CTL epitopes. These epitopes span 113 Y. pestis proteins. Substantial enrichment of membrane-associated proteins was detected for epitopes selected from hotspots of predicted MHC binders. These results considerably expand the repertoire of known CTL epitopes in Y. pestis and pave the way to

  20. Correlation of proteome-wide changes with social immunity behaviors provides insight into resistance to the parasitic mite, Varroa destructor, in the honey bee (Apis mellifera).

    Science.gov (United States)

    Parker, Robert; Guarna, M Marta; Melathopoulos, Andony P; Moon, Kyung-Mee; White, Rick; Huxter, Elizabeth; Pernal, Stephen F; Foster, Leonard J

    2012-06-29

    Disease is a major factor driving the evolution of many organisms. In honey bees, selection for social behavioral responses is the primary adaptive process facilitating disease resistance. One such process, hygienic behavior, enables bees to resist multiple diseases, including the damaging parasitic mite Varroa destructor. The genetic elements and biochemical factors that drive the expression of these adaptations are currently unknown. Proteomics provides a tool to identify proteins that control behavioral processes, and these proteins can be used as biomarkers to aid identification of disease tolerant colonies. We sampled a large cohort of commercial queen lineages, recording overall mite infestation, hygiene, and the specific hygienic response to V. destructor. We performed proteome-wide correlation analyses in larval integument and adult antennae, identifying several proteins highly predictive of behavior and reduced hive infestation. In the larva, response to wounding was identified as a key adaptive process leading to reduced infestation, and chitin biosynthesis and immune responses appear to represent important disease resistant adaptations. The speed of hygienic behavior may be underpinned by changes in the antenna proteome, and chemosensory and neurological processes could also provide specificity for detection of V. destructor in antennae. Our results provide, for the first time, some insight into how complex behavioural adaptations manifest in the proteome of honey bees. The most important biochemical correlations provide clues as to the underlying molecular mechanisms of social and innate immunity of honey bees. Such changes are indicative of potential divergence in processes controlling the hive-worker maturation.

  1. Proteome-wide analysis of the amino terminal status of Escherichia coli proteins at the steady-state and upon deformylation inhibition.

    Science.gov (United States)

    Bienvenut, Willy V; Giglione, Carmela; Meinnel, Thierry

    2015-07-01

    A proteome wide analysis was performed in Escherichia coli to identify the impact on protein N-termini of actinonin, an antibiotic specifically inhibiting peptide deformylase (PDF). A strategy and tool suite (SILProNaQ) was employed to provide large-scale quantitation of N-terminal modifications. In control conditions, more than 1000 unique N-termini were identified with 56% showing initiator methionine removal. Additional modifications corresponded to partial or complete Nα-acetylation (10%) and N-formyl retention (5%). Among the proteins undergoing these N-terminal modifications, 140 unique N-termini from translocated membrane proteins were highlighted. The very early time-course impact of actinonin was followed after addition of bacteriostatic concentrations of the drug. Under these conditions, 26% of all proteins did not undergo deformylation any longer after 10 min, a value reaching more than 60% of all characterized proteins after 40 min of treatment. The N-formylation ratio measured on individual proteins increased with the same trend. Upon early PDF inhibition, two major categories of proteins retained their N-formyl group: a large number of inner membrane proteins and many proteins involved in protein synthesis including factors assisting the nascent chains in early cotranslational events. All MS data have been deposited in the ProteomeXchange with identifiers PXD001979, PXD002012 and PXD001983 (http://proteomecentral.proteomexchange.org/dataset/PXD001979, http://proteomecentral.proteomexchange.org/dataset/PXD002012 and http://proteomecentral.proteomexchange.org/dataset/PXD001983). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Preexisting CD4+ T-cell immunity in human population to avian influenza H7N9 virus: whole proteome-wide immunoinformatics analyses.

    Directory of Open Access Journals (Sweden)

    Venkata R Duvvuri

    Full Text Available In 2013, a novel avian influenza H7N9 virus was identified in human in China. The antigenically distinct H7N9 surface glycoproteins raised concerns about lack of cross-protective neutralizing antibodies. Epitope-specific preexisting T-cell immunity was one of the protective mechanisms in pandemic 2009 H1N1 even in the absence of cross-protective antibodies. Hence, the assessment of preexisting CD4+ T-cell immunity to conserved epitopes shared between H7N9 and human influenza A viruses (IAV is critical. A comparative whole proteome-wide immunoinformatics analysis was performed to predict the CD4+ T-cell epitopes that are commonly conserved within the proteome of H7N9 in reference to IAV subtypes (H1N1, H2N2, and H3N2. The CD4+ T-cell epitopes that are commonly conserved (∼ 556 were further screened against the Immune Epitope Database (IEDB to validate their immunogenic potential. This analysis revealed that 45.5% (253 of 556 epitopes are experimentally proven to induce CD4+ T-cell memory responses. In addition, we also found that 23.3% of CD4+ T-cell epitopes have ≥ 90% of sequence homology with experimentally defined CD8+ T-cell epitopes. We also conducted the population coverage analysis across different ethnicities using commonly conserved CD4+ T-cell epitopes and corresponding HLA-DRB1 alleles. Interestingly, the indigenous populations from Canada, United States, Mexico and Australia exhibited low coverage (28.65% to 45.62% when compared with other ethnicities (57.77% to 94.84%. In summary, the present analysis demonstrate an evidence on the likely presence of preexisting T-cell immunity in human population and also shed light to understand the potential risk of H7N9 virus among indigenous populations, given their high susceptibility during previous pandemic influenza events. This information is crucial for public health policy, in targeting priority groups for immunization programs.

  3. Non-synonymous variations in cancer and their effects on the human proteome: workflow for NGS data biocuration and proteome-wide analysis of TCGA data.

    Science.gov (United States)

    Cole, Charles; Krampis, Konstantinos; Karagiannis, Konstantinos; Almeida, Jonas S; Faison, William J; Motwani, Mona; Wan, Quan; Golikov, Anton; Pan, Yang; Simonyan, Vahan; Mazumder, Raja

    2014-01-27

    Next-generation sequencing (NGS) technologies have resulted in petabytes of scattered data, decentralized in archives, databases and sometimes in isolated hard-disks which are inaccessible for browsing and analysis. It is expected that curated secondary databases will help organize some of this Big Data thereby allowing users better navigate, search and compute on it. To address the above challenge, we have implemented a NGS biocuration workflow and are analyzing short read sequences and associated metadata from cancer patients to better understand the human variome. Curation of variation and other related information from control (normal tissue) and case (tumor) samples will provide comprehensive background information that can be used in genomic medicine research and application studies. Our approach includes a CloudBioLinux Virtual Machine which is used upstream of an integrated High-performance Integrated Virtual Environment (HIVE) that encapsulates Curated Short Read archive (CSR) and a proteome-wide variation effect analysis tool (SNVDis). As a proof-of-concept, we have curated and analyzed control and case breast cancer datasets from the NCI cancer genomics program - The Cancer Genome Atlas (TCGA). Our efforts include reviewing and recording in CSR available clinical information on patients, mapping of the reads to the reference followed by identification of non-synonymous Single Nucleotide Variations (nsSNVs) and integrating the data with tools that allow analysis of effect nsSNVs on the human proteome. Furthermore, we have also developed a novel phylogenetic analysis algorithm that uses SNV positions and can be used to classify the patient population. The workflow described here lays the foundation for analysis of short read sequence data to identify rare and novel SNVs that are not present in dbSNP and therefore provides a more comprehensive understanding of the human variome. Variation results for single genes as well as the entire study are available

  4. Precision manufacturing

    CERN Document Server

    Dornfeld, David

    2008-01-01

    Today there is a high demand for high-precision products. The manufacturing processes are now highly sophisticated and derive from a specialized genre called precision engineering. Precision Manufacturing provides an introduction to precision engineering and manufacturing with an emphasis on the design and performance of precision machines and machine tools, metrology, tooling elements, machine structures, sources of error, precision machining processes and precision process planning. As well as discussing the critical role precision machine design for manufacturing has had in technological developments over the last few hundred years. In addition, the influence of sustainable manufacturing requirements in precision processes is introduced. Drawing upon years of practical experience and using numerous examples and illustrative applications, David Dornfeld and Dae-Eun Lee cover precision manufacturing as it applies to: The importance of measurement and metrology in the context of Precision Manufacturing. Th...

  5. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  6. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    Science.gov (United States)

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  8. Why precision?

    Energy Technology Data Exchange (ETDEWEB)

    Bluemlein, Johannes

    2012-05-15

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  9. Why precision?

    International Nuclear Information System (INIS)

    Bluemlein, Johannes

    2012-05-01

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  10. Precision translator

    Science.gov (United States)

    Reedy, Robert P.; Crawford, Daniel W.

    1984-01-01

    A precision translator for focusing a beam of light on the end of a glass fiber which includes two turning fork-like members rigidly connected to each other. These members have two prongs each with its separation adjusted by a screw, thereby adjusting the orthogonal positioning of a glass fiber attached to one of the members. This translator is made of simple parts with capability to keep adjustment even in condition of rough handling.

  11. Precision Cosmology

    Science.gov (United States)

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  12. Green throughput taxation

    International Nuclear Information System (INIS)

    Bruvoll, A.; Ibenholt, K.

    1998-01-01

    According to optimal taxation theory, raw materials should be taxed to capture the embedded scarcity rent in their value. To reduce both natural resource use and the corresponding emissions, or the throughput in the economic system, the best policy may be a tax on material inputs. As a first approach to throughput taxation, this paper considers a tax on intermediates in the framework of a dynamic computable general equilibrium model with environmental feedbacks. To balance the budget, payroll taxes are reduced. As a result, welfare indicators as material consumption and leisure time consumption are reduced, while on the other hand all the environmental indicators improve. 27 refs

  13. Throughput rate study

    International Nuclear Information System (INIS)

    Ford, L.; Bailey, W.; Gottlieb, P.; Emami, F.; Fleming, M.; Robertson, D.

    1993-01-01

    The Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor, has completed a study to analyze system wide impacts of operating the CRWMS at varying throughput rates, including the 3000 MTU/year rate which has been assumed in the past. Impacts of throughput rate on all phases of the CRWMS operations (acceptance, transportation, storage and disposal) were evaluated. The results of the study indicate that a range from 3000 to 5000 MTU/year is preferred, based on system cost per MTU of SNF emplaced and logistics constraints

  14. Lessons we learned from high-throughput and top-down systems biology analyses about glioma stem cells.

    Science.gov (United States)

    Mock, Andreas; Chiblak, Sara; Herold-Mende, Christel

    2014-01-01

    A growing body of evidence suggests that glioma stem cells (GSCs) account for tumor initiation, therapy resistance, and the subsequent regrowth of gliomas. Thus, continuous efforts have been undertaken to further characterize this subpopulation of less differentiated tumor cells. Although we are able to enrich GSCs, we still lack a comprehensive understanding of GSC phenotypes and behavior. The advent of high-throughput technologies raised hope that incorporation of these newly developed platforms would help to tackle such questions. Since then a couple of comparative genome-, transcriptome- and proteome-wide studies on GSCs have been conducted giving new insights in GSC biology. However, lessons had to be learned in designing high-throughput experiments and some of the resulting conclusions fell short of expectations because they were performed on only a few GSC lines or at one molecular level instead of an integrative poly-omics approach. Despite these shortcomings, our knowledge of GSC biology has markedly expanded due to a number of survival-associated biomarkers as well as glioma-relevant signaling pathways and therapeutic targets being identified. In this article we review recent findings obtained by comparative high-throughput analyses of GSCs. We further summarize fundamental concepts of systems biology as well as its applications for glioma stem cell research.

  15. Precision Airdrop (Largage de precision)

    Science.gov (United States)

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  16. MXIbus data throughput tests

    International Nuclear Information System (INIS)

    Botlo, M.; Dunning, J.; Jagieski, M.; Miller, L.; Romero, A.

    1992-11-01

    A series of tests were conducted to evaluate data transfer rates using the MXIbus architecture. The tests were conducted by the DAQ group in the Physics Research Division. The MXIbus from National Instruments provides a multisystem extension interface bus. It allows multiple VME chassis to be networked. Other bus architectures that can participate in the network include VXIbus, IBM PC-AT bus, Sun Sbus, Mac NuBus and stand-alone instruments with the appropriate MXIbus adapter cards. From a functional standpoint the MXIbus provides the capability to enlarge the address space in a fashion that is transparent to the software application. The tests were designed to measure data throughput when using the MSIbus with other industry off-the-shelf hardware. This report contains discussions on: MXIbus architecture and general guidelines; the commercial hardware and software used in each set of tests; and a brief description of each set of tests, observations and guidelines; the commercial hardware and software used in each set of tests; and a brief description of each set of tests, observations and conclusions

  17. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    . A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery....

  18. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  19. An Unsupervised kNN Method to Systematically Detect Changes in Protein Localization in High-Throughput Microscopy Images.

    Directory of Open Access Journals (Sweden)

    Alex Xijie Lu

    Full Text Available Despite the importance of characterizing genes that exhibit subcellular localization changes between conditions in proteome-wide imaging experiments, many recent studies still rely upon manual evaluation to assess the results of high-throughput imaging experiments. We describe and demonstrate an unsupervised k-nearest neighbours method for the detection of localization changes. Compared to previous classification-based supervised change detection methods, our method is much simpler and faster, and operates directly on the feature space to overcome limitations in needing to manually curate training sets that may not generalize well between screens. In addition, the output of our method is flexible in its utility, generating both a quantitatively ranked list of localization changes that permit user-defined cut-offs, and a vector for each gene describing feature-wise direction and magnitude of localization changes. We demonstrate that our method is effective at the detection of localization changes using the Δrpd3 perturbation in Saccharomyces cerevisiae, where we capture 71.4% of previously known changes within the top 10% of ranked genes, and find at least four new localization changes within the top 1% of ranked genes. The results of our analysis indicate that simple unsupervised methods may be able to identify localization changes in images without laborious manual image labelling steps.

  20. An Unsupervised kNN Method to Systematically Detect Changes in Protein Localization in High-Throughput Microscopy Images.

    Science.gov (United States)

    Lu, Alex Xijie; Moses, Alan M

    2016-01-01

    Despite the importance of characterizing genes that exhibit subcellular localization changes between conditions in proteome-wide imaging experiments, many recent studies still rely upon manual evaluation to assess the results of high-throughput imaging experiments. We describe and demonstrate an unsupervised k-nearest neighbours method for the detection of localization changes. Compared to previous classification-based supervised change detection methods, our method is much simpler and faster, and operates directly on the feature space to overcome limitations in needing to manually curate training sets that may not generalize well between screens. In addition, the output of our method is flexible in its utility, generating both a quantitatively ranked list of localization changes that permit user-defined cut-offs, and a vector for each gene describing feature-wise direction and magnitude of localization changes. We demonstrate that our method is effective at the detection of localization changes using the Δrpd3 perturbation in Saccharomyces cerevisiae, where we capture 71.4% of previously known changes within the top 10% of ranked genes, and find at least four new localization changes within the top 1% of ranked genes. The results of our analysis indicate that simple unsupervised methods may be able to identify localization changes in images without laborious manual image labelling steps.

  1. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  2. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels....... A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct...... characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion...

  3. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  4. The newest precision measurement

    International Nuclear Information System (INIS)

    Lee, Jing Gu; Lee, Jong Dae

    1974-05-01

    This book introduces basic of precision measurement, measurement of length, limit gauge, measurement of angles, measurement of surface roughness, measurement of shapes and locations, measurement of outline, measurement of external and internal thread, gear testing, accuracy inspection of machine tools, three dimension coordinate measuring machine, digitalisation of precision measurement, automation of precision measurement, measurement of cutting tools, measurement using laser, and point of choosing length measuring instrument.

  5. Practical precision measurement

    International Nuclear Information System (INIS)

    Kwak, Ho Chan; Lee, Hui Jun

    1999-01-01

    This book introduces basic knowledge of precision measurement, measurement of length, precision measurement of minor diameter, measurement of angles, measurement of surface roughness, three dimensional measurement, measurement of locations and shapes, measurement of screw, gear testing, cutting tools testing, rolling bearing testing, and measurement of digitalisation. It covers height gauge, how to test surface roughness, measurement of plan and straightness, external and internal thread testing, gear tooth measurement, milling cutter, tab, rotation precision measurement, and optical transducer.

  6. [Precision and personalized medicine].

    Science.gov (United States)

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  7. Precision Clock Evaluation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Tests and evaluates high-precision atomic clocks for spacecraft, ground, and mobile applications. Supports performance evaluation, environmental testing,...

  8. Precision machining commercialization

    International Nuclear Information System (INIS)

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  9. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  10. Precision digital control systems

    Science.gov (United States)

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  11. submitter LEP precision results

    CERN Document Server

    Kawamoto, T

    2001-01-01

    Precision measurements at LEP are reviewed, with main focus on the electroweak measurements and tests of the Standard Model. Constraints placed by the LEP measurements on possible new physics are also discussed.

  12. Description of precision colorimeter

    OpenAIRE

    Campos Acosta, Joaquín; Pons Aglio, Alicia; Corróns, Antonio

    1987-01-01

    Describes the use of a fully automatic, computer-controlled absolute spectroradiometer as a precision colorimeter. The chromaticity coordinates of several types of light sources have been obtained with this measurement system.

  13. NCI Precision Medicine

    Science.gov (United States)

    This illustration represents the National Cancer Institute’s support of research to improve precision medicine in cancer treatment, in which unique therapies treat an individual’s cancer based on specific genetic abnormalities of that person’s tumor.

  14. Laser precision microfabrication

    CERN Document Server

    Sugioka, Koji; Pique, Alberto

    2010-01-01

    Miniaturization and high precision are rapidly becoming a requirement for many industrial processes and products. As a result, there is greater interest in the use of laser microfabrication technology to achieve these goals. This book composed of 16 chapters covers all the topics of laser precision processing from fundamental aspects to industrial applications to both inorganic and biological materials. It reviews the sate of the art of research and technological development in the area of laser processing.

  15. Throughput capacity of the Asbestos Conversion Unit

    International Nuclear Information System (INIS)

    Hyman, M.H.

    1996-10-01

    An engineering assessment is presented for factors that could significantly limit the throughput capacity of the Asbestos Conversion Unit. The assessment focuses mainly on volumetric throughput capacity (and related mass rate and feed density), and energy input. Important conclusions that were reached during this assessment are that the throughput is limited by feed densification capability and that the design energy input rating appears to be adequate

  16. High throughput diffractive multi-beam femtosecond laser processing using a spatial light modulator

    Energy Technology Data Exchange (ETDEWEB)

    Kuang Zheng [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)], E-mail: z.kuang@liv.ac.uk; Perrie, Walter [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Leach, Jonathan [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Sharp, Martin; Edwardson, Stuart P. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Padgett, Miles [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Dearden, Geoff; Watkins, Ken G. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)

    2008-12-30

    High throughput femtosecond laser processing is demonstrated by creating multiple beams using a spatial light modulator (SLM). The diffractive multi-beam patterns are modulated in real time by computer generated holograms (CGHs), which can be calculated by appropriate algorithms. An interactive LabVIEW program is adopted to generate the relevant CGHs. Optical efficiency at this stage is shown to be {approx}50% into first order beams and real time processing has been carried out at 50 Hz refresh rate. Results obtained demonstrate high precision surface micro-structuring on silicon and Ti6Al4V with throughput gain >1 order of magnitude.

  17. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  18. Precision Experiments at LEP

    CERN Document Server

    de Boer, Wim

    2015-01-01

    The Large Electron Positron Collider (LEP) established the Standard Model (SM) of particle physics with unprecedented precision, including all its radiative corrections. These led to predictions for the masses of the top quark and Higgs boson, which were beautifully confirmed later on. After these precision measurements the Nobel Prize in Physics was awarded in 1999 jointly to 't Hooft and Veltman "for elucidating the quantum structure of electroweak interactions in physics". Another hallmark of the LEP results were the precise measurements of the gauge coupling constants, which excluded unification of the forces within the SM, but allowed unification within the supersymmetric extension of the SM. This increased the interest in Supersymmetry (SUSY) and Grand Unified Theories, especially since the SM has no candidate for the elusive dark matter, while Supersymmetry provides an excellent candidate for dark matter. In addition, Supersymmetry removes the quadratic divergencies of the SM and {\\it predicts} the Hig...

  19. Precision muonium spectroscopy

    International Nuclear Information System (INIS)

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 µs. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In particular ground state hyperfine structure transitions can be measured by microwave spectroscopy to deliver the muon magnetic moment. The frequency of the 1s–2s transition in the hydrogen-like atom can be determined with laser spectroscopy to obtain the muon mass. With such measurements fundamental physical interactions, in particular quantum electrodynamics, can also be tested at highest precision. The results are important input parameters for experiments on the muon magnetic anomaly. The simplicity of the atom enables further precise experiments, such as a search for muonium–antimuonium conversion for testing charged lepton number conservation and searches for possible antigravity of muons and dark matter. (author)

  20. Precision genome editing

    DEFF Research Database (Denmark)

    Steentoft, Catharina; Bennett, Eric P; Schjoldager, Katrine Ter-Borch Gram

    2014-01-01

    Precise and stable gene editing in mammalian cell lines has until recently been hampered by the lack of efficient targeting methods. While different gene silencing strategies have had tremendous impact on many biological fields, they have generally not been applied with wide success in the field...... of glycobiology, primarily due to their low efficiencies, with resultant failure to impose substantial phenotypic consequences upon the final glycosylation products. Here, we review novel nuclease-based precision genome editing techniques enabling efficient and stable gene editing, including gene disruption...... by introducing single or double-stranded breaks at a defined genomic sequence. We here compare and contrast the different techniques and summarize their current applications, highlighting cases from the field of glycobiology as well as pointing to future opportunities. The emerging potential of precision gene...

  1. Precision electron polarimetry

    International Nuclear Information System (INIS)

    Chudakov, E.

    2013-01-01

    A new generation of precise Parity-Violating experiments will require a sub-percent accuracy of electron beam polarimetry. Compton polarimetry can provide such accuracy at high energies, but at a few hundred MeV the small analyzing power limits the sensitivity. Mo/ller polarimetry provides a high analyzing power independent on the beam energy, but is limited by the properties of the polarized targets commonly used. Options for precision polarimetry at 300 MeV will be discussed, in particular a proposal to use ultra-cold atomic hydrogen traps to provide a 100%-polarized electron target for Mo/ller polarimetry

  2. A passion for precision

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit

    2006-01-01

    For more than three decades, the quest for ever higher precision in laser spectroscopy of the simple hydrogen atom has inspired many advances in laser, optical, and spectroscopic techniques, culminating in femtosecond laser optical frequency combs as perhaps the most precise measuring tools known to man. Applications range from optical atomic clocks and tests of QED and relativity to searches for time variations of fundamental constants. Recent experiments are extending frequency comb techniques into the extreme ultraviolet. Laser frequency combs can also control the electric field of ultrashort light pulses, creating powerful new tools for the emerging field of attosecond science.

  3. Improving Precision of Types

    DEFF Research Database (Denmark)

    Winther, Johnni

    Types in programming languages provide a powerful tool for the programmer to document the code so that a large aspect of the intent can not only be presented to fellow programmers but also be checked automatically by compilers. The precision with which types model the behavior of programs...... is crucial to the quality of these automated checks, and in this thesis we present three different improvements to the precision of types in three different aspects of the Java programming language. First we show how to extend the type system in Java with a new type which enables the detection of unintended...

  4. Precision physics at LHC

    International Nuclear Information System (INIS)

    Hinchliffe, I.

    1997-05-01

    In this talk the author gives a brief survey of some physics topics that will be addressed by the Large Hadron Collider currently under construction at CERN. Instead of discussing the reach of this machine for new physics, the author gives examples of the types of precision measurements that might be made if new physics is discovered

  5. Precision Muonium Spectroscopy

    NARCIS (Netherlands)

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 mu s. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In

  6. What is precision medicine?

    Science.gov (United States)

    König, Inke R; Fuchs, Oliver; Hansen, Gesine; von Mutius, Erika; Kopp, Matthias V

    2017-10-01

    The term "precision medicine" has become very popular over recent years, fuelled by scientific as well as political perspectives. Despite its popularity, its exact meaning, and how it is different from other popular terms such as "stratified medicine", "targeted therapy" or "deep phenotyping" remains unclear. Commonly applied definitions focus on the stratification of patients, sometimes referred to as a novel taxonomy, and this is derived using large-scale data including clinical, lifestyle, genetic and further biomarker information, thus going beyond the classical "signs-and-symptoms" approach.While these aspects are relevant, this description leaves open a number of questions. For example, when does precision medicine begin? In which way does the stratification of patients translate into better healthcare? And can precision medicine be viewed as the end-point of a novel stratification of patients, as implied, or is it rather a greater whole?To clarify this, the aim of this paper is to provide a more comprehensive definition that focuses on precision medicine as a process. It will be shown that this proposed framework incorporates the derivation of novel taxonomies and their role in healthcare as part of the cycle, but also covers related terms. Copyright ©ERS 2017.

  7. Uplink SDMA with Limited Feedback: Throughput Scaling

    Directory of Open Access Journals (Sweden)

    Jeffrey G. Andrews

    2008-01-01

    Full Text Available Combined space division multiple access (SDMA and scheduling exploit both spatial multiplexing and multiuser diversity, increasing throughput significantly. Both SDMA and scheduling require feedback of multiuser channel sate information (CSI. This paper focuses on uplink SDMA with limited feedback, which refers to efficient techniques for CSI quantization and feedback. To quantify the throughput of uplink SDMA and derive design guidelines, the throughput scaling with system parameters is analyzed. The specific parameters considered include the numbers of users, antennas, and feedback bits. Furthermore, different SNR regimes and beamforming methods are considered. The derived throughput scaling laws are observed to change for different SNR regimes. For instance, the throughput scales logarithmically with the number of users in the high SNR regime but double logarithmically in the low SNR regime. The analysis of throughput scaling suggests guidelines for scheduling in uplink SDMA. For example, to maximize throughput scaling, scheduling should use the criterion of minimum quantization errors for the high SNR regime and maximum channel power for the low SNR regime.

  8. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  9. Precision synchrotron radiation detectors

    International Nuclear Information System (INIS)

    Levi, M.; Rouse, F.; Butler, J.

    1989-03-01

    Precision detectors to measure synchrotron radiation beam positions have been designed and installed as part of beam energy spectrometers at the Stanford Linear Collider (SLC). The distance between pairs of synchrotron radiation beams is measured absolutely to better than 28 /mu/m on a pulse-to-pulse basis. This contributes less than 5 MeV to the error in the measurement of SLC beam energies (approximately 50 GeV). A system of high-resolution video cameras viewing precisely-aligned fiducial wire arrays overlaying phosphorescent screens has achieved this accuracy. Also, detectors of synchrotron radiation using the charge developed by the ejection of Compton-recoil electrons from an array of fine wires are being developed. 4 refs., 5 figs., 1 tab

  10. A passion for precision

    CERN Multimedia

    CERN. Geneva

    2006-01-01

    For more than three decades, the quest for ever higher precision in laser spectroscopy of the simple hydrogen atom has inspired many advances in laser, optical, and spectroscopic techniques, culminating in femtosecond laser optical frequency combs  as perhaps the most precise measuring tools known to man. Applications range from optical atomic clocks and tests of QED and relativity to searches for time variations of fundamental constants. Recent experiments are extending frequency comb techniques into the extreme ultraviolet. Laser frequency combs can also control the electric field of ultrashort light pulses, creating powerful new tools for the emerging field of attosecond science.Organiser(s): L. Alvarez-Gaume / PH-THNote: * Tea & coffee will be served at 16:00.

  11. Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent

    Science.gov (United States)

    De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle

    2018-01-01

    Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770

  12. Quad precision delay generator

    International Nuclear Information System (INIS)

    Krishnan, Shanti; Gopalakrishnan, K.R.; Marballi, K.R.

    1997-01-01

    A Quad Precision Delay Generator delays a digital edge by a programmed amount of time, varying from nanoseconds to microseconds. The output of this generator has an amplitude of the order of tens of volts and rise time of the order of nanoseconds. This was specifically designed and developed to meet the stringent requirements of the plasma focus experiments. Plasma focus is a laboratory device for producing and studying nuclear fusion reactions in hot deuterium plasma. 3 figs

  13. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  14. Precision electroweak measurements

    International Nuclear Information System (INIS)

    Demarteau, M.

    1996-11-01

    Recent electroweak precision measurements fro e + e - and p anti p colliders are presented. Some emphasis is placed on the recent developments in the heavy flavor sector. The measurements are compared to predictions from the Standard Model of electroweak interactions. All results are found to be consistent with the Standard Model. The indirect constraint on the top quark mass from all measurements is in excellent agreement with the direct m t measurements. Using the world's electroweak data in conjunction with the current measurement of the top quark mass, the constraints on the Higgs' mass are discussed

  15. Electroweak precision tests

    International Nuclear Information System (INIS)

    Monteil, St.

    2009-12-01

    This document aims at summarizing a dozen of years of the author's research in High Energy Physics, in particular dealing with precision tests of the electroweak theory. Parity violating asymmetries measurements at LEP with the ALEPH detector together with global consistency checks of the Kobayashi-Maskawa paradigm within the CKM-fitter group are gathered in the first part of the document. The second part deals with the unpublished instrumental work about the design, tests, productions and commissioning of the elements of the Pre-Shower detector of the LHCb spectrometer at LHC. Physics perspectives with LHCb are eventually discussed as a conclusion. (author)

  16. Ultra-precision bearings

    CERN Document Server

    Wardle, F

    2015-01-01

    Ultra-precision bearings can achieve extreme accuracy of rotation, making them ideal for use in numerous applications across a variety of fields, including hard disk drives, roundness measuring machines and optical scanners. Ultraprecision Bearings provides a detailed review of the different types of bearing and their properties, as well as an analysis of the factors that influence motion error, stiffness and damping. Following an introduction to basic principles of motion error, each chapter of the book is then devoted to the basic principles and properties of a specific type of bearin

  17. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  18. Printing Proteins as Microarrays for High-Throughput Function Determination

    Science.gov (United States)

    MacBeath, Gavin; Schreiber, Stuart L.

    2000-09-01

    Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.

  19. Precision lifetime measurements

    International Nuclear Information System (INIS)

    Tanner, C.E.

    1994-01-01

    Precision measurements of atomic lifetimes provide important information necessary for testing atomic theory. The authors employ resonant laser excitation of a fast atomic beam to measure excited state lifetimes by observing the decay-in-flight of the emitted fluorescence. A similar technique was used by Gaupp, et al., who reported measurements with precisions of less than 0.2%. Their program includes lifetime measurements of the low lying p states in alkali and alkali like systems. Motivation for this work comes from a need to test the atomic many-body-perturbation theory (MBPT) that is necessary for interpretation of parity nonconservation experiments in atomic cesium. The authors have measured the cesium 6p 2 P 1/2 and 6p 2 P 3/2 state lifetimes to be 34.934±0.094 ns and 30.499±0.070 ns respectively. With minor changes to the apparatus, they have extended their measurements to include the lithium 2p 2 P 1/2 and 2p 2 P 3/2 states

  20. Fundamentals of precision medicine

    Science.gov (United States)

    Divaris, Kimon

    2018-01-01

    Imagine a world where clinicians make accurate diagnoses and provide targeted therapies to their patients according to well-defined, biologically-informed disease subtypes, accounting for individual differences in genetic make-up, behaviors, cultures, lifestyles and the environment. This is not as utopic as it may seem. Relatively recent advances in science and technology have led to an explosion of new information on what underlies health and what constitutes disease. These novel insights emanate from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, as well as epigenomics and exposomics—such ‘omics data can now be generated at unprecedented depth and scale, and at rapidly decreasing cost. Making sense and integrating these fundamental information domains to transform health care and improve health remains a challenge—an ambitious, laudable and high-yield goal. Precision dentistry is no longer a distant vision; it is becoming part of the rapidly evolving present. Insights from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, and epigenomics and exposomics have reached an unprecedented depth and scale. Much more needs to be done, however, for the realization of precision medicine in the oral health domain. PMID:29227115

  1. Precision measurements in supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Johnathan Lee [Stanford Univ., CA (United States)

    1995-05-01

    Supersymmetry is a promising framework in which to explore extensions of the standard model. If candidates for supersymmetric particles are found, precision measurements of their properties will then be of paramount importance. The prospects for such measurements and their implications are the subject of this thesis. If charginos are produced at the LEP II collider, they are likely to be one of the few available supersymmetric signals for many years. The author considers the possibility of determining fundamental supersymmetry parameters in such a scenario. The study is complicated by the dependence of observables on a large number of these parameters. He proposes a straightforward procedure for disentangling these dependences and demonstrate its effectiveness by presenting a number of case studies at representative points in parameter space. In addition to determining the properties of supersymmetric particles, precision measurements may also be used to establish that newly-discovered particles are, in fact, supersymmetric. Supersymmetry predicts quantitative relations among the couplings and masses of superparticles. The author discusses tests of such relations at a future e{sup +}e{sup {minus}} linear collider, using measurements that exploit the availability of polarizable beams. Stringent tests of supersymmetry from chargino production are demonstrated in two representative cases, and fermion and neutralino processes are also discussed.

  2. Precision muon physics

    Science.gov (United States)

    Gorringe, T. P.; Hertzog, D. W.

    2015-09-01

    The muon is playing a unique role in sub-atomic physics. Studies of muon decay both determine the overall strength and establish the chiral structure of weak interactions, as well as setting extraordinary limits on charged-lepton-flavor-violating processes. Measurements of the muon's anomalous magnetic moment offer singular sensitivity to the completeness of the standard model and the predictions of many speculative theories. Spectroscopy of muonium and muonic atoms gives unmatched determinations of fundamental quantities including the magnetic moment ratio μμ /μp, lepton mass ratio mμ /me, and proton charge radius rp. Also, muon capture experiments are exploring elusive features of weak interactions involving nucleons and nuclei. We will review the experimental landscape of contemporary high-precision and high-sensitivity experiments with muons. One focus is the novel methods and ingenious techniques that achieve such precision and sensitivity in recent, present, and planned experiments. Another focus is the uncommonly broad and topical range of questions in atomic, nuclear and particle physics that such experiments explore.

  3. Precision Joining Center

    Science.gov (United States)

    Powell, J. W.; Westphal, D. A.

    1991-08-01

    A workshop to obtain input from industry on the establishment of the Precision Joining Center (PJC) was held on July 10-12, 1991. The PJC is a center for training Joining Technologists in advanced joining techniques and concepts in order to promote the competitiveness of U.S. industry. The center will be established as part of the DOE Defense Programs Technology Commercialization Initiative, and operated by EG&G Rocky Flats in cooperation with the American Welding Society and the Colorado School of Mines Center for Welding and Joining Research. The overall objectives of the workshop were to validate the need for a Joining Technologists to fill the gap between the welding operator and the welding engineer, and to assure that the PJC will train individuals to satisfy that need. The consensus of the workshop participants was that the Joining Technologist is a necessary position in industry, and is currently used, with some variation, by many companies. It was agreed that the PJC core curriculum, as presented, would produce a Joining Technologist of value to industries that use precision joining techniques. The advantage of the PJC would be to train the Joining Technologist much more quickly and more completely. The proposed emphasis of the PJC curriculum on equipment intensive and hands-on training was judged to be essential.

  4. Computer-determined assay time based on preset precision

    International Nuclear Information System (INIS)

    Foster, L.A.; Hagan, R.; Martin, E.R.; Wachter, J.R.; Bonner, C.A.; Malcom, J.E.

    1994-01-01

    Most current assay systems for special nuclear materials (SNM) operate on the principle of a fixed assay time which provides acceptable measurement precision without sacrificing the required throughput of the instrument. Waste items to be assayed for SNM content can contain a wide range of nuclear material. Counting all items for the same preset assay time results in a wide range of measurement precision and wastes time at the upper end of the calibration range. A short time sample taken at the beginning of the assay could optimize the analysis time on the basis of the required measurement precision. To illustrate the technique of automatically determining the assay time, measurements were made with a segmented gamma scanner at the Plutonium Facility of Los Alamos National Laboratory with the assay time for each segment determined by counting statistics in that segment. Segments with very little SNM were quickly determined to be below the lower limit of the measurement range and the measurement was stopped. Segments with significant SNM were optimally assays to the preset precision. With this method the total assay time for each item is determined by the desired preset precision. This report describes the precision-based algorithm and presents the results of measurements made to test its validity

  5. Precision Medicine in Cancer Treatment

    Science.gov (United States)

    Precision medicine helps doctors select cancer treatments that are most likely to help patients based on a genetic understanding of their disease. Learn about the promise of precision medicine and the role it plays in cancer treatment.

  6. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  7. Precision Medicine in Cardiovascular Diseases

    Directory of Open Access Journals (Sweden)

    Yan Liu

    2017-02-01

    Full Text Available Since President Obama announced the Precision Medicine Initiative in the United States, more and more attention has been paid to precision medicine. However, clinicians have already used it to treat conditions such as cancer. Many cardiovascular diseases have a familial presentation, and genetic variants are associated with the prevention, diagnosis, and treatment of cardiovascular diseases, which are the basis for providing precise care to patients with cardiovascular diseases. Large-scale cohorts and multiomics are critical components of precision medicine. Here we summarize the application of precision medicine to cardiovascular diseases based on cohort and omic studies, and hope to elicit discussion about future health care.

  8. Precisely predictable Dirac observables

    CERN Document Server

    Cordes, Heinz Otto

    2006-01-01

    This work presents a "Clean Quantum Theory of the Electron", based on Dirac’s equation. "Clean" in the sense of a complete mathematical explanation of the well known paradoxes of Dirac’s theory, and a connection to classical theory, including the motion of a magnetic moment (spin) in the given field, all for a charged particle (of spin ½) moving in a given electromagnetic field. This theory is relativistically covariant, and it may be regarded as a mathematically consistent quantum-mechanical generalization of the classical motion of such a particle, à la Newton and Einstein. Normally, our fields are time-independent, but also discussed is the time-dependent case, where slightly different features prevail. A "Schroedinger particle", such as a light quantum, experiences a very different (time-dependent) "Precise Predictablity of Observables". An attempt is made to compare both cases. There is not the Heisenberg uncertainty of location and momentum; rather, location alone possesses a built-in uncertainty ...

  9. Prompt and Precise Prototyping

    Science.gov (United States)

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  10. Precisely Tracking Childhood Death.

    Science.gov (United States)

    Farag, Tamer H; Koplan, Jeffrey P; Breiman, Robert F; Madhi, Shabir A; Heaton, Penny M; Mundel, Trevor; Ordi, Jaume; Bassat, Quique; Menendez, Clara; Dowell, Scott F

    2017-07-01

    Little is known about the specific causes of neonatal and under-five childhood death in high-mortality geographic regions due to a lack of primary data and dependence on inaccurate tools, such as verbal autopsy. To meet the ambitious new Sustainable Development Goal 3.2 to eliminate preventable child mortality in every country, better approaches are needed to precisely determine specific causes of death so that prevention and treatment interventions can be strengthened and focused. Minimally invasive tissue sampling (MITS) is a technique that uses needle-based postmortem sampling, followed by advanced histopathology and microbiology to definitely determine cause of death. The Bill & Melinda Gates Foundation is supporting a new surveillance system called the Child Health and Mortality Prevention Surveillance network, which will determine cause of death using MITS in combination with other information, and yield cause-specific population-based mortality rates, eventually in up to 12-15 sites in sub-Saharan Africa and south Asia. However, the Gates Foundation funding alone is not enough. We call on governments, other funders, and international stakeholders to expand the use of pathology-based cause of death determination to provide the information needed to end preventable childhood mortality.

  11. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  12. Fast and sensitive detection of indels induced by precise gene targeting

    DEFF Research Database (Denmark)

    Yang, Zhang; Steentoft, Catharina; Hauge, Camilla

    2015-01-01

    The nuclease-based gene editing tools are rapidly transforming capabilities for altering the genome of cells and organisms with great precision and in high throughput studies. A major limitation in application of precise gene editing lies in lack of sensitive and fast methods to detect...... and characterize the induced DNA changes. Precise gene editing induces double-stranded DNA breaks that are repaired by error-prone non-homologous end joining leading to introduction of insertions and deletions (indels) at the target site. These indels are often small and difficult and laborious to detect...

  13. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M A; Fink, D; Hua, Q; Jacobsen, G E; Lawson, E M; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  14. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  15. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    Science.gov (United States)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  16. [Precision nutrition in the era of precision medicine].

    Science.gov (United States)

    Chen, P Z; Wang, H

    2016-12-06

    Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.

  17. Precision medicine in myasthenia graves: begin from the data precision

    Science.gov (United States)

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  18. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  19. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  20. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  1. A programmable, scalable-throughput interleaver

    NARCIS (Netherlands)

    Rijshouwer, E.J.C.; Berkel, van C.H.

    2010-01-01

    The interleaver stages of digital communication standards show a surprisingly large variation in throughput, state sizes, and permutation functions. Furthermore, data rates for 4G standards such as LTE-Advanced will exceed typical baseband clock frequencies of handheld devices. Multistream operation

  2. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    Science.gov (United States)

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2017-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  3. Pharmacometabolomics informs viromics towards precision medicine

    Directory of Open Access Journals (Sweden)

    Aggeliki Balasopoulou

    2016-10-01

    Full Text Available Nowadays, we are experiencing the big data era with the emerging challenge of single data interpretation. Although the advent of high-throughput technologies as well as chemo- and bio- informatics tools presents pan-omics data as the way forward to precision medicine, personalized health care and tailored-made therapeutics can be only envisaged when interindividual variability in response to/ toxicity of xenobiotics can be interpreted and thus, predicted. We know that such variability is the net outcome of genetics (host and microbiota and environmental factors (diet, lifestyle, polypharmacy, microbiota and for this, tremendous efforts have been made to clarify key-molecules from correlation to causality to clinical significance. Herein, we focus on the host-microbiome interplay and its direct and indirect impact on efficacy and toxicity of xenobiotics and we inevitably wonder about the role of viruses, as the least acknowledged ones. We present the emerging discipline of pharmacometabolomics-informed viromics, in which pre-dose metabotypes can assist modeling and prediction of interindividual response to/ toxicity of xenobiotics. Such features, either alone or in combination with host genetics, can power biomarker discovery so long as the features are variable among patients, stable enough to be of predictive value, and better than pre-existing tools for predicting therapeutic efficacy/ toxicity.

  4. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  5. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  6. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  7. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  8. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    Science.gov (United States)

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  9. Fabrication of combinatorial nm-planar electrode array for high throughput evaluation of organic semiconductors

    International Nuclear Information System (INIS)

    Haemori, M.; Edura, T.; Tsutsui, K.; Itaka, K.; Wada, Y.; Koinuma, H.

    2006-01-01

    We have fabricated a combinatorial nm-planar electrode array by using photolithography and chemical mechanical polishing processes for high throughput electrical evaluation of organic devices. Sub-nm precision was achieved with respect to the average level difference between each pair of electrodes and a dielectric layer. The insulating property between the electrodes is high enough to measure I-V characteristics of organic semiconductors. Bottom-contact field-effect-transistors (FETs) of pentacene were fabricated on this electrode array by use of molecular beam epitaxy. It was demonstrated that the array could be used as a pre-patterned device substrate for high throughput screening of the electrical properties of organic semiconductors

  10. Clinical proteomics-driven precision medicine for targeted cancer therapy: current overview and future perspectives.

    Science.gov (United States)

    Zhou, Li; Wang, Kui; Li, Qifu; Nice, Edouard C; Zhang, Haiyuan; Huang, Canhua

    2016-01-01

    Cancer is a common disease that is a leading cause of death worldwide. Currently, early detection and novel therapeutic strategies are urgently needed for more effective management of cancer. Importantly, protein profiling using clinical proteomic strategies, with spectacular sensitivity and precision, offer excellent promise for the identification of potential biomarkers that would direct the development of targeted therapeutic anticancer drugs for precision medicine. In particular, clinical sample sources, including tumor tissues and body fluids (blood, feces, urine and saliva), have been widely investigated using modern high-throughput mass spectrometry-based proteomic approaches combined with bioinformatic analysis, to pursue the possibilities of precision medicine for targeted cancer therapy. Discussed in this review are the current advantages and limitations of clinical proteomics, the available strategies of clinical proteomics for the management of precision medicine, as well as the challenges and future perspectives of clinical proteomics-driven precision medicine for targeted cancer therapy.

  11. Precision medicine for nurses: 101.

    Science.gov (United States)

    Lemoine, Colleen

    2014-05-01

    To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Achieving high data throughput in research networks

    International Nuclear Information System (INIS)

    Matthews, W.; Cottrell, L.

    2001-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155 Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  13. Achieving High Data Throughput in Research Networks

    International Nuclear Information System (INIS)

    Matthews, W

    2004-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  14. High throughput salt separation from uranium deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S.W.; Park, K.M.; Kim, J.G.; Kim, I.T.; Park, S.B., E-mail: swkwon@kaeri.re.kr [Korea Atomic Energy Research Inst. (Korea, Republic of)

    2014-07-01

    It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites in pyroprocessing. Multilayer porous crucible system was proposed to increase a throughput of the salt distiller in this study. An integrated sieve-crucible assembly was also investigated for the practical use of the porous crucible system. The salt evaporation behaviors were compared between the conventional nonporous crucible and the porous crucible. Two step weight reductions took place in the porous crucible, whereas the salt weight reduced only at high temperature by distillation in a nonporous crucible. The first weight reduction in the porous crucible was caused by the liquid salt penetrated out through the perforated crucible during the temperature elevation until the distillation temperature. Multilayer porous crucibles have a benefit to expand the evaporation surface area. (author)

  15. Implicit Consensus: Blockchain with Unbounded Throughput

    OpenAIRE

    Ren, Zhijie; Cong, Kelong; Pouwelse, Johan; Erkin, Zekeriya

    2017-01-01

    Recently, the blockchain technique was put in the spotlight as it introduced a systematic approach for multiple parties to reach consensus without needing trust. However, the application of this technique in practice is severely restricted due to its limitations in throughput. In this paper, we propose a novel consensus model, namely the implicit consensus, with a distinctive blockchain-based distributed ledger in which each node holds its individual blockchain. In our system, the consensus i...

  16. Advanced bioanalytics for precision medicine.

    Science.gov (United States)

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  17. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  18. Precision Oncology: Between Vaguely Right and Precisely Wrong.

    Science.gov (United States)

    Brock, Amy; Huang, Sui

    2017-12-01

    Precision Oncology seeks to identify and target the mutation that drives a tumor. Despite its straightforward rationale, concerns about its effectiveness are mounting. What is the biological explanation for the "imprecision?" First, Precision Oncology relies on indiscriminate sequencing of genomes in biopsies that barely represent the heterogeneous mix of tumor cells. Second, findings that defy the orthodoxy of oncogenic "driver mutations" are now accumulating: the ubiquitous presence of oncogenic mutations in silent premalignancies or the dynamic switching without mutations between various cell phenotypes that promote progression. Most troublesome is the observation that cancer cells that survive treatment still will have suffered cytotoxic stress and thereby enter a stem cell-like state, the seeds for recurrence. The benefit of "precision targeting" of mutations is inherently limited by this counterproductive effect. These findings confirm that there is no precise linear causal relationship between tumor genotype and phenotype, a reminder of logician Carveth Read's caution that being vaguely right may be preferable to being precisely wrong. An open-minded embrace of the latest inconvenient findings indicating nongenetic and "imprecise" phenotype dynamics of tumors as summarized in this review will be paramount if Precision Oncology is ultimately to lead to clinical benefits. Cancer Res; 77(23); 6473-9. ©2017 AACR . ©2017 American Association for Cancer Research.

  19. Numerical precision control and GRACE

    International Nuclear Information System (INIS)

    Fujimoto, J.; Hamaguchi, N.; Ishikawa, T.; Kaneko, T.; Morita, H.; Perret-Gallix, D.; Tokura, A.; Shimizu, Y.

    2006-01-01

    The control of the numerical precision of large-scale computations like those generated by the GRACE system for automatic Feynman diagram calculations has become an intrinsic part of those packages. Recently, Hitachi Ltd. has developed in FORTRAN a new library HMLIB for quadruple and octuple precision arithmetic where the number of lost-bits is made available. This library has been tested with success on the 1-loop radiative correction to e + e - ->e + e - τ + τ - . It is shown that the approach followed by HMLIB provides an efficient way to track down the source of numerical significance losses and to deliver high-precision results yet minimizing computing time

  20. Lung Cancer Precision Medicine Trials

    Science.gov (United States)

    Patients with lung cancer are benefiting from the boom in targeted and immune-based therapies. With a series of precision medicine trials, NCI is keeping pace with the rapidly changing treatment landscape for lung cancer.

  1. Precision engineering: an evolutionary perspective.

    Science.gov (United States)

    Evans, Chris J

    2012-08-28

    Precision engineering is a relatively new name for a technology with roots going back over a thousand years; those roots span astronomy, metrology, fundamental standards, manufacturing and money-making (literally). Throughout that history, precision engineers have created links across disparate disciplines to generate innovative responses to society's needs and wants. This review combines historical and technological perspectives to illuminate precision engineering's current character and directions. It first provides us a working definition of precision engineering and then reviews the subject's roots. Examples will be given showing the contributions of the technology to society, while simultaneously showing the creative tension between the technological convergence that spurs new directions and the vertical disintegration that optimizes manufacturing economics.

  2. How GNSS Enables Precision Farming

    Science.gov (United States)

    2014-12-01

    Precision farming: Feeding a Growing Population Enables Those Who Feed the World. Immediate and Ongoing Needs - population growth (more to feed) - urbanization (decrease in arable land) Double food production by 2050 to meet world demand. To meet thi...

  3. FROM PERSONALIZED TO PRECISION MEDICINE

    Directory of Open Access Journals (Sweden)

    K. V. Raskina

    2017-01-01

    Full Text Available The need to maintain a high quality of life against a backdrop of its inevitably increasing duration is one of the main problems of modern health care. The concept of "right drug to the right patient at the right time", which at first was bearing the name "personalized", is currently unanimously approved by international scientific community as "precision medicine". Precision medicine takes all the individual characteristics into account: genes diversity, environment, lifestyles, and even bacterial microflora and also involves the use of the latest technological developments, which serves to ensure that each patient gets assistance fitting his state best. In the United States, Canada and France national precision medicine programs have already been submitted and implemented. The aim of this review is to describe the dynamic integration of precision medicine methods into routine medical practice and life of modern society. The new paradigm prospects description are complemented by figures, proving the already achieved success in the application of precise methods for example, the targeted therapy of cancer. All in all, the presence of real-life examples, proving the regularity of transition to a new paradigm, and a wide range  of technical and diagnostic capabilities available and constantly evolving make the all-round transition to precision medicine almost inevitable.

  4. Probabilistic Coexistence and Throughput of Cognitive Dual-Polarized Networks

    Directory of Open Access Journals (Sweden)

    J.-M. Dricot

    2010-01-01

    Full Text Available Diversity techniques for cognitive radio networks are important since they enable the primary and secondary terminals to efficiently share the spectral resources in the same location simultaneously. In this paper, we investigate a simple, yet powerful, diversity scheme by exploiting the polarimetric dimension. More precisely, we evaluate a scenario where the cognitive terminals use cross-polarized communications with respect to the primary users. Our approach is network-centric, that is, the performance of the proposed dual-polarized system is investigated in terms of link throughput in the primary and the secondary networks. In order to carry out this analysis, we impose a probabilistic coexistence constraint derived from an information-theoretic approach, that is, we enforce a guaranteed capacity for a primary terminal for a high fraction of time. Improvements brought about by the use of our scheme are demonstrated analytically and through simulations. In particular, the main simulation parameters are extracted from a measurement campaign dedicated to the characterization of indoor-to-indoor and outdoor-to-indoor polarization behaviors. Our results suggest that the polarimetric dimension represents a remarkable opportunity, yet easily implementable, in the context of cognitive radio networks.

  5. [Morphometry of pulmonary tissue: From manual to high throughput automation].

    Science.gov (United States)

    Sallon, C; Soulet, D; Tremblay, Y

    2017-12-01

    Weibel's research has shown that any alteration of the pulmonary structure has effects on function. This demonstration required a quantitative analysis of lung structures called morphometry. This is possible thanks to stereology, a set of methods based on principles of geometry and statistics. His work has helped to better understand the morphological harmony of the lung, which is essential for its proper functioning. An imbalance leads to pathophysiology such as chronic obstructive pulmonary disease in adults and bronchopulmonary dysplasia in neonates. It is by studying this imbalance that new therapeutic approaches can be developed. These advances are achievable only through morphometric analytical methods, which are increasingly precise and focused, in particular thanks to the high-throughput automation of these methods. This review makes a comparison between an automated method that we developed in the laboratory and semi-manual methods of morphometric analyzes. The automation of morphometric measurements is a fundamental asset in the study of pulmonary pathophysiology because it is an assurance of robustness, reproducibility and speed. This tool will thus contribute significantly to the acceleration of the race for the development of new drugs. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  6. Using high-throughput barcode sequencing to efficiently map connectomes.

    Science.gov (United States)

    Peikon, Ian D; Kebschull, Justus M; Vagin, Vasily V; Ravens, Diana I; Sun, Yu-Chi; Brouzes, Eric; Corrêa, Ivan R; Bressan, Dario; Zador, Anthony M

    2017-07-07

    The function of a neural circuit is determined by the details of its synaptic connections. At present, the only available method for determining a neural wiring diagram with single synapse precision-a 'connectome'-is based on imaging methods that are slow, labor-intensive and expensive. Here, we present SYNseq, a method for converting the connectome into a form that can exploit the speed and low cost of modern high-throughput DNA sequencing. In SYNseq, each neuron is labeled with a unique random nucleotide sequence-an RNA 'barcode'-which is targeted to the synapse using engineered proteins. Barcodes in pre- and postsynaptic neurons are then associated through protein-protein crosslinking across the synapse, extracted from the tissue, and joined into a form suitable for sequencing. Although our failure to develop an efficient barcode joining scheme precludes the widespread application of this approach, we expect that with further development SYNseq will enable tracing of complex circuits at high speed and low cost. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Personalized In Vitro and In Vivo Cancer Models to Guide Precision Medicine | Office of Cancer Genomics

    Science.gov (United States)

    Precision medicine is an approach that takes into account the influence of individuals' genes, environment, and lifestyle exposures to tailor interventions. Here, we describe the development of a robust precision cancer care platform that integrates whole-exome sequencing with a living biobank that enables high-throughput drug screens on patient-derived tumor organoids. To date, 56 tumor-derived organoid cultures and 19 patient-derived xenograft (PDX) models have been established from the 769 patients enrolled in an Institutional Review Board-approved clinical trial.

  8. Toward high throughput optical metamaterial assemblies.

    Science.gov (United States)

    Fontana, Jake; Ratna, Banahalli R

    2015-11-01

    Optical metamaterials have unique engineered optical properties. These properties arise from the careful organization of plasmonic elements. Transitioning these properties from laboratory experiments to functional materials may lead to disruptive technologies for controlling light. A significant issue impeding the realization of optical metamaterial devices is the need for robust and efficient assembly strategies to govern the order of the nanometer-sized elements while enabling macroscopic throughput. This mini-review critically highlights recent approaches and challenges in creating these artificial materials. As the ability to assemble optical metamaterials improves, new unforeseen opportunities may arise for revolutionary optical devices.

  9. (Super Variable Costing-Throughput Costing)

    OpenAIRE

    Çakıcı, Cemal

    2006-01-01

    (Super Variable Costing-Throughput Costing) The aim of this study is to explain the super-variable costing method which is a new subject in cost and management accounting and to show it’s working practicly.Shortly, super-variable costing can be defined as a costing method which is use only direct material costs in calculate of product costs and treats all costs except these (direct labor and overhead) as periad costs or operating costs.By using super-variable costing method, product costs ar...

  10. Nanomaterials for Cancer Precision Medicine.

    Science.gov (United States)

    Wang, Yilong; Sun, Shuyang; Zhang, Zhiyuan; Shi, Donglu

    2018-04-01

    Medical science has recently advanced to the point where diagnosis and therapeutics can be carried out with high precision, even at the molecular level. A new field of "precision medicine" has consequently emerged with specific clinical implications and challenges that can be well-addressed by newly developed nanomaterials. Here, a nanoscience approach to precision medicine is provided, with a focus on cancer therapy, based on a new concept of "molecularly-defined cancers." "Next-generation sequencing" is introduced to identify the oncogene that is responsible for a class of cancers. This new approach is fundamentally different from all conventional cancer therapies that rely on diagnosis of the anatomic origins where the tumors are found. To treat cancers at molecular level, a recently developed "microRNA replacement therapy" is applied, utilizing nanocarriers, in order to regulate the driver oncogene, which is the core of cancer precision therapeutics. Furthermore, the outcome of the nanomediated oncogenic regulation has to be accurately assessed by the genetically characterized, patient-derived xenograft models. Cancer therapy in this fashion is a quintessential example of precision medicine, presenting many challenges to the materials communities with new issues in structural design, surface functionalization, gene/drug storage and delivery, cell targeting, and medical imaging. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Precision Medicine and Men's Health.

    Science.gov (United States)

    Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith

    2017-07-01

    Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.

  12. Precision Medicine in Gastrointestinal Pathology.

    Science.gov (United States)

    Wang, David H; Park, Jason Y

    2016-05-01

    -Precision medicine is the promise of individualized therapy and management of patients based on their personal biology. There are now multiple global initiatives to perform whole-genome sequencing on millions of individuals. In the United States, an early program was the Million Veteran Program, and a more recent proposal in 2015 by the president of the United States is the Precision Medicine Initiative. To implement precision medicine in routine oncology care, genetic variants present in tumors need to be matched with effective clinical therapeutics. When we focus on the current state of precision medicine for gastrointestinal malignancies, it becomes apparent that there is a mixed history of success and failure. -To present the current state of precision medicine using gastrointestinal oncology as a model. We will present currently available targeted therapeutics, promising new findings in clinical genomic oncology, remaining quality issues in genomic testing, and emerging oncology clinical trial designs. -Review of the literature including clinical genomic studies on gastrointestinal malignancies, clinical oncology trials on therapeutics targeted to molecular alterations, and emerging clinical oncology study designs. -Translating our ability to sequence thousands of genes into meaningful improvements in patient survival will be the challenge for the next decade.

  13. Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.

    Science.gov (United States)

    Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin

    2016-02-01

    High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.

  14. Optical tools for high-throughput screening of abrasion resistance of combinatorial libraries of organic coatings

    Science.gov (United States)

    Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.

    2002-02-01

    Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.

  15. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  16. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    Science.gov (United States)

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established

  17. Sensing technologies for precision irrigation

    CERN Document Server

    Ćulibrk, Dubravko; Minic, Vladan; Alonso Fernandez, Marta; Alvarez Osuna, Javier; Crnojevic, Vladimir

    2014-01-01

    This brief provides an overview of state-of-the-art sensing technologies relevant to the problem of precision irrigation, an emerging field within the domain of precision agriculture. Applications of wireless sensor networks, satellite data and geographic information systems in the domain are covered. This brief presents the basic concepts of the technologies and emphasizes the practical aspects that enable the implementation of intelligent irrigation systems. The authors target a broad audience interested in this theme and organize the content in five chapters, each concerned with a specific technology needed to address the problem of optimal crop irrigation. Professionals and researchers will find the text a thorough survey with practical applications.

  18. Precision measurement with atom interferometry

    International Nuclear Information System (INIS)

    Wang Jin

    2015-01-01

    Development of atom interferometry and its application in precision measurement are reviewed in this paper. The principle, features and the implementation of atom interferometers are introduced, the recent progress of precision measurement with atom interferometry, including determination of gravitational constant and fine structure constant, measurement of gravity, gravity gradient and rotation, test of weak equivalence principle, proposal of gravitational wave detection, and measurement of quadratic Zeeman shift are reviewed in detail. Determination of gravitational redshift, new definition of kilogram, and measurement of weak force with atom interferometry are also briefly introduced. (topical review)

  19. ELECTROWEAK PHYSICS AND PRECISION STUDIES

    International Nuclear Information System (INIS)

    MARCIANO, W.

    2005-01-01

    The utility of precision electroweak measurements for predicting the Standard Model Higgs mass via quantum loop effects is discussed. Current values of m W , sin 2 θ W (m Z ) # ovr MS# and m t imply a relatively light Higgs which is below the direct experimental bound but possibly consistent with Supersymmetry expectations. The existence of Supersymmetry is further suggested by a 2σ discrepancy between experiment and theory for the muon anomalous magnetic moment. Constraints from precision studies on other types of ''New Physics'' are also briefly described

  20. Universal precision sine bar attachment

    Science.gov (United States)

    Mann, Franklin D. (Inventor)

    1989-01-01

    This invention relates to an attachment for a sine bar which can be used to perform measurements during lathe operations or other types of machining operations. The attachment can be used for setting precision angles on vises, dividing heads, rotary tables and angle plates. It can also be used in the inspection of machined parts, when close tolerances are required, and in the layout of precision hardware. The novelty of the invention is believed to reside in a specific versatile sine bar attachment for measuring a variety of angles on a number of different types of equipment.

  1. Introduction to precise numerical methods

    CERN Document Server

    Aberth, Oliver

    2007-01-01

    Precise numerical analysis may be defined as the study of computer methods for solving mathematical problems either exactly or to prescribed accuracy. This book explains how precise numerical analysis is constructed. The book also provides exercises which illustrate points from the text and references for the methods presented. All disc-based content for this title is now available on the Web. · Clearer, simpler descriptions and explanations ofthe various numerical methods· Two new types of numerical problems; accurately solving partial differential equations with the included software and computing line integrals in the complex plane.

  2. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  3. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  4. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  5. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  6. A Programmable, Scalable-Throughput Interleaver

    Directory of Open Access Journals (Sweden)

    E. J. C. Rijshouwer

    2010-01-01

    Full Text Available The interleaver stages of digital communication standards show a surprisingly large variation in throughput, state sizes, and permutation functions. Furthermore, data rates for 4G standards such as LTE-Advanced will exceed typical baseband clock frequencies of handheld devices. Multistream operation for Software Defined Radio and iterative decoding algorithms will call for ever higher interleave data rates. Our interleave machine is built around 8 single-port SRAM banks and can be programmed to generate up to 8 addresses every clock cycle. The scalable architecture combines SIMD and VLIW concepts with an efficient resolution of bank conflicts. A wide range of cellular, connectivity, and broadcast interleavers have been mapped on this machine, with throughputs up to more than 0.5 Gsymbol/second. Although it was designed for channel interleaving, the application domain of the interleaver extends also to Turbo interleaving. The presented configuration of the architecture is designed as a part of a programmable outer receiver on a prototype board. It offers (near universal programmability to enable the implementation of new interleavers. The interleaver measures 2.09 mm2 in 65 nm CMOS (including memories and proves functional on silicon.

  7. A Programmable, Scalable-Throughput Interleaver

    Directory of Open Access Journals (Sweden)

    Rijshouwer EJC

    2010-01-01

    Full Text Available The interleaver stages of digital communication standards show a surprisingly large variation in throughput, state sizes, and permutation functions. Furthermore, data rates for 4G standards such as LTE-Advanced will exceed typical baseband clock frequencies of handheld devices. Multistream operation for Software Defined Radio and iterative decoding algorithms will call for ever higher interleave data rates. Our interleave machine is built around 8 single-port SRAM banks and can be programmed to generate up to 8 addresses every clock cycle. The scalable architecture combines SIMD and VLIW concepts with an efficient resolution of bank conflicts. A wide range of cellular, connectivity, and broadcast interleavers have been mapped on this machine, with throughputs up to more than 0.5 Gsymbol/second. Although it was designed for channel interleaving, the application domain of the interleaver extends also to Turbo interleaving. The presented configuration of the architecture is designed as a part of a programmable outer receiver on a prototype board. It offers (near universal programmability to enable the implementation of new interleavers. The interleaver measures 2.09 m in 65 nm CMOS (including memories and proves functional on silicon.

  8. Student throughput variables and properties: Varying cohort sizes

    Directory of Open Access Journals (Sweden)

    Lucas C.A. Stoop

    2017-11-01

    Full Text Available A recent research paper described how student throughput variables and properties combine to explain the behaviour of stationary or simplified throughput systems. Such behaviour can be understood in terms of the locus of a point in the triangular admissible region of the H-S plane, where H represents headcounts and S successful credits, each depending on the system properties at that point. The efficiency of the student throughput process is given by the ratio S/H. Simplified throughput systems are characterised by stationary graduation and dropout patterns of students as well as by annual intakes of student cohorts of equal size. The effect of varying the size of the annual intakes of student cohorts is reported on here. The observations made lead to the establishment of a more generalised student throughput theory which includes the simplified theory as a special case. The generalised theory still retains the notion of a triangular admissible region in the H-S plane but with the size and shape of the triangle depending on the size of the student cohorts. The ratio S/H again emerges as the process efficiency measure for throughput systems in general with unchanged roles assigned to important system properties. This theory provides for a more fundamental understanding of student throughput systems encountered in real life. Significance: A generalised stationary student throughput theory through varying cohort sizes allows for a far better understanding of real student throughput systems.

  9. Robust Throughput Boosting for Low Latency Dynamic Partial Reconfiguration

    DEFF Research Database (Denmark)

    Nannarelli, Alberto; Re, M.; Cardarilli, Gian Carlo

    2017-01-01

    Reducing the configuration time of portions of an FPGA at run time is crucial in contemporary FPGA-based accelerators. In this work, we propose a method to increase the throughput for FPGA dynamic partial reconfiguration by using standard IP blocks. The throughput is increased by over-clocking th......Reducing the configuration time of portions of an FPGA at run time is crucial in contemporary FPGA-based accelerators. In this work, we propose a method to increase the throughput for FPGA dynamic partial reconfiguration by using standard IP blocks. The throughput is increased by over...

  10. PFP total process throughput calculation and basis of estimate

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Process Throughput Calculation and Basis of Estimate document provides the calculated value and basis of estimate for process throughput associated with material stabilization operations conducted in 234-52 Building. The process throughput data provided reflects the best estimates of material processing rates consistent with experience at the Plutonium Finishing Plant (PFP) and other U.S. Department of Energy (DOE) sites. The rates shown reflect demonstrated capacity during ''full'' operation. They do not reflect impacts of building down time. Therefore, these throughput rates need to have a Total Operating Efficiency (TOE) factor applied

  11. Precision medicine and molecular imaging: new targeted approaches toward cancer therapeutic and diagnosis

    Science.gov (United States)

    Ghasemi, Mojtaba; Nabipour, Iraj; Omrani, Abdolmajid; Alipour, Zeinab; Assadi, Majid

    2016-01-01

    This paper presents a review of the importance and role of precision medicine and molecular imaging technologies in cancer diagnosis with therapeutics and diagnostics purposes. Precision medicine is progressively becoming a hot topic in all disciplines related to biomedical investigation and has the capacity to become the paradigm for clinical practice. The future of medicine lies in early diagnosis and individually appropriate treatments, a concept that has been named precision medicine, i.e. delivering the right treatment to the right patient at the right time. Molecular imaging is quickly being recognized as a tool with the potential to ameliorate every aspect of cancer treatment. On the other hand, emerging high-throughput technologies such as omics techniques and systems approaches have generated a paradigm shift for biological systems in advanced life science research. In this review, we describe the precision medicine, difference between precision medicine and personalized medicine, precision medicine initiative, systems biology/medicine approaches (such as genomics, radiogenomics, transcriptomics, proteomics, and metabolomics), P4 medicine, relationship between systems biology/medicine approaches and precision medicine, and molecular imaging modalities and their utility in cancer treatment and diagnosis. Accordingly, the precision medicine and molecular imaging will enable us to accelerate and improve cancer management in future medicine. PMID:28078184

  12. Precision medicine and molecular imaging: new targeted approaches toward cancer therapeutic and diagnosis.

    Science.gov (United States)

    Ghasemi, Mojtaba; Nabipour, Iraj; Omrani, Abdolmajid; Alipour, Zeinab; Assadi, Majid

    2016-01-01

    This paper presents a review of the importance and role of precision medicine and molecular imaging technologies in cancer diagnosis with therapeutics and diagnostics purposes. Precision medicine is progressively becoming a hot topic in all disciplines related to biomedical investigation and has the capacity to become the paradigm for clinical practice. The future of medicine lies in early diagnosis and individually appropriate treatments, a concept that has been named precision medicine, i.e. delivering the right treatment to the right patient at the right time. Molecular imaging is quickly being recognized as a tool with the potential to ameliorate every aspect of cancer treatment. On the other hand, emerging high-throughput technologies such as omics techniques and systems approaches have generated a paradigm shift for biological systems in advanced life science research. In this review, we describe the precision medicine, difference between precision medicine and personalized medicine, precision medicine initiative, systems biology/medicine approaches (such as genomics, radiogenomics, transcriptomics, proteomics, and metabolomics), P4 medicine, relationship between systems biology/medicine approaches and precision medicine, and molecular imaging modalities and their utility in cancer treatment and diagnosis. Accordingly, the precision medicine and molecular imaging will enable us to accelerate and improve cancer management in future medicine.

  13. STANFORD (SLAC): Precision electroweak result

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Precision testing of the electroweak sector of the Standard Model has intensified with the recent publication* of results from the SLD collaboration's 1993 run on the Stanford Linear Collider, SLC. Using a highly polarized electron beam colliding with an unpolarized positron beam, SLD physicists measured the left-right asymmetry at the Z boson resonance with dramatically improved accuracy over 1992

  14. Spin and precision electroweak physics

    Energy Technology Data Exchange (ETDEWEB)

    Marciano, W.J. [Brookhaven National Lab., Upton, NY (United States)

    1994-12-01

    A perspective on fundamental parameters and precision tests of the Standard Model is given. Weak neutral current reactions are discussed with emphasis on those processes involving (polarized) electrons. The role of electroweak radiative corrections in determining the top quark mass and probing for {open_quotes}new physics{close_quotes} is described.

  15. Spin and precision electroweak physics

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1993-01-01

    A perspective on fundamental parameters and precision tests of the Standard Model is given. Weak neutral current reactions are discussed with emphasis on those processes involving (polarized) electrons. The role of electroweak radiative corrections in determining the top quark mass and probing for ''new physics'' is described

  16. Precision surveying system for PEP

    International Nuclear Information System (INIS)

    Gunn, J.; Lauritzen, T.; Sah, R.; Pellisier, P.F.

    1977-01-01

    A semi-automatic precision surveying system is being developed for PEP. Reference elevations for vertical alignment will be provided by a liquid level. The short range surveying will be accomplished using a Laser Surveying System featuring automatic data acquisition and analysis

  17. Precision medicine at the crossroads.

    Science.gov (United States)

    Olson, Maynard V

    2017-10-11

    There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.

  18. Proton gyromagnetic precision measurement system

    International Nuclear Information System (INIS)

    Zhu Deming; Deming Zhu

    1991-01-01

    A computerized control and measurement system used in the proton gyromagnetic precision meausrement is descirbed. It adopts the CAMAC data acquisition equipment, using on-line control and analysis with the HP85 and PDP-11/60 computer systems. It also adopts the RSX11M computer operation system, and the control software is written in FORTRAN language

  19. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  20. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  1. Throughput/inventory dollar-days

    DEFF Research Database (Denmark)

    Gupta, Mahesh; Andersen, Soeren

    2018-01-01

    As the semiconductor industry moves away from vertical integration, performance measures play an increasingly important role to ensure effective collaboration. This paper demonstrates that the theory of constraints (TOC)-based measures, Throughput and Inventory Dollar-Days (T/IDD), induce...... autonomous supply chain (SC) links to function as a synergistic whole and thereby, improve the performance of the whole SC network significantly. We model an SC network of a well-known TOC case study using discrete event simulation and discuss managerial implications of these measures via a set of scenarios....... The scenarios explain how these measures – without sharing sensitive financial data – allow members of an SC network to monitor both the effectiveness (TDD) and efficiency (IDD) of SC members and lead them to create win-win solutions following well-known TOC-based planning and control concepts. We conclude...

  2. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  3. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  4. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  5. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  6. TCP Throughput Profiles Using Measurements over Dedicated Connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL; Sen, Satyabrata [ORNL; Towsley, Don [University of Massachusetts, Amherst; Vardoyan, Gayane [University of Massachusetts, Amherst; Kettimuthu, R. [Argonne National Laboratory (ANL); Foster, Ian [University of Chicago

    2017-06-01

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, in stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.

  7. Precision and accuracy in radiotherapy

    International Nuclear Information System (INIS)

    Brenner, J.D.

    1989-01-01

    The required precision due to random errors in the delivery of fractionated dose regime is considered. It is argued that suggestions that 1-3% precision is needed may be unnecessarily conservative. It is further suggested that random and systematic errors should not be combined with equal weight to yield an overall target uncertainty in dose delivery, systematic errors being of greater significance. The authors conclude that imprecise dose delivery and inaccurate dose delivery affect patient-cure results differently. Whereas, for example, a 10% inaccuracy in dose delivery would be quite catastrophic in the case considered here, a corresponding imprecision would have a much smaller effect on overall success rates. (author). 14 refs.; 2 figs

  8. Precision electroweak physics at LEP

    Energy Technology Data Exchange (ETDEWEB)

    Mannelli, M.

    1994-12-01

    Copious event statistics, a precise understanding of the LEP energy scale, and a favorable experimental situation at the Z{sup 0} resonance have allowed the LEP experiments to provide both dramatic confirmation of the Standard Model of strong and electroweak interactions and to place substantially improved constraints on the parameters of the model. The author concentrates on those measurements relevant to the electroweak sector. It will be seen that the precision of these measurements probes sensitively the structure of the Standard Model at the one-loop level, where the calculation of the observables measured at LEP is affected by the value chosen for the top quark mass. One finds that the LEP measurements are consistent with the Standard Model, but only if the mass of the top quark is measured to be within a restricted range of about 20 GeV.

  9. Environment-assisted precision measurement

    DEFF Research Database (Denmark)

    Goldstein, G.; Cappellaro, P.; Maze, J. R.

    2011-01-01

    We describe a method to enhance the sensitivity of precision measurements that takes advantage of the environment of a quantum sensor to amplify the response of the sensor to weak external perturbations. An individual qubit is used to sense the dynamics of surrounding ancillary qubits, which...... are in turn affected by the external field to be measured. The resulting sensitivity enhancement is determined by the number of ancillas that are coupled strongly to the sensor qubit; it does not depend on the exact values of the coupling strengths and is resilient to many forms of decoherence. The method...... achieves nearly Heisenberg-limited precision measurement, using a novel class of entangled states. We discuss specific applications to improve clock sensitivity using trapped ions and magnetic sensing based on electronic spins in diamond...

  10. Precise object tracking under deformation

    International Nuclear Information System (INIS)

    Saad, M.H

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This frame-work focuses on the precise object tracking under deformation such as scaling , rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results.

  11. Fit to Electroweak Precision Data

    International Nuclear Information System (INIS)

    Erler, Jens

    2006-01-01

    A brief review of electroweak precision data from LEP, SLC, the Tevatron, and low energies is presented. The global fit to all data including the most recent results on the masses of the top quark and the W boson reinforces the preference for a relatively light Higgs boson. I will also give an outlook on future developments at the Tevatron Run II, CEBAF, the LHC, and the ILC

  12. Precise Object Tracking under Deformation

    International Nuclear Information System (INIS)

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high

  13. Precision measurements of electroweak parameters

    CERN Document Server

    Savin, Alexander

    2017-01-01

    A set of selected precise measurements of the SM parameters from the LHC experiments is discussed. Results on W-mass measurement and forward-backward asymmetry in production of the Drell--Yan events in both dielectron and dimuon decay channels are presented together with results on the effective mixing angle measurements. Electroweak production of the vector bosons in association with two jets is discussed.

  14. Precision titration mini-calorimeter

    International Nuclear Information System (INIS)

    Ensor, D.; Kullberg, L.; Choppin, G.

    1977-01-01

    The design and test of a small volume calorimeter of high precision and simple design is described. The calorimeter operates with solution sample volumes in the range of 3 to 5 ml. The results of experiments on the entropy changes for two standard reactions: (1) reaction of tris(hydroxymethyl)aminomethane with hydrochloric acid and (2) reaction between mercury(II) and bromide ions are reported to confirm the accuracy and overall performance of the calorimeter

  15. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  16. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Hong Kezhu

    2007-01-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  17. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Kezhu Hong

    2007-04-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  18. Knowledge of Precision Farming Beneficiaries

    Directory of Open Access Journals (Sweden)

    A.V. Greena

    2016-05-01

    Full Text Available Precision Farming is one of the many advanced farming practices that make production more efficient by better resource management and reducing wastage. TN-IAMWARM is a world bank funded project aims to improve the farm productivity and income through better water management. The present study was carried out in Kambainallur sub basin of Dharmapuri district with 120 TN-IAMWARM beneficiaries as respondents. The result indicated that more than three fourth (76.67 % of the respondents had high level of knowledge on precision farming technologies which was made possible by the implementation of TN-IAMWARM project. The study further revealed that educational status, occupational status and exposure to agricultural messages had a positive and significant contribution to the knowledge level of the respondents at 0.01 level of probability whereas experience in precision farming and social participation had a positive and significant contribution at 0.05 level of probability.

  19. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  20. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  1. Constraining supersymmetry with precision data

    International Nuclear Information System (INIS)

    Pierce, D.M.; Erler, J.

    1997-01-01

    We discuss the results of a global fit to precision data in supersymmetric models. We consider both gravity- and gauge-mediated models. As the superpartner spectrum becomes light, the global fit to the data typically results in larger values of χ 2 . We indicate the regions of parameter space which are excluded by the data. We discuss the additional effect of the B(B→X s γ) measurement. Our analysis excludes chargino masses below M Z in the simplest gauge-mediated model with μ>0, with stronger constraints for larger values of tanβ. copyright 1997 American Institute of Physics

  2. High precision Standard Model Physics

    International Nuclear Information System (INIS)

    Magnin, J.

    2009-01-01

    The main goal of the LHCb experiment, one of the four large experiments of the Large Hadron Collider, is to try to give answers to the question of why Nature prefers matter over antimatter? This will be done by studying the decay of b quarks and their antimatter partners, b-bar, which will be produced by billions in 14 TeV p-p collisions by the LHC. In addition, as 'beauty' particles mainly decay in charm particles, an interesting program of charm physics will be carried on, allowing to measure quantities as for instance the D 0 -D-bar 0 mixing, with incredible precision.

  3. Electroweak precision measurements in CMS

    CERN Document Server

    Dordevic, Milos

    2017-01-01

    An overview of recent results on electroweak precision measurements from the CMS Collaboration is presented. Studies of the weak boson differential transverse momentum spectra, Z boson angular coefficients, forward-backward asymmetry of Drell-Yan lepton pairs and charge asymmetry of W boson production are made in comparison to the state-of-the-art Monte Carlo generators and theoretical predictions. The results show a good agreement with the Standard Model. As a proof of principle for future W mass measurements, a W-like analysis of the Z boson mass is performed.

  4. Precision proton spectrometers for CMS

    CERN Document Server

    Albrow, Michael

    2013-01-01

    We plan to add high precision tracking- and timing-detectors at z = +/- 240 m to CMS to study exclusive processes p + p -- p + X + p at high luminosity. This enables the LHC to be used as a tagged photon-photon collider, with X = l+l- and W+W-, and as a "tagged" gluon-gluon collider (with a spectator gluon) for QCD studies with jets. A second stage at z = 240 m would allow observations of exclusive Higgs boson production.

  5. Precise Analysis of String Expressions

    DEFF Research Database (Denmark)

    Christensen, Aske Simon; Møller, Anders; Schwartzbach, Michael Ignatieff

    2003-01-01

    We perform static analysis of Java programs to answer a simple question: which values may occur as results of string expressions? The answers are summarized for each expression by a regular language that is guaranteed to contain all possible values. We present several applications of this analysis...... are automatically produced. We present extensive benchmarks demonstrating that the analysis is efficient and produces results of useful precision......., including statically checking the syntax of dynamically generated expressions, such as SQL queries. Our analysis constructs flow graphs from class files and generates a context-free grammar with a nonterminal for each string expression. The language of this grammar is then widened into a regular language...

  6. The Many Faces of Precision

    Directory of Open Access Journals (Sweden)

    Andy eClark

    2013-05-01

    Full Text Available An appreciation of the many roles of ‘precision-weighting’ (upping the gain on select populations of prediction error units opens the door to better accounts of planning and ‘offline simulation’, makes suggestive contact with large bodies of work on embodied and situated cognition, and offers new perspectives on the ‘active brain’. Combined with the complex affordances of language and culture, and operating against the essential backdrop of a variety of more biologically basic ploys and stratagems, the result is a maximally context-sensitive, restless, constantly self-reconfiguring architecture.

  7. Thin films for precision optics

    International Nuclear Information System (INIS)

    Araujo, J.F.; Maurici, N.; Castro, J.C. de

    1983-01-01

    The technology of producing dielectric and/or metallic thin films for high precision optical components is discussed. Computer programs were developed in order to calculate and register, graphically, reflectance and transmittance spectra of multi-layer films. The technology of vacuum evaporation of several materials was implemented in our thin-films laboratory; various films for optics were then developed. The possibility of first calculate film characteristics and then produce the film is of great advantage since it reduces the time required to produce a new type of film and also reduces the cost of the project. (C.L.B.) [pt

  8. Plant responses to ambient temperature fluctuations and water-limiting conditions: A proteome-wide perspective

    Czech Academy of Sciences Publication Activity Database

    Johnova, P.; Skalák, J.; Saiz-Fernandez, I.; Brzobohatý, Břetislav

    2016-01-01

    Roč. 1864, č. 8 (2016), s. 916-931 ISSN 1570-9639 Institutional support: RVO:68081707 Keywords : Plant proteome * Heat stress * Cold stress Subject RIV: BO - Biophysics Impact factor: 2.773, year: 2016

  9. Identification of novel biomarkers in pediatric primitive neuroectodermal tumors and ependymomas by proteome-wide analysis

    NARCIS (Netherlands)

    de Bont, Judith M.; den Boer, Monique L.; Kros, Johan M.; Passier, Monique M. C. J.; Reddinglus, Roel E.; Smitt, Peter A. E. Sillevis; Luider, Theo M.; Pieters, Rob

    The aim of this study was to identify aberrantly expressed proteins in pediatric primitive neuroectodermal tumors (PNETs) and ependymornas. Tumor tissue of 29 PNET and 12 ependymoma patients was subjected to 2-dimensional difference gel electrophoresis. Gel analysis resulted in 79 protein spots

  10. Molecular mechanisms in ageing and cancer : Redox signalling & proteome-wide protein turnover

    NARCIS (Netherlands)

    Visscher, M.

    2016-01-01

    Cells are exposed to reactive oxygen species (ROS) that derive both from internal and external sources. The formation of cellular ROS is linked to ageing and age-related diseases like cancer. Because of conflicting data, the link between ROS and ageing or cancer is not exactly clear. ROS are in some

  11. NHS-Esters As Versatile Reactivity-Based Probes for Mapping Proteome-Wide Ligandable Hotspots.

    Science.gov (United States)

    Ward, Carl C; Kleinman, Jordan I; Nomura, Daniel K

    2017-06-16

    Most of the proteome is considered undruggable, oftentimes hindering translational efforts for drug discovery. Identifying previously unknown druggable hotspots in proteins would enable strategies for pharmacologically interrogating these sites with small molecules. Activity-based protein profiling (ABPP) has arisen as a powerful chemoproteomic strategy that uses reactivity-based chemical probes to map reactive, functional, and ligandable hotspots in complex proteomes, which has enabled inhibitor discovery against various therapeutic protein targets. Here, we report an alkyne-functionalized N-hydroxysuccinimide-ester (NHS-ester) as a versatile reactivity-based probe for mapping the reactivity of a wide range of nucleophilic ligandable hotspots, including lysines, serines, threonines, and tyrosines, encompassing active sites, allosteric sites, post-translational modification sites, protein interaction sites, and previously uncharacterized potential binding sites. Surprisingly, we also show that fragment-based NHS-ester ligands can be made to confer selectivity for specific lysine hotspots on specific targets including Dpyd, Aldh2, and Gstt1. We thus put forth NHS-esters as promising reactivity-based probes and chemical scaffolds for covalent ligand discovery.

  12. Proteome-wide dataset supporting functional study of tyrosine kinases in breast cancer

    Directory of Open Access Journals (Sweden)

    Nicos Angelopoulos

    2016-06-01

    Full Text Available Tyrosine kinases (TKs play an essential role in regulating various cellular activities and dysregulation of TK signaling contributes to oncogenesis. However, less than half of the TKs have been thoroughly studied. Through a combined use of RNAi and stable isotope labeling with amino acids in cell culture (SILAC-based quantitative proteomics, a global functional proteomic landscape of TKs in breast cancer was recently revealed highlighting a comprehensive and highly integrated signaling network regulated by TKs (Stebbing et al., 2015 [1]. We collate the enormous amount of the proteomic data in an open access platform, providing a valuable resource for studying the function of TKs in cancer and benefiting the science community. Here we present a detailed description related to this study (Stebbing et al., 2015 [1] and the raw data have been deposited to the ProteomeXchange Consortium via the PRIDE partner repository with the identifier http://www.ebi.ac.uk/pride/archive/projects/PXD002065.

  13. Proteome-wide Adaptations of Mouse Skeletal Muscles during a Full Month in Space.

    Science.gov (United States)

    Tascher, Georg; Brioche, Thomas; Maes, Pauline; Chopard, Angèle; O'Gorman, Donal; Gauquelin-Koch, Guillemette; Blanc, Stéphane; Bertile, Fabrice

    2017-07-07

    The safety of space flight is challenged by a severe loss of skeletal muscle mass, strength, and endurance that may compromise the health and performance of astronauts. The molecular mechanisms underpinning muscle atrophy and decreased performance have been studied mostly after short duration flights and are still not fully elucidated. By deciphering the muscle proteome changes elicited in mice after a full month aboard the BION-M1 biosatellite, we observed that the antigravity soleus incurred the greatest changes compared with locomotor muscles. Proteomics data notably suggested mitochondrial dysfunction, metabolic and fiber type switching toward glycolytic type II fibers, structural alterations, and calcium signaling-related defects to be the main causes for decreased muscle performance in flown mice. Alterations of the protein balance, mTOR pathway, myogenesis, and apoptosis were expected to contribute to muscle atrophy. Moreover, several signs reflecting alteration of telomere maintenance, oxidative stress, and insulin resistance were found as possible additional deleterious effects. Finally, 8 days of recovery post flight were not sufficient to restore completely flight-induced changes. Thus in-depth proteomics analysis unraveled the complex and multifactorial remodeling of skeletal muscle structure and function during long-term space flight, which should help define combined sets of countermeasures before, during, and after the flight.

  14. Proteome-wide analysis and diel proteomic profiling of the cyanobacterium Arthrospira platensis PCC 8005.

    Directory of Open Access Journals (Sweden)

    Sabine Matallana-Surget

    Full Text Available The filamentous cyanobacterium Arthrospira platensis has a long history of use as a food supply and it has been used by the European Space Agency in the MELiSSA project, an artificial microecosystem which supports life during long-term manned space missions. This study assesses progress in the field of cyanobacterial shotgun proteomics and light/dark diurnal cycles by focusing on Arthrospira platensis. Several fractionation workflows including gel-free and gel-based protein/peptide fractionation procedures were used and combined with LC-MS/MS analysis, enabling the overall identification of 1306 proteins, which represents 21% coverage of the theoretical proteome. A total of 30 proteins were found to be significantly differentially regulated under light/dark growth transition. Interestingly, most of the proteins showing differential abundance were related to photosynthesis, the Calvin cycle and translation processes. A novel aspect and major achievement of this work is the successful improvement of the cyanobacterial proteome coverage using a 3D LC-MS/MS approach, based on an immobilized metal affinity chromatography, a suitable tool that enabled us to eliminate the most abundant protein, the allophycocyanin. We also demonstrated that cell growth follows a light/dark cycle in A. platensis. This preliminary proteomic study has highlighted new characteristics of the Arthrospira platensis proteome in terms of diurnal regulation.

  15. Proteome-wide Identification of Poly(ADP-Ribosyl)ation Targets in Different Genotoxic Stress Responses

    DEFF Research Database (Denmark)

    Jungmichel, S.; Rosenthal, F.; Altmeyer, M.

    2013-01-01

    . Nuclear proteins encompassing nucleic acid binding properties are prominently PARylated upon genotoxic stress, consistent with the nuclear localization of ARTD1/PARP1 and ARTD2/PARP2. Distinct differences in proteins becoming PARylated upon various genotoxic insults are observed, exemplified...

  16. Proteome-wide analysis of arginine monomethylation reveals widespread occurrence in human cells

    DEFF Research Database (Denmark)

    Larsen, Sara C; Sylvestersen, Kathrine B; Mund, Andreas

    2016-01-01

    to the frequency of somatic mutations at arginine methylation sites throughout the proteome, we observed that somatic mutations were common at arginine methylation sites in proteins involved in mRNA splicing. Furthermore, in HeLa and U2OS cells, we found that distinct arginine methyltransferases differentially...... kidney 293 cells, indicating that the occurrence of this modification is comparable to phosphorylation and ubiquitylation. A site-level conservation analysis revealed that arginine methylation sites are less evolutionarily conserved compared to arginines that were not identified as modified...... as coactivator-associated arginine methyltransferase 1 (CARM1)] or PRMT1 increased the RNA binding function of HNRNPUL1. High-content single-cell imaging additionally revealed that knocking down CARM1 promoted the nuclear accumulation of SRSF2, independent of cell cycle phase. Collectively, the presented human...

  17. Proteome-wide identification of WRN-interacting proteins in untreated and nuclease-treated samples

    DEFF Research Database (Denmark)

    Lachapelle, Sophie; Gagné, Jean-Philippe; Garand, Chantal

    2011-01-01

    Werner syndrome (WS) is characterized by the premature onset of several age-associated pathologies. The protein defective in WS patients (WRN) is a helicase/exonuclease involved in DNA repair, replication, telomere maintenance, and transcription. Here, we present the results of a large-scale prot...

  18. Proteome-wide analysis of neural stem cell differentiation to facilitate transition to cell replacement therapies

    Czech Academy of Sciences Publication Activity Database

    Žižková, Martina; Suchá, Rita; Tylečková, Jiřina; Jarkovská, Karla; Mairychová, Kateřina; Kotrčová, Eva; Marsala, M.; Gadher, S. J.; Kovářová, Hana

    2015-01-01

    Roč. 12, č. 1 (2015), s. 83-95 ISSN 1478-9450 R&D Projects: GA MŠk ED2.1.00/03.0124; GA TA ČR(CZ) TA01011466 Institutional support: RVO:67985904 Keywords : cell therapy * immunomodulation * neural stem cell differentiation * neural subpopulation * neurodegenerative disease Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.465, year: 2015

  19. AOPs and Biomarkers: Bridging High Throughput Screening ...

    Science.gov (United States)

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  20. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  1. Enhanced throughput for infrared automated DNA sequencing

    Science.gov (United States)

    Middendorf, Lyle R.; Gartside, Bill O.; Humphrey, Pat G.; Roemer, Stephen C.; Sorensen, David R.; Steffens, David L.; Sutter, Scott L.

    1995-04-01

    Several enhancements have been developed and applied to infrared automated DNA sequencing resulting in significantly higher throughput. A 41 cm sequencing gel (31 cm well- to-read distance) combines high resolution of DNA sequencing fragments with optimized run times yielding two runs per day of 500 bases per sample. A 66 cm sequencing gel (56 cm well-to-read distance) produces sequence read lengths of up to 1000 bases for ds and ss templates using either T7 polymerase or cycle-sequencing protocols. Using a multichannel syringe to load 64 lanes allows 16 samples (compatible with 96-well format) to be visualized for each run. The 41 cm gel configuration allows 16,000 bases per day (16 samples X 500 bases/sample X 2 ten hour runs/day) to be sequenced with the advantages of infrared technology. Enhancements to internal labeling techniques using an infrared-labeled dATP molecule (Boehringer Mannheim GmbH, Penzberg, Germany; Sequenase (U.S. Biochemical) have also been made. The inclusion of glycerol in the sequencing reactions yields greatly improved results for some primer and template combinations. The inclusion of (alpha) -Thio-dNTP's in the labeling reaction increases signal intensity two- to three-fold.

  2. A fast image encryption system based on chaotic maps with finite precision representation

    International Nuclear Information System (INIS)

    Kwok, H.S.; Tang, Wallace K.S.

    2007-01-01

    In this paper, a fast chaos-based image encryption system with stream cipher structure is proposed. In order to achieve a fast throughput and facilitate hardware realization, 32-bit precision representation with fixed point arithmetic is assumed. The major core of the encryption system is a pseudo-random keystream generator based on a cascade of chaotic maps, serving the purpose of sequence generation and random mixing. Unlike the other existing chaos-based pseudo-random number generators, the proposed keystream generator not only achieves a very fast throughput, but also passes the statistical tests of up-to-date test suite even under quantization. The overall design of the image encryption system is to be explained while detail cryptanalysis is given and compared with some existing schemes

  3. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  4. Throughput Capacity of Ad Hoc Networks with Route Discovery

    Directory of Open Access Journals (Sweden)

    Blum Rick S

    2007-01-01

    Full Text Available Throughput capacity of large ad hoc networks has been shown to scale adversely with the size of network . However the need for the nodes to find or repair routes has not been analyzed in this context. In this paper, we explicitly take route discovery into account and obtain the scaling law for the throughput capacity under general assumptions on the network environment, node behavior, and the quality of route discovery algorithms. We also discuss a number of possible scenarios and show that the need for route discovery may change the scaling for the throughput capacity.

  5. Precision luminosity measurements at LHCb

    CERN Document Server

    Aaij, Roel; Adinolfi, Marco; Affolder, Anthony; Ajaltouni, Ziad; Akar, Simon; Albrecht, Johannes; Alessio, Federico; Alexander, Michael; Ali, Suvayu; Alkhazov, Georgy; Alvarez Cartelle, Paula; Alves Jr, Antonio Augusto; Amato, Sandra; Amerio, Silvia; Amhis, Yasmine; An, Liupan; Anderlini, Lucio; Anderson, Jonathan; Andreassen, Rolf; Andreotti, Mirco; Andrews, Jason; Appleby, Robert; Aquines Gutierrez, Osvaldo; Archilli, Flavio; Artamonov, Alexander; Artuso, Marina; Aslanides, Elie; Auriemma, Giulio; Baalouch, Marouen; Bachmann, Sebastian; Back, John; Badalov, Alexey; Baesso, Clarissa; Baldini, Wander; Barlow, Roger; Barschel, Colin; Barsuk, Sergey; Barter, William; Batozskaya, Varvara; Battista, Vincenzo; Bay, Aurelio; Beaucourt, Leo; Beddow, John; Bedeschi, Franco; Bediaga, Ignacio; Belogurov, Sergey; Belous, Konstantin; Belyaev, Ivan; Ben-Haim, Eli; Bencivenni, Giovanni; Benson, Sean; Benton, Jack; Berezhnoy, Alexander; Bernet, Roland; Bettler, Marc-Olivier; van Beuzekom, Martinus; Bien, Alexander; Bifani, Simone; Bird, Thomas; Bizzeti, Andrea; Bjørnstad, Pål Marius; Blake, Thomas; Blanc, Frédéric; Blouw, Johan; Blusk, Steven; Bocci, Valerio; Bondar, Alexander; Bondar, Nikolay; Bonivento, Walter; Borghi, Silvia; Borgia, Alessandra; Borsato, Martino; Bowcock, Themistocles; Bowen, Espen Eie; Bozzi, Concezio; Brambach, Tobias; Bressieux, Joël; Brett, David; Britsch, Markward; Britton, Thomas; Brodzicka, Jolanta; Brook, Nicholas; Brown, Henry; Bursche, Albert; Buytaert, Jan; Cadeddu, Sandro; Calabrese, Roberto; Calvi, Marta; Calvo Gomez, Miriam; Campana, Pierluigi; Campora Perez, Daniel; Carbone, Angelo; Carboni, Giovanni; Cardinale, Roberta; Cardini, Alessandro; Carson, Laurence; Carvalho Akiba, Kazuyoshi; Casse, Gianluigi; Cassina, Lorenzo; Castillo Garcia, Lucia; Cattaneo, Marco; Cauet, Christophe; Cenci, Riccardo; Charles, Matthew; Charpentier, Philippe; Chefdeville, Maximilien; Chen, Shanzhen; Cheung, Shu-Faye; Chiapolini, Nicola; Chrzaszcz, Marcin; Ciba, Krzystof; Cid Vidal, Xabier; Ciezarek, Gregory; Clarke, Peter; Clemencic, Marco; Cliff, Harry; Closier, Joel; Coco, Victor; Cogan, Julien; Cogneras, Eric; Cojocariu, Lucian; Collazuol, Gianmaria; Collins, Paula; Comerma-Montells, Albert; Contu, Andrea; Cook, Andrew; Coombes, Matthew; Coquereau, Samuel; Corti, Gloria; Corvo, Marco; Counts, Ian; Couturier, Benjamin; Cowan, Greig; Craik, Daniel Charles; Cruz Torres, Melissa Maria; Cunliffe, Samuel; Currie, Robert; D'Ambrosio, Carmelo; Dalseno, Jeremy; David, Pascal; David, Pieter; Davis, Adam; De Bruyn, Kristof; De Capua, Stefano; De Cian, Michel; De Miranda, Jussara; De Paula, Leandro; De Silva, Weeraddana; De Simone, Patrizia; Dean, Cameron Thomas; Decamp, Daniel; Deckenhoff, Mirko; Del Buono, Luigi; Déléage, Nicolas; Derkach, Denis; Deschamps, Olivier; Dettori, Francesco; Di Canto, Angelo; Dijkstra, Hans; Donleavy, Stephanie; Dordei, Francesca; Dorigo, Mirco; Dosil Suárez, Alvaro; Dossett, David; Dovbnya, Anatoliy; Dreimanis, Karlis; Dujany, Giulio; Dupertuis, Frederic; Durante, Paolo; Dzhelyadin, Rustem; Dziurda, Agnieszka; Dzyuba, Alexey; Easo, Sajan; Egede, Ulrik; Egorychev, Victor; Eidelman, Semen; Eisenhardt, Stephan; Eitschberger, Ulrich; Ekelhof, Robert; Eklund, Lars; El Rifai, Ibrahim; Elsasser, Christian; Ely, Scott; Esen, Sevda; Evans, Hannah Mary; Evans, Timothy; Falabella, Antonio; Färber, Christian; Farinelli, Chiara; Farley, Nathanael; Farry, Stephen; Fay, Robert; Ferguson, Dianne; Fernandez Albor, Victor; Ferreira Rodrigues, Fernando; Ferro-Luzzi, Massimiliano; Filippov, Sergey; Fiore, Marco; Fiorini, Massimiliano; Firlej, Miroslaw; Fitzpatrick, Conor; Fiutowski, Tomasz; Fol, Philip; Fontana, Marianna; Fontanelli, Flavio; Forty, Roger; Francisco, Oscar; Frank, Markus; Frei, Christoph; Frosini, Maddalena; Fu, Jinlin; Furfaro, Emiliano; Gallas Torreira, Abraham; Galli, Domenico; Gallorini, Stefano; Gambetta, Silvia; Gandelman, Miriam; Gandini, Paolo; Gao, Yuanning; García Pardiñas, Julián; Garofoli, Justin; Garra Tico, Jordi; Garrido, Lluis; Gascon, David; Gaspar, Clara; Gauld, Rhorry; Gavardi, Laura; Geraci, Angelo; Gersabeck, Evelina; Gersabeck, Marco; Gershon, Timothy; Ghez, Philippe; Gianelle, Alessio; Gianì, Sebastiana; Gibson, Valerie; Giubega, Lavinia-Helena; Gligorov, V.V.; Göbel, Carla; Golubkov, Dmitry; Golutvin, Andrey; Gomes, Alvaro; Gotti, Claudio; Grabalosa Gándara, Marc; Graciani Diaz, Ricardo; Granado Cardoso, Luis Alberto; Graugés, Eugeni; Graziani, Giacomo; Grecu, Alexandru; Greening, Edward; Gregson, Sam; Griffith, Peter; Grillo, Lucia; Grünberg, Oliver; Gui, Bin; Gushchin, Evgeny; Guz, Yury; Gys, Thierry; Hadjivasiliou, Christos; Haefeli, Guido; Haen, Christophe; Haines, Susan; Hall, Samuel; Hamilton, Brian; Hampson, Thomas; Han, Xiaoxue; Hansmann-Menzemer, Stephanie; Harnew, Neville; Harnew, Samuel; Harrison, Jonathan; He, Jibo; Head, Timothy; Heijne, Veerle; Hennessy, Karol; Henrard, Pierre; Henry, Louis; Hernando Morata, Jose Angel; van Herwijnen, Eric; Heß, Miriam; Hicheur, Adlène; Hill, Donal; Hoballah, Mostafa; Hombach, Christoph; Hulsbergen, Wouter; Hunt, Philip; Hussain, Nazim; Hutchcroft, David; Hynds, Daniel; Idzik, Marek; Ilten, Philip; Jacobsson, Richard; Jaeger, Andreas; Jalocha, Pawel; Jans, Eddy; Jaton, Pierre; Jawahery, Abolhassan; Jing, Fanfan; John, Malcolm; Johnson, Daniel; Jones, Christopher; Joram, Christian; Jost, Beat; Jurik, Nathan; Kandybei, Sergii; Kanso, Walaa; Karacson, Matthias; Karbach, Moritz; Karodia, Sarah; Kelsey, Matthew; Kenyon, Ian; Ketel, Tjeerd; Khanji, Basem; Khurewathanakul, Chitsanu; Klaver, Suzanne; Klimaszewski, Konrad; Kochebina, Olga; Kolpin, Michael; Komarov, Ilya; Koopman, Rose; Koppenburg, Patrick; Korolev, Mikhail; Kozlinskiy, Alexandr; Kravchuk, Leonid; Kreplin, Katharina; Kreps, Michal; Krocker, Georg; Krokovny, Pavel; Kruse, Florian; Kucewicz, Wojciech; Kucharczyk, Marcin; Kudryavtsev, Vasily; Kurek, Krzysztof; Kvaratskheliya, Tengiz; La Thi, Viet Nga; Lacarrere, Daniel; Lafferty, George; Lai, Adriano; Lambert, Dean; Lambert, Robert W; Lanfranchi, Gaia; Langenbruch, Christoph; Langhans, Benedikt; Latham, Thomas; Lazzeroni, Cristina; Le Gac, Renaud; van Leerdam, Jeroen; Lees, Jean-Pierre; Lefèvre, Regis; Leflat, Alexander; Lefrançois, Jacques; Leo, Sabato; Leroy, Olivier; Lesiak, Tadeusz; Leverington, Blake; Li, Yiming; Likhomanenko, Tatiana; Liles, Myfanwy; Lindner, Rolf; Linn, Christian; Lionetto, Federica; Liu, Bo; Lohn, Stefan; Longstaff, Iain; Lopes, Jose; Lopez-March, Neus; Lowdon, Peter; Lu, Haiting; Lucchesi, Donatella; Luo, Haofei; Lupato, Anna; Luppi, Eleonora; Lupton, Oliver; Machefert, Frederic; Machikhiliyan, Irina V; Maciuc, Florin; Maev, Oleg; Malde, Sneha; Malinin, Alexander; Manca, Giulia; Mancinelli, Giampiero; Mapelli, Alessandro; Maratas, Jan; Marchand, Jean François; Marconi, Umberto; Marin Benito, Carla; Marino, Pietro; Märki, Raphael; Marks, Jörg; Martellotti, Giuseppe; Martens, Aurelien; Martín Sánchez, Alexandra; Martinelli, Maurizio; Martinez Santos, Diego; Martinez Vidal, Fernando; Martins Tostes, Danielle; Massafferri, André; Matev, Rosen; Mathe, Zoltan; Matteuzzi, Clara; Maurin, Brice; Mazurov, Alexander; McCann, Michael; McCarthy, James; McNab, Andrew; McNulty, Ronan; McSkelly, Ben; Meadows, Brian; Meier, Frank; Meissner, Marco; Merk, Marcel; Milanes, Diego Alejandro; Minard, Marie-Noelle; Moggi, Niccolò; Molina Rodriguez, Josue; Monteil, Stephane; Morandin, Mauro; Morawski, Piotr; Mordà, Alessandro; Morello, Michael Joseph; Moron, Jakub; Morris, Adam Benjamin; Mountain, Raymond; Muheim, Franz; Müller, Katharina; Mussini, Manuel; Muster, Bastien; Naik, Paras; Nakada, Tatsuya; Nandakumar, Raja; Nasteva, Irina; Needham, Matthew; Neri, Nicola; Neubert, Sebastian; Neufeld, Niko; Neuner, Max; Nguyen, Anh Duc; Nguyen, Thi-Dung; Nguyen-Mau, Chung; Nicol, Michelle; Niess, Valentin; Niet, Ramon; Nikitin, Nikolay; Nikodem, Thomas; Novoselov, Alexey; O'Hanlon, Daniel Patrick; Oblakowska-Mucha, Agnieszka; Obraztsov, Vladimir; Oggero, Serena; Ogilvy, Stephen; Okhrimenko, Oleksandr; Oldeman, Rudolf; Onderwater, Gerco; Orlandea, Marius; Otalora Goicochea, Juan Martin; Owen, Patrick; Oyanguren, Maria Arantza; Pal, Bilas Kanti; Palano, Antimo; Palombo, Fernando; Palutan, Matteo; Panman, Jacob; Papanestis, Antonios; Pappagallo, Marco; Pappalardo, Luciano; Parkes, Christopher; Parkinson, Christopher John; Passaleva, Giovanni; Patel, Girish; Patel, Mitesh; Patrignani, Claudia; Pearce, Alex; Pellegrino, Antonio; Pepe Altarelli, Monica; Perazzini, Stefano; Perret, Pascal; Perrin-Terrin, Mathieu; Pescatore, Luca; Pesen, Erhan; Pessina, Gianluigi; Petridis, Konstantin; Petrolini, Alessandro; Picatoste Olloqui, Eduardo; Pietrzyk, Boleslaw; Pilař, Tomas; Pinci, Davide; Pistone, Alessandro; Playfer, Stephen; Plo Casasus, Maximo; Polci, Francesco; Poluektov, Anton; Polycarpo, Erica; Popov, Alexander; Popov, Dmitry; Popovici, Bogdan; Potterat, Cédric; Price, Eugenia; Price, Joseph David; Prisciandaro, Jessica; Pritchard, Adrian; Prouve, Claire; Pugatch, Valery; Puig Navarro, Albert; Punzi, Giovanni; Qian, Wenbin; Rachwal, Bartolomiej; Rademacker, Jonas; Rakotomiaramanana, Barinjaka; Rama, Matteo; Rangel, Murilo; Raniuk, Iurii; Rauschmayr, Nathalie; Raven, Gerhard; Redi, Federico; Reichert, Stefanie; Reid, Matthew; dos Reis, Alberto; Ricciardi, Stefania; Richards, Sophie; Rihl, Mariana; Rinnert, Kurt; Rives Molina, Vincente; Robbe, Patrick; Rodrigues, Ana Barbara; Rodrigues, Eduardo; Rodriguez Perez, Pablo; Roiser, Stefan; Romanovsky, Vladimir; Romero Vidal, Antonio; Rotondo, Marcello; Rouvinet, Julien; Ruf, Thomas; Ruiz, Hugo; Ruiz Valls, Pablo; Saborido Silva, Juan Jose; Sagidova, Naylya; Sail, Paul; Saitta, Biagio; Salustino Guimaraes, Valdir; Sanchez Mayordomo, Carlos; Sanmartin Sedes, Brais; Santacesaria, Roberta; Santamarina Rios, Cibran; Santovetti, Emanuele; Sarti, Alessio; Satriano, Celestina; Satta, Alessia; Saunders, Daniel Martin; Savrina, Darya; Schiller, Manuel; Schindler, Heinrich; Schlupp, Maximilian; Schmelling, Michael; Schmidt, Burkhard; Schneider, Olivier; Schopper, Andreas; Schubiger, Maxime; Schune, Marie Helene; Schwemmer, Rainer; Sciascia, Barbara; Sciubba, Adalberto; Semennikov, Alexander; Sepp, Indrek; Serra, Nicola; Serrano, Justine; Sestini, Lorenzo; Seyfert, Paul; Shapkin, Mikhail; Shapoval, Illya; Shcheglov, Yury; Shears, Tara; Shekhtman, Lev; Shevchenko, Vladimir; Shires, Alexander; Silva Coutinho, Rafael; Simi, Gabriele; Sirendi, Marek; Skidmore, Nicola; Skwarnicki, Tomasz; Smith, Anthony; Smith, Edmund; Smith, Eluned; Smith, Jackson; Smith, Mark; Snoek, Hella; Sokoloff, Michael; Soler, Paul; Soomro, Fatima; Souza, Daniel; Souza De Paula, Bruno; Spaan, Bernhard; Sparkes, Ailsa; Spradlin, Patrick; Sridharan, Srikanth; Stagni, Federico; Stahl, Marian; Stahl, Sascha; Steinkamp, Olaf; Stenyakin, Oleg; Stevenson, Scott; Stoica, Sabin; Stone, Sheldon; Storaci, Barbara; Stracka, Simone; Straticiuc, Mihai; Straumann, Ulrich; Stroili, Roberto; Subbiah, Vijay Kartik; Sun, Liang; Sutcliffe, William; Swientek, Krzysztof; Swientek, Stefan; Syropoulos, Vasileios; Szczekowski, Marek; Szczypka, Paul; Szumlak, Tomasz; T'Jampens, Stephane; Teklishyn, Maksym; Tellarini, Giulia; Teubert, Frederic; Thomas, Christopher; Thomas, Eric; van Tilburg, Jeroen; Tisserand, Vincent; Tobin, Mark; Tolk, Siim; Tomassetti, Luca; Tonelli, Diego; Topp-Joergensen, Stig; Torr, Nicholas; Tournefier, Edwige; Tourneur, Stephane; Tran, Minh Tâm; Tresch, Marco; Trisovic, Ana; Tsaregorodtsev, Andrei; Tsopelas, Panagiotis; Tuning, Niels; Ubeda Garcia, Mario; Ukleja, Artur; Ustyuzhanin, Andrey; Uwer, Ulrich; Vacca, Claudia; Vagnoni, Vincenzo; Valenti, Giovanni; Vallier, Alexis; Vazquez Gomez, Ricardo; Vazquez Regueiro, Pablo; Vázquez Sierra, Carlos; Vecchi, Stefania; Velthuis, Jaap; Veltri, Michele; Veneziano, Giovanni; Vesterinen, Mika; Viaud, Benoit; Vieira, Daniel; Vieites Diaz, Maria; Vilasis-Cardona, Xavier; Vollhardt, Achim; Volyanskyy, Dmytro; Voong, David; Vorobyev, Alexey; Vorobyev, Vitaly; Voß, Christian; de Vries, Jacco; Waldi, Roland; Wallace, Charlotte; Wallace, Ronan; Walsh, John; Wandernoth, Sebastian; Wang, Jianchun; Ward, David; Watson, Nigel; Websdale, David; Whitehead, Mark; Wicht, Jean; Wiedner, Dirk; Wilkinson, Guy; Williams, Matthew; Williams, Mike; Wilschut, Hans; Wilson, Fergus; Wimberley, Jack; Wishahi, Julian; Wislicki, Wojciech; Witek, Mariusz; Wormser, Guy; Wotton, Stephen; Wright, Simon; Wyllie, Kenneth; Xie, Yuehong; Xing, Zhou; Xu, Zhirui; Yang, Zhenwei; Yuan, Xuhao; Yushchenko, Oleg; Zangoli, Maria; Zavertyaev, Mikhail; Zhang, Liming; Zhang, Wen Chao; Zhang, Yanxi; Zhelezov, Alexey; Zhokhov, Anatoly; Zhong, Liang; Zvyagin, Alexander

    2014-12-05

    Measuring cross-sections at the LHC requires the luminosity to be determined accurately at each centre-of-mass energy $\\sqrt{s}$. In this paper results are reported from the luminosity calibrations carried out at the LHC interaction point 8 with the LHCb detector for $\\sqrt{s}$ = 2.76, 7 and 8 TeV (proton-proton collisions) and for $\\sqrt{s_{NN}}$ = 5 TeV (proton-lead collisions). Both the "van der Meer scan" and "beam-gas imaging" luminosity calibration methods were employed. It is observed that the beam density profile cannot always be described by a function that is factorizable in the two transverse coordinates. The introduction of a two-dimensional description of the beams improves significantly the consistency of the results. For proton-proton interactions at $\\sqrt{s}$ = 8 TeV a relative precision of the luminosity calibration of 1.47% is obtained using van der Meer scans and 1.43% using beam-gas imaging, resulting in a combined precision of 1.12%. Applying the calibration to the full data set determin...

  6. Laser precision microfabrication in Japan

    Science.gov (United States)

    Miyamoto, Isamu; Ooie, Toshihiko; Takeno, Shozui

    2000-11-01

    Electronic devices such as handy phones and micro computers have been rapidly expanding their market recent years due to their enhanced performance, down sizing and cost down. This has been realized by the innovation in the precision micro- fabrication technology of semiconductors and printed wiring circuit boards (PWB) where laser technologies such as lithography, drilling, trimming, welding and soldering play an important role. In phot lithography, for instance, KrF excimer lasers having a resolution of 0.18 micrometers has been used in production instead of mercury lamp. Laser drilling of PWB has been increased up to over 1000 holes per second, and approximately 800 laser drilling systems of PWB are expected to be delivered in the world market this year, and most of these laser processing systems are manufactured in Japan. Trend of laser micro-fabrication in Japanese industry is described along with recent topics of R&D, government supported project and future tasks of industrial laser precision micro-fabrication on the basis of the survey conducted by Japan laser Processing Society.

  7. Precision experiments in electroweak interactions

    International Nuclear Information System (INIS)

    Swartz, M.L.

    1990-03-01

    The electroweak theory of Glashow, Weinberg, and Salam (GWS) has become one of the twin pillars upon which our understanding of all particle physics phenomena rests. It is a brilliant achievement that qualitatively and quantitatively describes all of the vast quantity of experimental data that have been accumulated over some forty years. Note that the word quantitatively must be qualified. The low energy limiting cases of the GWS theory, Quantum Electrodynamics and the V-A Theory of Weak Interactions, have withstood rigorous testing. The high energy synthesis of these ideas, the GWS theory, has not yet been subjected to comparably precise scrutiny. The recent operation of a new generation of proton-antiproton (p bar p) and electron-positron (e + e - ) colliders has made it possible to produce and study large samples of the electroweak gauge bosons W ± and Z 0 . We expect that these facilities will enable very precise tests of the GWS theory to be performed in the near future. In keeping with the theme of this Institute, Physics at the 100 GeV Mass Scale, these lectures will explore the current status and the near-future prospects of these experiments

  8. Antihydrogen production and precision experiments

    International Nuclear Information System (INIS)

    Nieto, M.M.; Goldman, T.; Holzscheiter, M.H.

    1996-01-01

    The study of CPT invariance with the highest achievable precision in all particle sectors is of fundamental importance for physics. Equally important is the question of the gravitational acceleration of antimatter. In recent years, impressive progress has been achieved in capturing antiprotons in specially designed Penning traps, in cooling them to energies of a few milli-electron volts, and in storing them for hours in a small volume of space. Positrons have been accumulated in large numbers in similar traps, and low energy positron or positronium beams have been generated. Finally, steady progress has been made in trapping and cooling neutral atoms. Thus the ingredients to form antihydrogen at rest are at hand. Once antihydrogen atoms have been captured at low energy, spectroscopic methods can be applied to interrogate their atomic structure with extremely high precision and compare it to its normal matter counterpart, the hydrogen atom. Especially the 1S-2S transition, with a lifetime of the excited state of 122 msec and thereby a natural linewidth of 5 parts in 10 16 , offers in principle the possibility to directly compare matter and antimatter properties at a level of 1 part in 10 16

  9. Laser fusion and precision engineering

    International Nuclear Information System (INIS)

    Nakai, Sadao

    1989-01-01

    The development of laser nuclear fusion energy for attaining the self supply of energy in Japan and establishing the future perspective as the nation is based in the wide fields of high level science and technology. Therefore to its promotion, large expectation is placed as the powerful traction for the development of creative science and technology which are particularly necessary in Japan. The research on laser nuclear fusion advances steadily in the elucidation of the physics of pellet implosion which is its basic concept and compressed plasma parameters. In September, 1986, the number of neutron generation 10 13 , and in October, 1988, the high density compression 600 times as high as solid density have been achieved. Based on these results, now the laser nuclear fusion is in the situation to begin the attainment of ignition condition for nuclear fusion and the realization of break even. The optical components, high power laser technology, fuel pellet production, high resolution measurement, the simulation of implosion using a supercomputer and so on are closely related to precision engineering. In this report, the mechanism of laser nuclear fusion, the present status of its research, and the basic technologies and precision engineering are described. (K.I.)

  10. The Precision Field Lysimeter Concept

    Science.gov (United States)

    Fank, J.

    2009-04-01

    The understanding and interpretation of leaching processes have improved significantly during the past decades. Unlike laboratory experiments, which are mostly performed under very controlled conditions (e.g. homogeneous, uniform packing of pre-treated test material, saturated steady-state flow conditions, and controlled uniform hydraulic conditions), lysimeter experiments generally simulate actual field conditions. Lysimeters may be classified according to different criteria such as type of soil block used (monolithic or reconstructed), drainage (drainage by gravity or vacuum or a water table may be maintained), or weighing or non-weighing lysimeters. In 2004 experimental investigations have been set up to assess the impact of different farming systems on groundwater quality of the shallow floodplain aquifer of the river Mur in Wagna (Styria, Austria). The sediment is characterized by a thin layer (30 - 100 cm) of sandy Dystric Cambisol and underlying gravel and sand. Three precisely weighing equilibrium tension block lysimeters have been installed in agricultural test fields to compare water flow and solute transport under (i) organic farming, (ii) conventional low input farming and (iii) extensification by mulching grass. Specific monitoring equipment is used to reduce the well known shortcomings of lysimeter investigations: The lysimeter core is excavated as an undisturbed monolithic block (circular, 1 m2 surface area, 2 m depth) to prevent destruction of the natural soil structure, and pore system. Tracing experiments have been achieved to investigate the occurrence of artificial preferential flow and transport along the walls of the lysimeters. The results show that such effects can be neglected. Precisely weighing load cells are used to constantly determine the weight loss of the lysimeter due to evaporation and transpiration and to measure different forms of precipitation. The accuracy of the weighing apparatus is 0.05 kg, or 0.05 mm water equivalent

  11. Precision pharmacology for Alzheimer's disease.

    Science.gov (United States)

    Hampel, Harald; Vergallo, Andrea; Aguilar, Lisi Flores; Benda, Norbert; Broich, Karl; Cuello, A Claudio; Cummings, Jeffrey; Dubois, Bruno; Federoff, Howard J; Fiandaca, Massimo; Genthon, Remy; Haberkamp, Marion; Karran, Eric; Mapstone, Mark; Perry, George; Schneider, Lon S; Welikovitch, Lindsay A; Woodcock, Janet; Baldacci, Filippo; Lista, Simone

    2018-04-01

    The complex multifactorial nature of polygenic Alzheimer's disease (AD) presents significant challenges for drug development. AD pathophysiology is progressing in a non-linear dynamic fashion across multiple systems levels - from molecules to organ systems - and through adaptation, to compensation, and decompensation to systems failure. Adaptation and compensation maintain homeostasis: a dynamic equilibrium resulting from the dynamic non-linear interaction between genome, epigenome, and environment. An individual vulnerability to stressors exists on the basis of individual triggers, drivers, and thresholds accounting for the initiation and failure of adaptive and compensatory responses. Consequently, the distinct pattern of AD pathophysiology in space and time must be investigated on the basis of the individual biological makeup. This requires the implementation of systems biology and neurophysiology to facilitate Precision Medicine (PM) and Precision Pharmacology (PP). The regulation of several processes at multiple levels of complexity from gene expression to cellular cycle to tissue repair and system-wide network activation has different time delays (temporal scale) according to the affected systems (spatial scale). The initial failure might originate and occur at every level potentially affecting the whole dynamic interrelated systems within an organism. Unraveling the spatial and temporal dynamics of non-linear pathophysiological mechanisms across the continuum of hierarchical self-organized systems levels and from systems homeostasis to systems failure is key to understand AD. Measuring and, possibly, controlling space- and time-scaled adaptive and compensatory responses occurring during AD will represent a crucial step to achieve the capacity to substantially modify the disease course and progression at the best suitable timepoints, thus counteracting disrupting critical pathophysiological inputs. This approach will provide the conceptual basis for effective

  12. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  13. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...

  14. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  15. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  16. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  17. Precision cosmology and the landscape

    International Nuclear Information System (INIS)

    Bousso, Raphael; Bousso, Raphael

    2006-01-01

    After reviewing the cosmological constant problem--why is Lambda not huge?--I outline the two basic approaches that had emerged by the late 1980s, and note that each made a clear prediction. Precision cosmological experiments now indicate that the cosmological constant is nonzero. This result strongly favors the environmental approach, in which vacuum energy can vary discretely among widely separated regions in the universe. The need to explain this variation from first principles constitutes an observational constraint on fundamental theory. I review arguments that string theory satisfies this constraint, as it contains a dense discretuum of metastable vacua. The enormous landscape of vacua calls for novel, statistical methods of deriving predictions, and it prompts us to reexamine our description of spacetime on the largest scales. I discuss the effects of cosmological dynamics, and I speculate that weighting vacua by their entropy production may allow for prior-free predictions that do not resort to explicitly anthropic arguments

  18. GPS Precision Timing at CERN

    CERN Document Server

    Beetham, C G

    1999-01-01

    For the past decade, the Global Positioning System (GPS) has been used to provide precise time, frequency and position co-ordinates world-wide. Recently, equipment has become available specialising in providing extremely accurate timing information, referenced to Universal Time Co-ordinates (UTC). This feature has been used at CERN to provide time of day information for systems that have been installed in the Proton Synchrotron (PS), Super Proton Synchrotron (SPS) and the Large Electron Positron (LEP) machines. The different systems are described as well as the planned developments, particularly with respect to optical transmission and the Inter-Range Instrumentation Group IRIG-B standard, for future use in the Large Hadron Collider (LHC).

  19. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  20. Towards a compact and precise sample holder for macromolecular crystallography.

    Science.gov (United States)

    Papp, Gergely; Rossi, Christopher; Janocha, Robert; Sorez, Clement; Lopez-Marrero, Marcos; Astruc, Anthony; McCarthy, Andrew; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent

    2017-10-01

    Most of the sample holders currently used in macromolecular crystallography offer limited storage density and poor initial crystal-positioning precision upon mounting on a goniometer. This has now become a limiting factor at high-throughput beamlines, where data collection can be performed in a matter of seconds. Furthermore, this lack of precision limits the potential benefits emerging from automated harvesting systems that could provide crystal-position information which would further enhance alignment at beamlines. This situation provided the motivation for the development of a compact and precise sample holder with corresponding pucks, handling tools and robotic transfer protocols. The development process included four main phases: design, prototype manufacture, testing with a robotic sample changer and validation under real conditions on a beamline. Two sample-holder designs are proposed: NewPin and miniSPINE. They share the same robot gripper and allow the storage of 36 sample holders in uni-puck footprint-style pucks, which represents 252 samples in a dry-shipping dewar commonly used in the field. The pucks are identified with human- and machine-readable codes, as well as with radio-frequency identification (RFID) tags. NewPin offers a crystal-repositioning precision of up to 10 µm but requires a specific goniometer socket. The storage density could reach 64 samples using a special puck designed for fully robotic handling. miniSPINE is less precise but uses a goniometer mount compatible with the current SPINE standard. miniSPINE is proposed for the first implementation of the new standard, since it is easier to integrate at beamlines. An upgraded version of the SPINE sample holder with a corresponding puck named SPINEplus is also proposed in order to offer a homogenous and interoperable system. The project involved several European synchrotrons and industrial companies in the fields of consumables and sample-changer robotics. Manual handling of mini

  1. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  2. Development of sensor guided precision sprayers

    NARCIS (Netherlands)

    Nieuwenhuizen, A.T.; Zande, van de J.C.

    2013-01-01

    Sensor guided precision sprayers were developed to automate the spray process with a focus on emission reduction and identical or increased efficacy, with the precision agriculture concept in mind. Within the project “Innovations2” sensor guided precision sprayers were introduced to leek,

  3. Droplet electrospray ionization mass spectrometry for high throughput screening for enzyme inhibitors.

    Science.gov (United States)

    Sun, Shuwen; Kennedy, Robert T

    2014-09-16

    High throughput screening (HTS) is important for identifying molecules with desired properties. Mass spectrometry (MS) is potentially powerful for label-free HTS due to its high sensitivity, speed, and resolution. Segmented flow, where samples are manipulated as droplets separated by an immiscible fluid, is an intriguing format for high throughput MS because it can be used to reliably and precisely manipulate nanoliter volumes and can be directly coupled to electrospray ionization (ESI) MS for rapid analysis. In this study, we describe a "MS Plate Reader" that couples standard multiwell plate HTS workflow to droplet ESI-MS. The MS plate reader can reformat 3072 samples from eight 384-well plates into nanoliter droplets segmented by an immiscible oil at 4.5 samples/s and sequentially analyze them by MS at 2 samples/s. Using the system, a label-free screen for cathepsin B modulators against 1280 chemicals was completed in 45 min with a high Z-factor (>0.72) and no false positives (24 of 24 hits confirmed). The assay revealed 11 structures not previously linked to cathepsin inhibition. For even larger scale screening, reformatting and analysis could be conducted simultaneously, which would enable more than 145,000 samples to be analyzed in 1 day.

  4. A direct comparison of remote sensing approaches for high-throughput phenotyping in plant breeding

    Directory of Open Access Journals (Sweden)

    Maria Tattaris

    2016-08-01

    Full Text Available Remote sensing (RS of plant canopies permits non-intrusive, high-throughput monitoring of plant physiological characteristics. This study compared three RS approaches using a low flying UAV (unmanned aerial vehicle, with that of proximal sensing, and satellite-based imagery. Two physiological traits were considered, canopy temperature (CT and a vegetation index (NDVI, to determine the most viable approaches for large scale crop genetic improvement. The UAV-based platform achieves plot-level resolution while measuring several hundred plots in one mission via high-resolution thermal and multispectral imagery measured at altitudes of 30-100 m. The satellite measures multispectral imagery from an altitude of 770 km. Information was compared with proximal measurements using IR thermometers and an NDVI sensor at a distance of 0.5-1m above plots. For robust comparisons, CT and NDVI were assessed on panels of elite cultivars under irrigated and drought conditions, in different thermal regimes, and on un-adapted genetic resources under water deficit. Correlations between airborne data and yield/biomass at maturity were generally higher than equivalent proximal correlations. NDVI was derived from high-resolution satellite imagery for only larger sized plots (8.5 x 2.4 m due to restricted pixel density. Results support use of UAV-based RS techniques for high-throughput phenotyping for both precision and efficiency.

  5. Development and operation of a high-throughput accurate-wavelength lens-based spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Bell, Ronald E., E-mail: rbell@pppl.gov [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

    2014-11-15

    A high-throughput spectrometer for the 400–820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm{sup −1} grating is matched with fast f/1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy ≤0.075 arc sec. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount at the entrance slit. Computer-controlled hardware allows automated control of wavelength, timing, f-number, automated data collection, and wavelength calibration.

  6. High-Throughput Particle Manipulation Based on Hydrodynamic Effects in Microchannels

    Directory of Open Access Journals (Sweden)

    Chao Liu

    2017-03-01

    Full Text Available Microfluidic techniques are effective tools for precise manipulation of particles and cells, whose enrichment and separation is crucial for a wide range of applications in biology, medicine, and chemistry. Recently, lateral particle migration induced by the intrinsic hydrodynamic effects in microchannels, such as inertia and elasticity, has shown its promise for high-throughput and label-free particle manipulation. The particle migration can be engineered to realize the controllable focusing and separation of particles based on a difference in size. The widespread use of inertial and viscoelastic microfluidics depends on the understanding of hydrodynamic effects on particle motion. This review will summarize the progress in the fundamental mechanisms and key applications of inertial and viscoelastic particle manipulation.

  7. High-throughput analysis of amino acids in plant materials by single quadrupole mass spectrometry

    DEFF Research Database (Denmark)

    Dahl-Lassen, Rasmus; van Hecke, Jan Julien Josef; Jørgensen, Henning

    2018-01-01

    that it is very time consuming with typical chromatographic run times of 70 min or more. Results: We have here developed a high-throughput method for analysis of amino acid profiles in plant materials. The method combines classical protein hydrolysis and derivatization with fast separation by UHPLC and detection...... reducing the overall analytical costs compared to methods based on more advanced mass spectrometers....... by a single quadrupole (QDa) mass spectrometer. The chromatographic run time is reduced to 10 min and the precision, accuracy and sensitivity of the method are in line with other recent methods utilizing advanced and more expensive mass spectrometers. The sensitivity of the method is at least a factor 10...

  8. Ratiometric fluorescent pH-sensitive polymers for high-throughput monitoring of extracellular pH†

    OpenAIRE

    Zhang, Liqiang; Su, Fengyu; Kong, Xiangxing; Lee, Fred; Day, Kevin; Gao, Weimin; Vecera, Mary E.; Sohr, Jeremy M.; Buizer, Sean; Tian, Yanqing; Meldrum, Deirdre R

    2016-01-01

    Extracellular pH has a strong effect on cell metabolism and growth. Precisely detecting extracellular pH with high throughput is critical for cell metabolism research and fermentation applications. In this research, a series of ratiometric fluorescent pH sensitive polymers are developed and the ps-pH-neutral is characterized as the best one for exculsive detection of extracellular pH. Poly(N-(2-hydroxypropyl)methacrylamide) (PHPMA) is used as the host polymer to increase the water solubility ...

  9. Precision measurements with atom interferometry

    Science.gov (United States)

    Schubert, Christian; Abend, Sven; Schlippert, Dennis; Ertmer, Wolfgang; Rasel, Ernst M.

    2017-04-01

    Interferometry with matter waves enables precise measurements of rotations, accelerations, and differential accelerations [1-5]. This is exploited for determining fundamental constants [2], in fundamental science as e.g. testing the universality of free fall [3], and is applied for gravimetry [4], and gravity gradiometry [2,5]. At the Institut für Quantenoptik in Hannover, different approaches are pursued. A large scale device is designed and currently being set up to investigate the gain in precision for gravimetry, gradiometry, and fundamental tests on large baselines [6]. For field applications, a compact and transportable device is being developed. Its key feature is an atom chip source providing a collimated high flux of atoms which is expected to mitigate systematic uncertainties [7,8]. The atom chip technology and miniaturization benefits from microgravity experiments in the drop tower in Bremen and sounding rocket experiments [8,9] which act as pathfinders for space borne operation [10]. This contribution will report about our recent results. The presented work is supported by the CRC 1227 DQ-mat, the CRC 1128 geo-Q, the RTG 1729, the QUEST-LFS, and by the German Space Agency (DLR) with funds provided by the Federal Ministry of Economic Affairs and Energy (BMWi) due to an enactment of the German Bundestag under Grant No. DLR 50WM1552-1557. [1] P. Berg et al., Phys. Rev. Lett., 114, 063002, 2015; I. Dutta et al., Phys. Rev. Lett., 116, 183003, 2016. [2] J. B. Fixler et al., Science 315, 74 (2007); G. Rosi et al., Nature 510, 518, 2014. [3] D. Schlippert et al., Phys. Rev. Lett., 112, 203002, 2014. [4] A. Peters et al., Nature 400, 849, 1999; A. Louchet-Chauvet et al., New J. Phys. 13, 065026, 2011; C. Freier et al., J. of Phys.: Conf. Series 723, 012050, 2016. [5] J. M. McGuirk et al., Phys. Rev. A 65, 033608, 2002; P. Asenbaum et al., arXiv:1610.03832. [6] J. Hartwig et al., New J. Phys. 17, 035011, 2015. [7] H. Ahlers et al., Phys. Rev. Lett. 116, 173601

  10. The NCI Genomic Data Commons as an engine for precision medicine.

    Science.gov (United States)

    Jensen, Mark A; Ferretti, Vincent; Grossman, Robert L; Staudt, Louis M

    2017-07-27

    The National Cancer Institute Genomic Data Commons (GDC) is an information system for storing, analyzing, and sharing genomic and clinical data from patients with cancer. The recent high-throughput sequencing of cancer genomes and transcriptomes has produced a big data problem that precludes many cancer biologists and oncologists from gleaning knowledge from these data regarding the nature of malignant processes and the relationship between tumor genomic profiles and treatment response. The GDC aims to democratize access to cancer genomic data and to foster the sharing of these data to promote precision medicine approaches to the diagnosis and treatment of cancer.

  11. Development of a mobile multispectral imaging platform for precise field phenotyping

    DEFF Research Database (Denmark)

    Svensgaard, Jesper; Roitsch, Thomas Georg; Christensen, Svend

    2014-01-01

    nitrogen levels, replicated on two different soil types at four different dates from 15 May (BBCH 13) to 18 June (BBCH 41 to 57). The images were analyzed and derived vegetation coverage and Normalized Difference Vegetation index (NDVI) were used to assess varietal differences. The results showed...... potentials for differentiating between the varieties using both vegetation coverage and NDVI, especially at the early growth stages. The perspectives of high-precision and high-throughput imaging for field phenotyping are discussed including the potentials of measuring varietal differences via spectral...

  12. Precision is in their nature

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    There are more than 100 of them in the LHC ring and they have a total of about 400 degrees of freedom. Each one has 4 motors and the newest ones have their own beam-monitoring pickups. Their jaws constrain the relativistic, high-energy particles to a very small transverse area and protect the machine aperture. We are speaking about the LHC collimators, those ultra-precise instruments that leave escaping unstable particles no chance.   The internal structure of a new LHC collimator featuring (see red arrow) one of the beam position monitor's pickups. Designed at CERN but mostly produced by very specialised manufacturers in Europe, the LHC collimators are among the most complex elements of the accelerator. Their job is to control and safely dispose of the halo particles that are produced by unavoidable beam losses from the circulating beam core. “The LHC collimation system has been designed to ensure that beam losses in superconducting magnets remain below quench limits in al...

  13. The Age of Precision Cosmology

    Science.gov (United States)

    Chuss, David T.

    2012-01-01

    In the past two decades, our understanding of the evolution and fate of the universe has increased dramatically. This "Age of Precision Cosmology" has been ushered in by measurements that have both elucidated the details of the Big Bang cosmology and set the direction for future lines of inquiry. Our universe appears to consist of 5% baryonic matter; 23% of the universe's energy content is dark matter which is responsible for the observed structure in the universe; and 72% of the energy density is so-called "dark energy" that is currently accelerating the expansion of the universe. In addition, our universe has been measured to be geometrically flat to 1 %. These observations and related details of the Big Bang paradigm have hinted that the universe underwent an epoch of accelerated expansion known as Uinflation" early in its history. In this talk, I will review the highlights of modern cosmology, focusing on the contributions made by measurements of the cosmic microwave background, the faint afterglow of the Big Bang. I will also describe new instruments designed to measure the polarization of the cosmic microwave background in order to search for evidence of cosmic inflation.

  14. High precision redundant robotic manipulator

    International Nuclear Information System (INIS)

    Young, K.K.D.

    1998-01-01

    A high precision redundant robotic manipulator for overcoming contents imposed by obstacles or imposed by a highly congested work space is disclosed. One embodiment of the manipulator has four degrees of freedom and another embodiment has seven degrees of freedom. Each of the embodiments utilize a first selective compliant assembly robot arm (SCARA) configuration to provide high stiffness in the vertical plane, a second SCARA configuration to provide high stiffness in the horizontal plane. The seven degree of freedom embodiment also utilizes kinematic redundancy to provide the capability of avoiding obstacles that lie between the base of the manipulator and the end effector or link of the manipulator. These additional three degrees of freedom are added at the wrist link of the manipulator to provide pitch, yaw and roll. The seven degrees of freedom embodiment uses one revolute point per degree of freedom. For each of the revolute joints, a harmonic gear coupled to an electric motor is introduced, and together with properly designed based servo controllers provide an end point repeatability of less than 10 microns. 3 figs

  15. Studying antimatter with laser precision

    CERN Multimedia

    Katarina Anthony

    2012-01-01

    The next generation of antihydrogen trapping devices, ALPHA-2, is moving into CERN’s Antiproton Decelerator (AD) hall. This brand-new experiment will allow the ALPHA collaboration to conduct studies of antimatter with greater precision. ALPHA spokesperson Jeffrey Hangst was recently awarded a grant by the Carlsberg Foundation, which will be used to purchase equipment for the new experiment.   A 3-D view of the new magnet (in blue) and cryostat. The red lines show the paths of laser beams. LHC-type current leads for the superconducting magnets are visible on the top-right of the image. The ALPHA collaboration has been working to trap and study antihydrogen since 2006. Using antiprotons provided by CERN’s Antiproton Decelerator (AD), ALPHA was the first experiment to trap antihydrogen and to hold it long enough to study its properties. “The new ALPHA-2 experiment will use integrated lasers to probe the trapped antihydrogen,” explains Jeffrey Hangst, ALP...

  16. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  17. High precision anatomy for MEG.

    Science.gov (United States)

    Troebinger, Luzia; López, José David; Lutti, Antoine; Bradbury, David; Bestmann, Sven; Barnes, Gareth

    2014-02-01

    Precise MEG estimates of neuronal current flow are undermined by uncertain knowledge of the head location with respect to the MEG sensors. This is either due to head movements within the scanning session or systematic errors in co-registration to anatomy. Here we show how such errors can be minimized using subject-specific head-casts produced using 3D printing technology. The casts fit the scalp of the subject internally and the inside of the MEG dewar externally, reducing within session and between session head movements. Systematic errors in matching to MRI coordinate system are also reduced through the use of MRI-visible fiducial markers placed on the same cast. Bootstrap estimates of absolute co-registration error were of the order of 1mm. Estimates of relative co-registration error were <1.5mm between sessions. We corroborated these scalp based estimates by looking at the MEG data recorded over a 6month period. We found that the between session sensor variability of the subject's evoked response was of the order of the within session noise, showing no appreciable noise due to between-session movement. Simulations suggest that the between-session sensor level amplitude SNR improved by a factor of 5 over conventional strategies. We show that at this level of coregistration accuracy there is strong evidence for anatomical models based on the individual rather than canonical anatomy; but that this advantage disappears for errors of greater than 5mm. This work paves the way for source reconstruction methods which can exploit very high SNR signals and accurate anatomical models; and also significantly increases the sensitivity of longitudinal studies with MEG. © 2013. Published by Elsevier Inc. All rights reserved.

  18. High precision anatomy for MEG☆

    Science.gov (United States)

    Troebinger, Luzia; López, José David; Lutti, Antoine; Bradbury, David; Bestmann, Sven; Barnes, Gareth

    2014-01-01

    Precise MEG estimates of neuronal current flow are undermined by uncertain knowledge of the head location with respect to the MEG sensors. This is either due to head movements within the scanning session or systematic errors in co-registration to anatomy. Here we show how such errors can be minimized using subject-specific head-casts produced using 3D printing technology. The casts fit the scalp of the subject internally and the inside of the MEG dewar externally, reducing within session and between session head movements. Systematic errors in matching to MRI coordinate system are also reduced through the use of MRI-visible fiducial markers placed on the same cast. Bootstrap estimates of absolute co-registration error were of the order of 1 mm. Estimates of relative co-registration error were < 1.5 mm between sessions. We corroborated these scalp based estimates by looking at the MEG data recorded over a 6 month period. We found that the between session sensor variability of the subject's evoked response was of the order of the within session noise, showing no appreciable noise due to between-session movement. Simulations suggest that the between-session sensor level amplitude SNR improved by a factor of 5 over conventional strategies. We show that at this level of coregistration accuracy there is strong evidence for anatomical models based on the individual rather than canonical anatomy; but that this advantage disappears for errors of greater than 5 mm. This work paves the way for source reconstruction methods which can exploit very high SNR signals and accurate anatomical models; and also significantly increases the sensitivity of longitudinal studies with MEG. PMID:23911673

  19. Micro-patterned agarose gel devices for single-cell high-throughput microscopy of E. coli cells.

    Science.gov (United States)

    Priest, David G; Tanaka, Nobuyuki; Tanaka, Yo; Taniguchi, Yuichi

    2017-12-21

    High-throughput microscopy of bacterial cells elucidated fundamental cellular processes including cellular heterogeneity and cell division homeostasis. Polydimethylsiloxane (PDMS)-based microfluidic devices provide advantages including precise positioning of cells and throughput, however device fabrication is time-consuming and requires specialised skills. Agarose pads are a popular alternative, however cells often clump together, which hinders single cell quantitation. Here, we imprint agarose pads with micro-patterned 'capsules', to trap individual cells and 'lines', to direct cellular growth outwards in a straight line. We implement this micro-patterning into multi-pad devices called CapsuleHotel and LineHotel for high-throughput imaging. CapsuleHotel provides ~65,000 capsule structures per mm 2 that isolate individual Escherichia coli cells. In contrast, LineHotel provides ~300 line structures per mm that direct growth of micro-colonies. With CapsuleHotel, a quantitative single cell dataset of ~10,000 cells across 24 samples can be acquired and analysed in under 1 hour. LineHotel allows tracking growth of > 10 micro-colonies across 24 samples simultaneously for up to 4 generations. These easy-to-use devices can be provided in kit format, and will accelerate discoveries in diverse fields ranging from microbiology to systems and synthetic biology.

  20. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong; Xu, Chao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2016-04-15

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  1. [Progress in precision medicine: a scientific perspective].

    Science.gov (United States)

    Wang, B; Li, L M

    2017-01-10

    Precision medicine is a new strategy for disease prevention and treatment by taking into account differences in genetics, environment and lifestyles among individuals and making precise diseases classification and diagnosis, which can provide patients with personalized, targeted prevention and treatment. Large-scale population cohort studies are fundamental for precision medicine research, and could produce best evidence for precision medicine practices. Current criticisms on precision medicine mainly focus on the very small proportion of benefited patients, the neglect of social determinants for health, and the possible waste of limited medical resources. In spite of this, precision medicine is still a most hopeful research area, and would become a health care practice model in the future.

  2. Precision Medicine, Cardiovascular Disease and Hunting Elephants.

    Science.gov (United States)

    Joyner, Michael J

    2016-01-01

    Precision medicine postulates improved prediction, prevention, diagnosis and treatment of disease based on patient specific factors especially DNA sequence (i.e., gene) variants. Ideas related to precision medicine stem from the much anticipated "genetic revolution in medicine" arising seamlessly from the human genome project (HGP). In this essay I deconstruct the concept of precision medicine and raise questions about the validity of the paradigm in general and its application to cardiovascular disease. Thus far precision medicine has underperformed based on the vision promulgated by enthusiasts. While niche successes for precision medicine are likely, the promises of broad based transformation should be viewed with skepticism. Open discussion and debate related to precision medicine are urgently needed to avoid misapplication of resources, hype, iatrogenic interventions, and distraction from established approaches with ongoing utility. Failure to engage in such debate will lead to negative unintended consequences from a revolution that might never come. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Improved precision and accuracy for microarrays using updated probe set definitions

    Directory of Open Access Journals (Sweden)

    Larsson Ola

    2007-02-01

    Full Text Available Abstract Background Microarrays enable high throughput detection of transcript expression levels. Different investigators have recently introduced updated probe set definitions to more accurately map probes to our current knowledge of genes and transcripts. Results We demonstrate that updated probe set definitions provide both better precision and accuracy in probe set estimates compared to the original Affymetrix definitions. We show that the improved precision mainly depends on the increased number of probes that are integrated into each probe set, but we also demonstrate an improvement when the same number of probes is used. Conclusion Updated probe set definitions does not only offer expression levels that are more accurately associated to genes and transcripts but also improvements in the estimated transcript expression levels. These results give support for the use of updated probe set definitions for analysis and meta-analysis of microarray data.

  4. Active and Precise Control of Microdroplet Division Using Horizontal Pneumatic Valves in Bifurcating Microchannel

    Directory of Open Access Journals (Sweden)

    Shuichi Shoji

    2013-05-01

    Full Text Available This paper presents a microfluidic system for the active and precise control of microdroplet division in a micro device. Using two horizontal pneumatic valves formed at downstream of bifurcating microchannel, flow resistances of downstream channels were variably controlled. With the resistance control, volumetric ratio of downstream flows was changed and water-in-oil microdroplets were divided into two daughter droplets of different volume corresponding to the ratio. The microfluidic channels and pneumatic valves were fabricated by single-step soft lithography process of PDMS (polydimethylsiloxane using SU-8 mold. A wide range control of the daughter droplets’ volume ratio was achieved by the simple channel structure. Volumetric ratio between large and small daughter droplets are ranged from 1 to 70, and the smallest droplet volume of 14 pL was obtained. The proposed microfluidic device is applicable for precise and high throughput droplet based digital synthesis.

  5. Applying Precision Medicine and Immunotherapy Advances from Oncology to Host-Directed Therapies for Infectious Diseases.

    Science.gov (United States)

    Mahon, Robert N; Hafner, Richard

    2017-01-01

    To meet the challenges of increasing antimicrobial resistance, the infectious disease community needs innovative therapeutics. Precision medicine and immunotherapies are transforming cancer therapeutics by targeting the regulatory signaling pathways that are involved not only in malignancies but also in the metabolic and immunologic function of the tumor microenvironment. Infectious diseases target many of the same regulatory pathways as they modulate host metabolic functions for their own nutritional requirements and to impede host immunity. These similarities and the advances made in precision medicine and immuno-oncology that are relevant for the current development of host-directed therapies (HDTs) to treat infectious diseases are discussed. To harness this potential, improvements in drug screening methods and development of assays that utilize the research tools including high throughput multiplexes already developed by oncology are essential. A multidisciplinary approach that brings together immunologists, infectious disease specialists, and oncologists will be necessary to fully develop the potential of HDTs.

  6. Sensing Technologies for Precision Phenotyping in Vegetable Crops: Current Status and Future Challenges

    Directory of Open Access Journals (Sweden)

    Pasquale Tripodi

    2018-04-01

    Full Text Available Increasing the ability to investigate plant functions and structure through non-invasive methods with high accuracy has become a major target in plant breeding and precision agriculture. Emerging approaches in plant phenotyping play a key role in unraveling quantitative traits responsible for growth, production, quality, and resistance to various stresses. Beyond fully automatic phenotyping systems, several promising technologies can help accurately characterize a wide range of plant traits at affordable costs and with high-throughput. In this review, we revisit the principles of proximal and remote sensing, describing the application of non-invasive devices for precision phenotyping applied to the protected horticulture. Potentiality and constraints of big data management and integration with “omics” disciplines will also be discussed.

  7. ASSESSMENT OF SYSTEMATIC CHROMATIC ERRORS THAT IMPACT SUB-1% PHOTOMETRIC PRECISION IN LARGE-AREA SKY SURVEYS

    Energy Technology Data Exchange (ETDEWEB)

    Li, T. S.; DePoy, D. L.; Marshall, J. L.; Boada, S.; Mondrik, N.; Nagasawa, D. [George P. and Cynthia Woods Mitchell Institute for Fundamental Physics and Astronomy, and Department of Physics and Astronomy, Texas A and M University, College Station, TX 77843 (United States); Tucker, D.; Annis, J.; Finley, D. A.; Kent, S.; Lin, H.; Marriner, J.; Wester, W. [Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Kessler, R.; Scolnic, D. [Kavli Institute for Cosmological Physics, University of Chicago, Chicago, IL 60637 (United States); Bernstein, G. M. [Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Burke, D. L.; Rykoff, E. S. [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); James, D. J.; Walker, A. R. [Cerro Tololo Inter-American Observatory, National Optical Astronomy Observatory, Casilla 603, La Serena (Chile); Collaboration: DES Collaboration; and others

    2016-06-01

    Meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is both stable in time and uniform over the sky to 1% precision or better. Past and current surveys have achieved photometric precision of 1%–2% by calibrating the survey’s stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations in the wavelength dependence of the atmospheric transmission and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors (SCEs) using photometry from the Dark Energy Survey (DES) as an example. We first define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the SCEs caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane can be up to 2% in some bandpasses. We then compare the calculated SCEs with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput from auxiliary calibration systems. The residual after correction is less than 0.3%. Moreover, we calculate such SCEs for Type Ia supernovae and elliptical galaxies and find that the chromatic errors for non-stellar objects are redshift-dependent and can be larger than those for

  8. Assessment of Systematic Chromatic Errors that Impact Sub-1% Photometric Precision in Large-Area Sky Surveys

    Energy Technology Data Exchange (ETDEWEB)

    Li, T. S. [et al.

    2016-05-27

    Meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is stable in time and uniform over the sky to 1% precision or better. Past surveys have achieved photometric precision of 1-2% by calibrating the survey's stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations in the wavelength dependence of the atmospheric transmission and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors using photometry from the Dark Energy Survey (DES) as an example. We define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes, when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the systematic chromatic errors caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane, can be up to 2% in some bandpasses. We compare the calculated systematic chromatic errors with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput. The residual after correction is less than 0.3%. We also find that the errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.

  9. New methods for precision Moeller polarimetry*

    International Nuclear Information System (INIS)

    Gaskell, D.; Meekins, D.G.; Yan, C.

    2007-01-01

    Precision electron beam polarimetry is becoming increasingly important as parity violation experiments attempt to probe the frontiers of the standard model. In the few GeV regime, Moeller polarimetry is well suited to high-precision measurements, however is generally limited to use at relatively low beam currents (<10 μA). We present a novel technique that will enable precision Moeller polarimetry at very large currents, up to 100 μA. (orig.)

  10. The Development of Precise Engineering Surveying Technology

    Directory of Open Access Journals (Sweden)

    LI Guangyun

    2017-10-01

    Full Text Available With the construction of big science projects in China, the precise engineering surveying technology developed rapidly in the 21th century. Firstly, the paper summarized up the current development situation for the precise engineering surveying instrument and theory. Then the three typical cases of the precise engineering surveying practice such as accelerator alignment, industry measurement and high-speed railway surveying technology are focused.

  11. Modeling and control of precision actuators

    CERN Document Server

    Kiong, Tan Kok

    2013-01-01

    IntroductionGrowing Interest in Precise ActuatorsTypes of Precise ActuatorsApplications of Precise ActuatorsNonlinear Dynamics and ModelingHysteresisCreepFrictionForce RipplesIdentification and Compensation of Preisach Hysteresis in Piezoelectric ActuatorsSVD-Based Identification and Compensation of Preisach HysteresisHigh-Bandwidth Identification and Compensation of Hysteretic Dynamics in Piezoelectric ActuatorsConcluding RemarksIdentification and Compensation of Frict

  12. New EU Governance Modes in Professional Sport: Enhancing Throughput Legitimacy

    Directory of Open Access Journals (Sweden)

    Arnout Geeraert

    2014-08-01

    Full Text Available This article explores the limits and opportunities for enhancing the democratic legitimacy of EU actions in the field of professional sport using new modes of governance. It presents a conceptual toolkit by which the ‘throughput legitimacy’ of an EU policy can be analysed. Analysing the throughput legitimacy of the European social dialogue, we establish that, by improving the latter, both input and output legitimacy can be increased. The EU could borrow some of the positive elements of the social dialogue approach and incorporate them in the steering of other issues in professional sport. For instance, it may be interesting to pre-establish certain conditions on representativeness and relevance for participation in the policy process. Crucially, working on a clear theme-per-theme-basis instead of organising outsized gatherings such as the EU sport forum would definitely benefit throughput legitimacy.

  13. Precision platform for convex lens-induced confinement microscopy

    Science.gov (United States)

    Berard, Daniel; McFaul, Christopher M. J.; Leith, Jason S.; Arsenault, Adriel K. J.; Michaud, François; Leslie, Sabrina R.

    2013-10-01

    We present the conception, fabrication, and demonstration of a versatile, computer-controlled microscopy device which transforms a standard inverted fluorescence microscope into a precision single-molecule imaging station. The device uses the principle of convex lens-induced confinement [S. R. Leslie, A. P. Fields, and A. E. Cohen, Anal. Chem. 82, 6224 (2010)], which employs a tunable imaging chamber to enhance background rejection and extend diffusion-limited observation periods. Using nanopositioning stages, this device achieves repeatable and dynamic control over the geometry of the sample chamber on scales as small as the size of individual molecules, enabling regulation of their configurations and dynamics. Using microfluidics, this device enables serial insertion as well as sample recovery, facilitating temporally controlled, high-throughput measurements of multiple reagents. We report on the simulation and experimental characterization of this tunable chamber geometry, and its influence upon the diffusion and conformations of DNA molecules over extended observation periods. This new microscopy platform has the potential to capture, probe, and influence the configurations of single molecules, with dramatically improved imaging conditions in comparison to existing technologies. These capabilities are of immediate interest to a wide range of research and industry sectors in biotechnology, biophysics, materials, and chemistry.

  14. Multiplexed precision genome editing with trackable genomic barcodes in yeast.

    Science.gov (United States)

    Roy, Kevin R; Smith, Justin D; Vonesch, Sibylle C; Lin, Gen; Tu, Chelsea Szu; Lederer, Alex R; Chu, Angela; Suresh, Sundari; Nguyen, Michelle; Horecka, Joe; Tripathi, Ashutosh; Burnett, Wallace T; Morgan, Maddison A; Schulz, Julia; Orsley, Kevin M; Wei, Wu; Aiyar, Raeka S; Davis, Ronald W; Bankaitis, Vytas A; Haber, James E; Salit, Marc L; St Onge, Robert P; Steinmetz, Lars M

    2018-07-01

    Our understanding of how genotype controls phenotype is limited by the scale at which we can precisely alter the genome and assess the phenotypic consequences of each perturbation. Here we describe a CRISPR-Cas9-based method for multiplexed accurate genome editing with short, trackable, integrated cellular barcodes (MAGESTIC) in Saccharomyces cerevisiae. MAGESTIC uses array-synthesized guide-donor oligos for plasmid-based high-throughput editing and features genomic barcode integration to prevent plasmid barcode loss and to enable robust phenotyping. We demonstrate that editing efficiency can be increased more than fivefold by recruiting donor DNA to the site of breaks using the LexA-Fkh1p fusion protein. We performed saturation editing of the essential gene SEC14 and identified amino acids critical for chemical inhibition of lipid signaling. We also constructed thousands of natural genetic variants, characterized guide mismatch tolerance at the genome scale, and ascertained that cryptic Pol III termination elements substantially reduce guide efficacy. MAGESTIC will be broadly useful to uncover the genetic basis of phenotypes in yeast.

  15. Vacuum ultraviolet spectropolarimeter design for precise polarization measurements.

    Science.gov (United States)

    Narukage, Noriyuki; Auchère, Frédéric; Ishikawa, Ryohko; Kano, Ryouhei; Tsuneta, Saku; Winebarger, Amy R; Kobayashi, Ken

    2015-03-10

    Precise polarization measurements in the vacuum ultraviolet (VUV) region provide a new means for inferring weak magnetic fields in the upper atmosphere of the Sun and stars. We propose a VUV spectropolarimeter design ideally suited for this purpose. This design is proposed and adopted for the NASA-JAXA chromospheric lyman-alpha spectropolarimeter (CLASP), which will record the linear polarization (Stokes Q and U) of the hydrogen Lyman-α line (121.567 nm) profile. The expected degree of polarization is on the order of 0.1%. Our spectropolarimeter has two optically symmetric channels to simultaneously measure orthogonal linear polarization states with a single concave diffraction grating that serves both as the spectral dispersion element and beam splitter. This design has a minimal number of reflective components with a high VUV throughput. Consequently, these design features allow us to minimize the polarization errors caused by possible time variation of the VUV flux during the polarization modulation and by statistical photon noise.

  16. Management of genetic epilepsies: From empirical treatment to precision medicine.

    Science.gov (United States)

    Striano, Pasquale; Vari, Maria Stella; Mazzocchetti, Chiara; Verrotti, Alberto; Zara, Federico

    2016-05-01

    Despite the over 20 antiepileptic drugs (AEDs) now licensed for epilepsy treatment, seizures can be effectively controlled in about ∼70% of patients. Thus, epilepsy treatment is still challenging in about one third of patients and this may lead to a severe medically, physically, and socially disabling condition. However, there is clear evidence of heterogeneity of response to existing AEDs and a significant unmet need for effective intervention. A number of studies have shown that polymorphisms may influence the poor or inadequate therapeutic response as well as the occurrence of adverse effects. In addition, the new frontier of genomic technologies, including chromosome microarrays and next-generation sequencing, improved our understanding of the genetic architecture of epilepsies. Recent findings in some genetic epilepsy syndromes provide insights into mechanisms of epileptogenesis, unrevealing the role of a number of genes with different functions, such as ion channels, proteins associated to the vesical synaptic cycle or involved in energy metabolism. The rapid progress of high-throughput genomic sequencing and corresponding analysis tools in molecular diagnosis are revolutionizing the practice and it is a fact that for some monogenic epilepsies the molecular confirmation may influence the choice of the treatment. Moreover, the novel genetic methods, that are able to analyze all known genes at a reasonable price, are of paramount importance to discover novel therapeutic avenues and individualized (or precision) medicine. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    Science.gov (United States)

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  19. High-throughput screening to identify inhibitors of lysine demethylases.

    Science.gov (United States)

    Gale, Molly; Yan, Qin

    2015-01-01

    Lysine demethylases (KDMs) are epigenetic regulators whose dysfunction is implicated in the pathology of many human diseases including various types of cancer, inflammation and X-linked intellectual disability. Particular demethylases have been identified as promising therapeutic targets, and tremendous efforts are being devoted toward developing suitable small-molecule inhibitors for clinical and research use. Several High-throughput screening strategies have been developed to screen for small-molecule inhibitors of KDMs, each with advantages and disadvantages in terms of time, cost, effort, reliability and sensitivity. In this Special Report, we review and evaluate the High-throughput screening methods utilized for discovery of novel small-molecule KDM inhibitors.

  20. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  1. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  2. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  3. A New High-Throughput Approach to Genotype Ancient Human Gastrointestinal Parasites.

    Science.gov (United States)

    Côté, Nathalie M L; Daligault, Julien; Pruvost, Mélanie; Bennett, E Andrew; Gorgé, Olivier; Guimaraes, Silvia; Capelli, Nicolas; Le Bailly, Matthieu; Geigl, Eva-Maria; Grange, Thierry

    2016-01-01

    Human gastrointestinal parasites are good indicators for hygienic conditions and health status of past and present individuals and communities. While microscopic analysis of eggs in sediments of archeological sites often allows their taxonomic identification, this method is rarely effective at the species level, and requires both the survival of intact eggs and their proper identification. Genotyping via PCR-based approaches has the potential to achieve a precise species-level taxonomic determination. However, so far it has mostly been applied to individual eggs isolated from archeological samples. To increase the throughput and taxonomic accuracy, as well as reduce costs of genotyping methods, we adapted a PCR-based approach coupled with next-generation sequencing to perform precise taxonomic identification of parasitic helminths directly from archeological sediments. Our study of twenty-five 100 to 7,200 year-old archeological samples proved this to be a powerful, reliable and efficient approach for species determination even in the absence of preserved eggs, either as a stand-alone method or as a complement to microscopic studies.

  4. Throughput and latency programmable optical transceiver by using DSP and FEC control.

    Science.gov (United States)

    Tanimura, Takahito; Hoshida, Takeshi; Kato, Tomoyuki; Watanabe, Shigeki; Suzuki, Makoto; Morikawa, Hiroyuki

    2017-05-15

    We propose and experimentally demonstrate a proof-of-concept of a programmable optical transceiver that enables simultaneous optimization of multiple programmable parameters (modulation format, symbol rate, power allocation, and FEC) for satisfying throughput, signal quality, and latency requirements. The proposed optical transceiver also accommodates multiple sub-channels that can transport different optical signals with different requirements. Multi-degree-of-freedom of the parameters often leads to difficulty in finding the optimum combination among the parameters due to an explosion of the number of combinations. The proposed optical transceiver reduces the number of combinations and finds feasible sets of programmable parameters by using constraints of the parameters combined with a precise analytical model. For precise BER prediction with the specified set of parameters, we model the sub-channel BER as a function of OSNR, modulation formats, symbol rates, and power difference between sub-channels. Next, we formulate simple constraints of the parameters and combine the constraints with the analytical model to seek feasible sets of programmable parameters. Finally, we experimentally demonstrate the end-to-end operation of the proposed optical transceiver with offline manner including low-density parity-check (LDPC) FEC encoding and decoding under a specific use case with latency-sensitive application and 40-km transmission.

  5. Personalized In Vitro and In Vivo Cancer Models to Guide Precision Medicine

    Science.gov (United States)

    Pauli, Chantal; Hopkins, Benjamin D.; Prandi, Davide; Shaw, Reid; Fedrizzi, Tarcisio; Sboner, Andrea; Sailer, Verena; Augello, Michael; Puca, Loredana; Rosati, Rachele; McNary, Terra J.; Churakova, Yelena; Cheung, Cynthia; Triscott, Joanna; Pisapia, David; Rao, Rema; Mosquera, Juan Miguel; Robinson, Brian; Faltas, Bishoy M.; Emerling, Brooke E.; Gadi, Vijayakrishna K.; Bernard, Brady; Elemento, Olivier; Beltran, Himisha; Dimichelis, Francesca; Kemp, Christopher J.; Grandori, Carla; Cantley, Lewis C.; Rubin, Mark A.

    2017-01-01

    Precision Medicine is an approach that takes into account the influence of individuals' genes, environment and lifestyle exposures to tailor interventions. Here, we describe the development of a robust precision cancer care platform, which integrates whole exome sequencing (WES) with a living biobank that enables high throughput drug screens on patient-derived tumor organoids. To date, 56 tumor-derived organoid cultures, and 19 patient-derived xenograft (PDX) models have been established from the 769 patients enrolled in an IRB approved clinical trial. Because genomics alone was insufficient to identify therapeutic options for the majority of patients with advanced disease, we used high throughput drug screening effective strategies. Analysis of tumor derived cells from four cases, two uterine malignancies and two colon cancers, identified effective drugs and drug combinations that were subsequently validated using 3D cultures and PDX models. This platform thereby promotes the discovery of novel therapeutic approaches that can be assessed in clinical trials and provides personalized therapeutic options for individual patients where standard clinical options have been exhausted. PMID:28331002

  6. Reconciling evidence-based medicine and precision medicine in the era of big data: challenges and opportunities.

    Science.gov (United States)

    Beckmann, Jacques S; Lew, Daniel

    2016-12-19

    This era of groundbreaking scientific developments in high-resolution, high-throughput technologies is allowing the cost-effective collection and analysis of huge, disparate datasets on individual health. Proper data mining and translation of the vast datasets into clinically actionable knowledge will require the application of clinical bioinformatics. These developments have triggered multiple national initiatives in precision medicine-a data-driven approach centering on the individual. However, clinical implementation of precision medicine poses numerous challenges. Foremost, precision medicine needs to be contrasted with the powerful and widely used practice of evidence-based medicine, which is informed by meta-analyses or group-centered studies from which mean recommendations are derived. This "one size fits all" approach can provide inadequate solutions for outliers. Such outliers, which are far from an oddity as all of us fall into this category for some traits, can be better managed using precision medicine. Here, we argue that it is necessary and possible to bridge between precision medicine and evidence-based medicine. This will require worldwide and responsible data sharing, as well as regularly updated training programs. We also discuss the challenges and opportunities for achieving clinical utility in precision medicine. We project that, through collection, analyses and sharing of standardized medically relevant data globally, evidence-based precision medicine will shift progressively from therapy to prevention, thus leading eventually to improved, clinician-to-patient communication, citizen-centered healthcare and sustained well-being.

  7. Precision measurements at a muon collider

    International Nuclear Information System (INIS)

    Dawson, S.

    1995-01-01

    We discuss the potential for making precision measurements of M W and M T at a muon collider and the motivations for each measurement. A comparison is made with the precision measurements expected at other facilities. The measurement of the top quark decay width is also discussed

  8. Visual thread quality for precision miniature mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, L.K.

    1981-04-01

    Threaded features have eight visual appearance factors which can affect their function in precision miniature mechanisms. The Bendix practice in deburring, finishing, and accepting these conditions on miniature threads is described as is their impact in assemblies of precision miniature electromechanical assemblies.

  9. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  10. An aberrant precision account of autism.

    Directory of Open Access Journals (Sweden)

    Rebecca P Lawson

    2014-05-01

    Full Text Available Autism is a neurodevelopmental disorder characterised by problems with social-communication, restricted interests and repetitive behaviour. A recent and controversial article presented a compelling normative explanation for the perceptual symptoms of autism in terms of a failure of Bayesian inference (Pellicano and Burr, 2012. In response, we suggested that when Bayesian interference is grounded in its neural instantiation – namely, predictive coding – many features of autistic perception can be attributed to aberrant precision (or beliefs about precision within the context of hierarchical message passing in the brain (Friston et al., 2013. Here, we unpack the aberrant precision account of autism. Specifically, we consider how empirical findings – that speak directly or indirectly to neurobiological mechanisms – are consistent with the aberrant encoding of precision in autism; in particular, an imbalance of the precision ascribed to sensory evidence relative to prior beliefs.

  11. Precision medicine for psychopharmacology: a general introduction.

    Science.gov (United States)

    Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A

    2016-07-01

    Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.

  12. Precision surveying the principles and geomatics practice

    CERN Document Server

    Ogundare, John Olusegun

    2016-01-01

    A comprehensive overview of high precision surveying, including recent developments in geomatics and their applications This book covers advanced precision surveying techniques, their proper use in engineering and geoscience projects, and their importance in the detailed analysis and evaluation of surveying projects. The early chapters review the fundamentals of precision surveying: the types of surveys; survey observations; standards and specifications; and accuracy assessments for angle, distance and position difference measurement systems. The book also covers network design and 3-D coordinating systems before discussing specialized topics such as structural and ground deformation monitoring techniques and analysis, mining surveys, tunneling surveys, and alignment surveys. Precision Surveying: The Principles and Geomatics Practice: * Covers structural and ground deformation monitoring analysis, advanced techniques in mining and tunneling surveys, and high precision alignment of engineering structures *...

  13. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  14. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  15. HTTK: R Package for High-Throughput Toxicokinetics

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  16. Using Six Sigma and Lean methodologies to improve OR throughput.

    Science.gov (United States)

    Fairbanks, Catharine B

    2007-07-01

    Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members. (c) AORN, Inc, 2007.

  17. Throughput Performance Evaluation of Multiservice Multirate OCDMA in Flexible Networks

    DEFF Research Database (Denmark)

    Raddo, Thiago R.; Sanches, Anderson L.; Tafur Monroy, Idelfonso

    2016-01-01

    in the system. The bit error rate (BER) and packet correct probability expressions are derived, considering the multiple-access interference as binomially distributed. Packet throughput expressions, on the other hand, are derived considering Poisson, binomial, and Markov chain approaches for the composite......, the binomial approach proved to be more straightforward, computationally more efficient, and just as accurate as the Markov chain approach....

  18. Packet throughput performance of multiservice, multirate OCDMA in elastic networks

    DEFF Research Database (Denmark)

    Raddo, Thiago R.; Sanches, Anderson L.; Tafur Monroy, Idelfonso

    2016-01-01

    the multiple-access interference (MAI) as binomially distributed. The packet throughput expression, by its turn, is derived considering a Poisson distribution for the composite packet arrivals. Numerical results show that the multicode technique is a good candidate for future multiservice, multirate OCDMA...

  19. Fundamental Tradeoffs among Reliability, Latency and Throughput in Cellular Networks

    DEFF Research Database (Denmark)

    Soret, Beatriz; Mogensen, Preben; Pedersen, Klaus I.

    2014-01-01

    We address the fundamental tradeoffs among latency, reliability and throughput in a cellular network. The most important elements influencing the KPIs in a 4G network are identified, and the inter-relationships among them is discussed. We use the effective bandwidth and the effective capacity......, in which latency and reliability will be two of the principal KPIs....

  20. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  1. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  2. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  3. Throughput-constrained DVFS for scenario-aware dataflow graphs

    NARCIS (Netherlands)

    Damavandpeyma, M.; Stuijk, S.; Basten, T.; Geilen, M.C.W.; Corporaal, H.

    2013-01-01

    Dynamic behavior of streaming applications can be effectively modeled by scenario-aware dataflow graphs (SADFs). Many streaming applications must provide timing guarantees (e.g., throughput) to assure their quality-of-service. For instance, a video decoder which is running on a mobile device is

  4. Throughput maximization of parcel sorter systems by scheduling inbound containers

    NARCIS (Netherlands)

    Haneyah, S.W.A.; Schutten, Johannes M.J.; Fikse, K.; Clausen, Uwe; ten Hompel, Michael; Meier, J. Fabian

    2013-01-01

    This paper addresses the inbound container scheduling problem for automated sorter systems in express parcel sorting. The purpose is to analyze which container scheduling approaches maximize the throughput of sorter systems. We build on existing literature, particularly on the dynamic load balancing

  5. Aspects of multiuser MIMO for cell throughput maximization

    DEFF Research Database (Denmark)

    Bauch, Gerhard; Tejera, Pedro; Guthy, Christian

    2007-01-01

    We consider a multiuser MIMO downlink scenario where the resources in time, frequency and space are allocated such that the total cell throughput is maximized. This is achieved by exploiting multiuser diversity, i.e. the physical resources are allocated to the user with the highest SNR. We assume...

  6. Max-plus algebraic throughput analysis of synchronous dataflow graphs

    NARCIS (Netherlands)

    de Groote, Robert; Kuper, Jan; Broersma, Haitze J.; Smit, Gerardus Johannes Maria

    2012-01-01

    In this paper we present a novel approach to throughput analysis of synchronous dataflow (SDF) graphs. Our approach is based on describing the evolution of actor firing times as a linear time-invariant system in max-plus algebra. Experimental results indicate that our approach is faster than

  7. Throughput of a MIMO OFDM based WLAN system

    NARCIS (Netherlands)

    Schenk, T.C.W.; Dolmans, G.; Modonesi, I.

    2004-01-01

    In this paper, the system throughput of a wireless local-area-network (WLAN) based on multiple-input multipleoutput orthogonal frequency division multiplexing (MIMO OFDM) is studied. A broadband channel model is derived from indoor channel measurements. This model is used in simulations to evaluate

  8. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses

  9. Analyzing logistic map pseudorandom number generators for periodicity induced by finite precision floating-point representation

    International Nuclear Information System (INIS)

    Persohn, K.J.; Povinelli, R.J.

    2012-01-01

    Highlights: ► A chaotic pseudorandom number generator (C-PRNG) poorly explores the key space. ► A C-PRNG is finite and periodic when implemented on a finite precision computer. ► We present a method to determine the period lengths of a C-PRNG. - Abstract: Because of the mixing and aperiodic properties of chaotic maps, such maps have been used as the basis for pseudorandom number generators (PRNGs). However, when implemented on a finite precision computer, chaotic maps have finite and periodic orbits. This manuscript explores the consequences finite precision has on the periodicity of a PRNG based on the logistic map. A comparison is made with conventional methods of generating pseudorandom numbers. The approach used to determine the number, delay, and period of the orbits of the logistic map at varying degrees of precision (3 to 23 bits) is described in detail, including the use of the Condor high-throughput computing environment to parallelize independent tasks of analyzing a large initial seed space. Results demonstrate that in terms of pathological seeds and effective bit length, a PRNG based on the logistic map performs exponentially worse than conventional PRNGs.

  10. Analisis Throughput Varian TCP Pada Model Jaringan WiMAX

    Directory of Open Access Journals (Sweden)

    Medi Taruk

    2016-07-01

    Full Text Available Transmission Control Protocol (TCP is a protocol that works at the transport layer of the OSI model. TCP was originally designed more destined for a wired network. However, to meet the need for the development of a very fast network technology based on the needs of the use by the user, it needs further development to the use of TCP on wireless devices. One implementation of a wireless network based on Worldwide Interoperability for Microwave Access (WiMAX network is a model that offers a variety advantage, particularly in terms of access speed. In this case, use NS-2 to see throughput at TCP variants tested, namely TCP-Tahoe, TCP-Reno, TCP-Vegas, and TCP-SACK over WiMAX network model, with few observations scenarios. The first is a look at each of these variants throughput of TCP when only one particular variant of the work in the network. Second observe all variants of TCP throughput at the same time and have the equivalent QoS, but with the possibility of a small congestion based on the capacity of the link is made sufficient. Third observed throughput with multi congestion. In WiMAX network has scheduling services are UGS, rtPS and ertPS using UDP protocol and nrtPS and BE using the TCP Protocol. By using the software network simulator (NS-2 to obtain performance comparison TCP protocol-based services on the WiMAX network with QoS parameters are throughput, packet loss, fairness and time delay.

  11. Toward precision medicine in Alzheimer's disease.

    Science.gov (United States)

    Reitz, Christiane

    2016-03-01

    In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.

  12. [Precision Nursing: Individual-Based Knowledge Translation].

    Science.gov (United States)

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  13. Accurate molecular diagnosis of phenylketonuria and tetrahydrobiopterin-deficient hyperphenylalaninemias using high-throughput targeted sequencing

    Science.gov (United States)

    Trujillano, Daniel; Perez, Belén; González, Justo; Tornador, Cristian; Navarrete, Rosa; Escaramis, Georgia; Ossowski, Stephan; Armengol, Lluís; Cornejo, Verónica; Desviat, Lourdes R; Ugarte, Magdalena; Estivill, Xavier

    2014-01-01

    Genetic diagnostics of phenylketonuria (PKU) and tetrahydrobiopterin (BH4) deficient hyperphenylalaninemia (BH4DH) rely on methods that scan for known mutations or on laborious molecular tools that use Sanger sequencing. We have implemented a novel and much more efficient strategy based on high-throughput multiplex-targeted resequencing of four genes (PAH, GCH1, PTS, and QDPR) that, when affected by loss-of-function mutations, cause PKU and BH4DH. We have validated this approach in a cohort of 95 samples with the previously known PAH, GCH1, PTS, and QDPR mutations and one control sample. Pooled barcoded DNA libraries were enriched using a custom NimbleGen SeqCap EZ Choice array and sequenced using a HiSeq2000 sequencer. The combination of several robust bioinformatics tools allowed us to detect all known pathogenic mutations (point mutations, short insertions/deletions, and large genomic rearrangements) in the 95 samples, without detecting spurious calls in these genes in the control sample. We then used the same capture assay in a discovery cohort of 11 uncharacterized HPA patients using a MiSeq sequencer. In addition, we report the precise characterization of the breakpoints of four genomic rearrangements in PAH, including a novel deletion of 899 bp in intron 3. Our study is a proof-of-principle that high-throughput-targeted resequencing is ready to substitute classical molecular methods to perform differential genetic diagnosis of hyperphenylalaninemias, allowing the establishment of specifically tailored treatments a few days after birth. PMID:23942198

  14. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  15. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  16. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  17. Advances in Precision Medicine: Tailoring Individualized Therapies.

    Science.gov (United States)

    Matchett, Kyle B; Lynam-Lennon, Niamh; Watson, R William; Brown, James A L

    2017-10-25

    The traditional bench-to-bedside pipeline involves using model systems and patient samples to provide insights into pathways deregulated in cancer. This discovery reveals new biomarkers and therapeutic targets, ultimately stratifying patients and informing cohort-based treatment options. Precision medicine (molecular profiling of individual tumors combined with established clinical-pathological parameters) reveals, in real-time, individual patient's diagnostic and prognostic risk profile, informing tailored and tumor-specific treatment plans. Here we discuss advances in precision medicine presented at the Irish Association for Cancer Research Annual Meeting, highlighting examples where personalized medicine approaches have led to precision discovery in individual tumors, informing customized treatment programs.

  18. Mixed-Precision Spectral Deferred Correction: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Grout, Ray W. S.

    2015-09-02

    Convergence of spectral deferred correction (SDC), where low-order time integration methods are used to construct higher-order methods through iterative refinement, can be accelerated in terms of computational effort by using mixed-precision methods. Using ideas from multi-level SDC (in turn based on FAS multigrid ideas), some of the SDC correction sweeps can use function values computed in reduced precision without adversely impacting the accuracy of the final solution. This is particularly beneficial for the performance of combustion solvers such as S3D [6] which require double precision accuracy but are performance limited by the cost of data motion.

  19. Accuracy and precision in thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    Marshall, T.O.

    1984-01-01

    The question of accuracy and precision in thermoluminescent dosimetry, particularly in relation to lithium fluoride phosphor, is discussed. The more important sources of error, including those due to the detectors, the reader, annealing and dosemeter design, are identified and methods of reducing their effects on accuracy and precision to a minimum are given. Finally, the accuracy and precision achievable for three quite different applications are discussed, namely, for personal dosimetry, environmental monitoring and for the measurement of photon dose distributions in phantoms. (U.K.)

  20. High - speed steel for precise cased tools

    International Nuclear Information System (INIS)

    Karwiarz, J.; Mazur, A.

    2001-01-01

    The test results of high-vanadium high - speed steel (SWV9) for precise casted tools are presented. The face -milling cutters of NFCa80A type have been tested in industrial operating conditions. An average life - time of SWV9 steel tools was 3-10 times longer compare to the conventional high - speed milling cutters. Metallography of SWB9 precise casted steel revealed beneficial for tool properties distribution of primary vanadium carbides in the steel matrix. Presented results should be a good argument for wide application of high - vanadium high - speed steel for precise casted tools. (author)

  1. Precision mechatronics based on high-precision measuring and positioning systems and machines

    Science.gov (United States)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  2. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  3. High precision spectrophotometric analysis of thorium

    International Nuclear Information System (INIS)

    Palmieri, H.E.L.

    1984-01-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium when processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using dissodium ethylenediaminetetraacetate (EDTA) solution and alizarin-S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer programme. Besides the equivalence point, other parameters of titration were determined: the indicator concentration, the absorbance of the metal-indicator complex, and the stability constants of the metal-indicator and the metal-EDTA complexes. (Author) [pt

  4. Thorium spectrophotometric analysis with high precision

    International Nuclear Information System (INIS)

    Palmieri, H.E.L.

    1983-06-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using disodium ethylenediaminetetraacetate (EDTA) solution and alizarin S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer program. (author)

  5. Cardiovascular Precision Medicine in the Genomics Era

    Directory of Open Access Journals (Sweden)

    Alexandra M. Dainis, BS

    2018-04-01

    Full Text Available Summary: Precision medicine strives to delineate disease using multiple data sources—from genomics to digital health metrics—in order to be more precise and accurate in our diagnoses, definitions, and treatments of disease subtypes. By defining disease at a deeper level, we can treat patients based on an understanding of the molecular underpinnings of their presentations, rather than grouping patients into broad categories with one-size-fits-all treatments. In this review, the authors examine how precision medicine, specifically that surrounding genetic testing and genetic therapeutics, has begun to make strides in both common and rare cardiovascular diseases in the clinic and the laboratory, and how these advances are beginning to enable us to more effectively define risk, diagnose disease, and deliver therapeutics for each individual patient. Key Words: genome sequencing, genomics, precision medicine, targeted therapeutics

  6. Equity and Value in 'Precision Medicine'.

    Science.gov (United States)

    Gray, Muir; Lagerberg, Tyra; Dombrádi, Viktor

    2017-04-01

    Precision medicine carries huge potential in the treatment of many diseases, particularly those with high-penetrance monogenic underpinnings. However, precision medicine through genomic technologies also has ethical implications. We will define allocative, personal, and technical value ('triple value') in healthcare and how this relates to equity. Equity is here taken to be implicit in the concept of triple value in countries that have publicly funded healthcare systems. It will be argued that precision medicine risks concentrating resources to those that already experience greater access to healthcare and power in society, nationally as well as globally. Healthcare payers, clinicians, and patients must all be involved in optimising the potential of precision medicine, without reducing equity. Throughout, the discussion will refer to the NHS RightCare Programme, which is a national initiative aiming to improve value and equity in the context of NHS England.

  7. The forthcoming era of precision medicine.

    Science.gov (United States)

    Gamulin, Stjepan

    2016-11-01

    The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients' groups). Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism ("big data"), development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  8. Epistemology, Ethics, and Progress in Precision Medicine.

    Science.gov (United States)

    Hey, Spencer Phillips; Barsanti-Innes, Brianna

    2016-01-01

    The emerging paradigm of precision medicine strives to leverage the tools of molecular biology to prospectively tailor treatments to the individual patient. Fundamental to the success of this movement is the discovery and validation of "predictive biomarkers," which are properties of a patient's biological specimens that can be assayed in advance of therapy to inform the treatment decision. Unfortunately, research into biomarkers and diagnostics for precision medicine has fallen well short of expectations. In this essay, we examine the portfolio of research activities into the excision repair cross complement group 1 (ERCC1) gene as a predictive biomarker for precision lung cancer therapy as a case study in elucidating the epistemological and ethical obstacles to developing new precision medicines.

  9. A Note on "Accuracy" and "Precision"

    Science.gov (United States)

    Stallings, William M.; Gillmore, Gerald M.

    1971-01-01

    Advocates the use of precision" rather than accuracy" in defining reliability. These terms are consistently differentiated in certain sciences. Review of psychological and measurement literature reveals, however, interchangeable usage of the terms in defining reliability. (Author/GS)

  10. Precision axial translator with high stability.

    Science.gov (United States)

    Bösch, M A

    1979-08-01

    We describe a new type of translator which is inherently stable against torsion and twisting. This concentric translator is also ideally suited for precise axial motion with clearance of the center line.

  11. Mechanics and Physics of Precise Vacuum Mechanisms

    CERN Document Server

    Deulin, E. A; Panfilov, Yu V; Nevshupa, R. A

    2010-01-01

    In this book the Russian expertise in the field of the design of precise vacuum mechanics is summarized. A wide range of physical applications of mechanism design in electronic, optical-electronic, chemical, and aerospace industries is presented in a comprehensible way. Topics treated include the method of microparticles flow regulation and its determination in vacuum equipment and mechanisms of electronics; precise mechanisms of nanoscale precision based on magnetic and electric rheology; precise harmonic rotary and not-coaxial nut-screw linear motion vacuum feedthroughs with technical parameters considered the best in the world; elastically deformed vacuum motion feedthroughs without friction couples usage; the computer system of vacuum mechanisms failure predicting. This English edition incorporates a number of features which should improve its usefulness as a textbook without changing the basic organization or the general philosophy of presentation of the subject matter of the original Russian work. Exper...

  12. Precision Munition Electro-Sciences Facility

    Data.gov (United States)

    Federal Laboratory Consortium — This facility allows the characterization of the electro-magnetic environment produced by a precision weapon in free flight. It can measure the radiofrequency (RF)...

  13. Precision electroweak physics at the Tevatron

    International Nuclear Information System (INIS)

    James, Eric B.

    2006-01-01

    An overview of Tevatron electroweak measurements performed by the CDF and Dφ experiments is presented. The current status and future prospects for high precision measurements of electroweak parameters and detailed studies of boson production are highlighted. (author)

  14. Precision Guidance with Impact Angle Requirements

    National Research Council Canada - National Science Library

    Ford, Jason

    2001-01-01

    This paper examines a weapon system precision guidance problem in which the objective is to guide a weapon onto a non-manoeuvring target so that a particular desired angle of impact is achieved using...

  15. Precise subtyping for synchronous multiparty sessions

    Directory of Open Access Journals (Sweden)

    Mariangiola Dezani-Ciancaglini

    2016-02-01

    Full Text Available The notion of subtyping has gained an important role both in theoretical and applicative domains: in lambda and concurrent calculi as well as in programming languages. The soundness and the completeness, together referred to as the preciseness of subtyping, can be considered from two different points of view: operational and denotational. The former preciseness has been recently developed with respect to type safety, i.e. the safe replacement of a term of a smaller type when a term of a bigger type is expected. The latter preciseness is based on the denotation of a type which is a mathematical object that describes the meaning of the type in accordance with the denotations of other expressions from the language. The result of this paper is the operational and denotational preciseness of the subtyping for a synchronous multiparty session calculus. The novelty of this paper is the introduction of characteristic global types to prove the operational completeness.

  16. Prospects for Precision Neutrino Cross Section Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Deborah A. [Fermilab

    2016-01-28

    The need for precision cross section measurements is more urgent now than ever before, given the central role neutrino oscillation measurements play in the field of particle physics. The definition of precision is something worth considering, however. In order to build the best model for an oscillation experiment, cross section measurements should span a broad range of energies, neutrino interaction channels, and target nuclei. Precision might better be defined not in the final uncertainty associated with any one measurement but rather with the breadth of measurements that are available to constrain models. Current experience shows that models are better constrained by 10 measurements across different processes and energies with 10% uncertainties than by one measurement of one process on one nucleus with a 1% uncertainty. This article describes the current status of and future prospects for the field of precision cross section measurements considering the metric of how many processes, energies, and nuclei have been studied.

  17. Precise Calculation of Complex Radioactive Decay Chains

    National Research Council Canada - National Science Library

    Harr, Logan J

    2007-01-01

    ...). An application of the exponential moments function is used with a transmutation matrix in the calculation of complex radioactive decay chains to achieve greater precision than can be attained through current methods...

  18. Collaborative Genomics Study Advances Precision Oncology

    Science.gov (United States)

    A collaborative study conducted by two Office of Cancer Genomics (OCG) initiatives highlights the importance of integrating structural and functional genomics programs to improve cancer therapies, and more specifically, contribute to precision oncology treatments for children.

  19. Nucleon measurements at the precision frontier

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Carl E. [Physics Department, College of William and Mary, Williamsburg, VA 23187 (United States)

    2013-11-07

    We comment on nucleon measurements at the precision frontier. As examples of what can be learned, we concentrate on three topics, which are parity violating scattering experiments, the proton radius puzzle, and the symbiosis between nuclear and atomic physics.

  20. The forthcoming era of precision medicine

    Directory of Open Access Journals (Sweden)

    Stjepan Gamulin

    2016-11-01

    Full Text Available Abstract. The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients’ groups. Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism (“big data”, development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. Conclusion. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach.

  1. PRECISION ELECTROWEAK MEASUREMENTS AND THE HIGGS MASS

    International Nuclear Information System (INIS)

    MARCIANO, W.J.

    2004-01-01

    The utility of precision electroweak measurements for predicting the Standard Model Higgs mass via quantum loop effects is discussed. Current constraints from m w and sin 2 θ w (m z ) ovr MS imply a relatively light Higgs ∼< 154 GeV which is consistent with Supersymmetry expectations. The existence of Supersymmetry is further suggested by a discrepancy between experiment and theory for the muon anomalous magnetic moment. Constraints from precision studies on other types of ''New Physics'' are also briefly described

  2. Precision Medicine-Nobody Is Average.

    Science.gov (United States)

    Vinks, A A

    2017-03-01

    Medicine gets personal and tailor-made treatments are underway. Hospitals have started to advertise their advanced genomic testing capabilities and even their disruptive technologies to help foster a culture of innovation. The prediction in the lay press is that in decades from now we may look back and see 2017 as the year precision medicine blossomed. It is all part of the Precision Medicine Initiative that takes into account individual differences in people's genes, environments, and lifestyles. © 2017 ASCPT.

  3. The role of precise time in IFF

    Science.gov (United States)

    Bridge, W. M.

    1982-01-01

    The application of precise time to the identification of friend or foe (IFF) problem is discussed. The simple concept of knowing when to expect each signal is exploited in a variety of ways to achieve an IFF system which is hard to detect, minimally exploitable and difficult to jam. Precise clocks are the backbone of the concept and the various candidates for this role are discussed. The compact rubidium-controlled oscillator is the only practical candidate.

  4. Precision siting of a particle accelerator

    International Nuclear Information System (INIS)

    Cintra, Jorge Pimentel

    1996-01-01

    Precise location is a specific survey job that involves a high skilled work to avoid unrecoverable results at the project installation. As a function of the different process stages, different specifications can be applied, invoking different instruments: theodolite, measurement tape, distanciometer, invar wire. This paper, based on experience obtained at the installation of particle accelerator equipment, deals with general principles of precise location: tolerance definitions, increasing accuracy techniques, schedule of locations, sensitivity analysis, quality control methods. (author)

  5. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    Science.gov (United States)

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Principles of precision medicine in stroke.

    Science.gov (United States)

    Hinman, Jason D; Rost, Natalia S; Leung, Thomas W; Montaner, Joan; Muir, Keith W; Brown, Scott; Arenillas, Juan F; Feldmann, Edward; Liebeskind, David S

    2017-01-01

    The era of precision medicine has arrived and conveys tremendous potential, particularly for stroke neurology. The diagnosis of stroke, its underlying aetiology, theranostic strategies, recurrence risk and path to recovery are populated by a series of highly individualised questions. Moreover, the phenotypic complexity of a clinical diagnosis of stroke makes a simple genetic risk assessment only partially informative on an individual basis. The guiding principles of precision medicine in stroke underscore the need to identify, value, organise and analyse the multitude of variables obtained from each individual to generate a precise approach to optimise cerebrovascular health. Existing data may be leveraged with novel technologies, informatics and practical clinical paradigms to apply these principles in stroke and realise the promise of precision medicine. Importantly, precision medicine in stroke will only be realised once efforts to collect, value and synthesise the wealth of data collected in clinical trials and routine care starts. Stroke theranostics, the ultimate vision of synchronising tailored therapeutic strategies based on specific diagnostic data, demand cerebrovascular expertise on big data approaches to clinically relevant paradigms. This review considers such challenges and delineates the principles on a roadmap for rational application of precision medicine to stroke and cerebrovascular health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Precision medicine needs pioneering clinical bioinformaticians.

    Science.gov (United States)

    Gómez-López, Gonzalo; Dopazo, Joaquín; Cigudosa, Juan C; Valencia, Alfonso; Al-Shahrour, Fátima

    2017-10-25

    Success in precision medicine depends on accessing high-quality genetic and molecular data from large, well-annotated patient cohorts that couple biological samples to comprehensive clinical data, which in conjunction can lead to effective therapies. From such a scenario emerges the need for a new professional profile, an expert bioinformatician with training in clinical areas who can make sense of multi-omics data to improve therapeutic interventions in patients, and the design of optimized basket trials. In this review, we first describe the main policies and international initiatives that focus on precision medicine. Secondly, we review the currently ongoing clinical trials in precision medicine, introducing the concept of 'precision bioinformatics', and we describe current pioneering bioinformatics efforts aimed at implementing tools and computational infrastructures for precision medicine in health institutions around the world. Thirdly, we discuss the challenges related to the clinical training of bioinformaticians, and the urgent need for computational specialists capable of assimilating medical terminologies and protocols to address real clinical questions. We also propose some skills required to carry out common tasks in clinical bioinformatics and some tips for emergent groups. Finally, we explore the future perspectives and the challenges faced by precision medicine bioinformatics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Towards precision medicine; a new biomedical cosmology.

    Science.gov (United States)

    Vegter, M W

    2018-02-10

    Precision Medicine has become a common label for data-intensive and patient-driven biomedical research. Its intended future is reflected in endeavours such as the Precision Medicine Initiative in the USA. This article addresses the question whether it is possible to discern a new 'medical cosmology' in Precision Medicine, a concept that was developed by Nicholas Jewson to describe comprehensive transformations involving various dimensions of biomedical knowledge and practice, such as vocabularies, the roles of patients and physicians and the conceptualisation of disease. Subsequently, I will elaborate my assessment of the features of Precision Medicine with the help of Michel Foucault, by exploring how precision medicine involves a transformation along three axes: the axis of biomedical knowledge, of biomedical power and of the patient as a self. Patients are encouraged to become the managers of their own health status, while the medical domain is reframed as a data-sharing community, characterised by changing power relationships between providers and patients, producers and consumers. While the emerging Precision Medicine cosmology may surpass existing knowledge frameworks; it obscures previous traditions and reduces research-subjects to mere data. This in turn, means that the individual is both subjected to the neoliberal demand to share personal information, and at the same time has acquired the positive 'right' to become a member of the data-sharing community. The subject has to constantly negotiate the meaning of his or her data, which can either enable self-expression, or function as a commanding Superego.

  9. Precision oncology: origins, optimism, and potential.

    Science.gov (United States)

    Prasad, Vinay; Fojo, Tito; Brada, Michael

    2016-02-01

    Imatinib, the first and arguably the best targeted therapy, became the springboard for developing drugs aimed at molecular targets deemed crucial to tumours. As this development unfolded, a revolution in the speed and cost of genetic sequencing occurred. The result--an armamentarium of drugs and an array of molecular targets--set the stage for precision oncology, a hypothesis that cancer treatment could be markedly improved if therapies were guided by a tumour's genomic alterations. Drawing lessons from the biological basis of cancer and recent empirical investigations, we take a more measured view of precision oncology's promise. Ultimately, the promise is not our concern, but the threshold at which we declare success. We review reports of precision oncology alongside those of precision diagnostics and novel radiotherapy approaches. Although confirmatory evidence is scarce, these interventions have been widely endorsed. We conclude that the current path will probably not be successful or, at a minimum, will have to undergo substantive adjustments before it can be successful. For the sake of patients with cancer, we hope one form of precision oncology will deliver on its promise. However, until confirmatory studies are completed, precision oncology remains unproven, and as such, a hypothesis in need of rigorous testing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Precision validation of MIPAS-Envisat products

    Directory of Open Access Journals (Sweden)

    C. Piccolo

    2007-01-01

    Full Text Available This paper discusses the variation and validation of the precision, or estimated random error, associated with the ESA Level 2 products from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS. This quantity represents the propagation of the radiometric noise from the spectra through the retrieval process into the Level 2 profile values. The noise itself varies with time, steadily rising between ice decontamination events, but the Level 2 precision has a greater variation due to the atmospheric temperature which controls the total radiance received. Hence, for all species, the precision varies latitudinally/seasonally with temperature, with a small superimposed temporal structure determined by the degree of ice contamination on the detectors. The precision validation involves comparing two MIPAS retrievals at the intersections of ascending/descending orbits. For 5 days per month of full resolution MIPAS operation, the standard deviation of the matching profile pairs is computed and compared with the precision given in the MIPAS Level 2 data, except for NO2 since it has a large diurnal variation between ascending/descending intersections. Even taking into account the propagation of the pressure-temperature retrieval errors into the VMR retrieval, the standard deviation of the matching pairs is usually a factor 1–2 larger than the precision. This is thought to be due to effects such as horizontal inhomogeneity of the atmosphere and instability of the retrieval.

  11. The economic case for precision medicine.

    Science.gov (United States)

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  12. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  13. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  14. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  15. MONO: A program to calculate synchrotron beamline monochromator throughputs

    International Nuclear Information System (INIS)

    Chapman, D.

    1989-01-01

    A set of Fortran programs have been developed to calculate the expected throughput of x-ray monochromators with a filtered synchrotron source and is applicable to bending magnet and wiggler beamlines. These programs calculate the normalized throughput and filtered synchrotron spectrum passed by multiple element, flat un- focussed monochromator crystals of the Bragg or Laue type as a function of incident beam divergence, energy and polarization. The reflected and transmitted beam of each crystal is calculated using the dynamical theory of diffraction. Multiple crystal arrangements in the dispersive and non-dispersive mode are allowed as well as crystal asymmetry and energy or angle offsets. Filters or windows of arbitrary elemental composition may be used to filter the incident synchrotron beam. This program should be useful to predict the intensities available from many beamline configurations as well as assist in the design of new monochromator and analyzer systems. 6 refs., 3 figs

  16. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  17. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si......RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...

  18. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  19. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  20. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  1. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  2. An Overlay Architecture for Throughput Optimal Multipath Routing

    Science.gov (United States)

    2017-01-14

    maximum throughput. Finally, we propose a threshold-based policy (BP-T) and a heuristic policy (OBP), which dynamically control traffic bifurcations...network stability region is available . Second, given any subset of nodes that are controllable, we also wish to develop an optimal routing policy that...case when tunnels do not overlap. We also develop a heuristic overlay control policy for use on general topologies, and show through simulation that

  3. Development of rapid high throughput biodosimetry tools for radiological triage

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Escalona, Maria; Smith, Tammy; Ryan, Terri; Dainiak, Nicholas

    2018-01-01

    Accidental or intentional radiological or nuclear (R/N) disasters constitute a major threat around the globe that can affect several tens, hundreds and thousands of humans. Currently available cytogenetic biodosimeters are time consuming and laborious to perform making them impractical for triage scenarios. Therefore, it is imperative to develop high throughput techniques which will enable timely assessment of personalized dose for making an appropriate 'life-saving' clinical decision

  4. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  5. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  6. A Functional High-Throughput Assay of Myelination in Vitro

    Science.gov (United States)

    2014-07-01

    Human induced pluripotent stem cells, hydrogels, 3D culture, electrophysiology, high-throughput assay 16. SECURITY CLASSIFICATION OF: 17...image the 3D rat dorsal root ganglion ( DRG ) cultures with sufficiently low background as to detect electrically-evoked depolarization events, as...of voltage-sensitive dyes. 8    We have made substantial progress in Task 4.1. We have fabricated neural fiber tracts from DRG explants and

  7. On Throughput Maximization in Constant Travel-Time Robotic Cells

    OpenAIRE

    Milind Dawande; Chelliah Sriskandarajah; Suresh Sethi

    2002-01-01

    We consider the problem of scheduling operations in bufferless robotic cells that produce identical parts. The objective is to find a cyclic sequence of robot moves that minimizes the long-run average time to produce a part or, equivalently, maximizes the throughput rate. The robot can be moved in simple cycles that produce one unit or, in more complicated cycles, that produce multiple units. Because one-unit cycles are the easiest to understand, implement, and control, they are widely used i...

  8. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  9. Conducting Precision Medicine Research with African Americans.

    Science.gov (United States)

    Halbert, Chanita Hughes; McDonald, Jasmine; Vadaparampil, Susan; Rice, LaShanta; Jefferson, Melanie

    2016-01-01

    Precision medicine is an approach to detecting, treating, and managing disease that is based on individual variation in genetic, environmental, and lifestyle factors. Precision medicine is expected to reduce health disparities, but this will be possible only if studies have adequate representation of racial minorities. It is critical to anticipate the rates at which individuals from diverse populations are likely to participate in precision medicine studies as research initiatives are being developed. We evaluated the likelihood of participating in a clinical study for precision medicine. Observational study conducted between October 2010 and February 2011 in a national sample of African Americans. Intentions to participate in a government sponsored study that involves providing a biospecimen and generates data that could be shared with other researchers to conduct future studies. One third of respondents would participate in a clinical study for precision medicine. Only gender had a significant independent association with participation intentions. Men had a 1.86 (95% CI = 1.11, 3.12, p = 0.02) increased likelihood of participating in a precision medicine study compared to women in the model that included overall barriers and facilitators. In the model with specific participation barriers, distrust was associated with a reduced likelihood of participating in the research described in the vignette (OR = 0.57, 95% CI = 0.34, 0.96, p = 0.04). African Americans may have low enrollment in PMI research. As PMI research is implemented, extensive efforts will be needed to ensure adequate representation. Additional research is needed to identify optimal ways of ethically describing precision medicine studies to ensure sufficient recruitment of racial minorities.

  10. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  11. On Maximizing the Throughput of Packet Transmission under Energy Constraints.

    Science.gov (United States)

    Wu, Weiwei; Dai, Guangli; Li, Yan; Shan, Feng

    2018-06-23

    More and more Internet of Things (IoT) wireless devices have been providing ubiquitous services over the recent years. Since most of these devices are powered by batteries, a fundamental trade-off to be addressed is the depleted energy and the achieved data throughput in wireless data transmission. By exploiting the rate-adaptive capacities of wireless devices, most existing works on energy-efficient data transmission try to design rate-adaptive transmission policies to maximize the amount of transmitted data bits under the energy constraints of devices. Such solutions, however, cannot apply to scenarios where data packets have respective deadlines and only integrally transmitted data packets contribute. Thus, this paper introduces a notion of weighted throughput, which measures how much total value of data packets are successfully and integrally transmitted before their own deadlines. By designing efficient rate-adaptive transmission policies, this paper aims to make the best use of the energy and maximize the weighted throughput. What is more challenging but with practical significance, we consider the fading effect of wireless channels in both offline and online scenarios. In the offline scenario, we develop an optimal algorithm that computes the optimal solution in pseudo-polynomial time, which is the best possible solution as the problem undertaken is NP-hard. In the online scenario, we propose an efficient heuristic algorithm based on optimal properties derived for the optimal offline solution. Simulation results validate the efficiency of the proposed algorithm.

  12. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  13. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  15. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  16. Precision Departure Release Capability (PDRC): NASA to FAA Research Transition

    Science.gov (United States)

    Engelland, Shawn; Davis, Thomas J.

    2013-01-01

    After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and

  17. Precision forging technology for aluminum alloy

    Science.gov (United States)

    Deng, Lei; Wang, Xinyun; Jin, Junsong; Xia, Juchen

    2018-03-01

    Aluminum alloy is a preferred metal material for lightweight part manufacturing in aerospace, automobile, and weapon industries due to its good physical properties, such as low density, high specific strength, and good corrosion resistance. However, during forging processes, underfilling, folding, broken streamline, crack, coarse grain, and other macro- or microdefects are easily generated because of the deformation characteristics of aluminum alloys, including narrow forgeable temperature region, fast heat dissipation to dies, strong adhesion, high strain rate sensitivity, and large flow resistance. Thus, it is seriously restricted for the forged part to obtain precision shape and enhanced property. In this paper, progresses in precision forging technologies of aluminum alloy parts were reviewed. Several advanced precision forging technologies have been developed, including closed die forging, isothermal die forging, local loading forging, metal flow forging with relief cavity, auxiliary force or vibration loading, casting-forging hybrid forming, and stamping-forging hybrid forming. High-precision aluminum alloy parts can be realized by controlling the forging processes and parameters or combining precision forging technologies with other forming technologies. The development of these technologies is beneficial to promote the application of aluminum alloys in manufacturing of lightweight parts.

  18. Precision medicine for advanced prostate cancer.

    Science.gov (United States)

    Mullane, Stephanie A; Van Allen, Eliezer M

    2016-05-01

    Precision cancer medicine, the use of genomic profiling of patient tumors at the point-of-care to inform treatment decisions, is rapidly changing treatment strategies across cancer types. Precision medicine for advanced prostate cancer may identify new treatment strategies and change clinical practice. In this review, we discuss the potential and challenges of precision medicine in advanced prostate cancer. Although primary prostate cancers do not harbor highly recurrent targetable genomic alterations, recent reports on the genomics of metastatic castration-resistant prostate cancer has shown multiple targetable alterations in castration-resistant prostate cancer metastatic biopsies. Therapeutic implications include targeting prevalent DNA repair pathway alterations with PARP-1 inhibition in genomically defined subsets of patients, among other genomically stratified targets. In addition, multiple recent efforts have demonstrated the promise of liquid tumor profiling (e.g., profiling circulating tumor cells or cell-free tumor DNA) and highlighted the necessary steps to scale these approaches in prostate cancer. Although still in the initial phase of precision medicine for prostate cancer, there is extraordinary potential for clinical impact. Efforts to overcome current scientific and clinical barriers will enable widespread use of precision medicine approaches for advanced prostate cancer patients.

  19. The Challenges of Precision Medicine in COPD.

    Science.gov (United States)

    Cazzola, Mario; Calzetta, Luigino; Rogliani, Paola; Matera, Maria Gabriella

    2017-08-01

    Pheno-/endotyping chronic obstructive pulmonary disease (COPD) is really important because it provides patients with precise and personalized medicine. The central concept of precision medicine is to take individual variability into account when making management decisions. Precision medicine should ensure that patients get the right treatment at the right dose at the right time, with minimum harmful consequences and maximum efficacy. Ideally, we should search for genetic and molecular biomarker-based profiles. Given the clinical complexity of COPD, it seems likely that a panel of several biomarkers will be required to characterize pathogenetic factors and their course over time. The need for biomarkers to guide the clinical care of individuals with COPD and to enhance the possibilities of success in drug development is clear and urgent, but biomarker development is tremendously challenging and expensive, and translation of research efforts to date has been largely ineffective. Furthermore, the development of personalized treatments will require a much more detailed understanding of the clinical and biological heterogeneity of COPD. Therefore, we are still far from being able to apply precision medicine in COPD and the treatable traits and FEV 1 -free approaches are attempts to precision medicine in COPD that must be considered still quite unsophisticated.

  20. Apparatus for precision micromachining with lasers

    Science.gov (United States)

    Chang, J.J.; Dragon, E.P.; Warner, B.E.

    1998-04-28

    A new material processing apparatus using a short-pulsed, high-repetition-rate visible laser for precision micromachining utilizes a near diffraction limited laser, a high-speed precision two-axis tilt-mirror for steering the laser beam, an optical system for either focusing or imaging the laser beam on the part, and a part holder that may consist of a cover plate and a back plate. The system is generally useful for precision drilling, cutting, milling and polishing of metals and ceramics, and has broad application in manufacturing precision components. Precision machining has been demonstrated through percussion drilling and trepanning using this system. With a 30 W copper vapor laser running at multi-kHz pulse repetition frequency, straight parallel holes with size varying from 500 microns to less than 25 microns and with aspect ratios up to 1:40 have been consistently drilled with good surface finish on a variety of metals. Micromilling and microdrilling on ceramics using a 250 W copper vapor laser have also been demonstrated with good results. Materialographic sections of machined parts show little (submicron scale) recast layer and heat affected zone. 1 fig.

  1. IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.

    Science.gov (United States)

    Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis

    2018-04-01

    Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.

  2. High-Throughput Screening and Quantitation of Target Compounds in Biofluids by Coated Blade Spray-Mass Spectrometry.

    Science.gov (United States)

    Tascon, Marcos; Gómez-Ríos, Germán Augusto; Reyes-Garcés, Nathaly; Poole, Justen; Boyacı, Ezel; Pawliszyn, Janusz

    2017-08-15

    Most contemporary methods of screening and quantitating controlled substances and therapeutic drugs in biofluids typically require laborious, time-consuming, and expensive analytical workflows. In recent years, our group has worked toward developing microextraction (μe)-mass spectrometry (MS) technologies that merge all of the tedious steps of the classical methods into a simple, efficient, and low-cost methodology. Unquestionably, the automation of these technologies allows for faster sample throughput, greater reproducibility, and radically reduced analysis times. Coated blade spray (CBS) is a μe technology engineered for extracting/enriching analytes of interest in complex matrices, and it can be directly coupled with MS instruments to achieve efficient screening and quantitative analysis. In this study, we introduced CBS as a technology that can be arranged to perform either rapid diagnostics (single vial) or the high-throughput (96-well plate) analysis of biofluids. Furthermore, we demonstrate that performing 96-CBS extractions at the same time allows the total analysis time to be reduced to less than 55 s per sample. Aiming to validate the versatility of CBS, substances comprising a broad range of molecular weights, moieties, protein binding, and polarities were selected. Thus, the high-throughput (HT)-CBS technology was used for the concomitant quantitation of 18 compounds (mixture of anabolics, β-2 agonists, diuretics, stimulants, narcotics, and β-blockers) spiked in human urine and plasma samples. Excellent precision (∼2.5%), accuracy (≥90%), and linearity (R 2 ≥ 0.99) were attained for all the studied compounds, and the limits of quantitation (LOQs) were within the range of 0.1 to 10 ng·mL -1 for plasma and 0.25 to 10 ng·mL -1 for urine. The results reported in this paper confirm CBS's great potential for achieving subsixty-second analyses of target compounds in a broad range of fields such as those related to clinical diagnosis, food, the

  3. High-throughput on-chip in vivo neural regeneration studies using femtosecond laser nano-surgery and microfluidics

    Science.gov (United States)

    Rohde, Christopher B.; Zeng, Fei; Gilleland, Cody; Samara, Chrysanthi; Yanik, Mehmet F.

    2009-02-01

    In recent years, the advantages of using small invertebrate animals as model systems for human disease have become increasingly apparent and have resulted in three Nobel Prizes in medicine or chemistry during the last six years for studies conducted on the nematode Caenorhabditis elegans (C. elegans). The availability of a wide array of species-specific genetic techniques, along with the transparency of the worm and its ability to grow in minute volumes make C. elegans an extremely powerful model organism. We present a suite of technologies for complex high-throughput whole-animal genetic and drug screens. We demonstrate a high-speed microfluidic sorter that can isolate and immobilize C. elegans in a well-defined geometry, an integrated chip containing individually addressable screening chambers for incubation and exposure of individual animals to biochemical compounds, and a device for delivery of compound libraries in standard multiwell plates to microfluidic devices. The immobilization stability obtained by these devices is comparable to that of chemical anesthesia and the immobilization process does not affect lifespan, progeny production, or other aspects of animal health. The high-stability enables the use of a variety of key optical techniques. We use this to demonstrate femtosecond-laser nanosurgery and three-dimensional multiphoton microscopy. Used alone or in various combinations these devices facilitate a variety of high-throughput assays using whole animals, including mutagenesis and RNAi and drug screens at subcellular resolution, as well as high-throughput high-precision manipulations such as femtosecond-laser nanosurgery for large-scale in vivo neural degeneration and regeneration studies.

  4. High throughput nanoimprint lithography for semiconductor memory applications

    Science.gov (United States)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  5. The Future of Precision Medicine in Oncology.

    Science.gov (United States)

    Millner, Lori M; Strotman, Lindsay N

    2016-09-01

    Precision medicine in oncology focuses on identifying which therapies are most effective for each patient based on genetic characterization of the cancer. Traditional chemotherapy is cytotoxic and destroys all cells that are rapidly dividing. The foundation of precision medicine is targeted therapies and selecting patients who will benefit most from these therapies. One of the newest aspects of precision medicine is liquid biopsy. A liquid biopsy includes analysis of circulating tumor cells, cell-free nucleic acid, or exosomes obtained from a peripheral blood draw. These can be studied individually or in combination and collected serially, providing real-time information as a patient's cancer changes. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Toward precision medicine in primary biliary cholangitis.

    Science.gov (United States)

    Carbone, Marco; Ronca, Vincenzo; Bruno, Savino; Invernizzi, Pietro; Mells, George F

    2016-08-01

    Primary biliary cholangitis is a chronic, cholestatic liver disease characterized by a heterogeneous presentation, symptomatology, disease progression and response to therapy. In contrast, clinical management and treatment of PBC is homogeneous with a 'one size fits all' approach. The evolving research landscape, with the emergence of the -omics field and the availability of large patient cohorts are creating a unique opportunity of translational epidemiology. Furthermore, several novel disease and symptom-modifying agents for PBC are currently in development. The time is therefore ripe for precision medicine in PBC. In this manuscript we describe the concept of precision medicine; review current approaches to risk-stratification in PBC, and speculate how precision medicine in PBC might develop in the near future. Copyright © 2016 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  7. High-Precision Computation and Mathematical Physics

    International Nuclear Information System (INIS)

    Bailey, David H.; Borwein, Jonathan M.

    2008-01-01

    At the present time, IEEE 64-bit floating-point arithmetic is sufficiently accurate for most scientific applications. However, for a rapidly growing body of important scientific computing applications, a higher level of numeric precision is required. Such calculations are facilitated by high-precision software packages that include high-level language translation modules to minimize the conversion effort. This paper presents a survey of recent applications of these techniques and provides some analysis of their numerical requirements. These applications include supernova simulations, climate modeling, planetary orbit calculations, Coulomb n-body atomic systems, scattering amplitudes of quarks, gluons and bosons, nonlinear oscillator theory, Ising theory, quantum field theory and experimental mathematics. We conclude that high-precision arithmetic facilities are now an indispensable component of a modern large-scale scientific computing environment.

  8. Subdomain Precise Integration Method for Periodic Structures

    Directory of Open Access Journals (Sweden)

    F. Wu

    2014-01-01

    Full Text Available A subdomain precise integration method is developed for the dynamical responses of periodic structures comprising many identical structural cells. The proposed method is based on the precise integration method, the subdomain scheme, and the repeatability of the periodic structures. In the proposed method, each structural cell is seen as a super element that is solved using the precise integration method, considering the repeatability of the structural cells. The computational efforts and the memory size of the proposed method are reduced, while high computational accuracy is achieved. Therefore, the proposed method is particularly suitable to solve the dynamical responses of periodic structures. Two numerical examples are presented to demonstrate the accuracy and efficiency of the proposed method through comparison with the Newmark and Runge-Kutta methods.

  9. Microhartree precision in density functional theory calculations

    Science.gov (United States)

    Gulans, Andris; Kozhevnikov, Anton; Draxl, Claudia

    2018-04-01

    To address ultimate precision in density functional theory calculations we employ the full-potential linearized augmented plane-wave + local-orbital (LAPW + lo) method and justify its usage as a benchmark method. LAPW + lo and two completely unrelated numerical approaches, the multiresolution analysis (MRA) and the linear combination of atomic orbitals, yield total energies of atoms with mean deviations of 0.9 and 0.2 μ Ha , respectively. Spectacular agreement with the MRA is reached also for total and atomization energies of the G2-1 set consisting of 55 molecules. With the example of α iron we demonstrate the capability of LAPW + lo to reach μ Ha /atom precision also for periodic systems, which allows also for the distinction between the numerical precision and the accuracy of a given functional.

  10. Nanotechnology: The new perspective in precision agriculture

    Directory of Open Access Journals (Sweden)

    Joginder Singh Duhan

    2017-09-01

    Full Text Available Nanotechnology is an interdisciplinary research field. In recent past efforts have been made to improve agricultural yield through exhaustive research in nanotechnology. The green revolution resulted in blind usage of pesticides and chemical fertilizers which caused loss of soil biodiversity and developed resistance against pathogens and pests as well. Nanoparticle-mediated material delivery to plants and advanced biosensors for precision farming are possible only by nanoparticles or nanochips. Nanoencapsulated conventional fertilizers, pesticides and herbicides helps in slow and sustained release of nutrients and agrochemicals resulting in precise dosage to the plants. Nanotechnology based plant viral disease detection kits are also becoming popular and are useful in speedy and early detection of viral diseases. In this article, the potential uses and benefits of nanotechnology in precision agriculture are discussed. The modern nanotechnology based tools and techniques have the potential to address the various problems of conventional agriculture and can revolutionize this sector.

  11. Quantification of rapid Myosin regulatory light chain phosphorylation using high-throughput in-cell Western assays: comparison to Western immunoblots.

    Directory of Open Access Journals (Sweden)

    Hector N Aguilar

    2010-04-01

    Full Text Available Quantification of phospho-proteins (PPs is crucial when studying cellular signaling pathways. Western immunoblotting (WB is commonly used for the measurement of relative levels of signaling intermediates in experimental samples. However, WB is in general a labour-intensive and low-throughput technique. Because of variability in protein yield and phospho-signal preservation during protein harvesting, and potential loss of antigen during protein transfer, WB provides only semi-quantitative data. By comparison, the "in-cell western" (ICW technique has high-throughput capacity and requires less extensive sample preparation. Thus, we compared the ICW technique to WB for measuring phosphorylated myosin regulatory light chain (PMLC(20 in primary cultures of uterine myocytes to assess their relative specificity, sensitivity, precision, and quantification of biologically relevant responses.ICWs are cell-based microplate assays for quantification of protein targets in their cellular context. ICWs utilize a two-channel infrared (IR scanner (Odyssey(R to quantify signals arising from near-infrared (NIR fluorophores conjugated to secondary antibodies. One channel is dedicated to measuring the protein of interest and the second is used for data normalization of the signal in each well of the microplate. Using uterine myocytes, we assessed oxytocin (OT-stimulated MLC(20 phosphorylation measured by ICW and WB, both using NIR fluorescence. ICW and WB data were comparable regarding signal linearity, signal specificity, and time course of phosphorylation response to OT.ICW and WB yield comparable biological data. The advantages of ICW over WB are its high-throughput capacity, improved precision, and reduced sample preparation requirements. ICW might provide better sensitivity and precision with low-quantity samples or for protocols requiring large numbers of samples. These features make the ICW technique an excellent tool for the study of phosphorylation endpoints

  12. Automated high-throughput quantification of mitotic spindle positioning from DIC movies of Caenorhabditis embryos.

    Directory of Open Access Journals (Sweden)

    David Cluet

    Full Text Available The mitotic spindle is a microtubule-based structure that elongates to accurately segregate chromosomes during anaphase. Its position within the cell also dictates the future cell cleavage plan, thereby determining daughter cell orientation within a tissue or cell fate adoption for polarized cells. Therefore, the mitotic spindle ensures at the same time proper cell division and developmental precision. Consequently, spindle dynamics is the matter of intensive research. Among the different cellular models that have been explored, the one-cell stage C. elegans embryo has been an essential and powerful system to dissect the molecular and biophysical basis of spindle elongation and positioning. Indeed, in this large and transparent cell, spindle poles (or centrosomes can be easily detected from simple DIC microscopy by human eyes. To perform quantitative and high-throughput analysis of spindle motion, we developed a computer program ACT for Automated-Centrosome-Tracking from DIC movies of C. elegans embryos. We therefore offer an alternative to the image acquisition and processing of transgenic lines expressing fluorescent spindle markers. Consequently, experiments on large sets of cells can be performed with a simple setup using inexpensive microscopes. Moreover, analysis of any mutant or wild-type backgrounds is accessible because laborious rounds of crosses with transgenic lines become unnecessary. Last, our program allows spindle detection in other nematode species, offering the same quality of DIC images but for which techniques of transgenesis are not accessible. Thus, our program also opens the way towards a quantitative evolutionary approach of spindle dynamics. Overall, our computer program is a unique macro for the image- and movie-processing platform ImageJ. It is user-friendly and freely available under an open-source licence. ACT allows batch-wise analysis of large sets of mitosis events. Within 2 minutes, a single movie is processed

  13. Lateral Temperature-Gradient Method for High-Throughput Characterization of Material Processing by Millisecond Laser Annealing.

    Science.gov (United States)

    Bell, Robert T; Jacobs, Alan G; Sorg, Victoria C; Jung, Byungki; Hill, Megan O; Treml, Benjamin E; Thompson, Michael O

    2016-09-12

    A high-throughput method for characterizing the temperature dependence of material properties following microsecond to millisecond thermal annealing, exploiting the temperature gradients created by a lateral gradient laser spike anneal (lgLSA), is presented. Laser scans generate spatial thermal gradients of up to 5 °C/μm with peak temperatures ranging from ambient to in excess of 1400 °C, limited only by laser power and materials thermal limits. Discrete spatial property measurements across the temperature gradient are then equivalent to independent measurements after varying temperature anneals. Accurate temperature calibrations, essential to quantitative analysis, are critical and methods for both peak temperature and spatial/temporal temperature profile characterization are presented. These include absolute temperature calibrations based on melting and thermal decomposition, and time-resolved profiles measured using platinum thermistors. A variety of spatially resolved measurement probes, ranging from point-like continuous profiling to large area sampling, are discussed. Examples from annealing of III-V semiconductors, CdSe quantum dots, low-κ dielectrics, and block copolymers are included to demonstrate the flexibility, high throughput, and precision of this technique.

  14. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  15. High Throughput Screening Method for Systematic Surveillance of Drugs of Abuse by Multisegment Injection-Capillary Electrophoresis-Mass Spectrometry.

    Science.gov (United States)

    DiBattista, Alicia; Rampersaud, Dianne; Lee, Howard; Kim, Marcus; Britz-McKibbin, Philip

    2017-11-07

    New technologies are urgently required for reliable drug screening given a worldwide epidemic of prescription drug abuse and its devastating socioeconomic impacts on public health. Primary screening of drugs of abuse (DoA) currently relies on immunoassays that are prone to bias and are not applicable to detect an alarming array of psychoactive stimulants, tranquilizers, and synthetic opioids. These limitations impact patient safety when monitoring for medication compliance, drug substitution, or misuse/abuse and require follow-up confirmatory testing by more specific yet lower throughput instrumental methods. Herein, we introduce a high throughput platform for nontargeted screening of a broad spectrum of DoA and their metabolites based on multisegment injection-capillary electrophoresis-mass spectrometry (MSI-CE-MS). We demonstrate that MSI-CE-MS enables serial injections of 10 samples within a single run (high resolution MS with full-scan data acquisition. Unambiguous drug identification was achieved by four or more independent parameters, including comigration with a deuterated internal standard or in silico prediction of electromigration behavior together with accurate mass, most likely molecular formula, as well as MS/MS as required for confirmation testing. Acceptable precision was demonstrated for over 50 DoA at 3 concentration levels over 4 days (median coefficient of variance = 13%, n = 117) with minimal ion suppression, isobaric interferences, and sample carry-over (screening cutoff levels in human urine while allowing for systematic surveillance, specimen verification, and retrospective testing of designer drugs that elude conventional drug tests.

  16. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  17. IS-seq: a novel high throughput survey of in vivo IS6110 transposition in multiple Mycobacterium tuberculosis genomes

    Directory of Open Access Journals (Sweden)

    Reyes Alejandro

    2012-06-01

    Full Text Available Abstract Background The insertion element IS6110 is one of the main sources of genomic variability in Mycobacterium tuberculosis, the etiological agent of human tuberculosis. Although IS 6110 has been used extensively as an epidemiological marker, the identification of the precise chromosomal insertion sites has been limited by technical challenges. Here, we present IS-seq, a novel method that combines high-throughput sequencing using Illumina technology with efficient combinatorial sample multiplexing to simultaneously probe 519 clinical isolates, identifying almost all the flanking regions of the element in a single experiment. Results We identified a total of 6,976 IS6110 flanking regions on the different isolates. When validated using reference strains, the method had 100% specificity and 98% positive predictive value. The insertions mapped to both coding and non-coding regions, and in some cases interrupted genes thought to be essential for virulence or in vitro growth. Strains were classified into families using insertion sites, and high agreement with previous studies was observed. Conclusions This high-throughput IS-seq method, which can also be used to map insertions in other organisms, extends previous surveys of in vivo interrupted loci and provides a baseline for probing the consequences of disruptions in M. tuberculosis strains.

  18. CRISPR-Cas9 epigenome editing enables high-throughput screening for functional regulatory elements in the human genome.

    Science.gov (United States)

    Klann, Tyler S; Black, Joshua B; Chellappan, Malathi; Safi, Alexias; Song, Lingyun; Hilton, Isaac B; Crawford, Gregory E; Reddy, Timothy E; Gersbach, Charles A

    2017-06-01

    Large genome-mapping consortia and thousands of genome-wide association studies have identified non-protein-coding elements in the genome as having a central role in various biological processes. However, decoding the functions of the millions of putative regulatory elements discovered in these studies remains challenging. CRISPR-Cas9-based epigenome editing technologies have enabled precise perturbation of the activity of specific regulatory elements. Here we describe CRISPR-Cas9-based epigenomic regulatory element screening (CERES) for improved high-throughput screening of regulatory element activity in the native genomic context. Using dCas9 KRAB repressor and dCas9 p300 activator constructs and lentiviral single guide RNA libraries to target DNase I hypersensitive sites surrounding a gene of interest, we carried out both loss- and gain-of-function screens to identify regulatory elements for the β-globin and HER2 loci in human cells. CERES readily identified known and previously unidentified regulatory elements, some of which were dependent on cell type or direction of perturbation. This technology allows the high-throughput functional annotation of putative regulatory elements in their native chromosomal context.

  19. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  20. Identification of antifungal compounds active against Candida albicans using an improved high-throughput Caenorhabditis elegans assay.

    Directory of Open Access Journals (Sweden)

    Ikechukwu Okoli

    2009-09-01

    Full Text Available Candida albicans, the most common human pathogenic fungus, can establish a persistent lethal infection in the intestine of the microscopic nematode Caenorhabditis elegans. The C. elegans-C. albicans infection model was previously adapted to screen for antifungal compounds. Modifications to this screen have been made to facilitate a high-throughput assay including co-inoculation of nematodes with C. albicans and instrumentation allowing precise dispensing of worms into assay wells, eliminating two labor-intensive steps. This high-throughput method was utilized to screen a library of 3,228 compounds represented by 1,948 bioactive compounds and 1,280 small molecules derived via diversity-oriented synthesis. Nineteen compounds were identified that conferred an increase in C. elegans survival, including most known antifungal compounds within the chemical library. In addition to seven clinically used antifungal compounds, twelve compounds were identified which are not primarily used as antifungal agents, including three immunosuppressive drugs. This assay also allowed the assessment of the relative minimal inhibitory concentration, the effective concentration in vivo, and the toxicity of the compound in a single assay.

  1. Pushing the precision frontier in Collider Physics

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The interplay between precise theory predictions and experimental measurements has written a success story in particle physics. After a brief journey into history we will review recent developments which have led  to "revolutions" with regard to precision calculations and to new insights into the structure of quantum field theory. The second part of the talk will focus on phenomenology, especially on Higgs boson pair production as a window to physics beyond the Standard Model, manifesting itself in a modification of those Higgs couplings which are still to a large extent unconstrained, in particular the Higgs boson self-coupling.

  2. Non-precision approach in manual mode

    Directory of Open Access Journals (Sweden)

    М. В. Коршунов

    2013-07-01

    Full Text Available Considered is the method of non-precision approach of an aircraft in the manual mode with a constant angle of path. Advantage of this method consists in the fact that the construction of approach with a constant angle of path provides the stable path of flight. It is also considered a detailed analysis of the possibility of the approach by the above-mentioned method. Conclusions contain recommendations regarding the use of the described method of non-precision approach during training flights.

  3. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  4. Precision medicine: what's all the fuss about?

    Science.gov (United States)

    Barker, Richard

    2016-01-01

    Precision medicine is now recognized globally as a major new era in medicine. It is being driven by advances in genomics and other 'omics' but also by the desire on the part of both health systems and governments to offer more targeted and cost-effective care. However, it faces a number of challenges, from the economics of developing more expensive companion diagnostics to the need to educate patients and the public on the advantages for them. New models of both R&D and care delivery are needed to capture the scientific, clinical and economic benefits of precision medicine.

  5. High precision detector robot arm system

    Science.gov (United States)

    Shu, Deming; Chu, Yong

    2017-01-31

    A method and high precision robot arm system are provided, for example, for X-ray nanodiffraction with an X-ray nanoprobe. The robot arm system includes duo-vertical-stages and a kinematic linkage system. A two-dimensional (2D) vertical plane ultra-precision robot arm supporting an X-ray detector provides positioning and manipulating of the X-ray detector. A vertical support for the 2D vertical plane robot arm includes spaced apart rails respectively engaging a first bearing structure and a second bearing structure carried by the 2D vertical plane robot arm.

  6. Fundamental limits of scintillation detector timing precision

    International Nuclear Information System (INIS)

    Derenzo, Stephen E; Choong, Woon-Seng; Moses, William W

    2014-01-01

    In this paper we review the primary factors that affect the timing precision of a scintillation detector. Monte Carlo calculations were performed to explore the dependence of the timing precision on the number of photoelectrons, the scintillator decay and rise times, the depth of interaction uncertainty, the time dispersion of the optical photons (modeled as an exponential decay), the photodetector rise time and transit time jitter, the leading-edge trigger level, and electronic noise. The Monte Carlo code was used to estimate the practical limits on the timing precision for an energy deposition of 511 keV in 3 mm × 3 mm × 30 mm Lu 2 SiO 5 :Ce and LaBr 3 :Ce crystals. The calculated timing precisions are consistent with the best experimental literature values. We then calculated the timing precision for 820 cases that sampled scintillator rise times from 0 to 1.0 ns, photon dispersion times from 0 to 0.2 ns, photodetector time jitters from 0 to 0.5 ns fwhm, and A from 10 to 10 000 photoelectrons per ns decay time. Since the timing precision R was found to depend on A −1/2  more than any other factor, we tabulated the parameter B, where R = BA −1/2 . An empirical analytical formula was found that fit the tabulated values of B with an rms deviation of 2.2% of the value of B. The theoretical lower bound of the timing precision was calculated for the example of 0.5 ns rise time, 0.1 ns photon dispersion, and 0.2 ns fwhm photodetector time jitter. The lower bound was at most 15% lower than leading-edge timing discrimination for A from 10 to 10 000 photoelectrons ns −1 . A timing precision of 8 ps fwhm should be possible for an energy deposition of 511 keV using currently available photodetectors if a theoretically possible scintillator were developed that could produce 10 000 photoelectrons ns −1 . (paper)

  7. Automatic titrator for high precision plutonium assay

    International Nuclear Information System (INIS)

    Jackson, D.D.; Hollen, R.M.

    1986-01-01

    Highly precise assay of plutonium metal is required for accountability measurements. We have developed an automatic titrator for this determination which eliminates analyst bias and requires much less analyst time. The analyst is only required to enter sample data and start the titration. The automated instrument titrates the sample, locates the end point, and outputs the results as a paper tape printout. Precision of the titration is less than 0.03% relative standard deviation for a single determination at the 250-mg plutonium level. The titration time is less than 5 min

  8. High-speed precision motion control

    CERN Document Server

    Yamaguchi, Takashi; Pang, Chee Khiang

    2011-01-01

    Written for researchers and postgraduate students in Control Engineering, as well as professionals in the Hard Disk Drive industry, this book discusses high-precision and fast servo controls in Hard Disk Drives (HDDs). The editors present a number of control algorithms that enable fast seeking and high precision positioning, and propose problems from commercial products, making the book valuable to researchers in HDDs. Each chapter is self contained, and progresses from concept to technique, present application examples that can be used within automotive, aerospace, aeronautical, and manufactu

  9. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    Science.gov (United States)

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Laser technology for high precision satellite tracking

    Science.gov (United States)

    Plotkin, H. H.

    1974-01-01

    Fixed and mobile laser ranging stations have been developed to track satellites equipped with retro-reflector arrays. These have operated consistently at data rates of once per second with range precision better than 50 cm, using Q-switched ruby lasers with pulse durations of 20 to 40 nanoseconds. Improvements are being incorporated to improve the precision to 10 cm, and to permit ranging to more distant satellites. These include improved reflector array designs, processing and analysis of the received reflection pulses, and use of sub-nanosecond pulse duration lasers.

  11. Precision measurements in nuclear beta decay

    Energy Technology Data Exchange (ETDEWEB)

    Naviliat-Cuncic, Oscar, E-mail: naviliat@nscl.msu.edu [Michigan State University, National Superconducting Cyclotron Laboratory and Department of Physics and Astronomy (United States)

    2013-03-15

    Precision measurements in nuclear beta decay provide sensitive means to determine the fundamental coupling of charged fermions to weak bosons and to test discrete symmetries in the weak interaction. The main motivation of such measurements is to find deviations from Standard Model predictions as possible indications of new physics. I focus here on two topics related to precision measurements in beta decay, namely: (i) the determination of the V{sub ud} element of the Cabibbo-Kobayashi-Maskawa quark mixing matrix from nuclear mirror transitions and (ii) selected measurements of time reversal violating correlations in nuclear and neutron decays. These topics complement those presented in other contributions to this conference.

  12. High-throughput machining using high average power ultrashort pulse lasers and ultrafast polygon scanner

    Science.gov (United States)

    Schille, Joerg; Schneider, Lutz; Streek, André; Kloetzer, Sascha; Loeschner, Udo

    2016-03-01

    In this paper, high-throughput ultrashort pulse laser machining is investigated on various industrial grade metals (Aluminium, Copper, Stainless steel) and Al2O3 ceramic at unprecedented processing speeds. This is achieved by using a high pulse repetition frequency picosecond laser with maximum average output power of 270 W in conjunction with a unique, in-house developed two-axis polygon scanner. Initially, different concepts of polygon scanners are engineered and tested to find out the optimal architecture for ultrafast and precision laser beam scanning. Remarkable 1,000 m/s scan speed is achieved on the substrate, and thanks to the resulting low pulse overlap, thermal accumulation and plasma absorption effects are avoided at up to 20 MHz pulse repetition frequencies. In order to identify optimum processing conditions for efficient high-average power laser machining, the depths of cavities produced under varied parameter settings are analyzed and, from the results obtained, the characteristic removal values are specified. The maximum removal rate is achieved as high as 27.8 mm3/min for Aluminium, 21.4 mm3/min for Copper, 15.3 mm3/min for Stainless steel and 129.1 mm3/min for Al2O3 when full available laser power is irradiated at optimum pulse repetition frequency.

  13. Genetic and Nongenetic Determinants of Cell Growth Variation Assessed by High-Throughput Microscopy

    Science.gov (United States)

    Ziv, Naomi; Siegal, Mark L.; Gresham, David

    2013-01-01

    In microbial populations, growth initiation and proliferation rates are major components of fitness and therefore likely targets of selection. We used a high-throughput microscopy assay, which enables simultaneous analysis of tens of thousands of microcolonies, to determine the sources and extent of growth rate variation in the budding yeast (Saccharomyces cerevisiae) in different glucose environments. We find that cell growth rates are regulated by the extracellular concentration of glucose as proposed by Monod (1949), but that significant heterogeneity in growth rates is observed among genetically identical individuals within an environment. Yeast strains isolated from different geographic locations and habitats differ in their growth rate responses to different glucose concentrations. Inheritance patterns suggest that the genetic determinants of growth rates in different glucose concentrations are distinct. In addition, we identified genotypes that differ in the extent of variation in growth rate within an environment despite nearly identical mean growth rates, providing evidence that alleles controlling phenotypic variability segregate in yeast populations. We find that the time to reinitiation of growth (lag) is negatively correlated with growth rate, yet this relationship is strain-dependent. Between environments, the respirative activity of individual cells negatively correlates with glucose abundance and growth rate, but within an environment respirative activity and growth rate show a positive correlation, which we propose reflects differences in protein expression capacity. Our study quantifies the sources of genetic and nongenetic variation in cell growth rates in different glucose environments with unprecedented precision, facilitating their molecular genetic dissection. PMID:23938868

  14. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  15. Vision-based Nano Robotic System for High-throughput Non-embedded Cell Cutting.

    Science.gov (United States)

    Shang, Wanfeng; Lu, Haojian; Wan, Wenfeng; Fukuda, Toshio; Shen, Yajing

    2016-03-04

    Cell cutting is a significant task in biology study, but the highly productive non-embedded cell cutting is still a big challenge for current techniques. This paper proposes a vision-based nano robotic system and then realizes automatic non-embedded cell cutting with this system. First, the nano robotic system is developed and integrated with a nanoknife inside an environmental scanning electron microscopy (ESEM). Then, the positions of the nanoknife and the single cell are recognized, and the distance between them is calculated dynamically based on image processing. To guarantee the positioning accuracy and the working efficiency, we propose a distance-regulated speed adapting strategy, in which the moving speed is adjusted intelligently based on the distance between the nanoknife and the target cell. The results indicate that the automatic non-embedded cutting is able to be achieved within 1-2 mins with low invasion benefiting from the high precise nanorobot system and the sharp edge of nanoknife. This research paves a way for the high-throughput cell cutting at cell's natural condition, which is expected to make significant impact on the biology studies, especially for the in-situ analysis at cellular and subcellular scale, such as cell interaction investigation, neural signal transduction and low invasive cell surgery.

  16. Determination of tiamulin in type C medicated swine feeds using high throughput extraction with liquid chromatography.

    Science.gov (United States)

    Moore, Douglas B; Britton, Nanc L; Smallidge, Robert L; Riter, Ken L

    2002-01-01

    An improved method for extraction and analysis of tiamulin is presented to address issues that arose during routine analysis of Type C medicated swine feeds under the current U.S. Food and Drug Administration-Center for Veterinary Medicine (FDA-CVM) approved method. The issues included the need for higher sample throughput and the ability to accommodate a wider variety of feed matrixes. Changes to the FDA-CVM approved method include reduced sample size and solvent volumes, phosphate buffering of tartaric acid, centrifugation, and use of a new liquid chromatography column and adjusted mobile phase composition. A paired sample study was performed to compare performance of the new and existing methods. The paired sample study showed no statistical difference between sample means of paired sets of 17 samples analyzed by both methods (t = 1.95 at 0.05 significance level, p = 0.068). A recovery study showed the method precision to be 2.06% (coefficient of variation) with an average standard recoveryof 95.8%. Ruggedness test results indicated good overall ruggedness of the method.

  17. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay

    Science.gov (United States)

    Zhong, Xuefeng; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-01

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter–receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter–receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity. PMID:29337911

  18. Accurate, high-throughput typing of copy number variation using paralogue ratios from dispersed repeats.

    Science.gov (United States)

    Armour, John A L; Palla, Raquel; Zeeuwen, Patrick L J M; den Heijer, Martin; Schalkwijk, Joost; Hollox, Edward J

    2007-01-01

    Recent work has demonstrated an unexpected prevalence of copy number variation in the human genome, and has highlighted the part this variation may play in predisposition to common phenotypes. Some important genes vary in number over a high range (e.g. DEFB4, which commonly varies between two and seven copies), and have posed formidable technical challenges for accurate copy number typing, so that there are no simple, cheap, high-throughput approaches suitable for large-scale screening. We have developed a simple comparative PCR method based on dispersed repeat sequences, using a single pair of precisely designed primers to amplify products simultaneously from both test and reference loci, which are subsequently distinguished and quantified via internal sequence differences. We have validated the method for the measurement of copy number at DEFB4 by comparison of results from >800 DNA samples with copy number measurements by MAPH/REDVR, MLPA and array-CGH. The new Paralogue Ratio Test (PRT) method can require as little as 10 ng genomic DNA, appears to be comparable in accuracy to the other methods, and for the first time provides a rapid, simple and inexpensive method for copy number analysis, suitable for application to typing thousands of samples in large case-control association studies.

  19. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    Directory of Open Access Journals (Sweden)

    Ibrahim B. Salisu

    2017-10-01

    Full Text Available As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1 DNA and (2 proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction and enzyme-linked immunosorbent assay (ELISA were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future.

  20. Vision-based Nano Robotic System for High-throughput Non-embedded Cell Cutting

    Science.gov (United States)

    Shang, Wanfeng; Lu, Haojian; Wan, Wenfeng; Fukuda, Toshio; Shen, Yajing

    2016-03-01

    Cell cutting is a significant task in biology study, but the highly productive non-embedded cell cutting is still a big challenge for current techniques. This paper proposes a vision-based nano robotic system and then realizes automatic non-embedded cell cutting with this system. First, the nano robotic system is developed and integrated with a nanoknife inside an environmental scanning electron microscopy (ESEM). Then, the positions of the nanoknife and the single cell are recognized, and the distance between them is calculated dynamically based on image processing. To guarantee the positioning accuracy and the working efficiency, we propose a distance-regulated speed adapting strategy, in which the moving speed is adjusted intelligently based on the distance between the nanoknife and the target cell. The results indicate that the automatic non-embedded cutting is able to be achieved within 1-2 mins with low invasion benefiting from the high precise nanorobot system and the sharp edge of nanoknife. This research paves a way for the high-throughput cell cutting at cell’s natural condition, which is expected to make significant impact on the biology studies, especially for the in-situ analysis at cellular and subcellular scale, such as cell interaction investigation, neural signal transduction and low invasive cell surgery.

  1. Digital PCR provides sensitive and absolute calibration for high throughput sequencing

    Directory of Open Access Journals (Sweden)

    Fan H Christina

    2009-03-01

    Full Text Available Abstract Background Next-generation DNA sequencing on the 454, Solexa, and SOLiD platforms requires absolute calibration of the number of molecules to be sequenced. This requirement has two unfavorable consequences. First, large amounts of sample-typically micrograms-are needed for library preparation, thereby limiting the scope of samples which can be sequenced. For many applications, including metagenomics and the sequencing of ancient, forensic, and clinical samples, the quantity of input DNA can be critically limiting. Second, each library requires a titration sequencing run, thereby increasing the cost and lowering the throughput of sequencing. Results We demonstrate the use of digital PCR to accurately quantify 454 and Solexa sequencing libraries, enabling the preparation of sequencing libraries from nanogram quantities of input material while eliminating costly and time-consuming titration runs of the sequencer. We successfully sequenced low-nanogram scale bacterial and mammalian DNA samples on the 454 FLX and Solexa DNA sequencing platforms. This study is the first to definitively demonstrate the successful sequencing of picogram quantities of input DNA on the 454 platform, reducing the sample requirement more than 1000-fold without pre-amplification and the associated bias and reduction in library depth. Conclusion The digital PCR assay allows absolute quantification of sequencing libraries, eliminates uncertainties associated with the construction and application of standard curves to PCR-based quantification, and with a coefficient of variation close to 10%, is sufficiently precise to enable direct sequencing without titration runs.

  2. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay.

    Science.gov (United States)

    Zhong, Xuefeng; Chen, Fangjiong; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-16

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter-receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter-receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity.

  3. Predictive Power of Machine Learning for Optimizing Solar Water Heater Performance: The Potential Application of High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-01-01

    Full Text Available Predicting the performance of solar water heater (SWH is challenging due to the complexity of the system. Fortunately, knowledge-based machine learning can provide a fast and precise prediction method for SWH performance. With the predictive power of machine learning models, we can further solve a more challenging question: how to cost-effectively design a high-performance SWH? Here, we summarize our recent studies and propose a general framework of SWH design using a machine learning-based high-throughput screening (HTS method. Design of water-in-glass evacuated tube solar water heater (WGET-SWH is selected as a case study to show the potential application of machine learning-based HTS to the design and optimization of solar energy systems.

  4. A Low Collision and High Throughput Data Collection Mechanism for Large-Scale Super Dense Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Chunyang Lei

    2016-07-01

    Full Text Available Super dense wireless sensor networks (WSNs have become popular with the development of Internet of Things (IoT, Machine-to-Machine (M2M communications and Vehicular-to-Vehicular (V2V networks. While highly-dense wireless networks provide efficient and sustainable solutions to collect precise environmental information, a new channel access scheme is needed to solve the channel collision problem caused by the large number of competing nodes accessing the channel simultaneously. In this paper, we propose a space-time random access method based on a directional data transmission strategy, by which collisions in the wireless channel are significantly decreased and channel utility efficiency is greatly enhanced. Simulation results show that our proposed method can decrease the packet loss rate to less than 2 % in large scale WSNs and in comparison with other channel access schemes for WSNs, the average network throughput can be doubled.

  5. A Low Collision and High Throughput Data Collection Mechanism for Large-Scale Super Dense Wireless Sensor Networks.

    Science.gov (United States)

    Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Gaura, Elena; Brusey, James; Zhang, Xuekun; Dutkiewicz, Eryk

    2016-07-18

    Super dense wireless sensor networks (WSNs) have become popular with the development of Internet of Things (IoT), Machine-to-Machine (M2M) communications and Vehicular-to-Vehicular (V2V) networks. While highly-dense wireless networks provide efficient and sustainable solutions to collect precise environmental information, a new channel access scheme is needed to solve the channel collision problem caused by the large number of competing nodes accessing the channel simultaneously. In this paper, we propose a space-time random access method based on a directional data transmission strategy, by which collisions in the wireless channel are significantly decreased and channel utility efficiency is greatly enhanced. Simulation results show that our proposed method can decrease the packet loss rate to less than 2 % in large scale WSNs and in comparison with other channel access schemes for WSNs, the average network throughput can be doubled.

  6. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  7. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    Science.gov (United States)

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  8. Precision Medicine In Action | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... page please turn JavaScript on. Feature: NIH Precision Medicine Initiative Precision Medicine In Action Past Issues / Fall 2015 Table of ... Dishman "I am totally motivated to support precision medicine because I am one of the early prototype ...

  9. NCI and the Precision Medicine Initiative®

    Science.gov (United States)

    NCI's activities related to precision medicine focuses on new and expanded precision medicine clinical trials; mechanisms to overcome drug resistance to cancer treatments; and developing a shared digital repository of precision medicine trials data.

  10. RNA Biomarkers: Frontier of Precision Medicine for Cancer

    Directory of Open Access Journals (Sweden)

    Xiaochen Xi

    2017-02-01

    Full Text Available As an essential part of central dogma, RNA delivers genetic and regulatory information and reflects cellular states. Based on high‐throughput sequencing technologies, cumulating data show that various RNA molecules are able to serve as biomarkers for the diagnosis and prognosis of various diseases, for instance, cancer. In particular, detectable in various bio‐fluids, such as serum, saliva and urine, extracellular RNAs (exRNAs are emerging as non‐invasive biomarkers for earlier cancer diagnosis, tumor progression monitor, and prediction of therapy response. In this review, we summarize the latest studies on various types of RNA biomarkers, especially extracellular RNAs, in cancer diagnosis and prognosis, and illustrate several well‐known RNA biomarkers of clinical utility. In addition, we describe and discuss general procedures and issues in investigating exRNA biomarkers, and perspectives on utility of exRNAs in precision medicine.

  11. Precise Manipulation and Patterning of Protein Crystals for Macromolecular Crystallography Using Surface Acoustic Waves.

    Science.gov (United States)

    Guo, Feng; Zhou, Weijie; Li, Peng; Mao, Zhangming; Yennawar, Neela H; French, Jarrod B; Huang, Tony Jun

    2015-06-01

    Advances in modern X-ray sources and detector technology have made it possible for crystallographers to collect usable data on crystals of only a few micrometers or less in size. Despite these developments, sample handling techniques have significantly lagged behind and often prevent the full realization of current beamline capabilities. In order to address this shortcoming, a surface acoustic wave-based method for manipulating and patterning crystals is developed. This method, which does not damage the fragile protein crystals, can precisely manipulate and pattern micrometer and submicrometer-sized crystals for data collection and screening. The technique is robust, inexpensive, and easy to implement. This method not only promises to significantly increase efficiency and throughput of both conventional and serial crystallography experiments, but will also make it possible to collect data on samples that were previously intractable. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A network architecture for precision formation flying using the IEEE 802.11 MAC Protocol

    Science.gov (United States)

    Clare, Loren P.; Gao, Jay L.; Jennings, Esther H.; Okino, Clayton

    2005-01-01

    Precision Formation Flying missions involve the tracking and maintenance of spacecraft in a desired geometric formation. The strong coupling of spacecraft in formation flying control requires inter-spacecraft communication to exchange information. In this paper, we present a network architecture that supports PFF control, from the initial random deployment phase to the final formation. We show that a suitable MAC layer for the application protocol is IEEE's 802.11 MAC protocol. IEEE 802.11 MAC has two modes of operations: DCF and PCF. We show that DCF is suitable for the initial deployment phase while switching to PCF when the spacecraft are in formation improves jitter and throughput. We also consider the effect of routing on protocol performance and suggest when it is profitable to turn off route discovery to achieve better network performance.

  13. Treatment Algorithms Based on Tumor Molecular Profiling: The Essence of Precision Medicine Trials.

    Science.gov (United States)

    Le Tourneau, Christophe; Kamal, Maud; Tsimberidou, Apostolia-Maria; Bedard, Philippe; Pierron, Gaëlle; Callens, Céline; Rouleau, Etienne; Vincent-Salomon, Anne; Servant, Nicolas; Alt, Marie; Rouzier, Roman; Paoletti, Xavier; Delattre, Olivier; Bièche, Ivan

    2016-04-01

    With the advent of high-throughput molecular technologies, several precision medicine (PM) studies are currently ongoing that include molecular screening programs and PM clinical trials. Molecular profiling programs establish the molecular profile of patients' tumors with the aim to guide therapy based on identified molecular alterations. The aim of prospective PM clinical trials is to assess the clinical utility of tumor molecular profiling and to determine whether treatment selection based on molecular alterations produces superior outcomes compared with unselected treatment. These trials use treatment algorithms to assign patients to specific targeted therapies based on tumor molecular alterations. These algorithms should be governed by fixed rules to ensure standardization and reproducibility. Here, we summarize key molecular, biological, and technical criteria that, in our view, should be addressed when establishing treatment algorithms based on tumor molecular profiling for PM trials. © The Author 2015. Published by Oxford University Press.

  14. Molecular Classification and Pharmacogenetics of Primary Plasma Cell Leukemia: An Initial Approach toward Precision Medicine.

    Science.gov (United States)

    Simeon, Vittorio; Todoerti, Katia; La Rocca, Francesco; Caivano, Antonella; Trino, Stefania; Lionetti, Marta; Agnelli, Luca; De Luca, Luciana; Laurenzana, Ilaria; Neri, Antonino; Musto, Pellegrino

    2015-07-30

    Primary plasma cell leukemia (pPCL) is a rare and aggressive variant of multiple myeloma (MM) which may represent a valid model for high-risk MM. This disease is associated with a very poor prognosis, and unfortunately, it has not significantly improved during the last three decades. New high-throughput technologies have allowed a better understanding of the molecular basis of this disease and moved toward risk stratification, providing insights for targeted therapy studies. This knowledge, added to the pharmacogenetic profile of new and old agents in the analysis of efficacy and safety, could contribute to help clinical decisions move toward a precision medicine and a better clinical outcome for these patients. In this review, we describe the available literature concerning the genomic characterization and pharmacogenetics of plasma cell leukemia (PCL).

  15. Ionospheric Modeling for Precise GNSS Applications

    NARCIS (Netherlands)

    Memarzadeh, Y.

    2009-01-01

    The main objective of this thesis is to develop a procedure for modeling and predicting ionospheric Total Electron Content (TEC) for high precision differential GNSS applications. As the ionosphere is a highly dynamic medium, we believe that to have a reliable procedure it is necessary to transfer

  16. High precision, rapid laser hole drilling

    Science.gov (United States)

    Chang, Jim J.; Friedman, Herbert W.; Comaskey, Brian J.

    2013-04-02

    A laser system produces a first laser beam for rapidly removing the bulk of material in an area to form a ragged hole. The laser system produces a second laser beam for accurately cleaning up the ragged hole so that the final hole has dimensions of high precision.

  17. 40 CFR 75.41 - Precision criteria.

    Science.gov (United States)

    2010-07-01

    ... correlation analysis according to the following procedures. (i) Plot each of the paired emissions readings as... and analysis. To demonstrate precision equal to or better than the continuous emission monitoring system, the owner or operator shall conduct an F-test, a correlation analysis, and a t-test for bias as...

  18. Precise relative cross sections for np scattering

    International Nuclear Information System (INIS)

    Goetz, J.; Brogli-Gysin, C.; Hammans, M.; Haffter, P.; Henneck, R.; Jourdan, J.; Masson, G.; Qin, L.M.; Robinson, S.; Sick, I.; Tuccillo, M.

    1994-01-01

    We present data on the differential cross section for neutron-proton scattering for an incident neutron energy of 67 MeV. These data allow a precise determination of the 1 P 1 phase which, in phase-shift analyses, is strongly correlated with the S-D amplitude which we are measuring via different observables. ((orig.))

  19. Surfaces in Precision Engineering, Microengineering and Nanotechnology

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Kunzmann, H.; Peggs, G. N.

    2003-01-01

    with precision engineering, microengineering and nanotechnology are presented, encompassing surfaces in computers, MEMS, biomedical systems, light and X-ray optics, as well as in chemical systems. Surface properties at micro and nanoscale are considered, including geometry as well as physical and chemical...

  20. Precision tests of the Standard Model

    International Nuclear Information System (INIS)

    Ol'shevskij, A.G.

    1996-01-01

    The present status of the precision measurements of electroweak observables is discussed with the special emphasis on the results obtained recently. All together these measurements provide the basis for the stringent test of the Standard Model and determination of the SM parameters. 22 refs., 23 figs., 11 tabs

  1. Lane Determination with GPS Precise Point Positioning

    NARCIS (Netherlands)

    Knoop, V.L.; de Bakker, P.F.; Tiberius, C.C.J.M.; van Arem, B.

    2017-01-01

    Modern intelligent transport solutions can achieve an improvement of traffic flow on motorways. With lane-specific measurements and lane-specific control, more measures are possible. Single frequency precise point positioning (PPP) is a newly developed and affordable technique to achieve an

  2. Bringing Precision Medicine to Community Oncologists.

    Science.gov (United States)

    2017-01-01

    Quest Diagnostics has teamed up with Memorial Sloan Kettering Cancer Center and IBM Watson Health to offer IBM Watson Genomics to its network of community cancer centers and hospitals. This new service aims to advance precision medicine by combining genomic tumor sequencing with the power of cognitive computing. ©2017 American Association for Cancer Research.

  3. High-precision positioning of radar scatterers

    NARCIS (Netherlands)

    Dheenathayalan, P.; Small, D.; Schubert, A.; Hanssen, R.F.

    2016-01-01

    Remote sensing radar satellites cover wide areas and provide spatially dense measurements, with millions of scatterers. Knowledge of the precise position of each radar scatterer is essential to identify the corresponding object and interpret the estimated deformation. The absolute position accuracy

  4. Precision of quantum tomographic detection of radiation

    Energy Technology Data Exchange (ETDEWEB)

    D' Ariano, G.M. (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy) Istituto Nazionale di Fisica Nucleare, Sezione di Pavia, Via A. Bassi 6, I-27100, Pavia (Italy)); Macchiavello, Chiara (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy)); Paris, M.G.A. (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy))

    1994-11-21

    Homodyne tomography provides an experimental technique for reconstructing the density matrix of the radiation field. Here we analyze the tomographic precision in recovering observables like the photon number, the quadrature, and the phase. We show that tomographic reconstruction, despite providing a complete characterization of the state of the field, is generally much less efficient than conventional detection techniques. ((orig.))

  5. Precision of quantum tomographic detection of radiation

    International Nuclear Information System (INIS)

    D'Ariano, G.M.; Macchiavello, Chiara; Paris, M.G.A.

    1994-01-01

    Homodyne tomography provides an experimental technique for reconstructing the density matrix of the radiation field. Here we analyze the tomographic precision in recovering observables like the photon number, the quadrature, and the phase. We show that tomographic reconstruction, despite providing a complete characterization of the state of the field, is generally much less efficient than conventional detection techniques. ((orig.))

  6. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  7. Technology on precision measurement of mass

    International Nuclear Information System (INIS)

    2005-10-01

    This book mentions mass and scales about technology for precision measurement, which deal with how to measure mass with scale. So it describes the basic things of mass and scales. It includes translated book of international standard OIML with demand of measurement and technology and form for test report and international original standard OIML with metrological and technical requirements and test report format.

  8. Artificial intelligence, physiological genomics, and precision medicine.

    Science.gov (United States)

    Williams, Anna Marie; Liu, Yong; Regner, Kevin R; Jotterand, Fabrice; Liu, Pengyuan; Liang, Mingyu

    2018-04-01

    Big data are a major driver in the development of precision medicine. Efficient analysis methods are needed to transform big data into clinically-actionable knowledge. To accomplish this, many researchers are turning toward machine learning (ML), an approach of artificial intelligence (AI) that utilizes modern algorithms to give computers the ability to learn. Much of the effort to advance ML for precision medicine has been focused on the development and implementation of algorithms and the generation of ever larger quantities of genomic sequence data and electronic health records. However, relevance and accuracy of the data are as important as quantity of data in the advancement of ML for precision medicine. For common diseases, physiological genomic readouts in disease-applicable tissues may be an effective surrogate to measure the effect of genetic and environmental factors and their interactions that underlie disease development and progression. Disease-applicable tissue may be difficult to obtain, but there are important exceptions such as kidney needle biopsy specimens. As AI continues to advance, new analytical approaches, including those that go beyond data correlation, need to be developed and ethical issues of AI need to be addressed. Physiological genomic readouts in disease-relevant tissues, combined with advanced AI, can be a powerful approach for precision medicine for common diseases.

  9. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  10. Explicit Covariance Matrix for Particle Measurement Precision

    CERN Document Server

    Karimäki, Veikko

    1997-01-01

    We derive explicit and precise formulae for 3 by 3 error matrix of the particle transverse momentum, direction and impact parameter. The error matrix elements are expressed as functions of up to fourth order statistical moments of the measured coordinates. The formulae are valid for any curvature and track length in case of negligible multiple scattering.

  11. Precision cryogenic temperature data acquisition system

    International Nuclear Information System (INIS)

    Farah, Y.; Sondericker, J.H.

    1985-01-01

    A Multiplexed Temperature Data Acquisition System with an overall precision of +-25 ppM has been designed using state-of-the-art electronics to accurately read temperature between 2.4 K and 600 K from pre-calibrated transducers such as germanium, silicon diode, thermistor or platinum temperature sensors

  12. A fluence device for precise radiation dosimetry

    International Nuclear Information System (INIS)

    Arnott, R.G.T.; Peak, M.J.

    1979-01-01

    An instrument is described which has been designed to ensure precise positioning of samples and sensing devices in three dimensions at all times during irradiation procedures. The system, which is both robust and sensitive, overcomes difficulties experienced when slight variations in the positioning of a sample under irradiation results in large changes in fluence. (UK)

  13. Fiber Scrambling for High Precision Spectrographs

    Science.gov (United States)

    Kaplan, Zachary; Spronck, J. F. P.; Fischer, D.

    2011-05-01

    The detection of Earth-like exoplanets with the radial velocity method requires extreme Doppler precision and long-term stability in order to measure tiny reflex velocities in the host star. Recent planet searches have led to the detection of so called "super-Earths” (up to a few Earth masses) that induce radial velocity changes of about 1 m/s. However, the detection of true Earth analogs requires a precision of 10 cm/s. One of the largest factors limiting Doppler precision is variation in the Point Spread Function (PSF) from observation to observation due to changes in the illumination of the slit and spectrograph optics. Thus, this stability has become a focus of current instrumentation work. Fiber optics have been used since the 1980's to couple telescopes to high-precision spectrographs, initially for simpler mechanical design and control. However, fiber optics are also naturally efficient scramblers. Scrambling refers to a fiber's ability to produce an output beam independent of input. Our research is focused on characterizing the scrambling properties of several types of fibers, including circular, square and octagonal fibers. By measuring the intensity distribution after the fiber as a function of input beam position, we can simulate guiding errors that occur at an observatory. Through this, we can determine which fibers produce the most uniform outputs for the severest guiding errors, improving the PSF and allowing sub-m/s precision. However, extensive testing of fibers of supposedly identical core diameter, length and shape from the same manufacturer has revealed the "personality” of individual fibers. Personality describes differing intensity patterns for supposedly duplicate fibers illuminated identically. Here, we present our results on scrambling characterization as a function of fiber type, while studying individual fiber personality.

  14. Proposal for a CLEO precision vertex detector

    International Nuclear Information System (INIS)

    1991-01-01

    Fermilab experiment E691 and CERN experiment NA32 have demonstrated the enormous power of precision vertexing for studying heavy quark physics. Nearly all collider experiments now have or are installing precision vertex detectors. This is a proposal for a precision vertex detector for CLEO, which will be the pre-eminent heavy quark experiment for at least the next 5 years. The purpose of a precision vertex detector for CLEO is to enhance the capabilities for isolating B, charm, and tau decays and to make it possible to measure the decay time. The precision vertex detector will also significantly improve strange particle identification and help with the tracking. The installation and use of this detector at CLEO is an important step in developing a vertex detector for an asymmetric B factory and therefore in observing CP violation in B decays. The CLEO environment imposes a number of unique conditions and challenges. The machine will be operating near the γ (4S) in energy. This means that B's are produced with a very small velocity and travel a distance about 1/2 that of the expected vertex position resolution. As a consequence B decay time information will not be useful for most physics. On the other hand, the charm products of B decays have a higher velocity. For the long lived D + in particular, vertex information can be used to isolate the charm particle on an event-by-event basis. This helps significantly in reconstructing B's. The vertex resolution for D's from B's is limited by multiple Coulomb scattering of the necessarily rather low momentum tracks. As a consequence it is essential to minimize the material, as measured in radiation lengths, in the beam pip and the vertex detector itself. It is also essential to build the beam pipe and detector with the smallest possible radius

  15. Spike timing precision of neuronal circuits.

    Science.gov (United States)

    Kilinc, Deniz; Demir, Alper

    2018-04-17

    Spike timing is believed to be a key factor in sensory information encoding and computations performed by the neurons and neuronal circuits. However, the considerable noise and variability, arising from the inherently stochastic mechanisms that exist in the neurons and the synapses, degrade spike timing precision. Computational modeling can help decipher the mechanisms utilized by the neuronal circuits in order to regulate timing precision. In this paper, we utilize semi-analytical techniques, which were adapted from previously developed methods for electronic circuits, for the stochastic characterization of neuronal circuits. These techniques, which are orders of magnitude faster than traditional Monte Carlo type simulations, can be used to directly compute the spike timing jitter variance, power spectral densities, correlation functions, and other stochastic characterizations of neuronal circuit operation. We consider three distinct neuronal circuit motifs: Feedback inhibition, synaptic integration, and synaptic coupling. First, we show that both the spike timing precision and the energy efficiency of a spiking neuron are improved with feedback inhibition. We unveil the underlying mechanism through which this is achieved. Then, we demonstrate that a neuron can improve on the timing precision of its synaptic inputs, coming from multiple sources, via synaptic integration: The phase of the output spikes of the integrator neuron has the same variance as that of the sample average of the phases of its inputs. Finally, we reveal that weak synaptic coupling among neurons, in a fully connected network, enables them to behave like a single neuron with a larger membrane area, resulting in an improvement in the timing precision through cooperation.

  16. Throughput centered prioritization of machines in transfer lines

    International Nuclear Information System (INIS)

    Pascual, R.; Godoy, D.; Louit, D.M.

    2011-01-01

    In an environment of scarce resources and complex production systems, prioritizing is key to confront the challenge of managing physical assets. In the literature, there exist a number of techniques to prioritize maintenance decisions that consider safety, technical and business perspectives. However, the effect of risk mitigating elements-such as intermediate buffers in production lines-on prioritization has not yet been investigated in depth. In this line, the work proposes a user-friendly graphical technique called the system efficiency influence diagram (SEID). Asset managers may use SEID to identify machines that have a greater impact on the system throughput, and thus set prioritized maintenance policies and/or redesign of buffers capacities. The tool provides insight to the analyst as it decomposes the influence of a given machine on the system throughput as a product of two elements: (1) system influence efficiency factor and (2) machine unavailability factor. We illustrate its applicability using three case studies: a four-machine transfer line, a vehicle assembly line, and an open-pit mining conveyor system. The results confirm that the machines with greater unavailability factors are not necessarily the most important for the efficiency of the production line, as it is the case when no intermediate buffers exist. As a decision aid tool, SEID emphasizes the need to move from a maintenance vision focused on machine availability, to a systems engineering perspective. - Highlights: → We propose a graphical technique to prioritize machines in production lines. → The tool is called 'system efficiency influence diagram' (SEID). → It helps setting prioritized maintenance policies and/or redesign of buffers. → The SEID technique focuses on system efficiency and throughput. → We illustrate its applicability using three case studies.

  17. High throughput experimentation for the discovery of new catalysts

    International Nuclear Information System (INIS)

    Thomson, S.; Hoffmann, C.; Johann, T.; Wolf, A.; Schmidt, H.-W.; Farrusseng, D.; Schueth, F.

    2002-01-01

    Full text: The use of combinatorial chemistry to obtain new materials has been developed extensively by the pharmaceutical and biochemical industries, but such approaches have been slow to impact on the field of heterogeneous catalysis. The reasons for this lie in with difficulties associated in the synthesis, characterisation and determination of catalytic properties of such materials. In many synthetic and catalytic reactions, the conditions used are difficult to emulate using High Throughput Experimentation (HTE). Furthermore, the ability to screen these catalysts simultaneously in real time, requires the development and/or modification of characterisation methods. Clearly, there is a need for both high throughput synthesis and screening of new and novel reactions, and we describe several new concepts that help to achieve these goals. Although such problems have impeded the development of combinatorial catalysis, the fact remains that many highly attractive processes still exist for which no suitable catalysts have been developed. The ability to decrease the tiFme needed to evaluate catalyst is therefore essential and this makes the use of high throughput techniques highly desirable. In this presentation we will describe the synthesis, catalytic testing, and novel screening methods developed at the Max Planck Institute. Automated synthesis procedures, performed by the use of a modified Gilson pipette robot, will be described, as will the development of two 16 and 49 sample fixed bed reactors and two 25 and 29 sample three phase reactors for catalytic testing. We will also present new techniques for the characterisation of catalysts and catalytic products using standard IR microscopy and infrared focal plane array detection, respectively

  18. Throughput centered prioritization of machines in transfer lines

    Energy Technology Data Exchange (ETDEWEB)

    Pascual, R., E-mail: rpascual@ing.puc.cl [Physical Asset Management Lab, Centro de Mineria, Pontificia Universidad Catolica de Chile, Av. Vicuna Mackenna 4860, Santiago (Chile); Godoy, D. [Physical Asset Management Lab, Centro de Mineria, Pontificia Universidad Catolica de Chile, Av. Vicuna Mackenna 4860, Santiago (Chile); Louit, D.M. [Komatsu Chile S.A., Av. Americo Vespucio 0631, Quilicura, Santiago (Chile)

    2011-10-15

    In an environment of scarce resources and complex production systems, prioritizing is key to confront the challenge of managing physical assets. In the literature, there exist a number of techniques to prioritize maintenance decisions that consider safety, technical and business perspectives. However, the effect of risk mitigating elements-such as intermediate buffers in production lines-on prioritization has not yet been investigated in depth. In this line, the work proposes a user-friendly graphical technique called the system efficiency influence diagram (SEID). Asset managers may use SEID to identify machines that have a greater impact on the system throughput, and thus set prioritized maintenance policies and/or redesign of buffers capacities. The tool provides insight to the analyst as it decomposes the influence of a given machine on the system throughput as a product of two elements: (1) system influence efficiency factor and (2) machine unavailability factor. We illustrate its applicability using three case studies: a four-machine transfer line, a vehicle assembly line, and an open-pit mining conveyor system. The results confirm that the machines with greater unavailability factors are not necessarily the most important for the efficiency of the production line, as it is the case when no intermediate buffers exist. As a decision aid tool, SEID emphasizes the need to move from a maintenance vision focused on machine availability, to a systems engineering perspective. - Highlights: > We propose a graphical technique to prioritize machines in production lines. > The tool is called 'system efficiency influence diagram' (SEID). > It helps setting prioritized maintenance policies and/or redesign of buffers. > The SEID technique focuses on system efficiency and throughput. > We illustrate its applicability using three case studies.

  19. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  20. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  1. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  2. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  3. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  4. High-throughput anisotropic plasma etching of polyimide for MEMS

    International Nuclear Information System (INIS)

    Bliznetsov, Vladimir; Manickam, Anbumalar; Ranganathan, Nagarajan; Chen, Junwei

    2011-01-01

    This note describes a new high-throughput process of polyimide etching for the fabrication of MEMS devices with an organic sacrificial layer approach. Using dual frequency superimposed capacitively coupled plasma we achieved a vertical profile of polyimide with an etching rate as high as 3.5 µm min −1 . After the fabrication of vertical structures in a polyimide material, additional steps were performed to fabricate structural elements of MEMS by deposition of a SiO 2 layer and performing release etching of polyimide. (technical note)

  5. Application of high-throughput DNA sequencing in phytopathology.

    Science.gov (United States)

    Studholme, David J; Glover, Rachel H; Boonham, Neil

    2011-01-01

    The new sequencing technologies are already making a big impact in academic research on medically important microbes and may soon revolutionize diagnostics, epidemiology, and infection control. Plant pathology also stands to gain from exploiting these opportunities. This manuscript reviews some applications of these high-throughput sequencing methods that are relevant to phytopathology, with emphasis on the associated computational and bioinformatics challenges and their solutions. Second-generation sequencing technologies have recently been exploited in genomics of both prokaryotic and eukaryotic plant pathogens. They are also proving to be useful in diagnostics, especially with respect to viruses. Copyright © 2011 by Annual Reviews. All rights reserved.

  6. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  7. High Throughput System for Plant Height and Hyperspectral Measurement

    Science.gov (United States)

    Zhao, H.; Xu, L.; Jiang, H.; Shi, S.; Chen, D.

    2018-04-01

    Hyperspectral and three-dimensional measurement can obtain the intrinsic physicochemical properties and external geometrical characteristics of objects, respectively. Currently, a variety of sensors are integrated into a system to collect spectral and morphological information in agriculture. However, previous experiments were usually performed with several commercial devices on a single platform. Inadequate registration and synchronization among instruments often resulted in mismatch between spectral and 3D information of the same target. And narrow field of view (FOV) extends the working hours in farms. Therefore, we propose a high throughput prototype that combines stereo vision and grating dispersion to simultaneously acquire hyperspectral and 3D information.

  8. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration...... of soft magnetic elements in the chip leads to a slightly higher capturing efficiency and a more uniform distribution of captured beads over the separation chamber than the system without soft magnetic elements....

  9. Quack: A quality assurance tool for high throughput sequence data.

    Science.gov (United States)

    Thrash, Adam; Arick, Mark; Peterson, Daniel G

    2018-05-01

    The quality of data generated by high-throughput DNA sequencing tools must be rapidly assessed in order to determine how useful the data may be in making biological discoveries; higher quality data leads to more confident results and conclusions. Due to the ever-increasing size of data sets and the importance of rapid quality assessment, tools that analyze sequencing data should quickly produce easily interpretable graphics. Quack addresses these issues by generating information-dense visualizations from FASTQ files at a speed far surpassing other publicly available quality assurance tools in a manner independent of sequencing technology. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  11. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  12. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  13. High throughput platforms for structural genomics of integral membrane proteins.

    Science.gov (United States)

    Mancia, Filippo; Love, James

    2011-08-01

    Structural genomics approaches on integral membrane proteins have been postulated for over a decade, yet specific efforts are lagging years behind their soluble counterparts. Indeed, high throughput methodologies for production and characterization of prokaryotic integral membrane proteins are only now emerging, while large-scale efforts for eukaryotic ones are still in their infancy. Presented here is a review of recent literature on actively ongoing structural genomics of membrane protein initiatives, with a focus on those aimed at implementing interesting techniques aimed at increasing our rate of success for this class of macromolecules. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  15. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  16. HIGH THROUGHPUT SYSTEM FOR PLANT HEIGHT AND HYPERSPECTRAL MEASUREMENT

    Directory of Open Access Journals (Sweden)

    H. Zhao

    2018-04-01

    Full Text Available Hyperspectral and three-dimensional measurement can obtain the intrinsic physicochemical properties and external geometrical characteristics of objects, respectively. Currently, a variety of sensors are integrated into a system to collect spectral and morphological information in agriculture. However, previous experiments were usually performed with several commercial devices on a single platform. Inadequate registration and synchronization among instruments often resulted in mismatch between spectral and 3D information of the same target. And narrow field of view (FOV extends the working hours in farms. Therefore, we propose a high throughput prototype that combines stereo vision and grating dispersion to simultaneously acquire hyperspectral and 3D information.

  17. Throughput-Based Traffic Steering in LTE-Advanced HetNet Deployments

    DEFF Research Database (Denmark)

    Gimenez, Lucas Chavarria; Kovacs, Istvan Z.; Wigard, Jeroen

    2015-01-01

    The objective of this paper is to propose traffic steering solutions that aim at optimizing the end-user throughput. Two different implementations of an active mode throughput-based traffic steering algorithm for Heterogeneous Networks (HetNet) are introduced. One that always forces handover of t...... throughput is generally higher, reaching values of 36% and 18% for the medium- and high-load conditions....

  18. Full-wave current conveyor precision rectifier

    Directory of Open Access Journals (Sweden)

    Đukić Slobodan R.

    2008-01-01

    Full Text Available A circuit that provides precision rectification of small signal with low temperature sensitivity for frequencies up to 100 kHz without waveform distortion is presented. It utilizes an improved second type current conveyor based on current-steering output stage and biased silicon diodes. The use of a DC current source to bias the rectifying diodes provides higher temperature stability and lower DC offset level at the output. Proposed design of the precision rectifier ensures good current transfer linearity in the range that satisfy class A of the amplifier and good voltage transfer characteristic for low level signals. Distortion during the zero crossing of the input signal is practically eliminated. Design of the proposed rectifier is realized with standard components.

  19. A precision pulser for main ring extraction

    International Nuclear Information System (INIS)

    Dinkel, J.; Biggs, J.

    1985-01-01

    A pulser has been designed to produce a 14 Hz sinusoid current pulse at a 2 s rate with peak amplitudes from 400 amps to 3750 amps, and a long term stability of + or -400 mA. Short term stability is achieved by the use of a precision voltage regulator for the capacitor bank. This voltage regulator uses gate turnoff thyristors to control the charging current to the 13 mF capacitor bank. Load current is monitored with a precision dc current transductor. The peak value is read into a single chip microcomputer programmed to act as a digital regulator. The microcomputer calculates reference values for the capacitor bank charging supply and the capacitor bank voltage regulator

  20. A precision pulser for main ring extraction

    Energy Technology Data Exchange (ETDEWEB)

    Dinkel, J.; Biggs, J.

    1985-10-01

    A pulser has been designed to produce a 14 Hz sinusoid current pulse at a 2 s rate with peak amplitudes from 400 amps to 3750 amps, and a long term stability of + or -400 mA. Short term stability is achieved by the use of a precision voltage regulator for the capacitor bank. This voltage regulator uses gate turnoff thyristors to control the charging current to the 13 mF capacitor bank. Load current is monitored with a precision dc current transductor. The peak value is read into a single chip microcomputer programmed to act as a digital regulator. The microcomputer calculates reference values for the capacitor bank charging supply and the capacitor bank voltage regulator.

  1. Precision pulser for main ring extraction

    International Nuclear Information System (INIS)

    Dinkel, J.; Biggs, J.

    1985-01-01

    A pulser has been designed to produce a 14 Hz sinusoid current pulse at a 2 s rate with peak amplitudes from 400 amps to 3750 amps, and a long term stability of +/-400 mA. Short term stability is achieved by the use of a precision voltage regulator for the capacitor bank. This voltage regulator uses gate turnoff thyristors to control the charging current to the 13 mF capacitor bank. Load current is monitored with a precision dc current transductor. The peak value is read into a single chip microcomputer programmed to act as a digital regulator. The microcomputer calculates reference values for the capacitor bank charging supply and the capacitor bank voltage regulator

  2. Precision experiments with antihydrogen: an outlook

    International Nuclear Information System (INIS)

    Doser, Michael

    2011-01-01

    After a first generation of experiments has demonstrated the feasibility of forming - in a controlled manner - low-energy antihydrogen atoms via several different techniques, a second generation of experiments is now attempting to trap sufficiently cold atoms, or to form an atomic beam of antihydrogen atoms. The goal of these experiments is to carry out comparative precision spectroscopy between hydrogen and antihydrogen, in view of testing the CPT theorem, either through 1S-2S spectroscopy or via a measurement of the hyperfine splitting of the ground state of antihydrogen. A related class of experiments combines techniques from these experiments with recent developments in the formation of positronium to test the gravitational interaction between matter and antimatter. A significant number of challenges and limitations will still need to be overcome before precision measurements with antihydrogen become feasible, with the next significant milestones being either trapping of antihydrogen or the formation of a beam of antihydrogen.

  3. Mobile Robotic Teams Applied to Precision Agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Matthew Oley; Kinoshita, Robert Arthur; Mckay, Mark D; Willis, Walter David; Gunderson, R.W.; Flann, N.S.

    1999-04-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) and Utah State University’s Center for Self-Organizing and Intelligent Systems (CSOIS) have developed a team of autonomous robotic vehicles applicable to precision agriculture. A unique technique has been developed to plan, coordinate, and optimize missions in large structured environments for these autonomous vehicles in realtime. Two generic tasks are supported: 1) Driving to a precise location, and 2) Sweeping an area while activating on-board equipment. Sensor data and task achievement data is shared among the vehicles enabling them to cooperatively adapt to changing environmental, vehicle, and task conditions. This paper discusses the development of the autonomous robotic team, details of the mission-planning algorithm, and successful field demonstrations at the INEEL.

  4. Mobile Robotic Teams Applied to Precision Agriculture

    Energy Technology Data Exchange (ETDEWEB)

    M.D. McKay; M.O. Anderson; N.S. Flann (Utah State University); R.A. Kinoshita; R.W. Gunderson; W.D. Willis (INEEL)

    1999-04-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) and Utah State University�s Center for Self-Organizing and Intelligent Systems (CSOIS) have developed a team of autonomous robotic vehicles applicable to precision agriculture. A unique technique has been developed to plan, coordinate, and optimize missions in large structured environments for these autonomous vehicles in real-time. Two generic tasks are supported: 1) Driving to a precise location, and 2) Sweeping an area while activating on-board equipment. Sensor data and task achievement data is shared among the vehicles enabling them to cooperatively adapt to changing environmental, vehicle, and task conditions. This paper discusses the development of the autonomous robotic team, details of the mission-planning algorithm, and successful field demonstrations at the INEEL.

  5. Improving the precision of noisy oscillators

    Science.gov (United States)

    Moehlis, Jeff

    2014-04-01

    We consider how the period of an oscillator is affected by white noise, with special attention given to the cases of additive noise and parameter fluctuations. Our treatment is based upon the concepts of isochrons, which extend the notion of the phase of a stable periodic orbit to the basin of attraction of the periodic orbit, and phase response curves, which can be used to understand the geometry of isochrons near the periodic orbit. This includes a derivation of the leading-order effect of noise on the statistics of an oscillator’s period. Several examples are considered in detail, which illustrate the use and validity of the theory, and demonstrate how to improve a noisy oscillator’s precision by appropriately tuning system parameters or operating away from a bifurcation point. It is also shown that appropriately timed impulsive kicks can give further improvements to oscillator precision.

  6. Precision measurements of the CKM angle gamma

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The level of CP-violation permitted within the Standard Model cannot account for the matter dominated universe in which we live. Within the Standard Model the CKM matrix, which describes the quark couplings, is expected to be unitary. By making precise measurements of the CKM matrix parameters new physics models can be constrained, or with sufficient precision the effects of physics beyond the standard model might become apparent. The CKM angle gamma is the least well known angle of the unitarity triangle. It is the only angle easily accessible at tree-level, and furthermore has almost no theoretical uncertainties. Therefore it provides an invaluable Standard Model benchmark against which other new physics sensitive tests of the CP-violation can be made. I will discuss recent measurements of gamma using the the Run 1 LHCb dataset, which improve our knowledge of this key parameter.

  7. Precision measurements with LPCTrap at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Liénard, E., E-mail: lienard@lpccaen.in2p3.fr; Ban, G. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Couratin, C. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Delahaye, P. [GANIL, CEA/DSM-CNRS/IN2P3 (France); Durand, D.; Fabian, X. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Fabre, B. [CELIA, Université Bordeaux, CNRS, CEA (France); Fléchard, X. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Finlay, P. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Mauger, F. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Méry, A. [CIMAP, CEA/CNRS/ENSICAEN, Université de Caen (France); Naviliat-Cuncic, O. [NSCL and Department of Physics and Astronomy, MSU (United States); Pons, B. [CELIA, Université Bordeaux, CNRS, CEA (France); Porobic, T. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Quéméner, G. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Severijns, N. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Thomas, J. C. [GANIL, CEA/DSM-CNRS/IN2P3 (France); Velten, Ph. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium)

    2015-11-15

    The experimental achievements and the results obtained so far with the LPCTrap device installed at GANIL are presented. The apparatus is dedicated to the study of the weak interaction at low energy by means of precise measurements of the β − ν angular correlation parameter in nuclear β decays. So far, the data collected with three isotopes have enabled to determine, for the first time, the charge state distributions of the recoiling ions, induced by shakeoff process. The analysis is presently refined to deduce the correlation parameters, with the potential of improving both the constraint deduced at low energy on exotic tensor currents ({sup 6}He{sup 1+}) and the precision on the V{sub ud} element of the quark-mixing matrix ({sup 35}Ar{sup 1+} and {sup 19}Ne{sup 1+}) deduced from the mirror transitions dataset.

  8. A primer on precision medicine informatics.

    Science.gov (United States)

    Sboner, Andrea; Elemento, Olivier

    2016-01-01

    In this review, we describe key components of a computational infrastructure for a precision medicine program that is based on clinical-grade genomic sequencing. Specific aspects covered in this review include software components and hardware infrastructure, reporting, integration into Electronic Health Records for routine clinical use and regulatory aspects. We emphasize informatics components related to reproducibility and reliability in genomic testing, regulatory compliance, traceability and documentation of processes, integration into clinical workflows, privacy requirements, prioritization and interpretation of results to report based on clinical needs, rapidly evolving knowledge base of genomic alterations and clinical treatments and return of results in a timely and predictable fashion. We also seek to differentiate between the use of precision medicine in germline and cancer. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Precision beam splitters for CO2 lasers

    International Nuclear Information System (INIS)

    Franzen, D.L.

    1975-01-01

    Beam splitters for 10-μm lasers are discussed and then applied to the precision measurement of high average powers. In particular, beam splitter stability has been investigated in various materials over the 20--600-W power range with power densities up to 1 kW/cm 2 . The absolute beam splitter ratios are given along with the achieved measurement precisions. The semiconductors investigated were GaAs, CdTe, and ZnSe in addition to one alkali-halide KC1. Standard deviations for the beam splitter ratios of 1% over the power range were typical. Absolute ratios agree with the predictions from Fresnel's equations to 1% or better. The best measurement was made on ZnSe when a standard deviation of 0.4% was obtained for the measurement of a ratio that agreed with a calculation from Fresnel's equations to better than 0.5%

  10. Cyclotrons as Drivers for Precision Neutrino Measurements

    International Nuclear Information System (INIS)

    Alonso, J.; Barletta, W. A.; Winslow, L. A.; Shaevitz, M. H.; Spitz, J.; Conrad, J. M.; Toups, M.; Adelmann, A.

    2014-01-01

    As we enter the age of precision measurement in neutrino physics, improved flux sources are required. These must have a well defined flavor content with energies in ranges where backgrounds are low and cross-section knowledge is high. Very few sources of neutrinos can meet these requirements. However, pion/muon and isotope decay-at-rest sources qualify. The ideal drivers for decay-at-rest sources are cyclotron accelerators, which are compact and relatively inexpensive. This paper describes a scheme to produce decay-at-rest sources driven by such cyclotrons, developed within the DAEδALUS program. Examples of the value of the high precision beams for pursuing Beyond Standard Model interactions are reviewed. New results on a combined DAEδALUS—Hyper-K search for CP violation that achieve errors on the mixing matrix parameter of 4° to 12° are presented

  11. Adaptation to high throughput batch chromatography enhances multivariate screening.

    Science.gov (United States)

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  13. High-throughput screening of ionic conductivity in polymer membranes

    International Nuclear Information System (INIS)

    Zapata, Pedro; Basak, Pratyay; Carson Meredith, J.

    2009-01-01

    Combinatorial and high-throughput techniques have been successfully used for efficient and rapid property screening in multiple fields. The use of these techniques can be an advantageous new approach to assay ionic conductivity and accelerate the development of novel materials in research areas such as fuel cells. A high-throughput ionic conductivity (HTC) apparatus is described and applied to screening candidate polymer electrolyte membranes for fuel cell applications. The device uses a miniature four-point probe for rapid, automated point-to-point AC electrochemical impedance measurements in both liquid and humid air environments. The conductivity of Nafion 112 HTC validation standards was within 1.8% of the manufacturer's specification. HTC screening of 40 novel Kynar poly(vinylidene fluoride) (PVDF)/acrylic polyelectrolyte (PE) membranes focused on varying the Kynar type (5x) and PE composition (8x) using reduced sample sizes. Two factors were found to be significant in determining the proton conducting capacity: (1) Kynar PVDF series: membranes containing a particular Kynar PVDF type exhibited statistically identical mean conductivity as other membranes containing different Kynar PVDF types that belong to the same series or family. (2) Maximum effective amount of polyelectrolyte: increments in polyelectrolyte content from 55 wt% to 60 wt% showed no statistically significant effect in increasing conductivity. In fact, some membranes experienced a reduction in conductivity.

  14. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  15. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  16. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    Science.gov (United States)

    Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  17. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  18. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  19. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  20. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  1. The JCSG high-throughput structural biology pipeline

    International Nuclear Information System (INIS)

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years and has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe. The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications

  2. High-throughput characterization for solar fuels materials discovery

    Science.gov (United States)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  3. Combinatorial chemoenzymatic synthesis and high-throughput screening of sialosides.

    Science.gov (United States)

    Chokhawala, Harshal A; Huang, Shengshu; Lau, Kam; Yu, Hai; Cheng, Jiansong; Thon, Vireak; Hurtado-Ziola, Nancy; Guerrero, Juan A; Varki, Ajit; Chen, Xi

    2008-09-19

    Although the vital roles of structures containing sialic acid in biomolecular recognition are well documented, limited information is available on how sialic acid structural modifications, sialyl linkages, and the underlying glycan structures affect the binding or the activity of sialic acid-recognizing proteins and related downstream biological processes. A novel combinatorial chemoenzymatic method has been developed for the highly efficient synthesis of biotinylated sialosides containing different sialic acid structures and different underlying glycans in 96-well plates from biotinylated sialyltransferase acceptors and sialic acid precursors. By transferring the reaction mixtures to NeutrAvidin-coated plates and assaying for the yields of enzymatic reactions using lectins recognizing sialyltransferase acceptors but not the sialylated products, the biotinylated sialoside products can be directly used, without purification, for high-throughput screening to quickly identify the ligand specificity of sialic acid-binding proteins. For a proof-of-principle experiment, 72 biotinylated alpha2,6-linked sialosides were synthesized in 96-well plates from 4 biotinylated sialyltransferase acceptors and 18 sialic acid precursors using a one-pot three-enzyme system. High-throughput screening assays performed in NeutrAvidin-coated microtiter plates show that whereas Sambucus nigra Lectin binds to alpha2,6-linked sialosides with high promiscuity, human Siglec-2 (CD22) is highly selective for a number of sialic acid structures and the underlying glycans in its sialoside ligands.

  4. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  5. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  6. Dashboard visualizations: Supporting real-time throughput decision-making.

    Science.gov (United States)

    Franklin, Amy; Gantela, Swaroop; Shifarraw, Salsawit; Johnson, Todd R; Robinson, David J; King, Brent R; Mehta, Amit M; Maddow, Charles L; Hoot, Nathan R; Nguyen, Vickie; Rubio, Adriana; Zhang, Jiajie; Okafor, Nnaemeka G

    2017-07-01

    Providing timely and effective care in the emergency department (ED) requires the management of individual patients as well as the flow and demands of the entire department. Strategic changes to work processes, such as adding a flow coordination nurse or a physician in triage, have demonstrated improvements in throughput times. However, such global strategic changes do not address the real-time, often opportunistic workflow decisions of individual clinicians in the ED. We believe that real-time representation of the status of the entire emergency department and each patient within it through information visualizations will better support clinical decision-making in-the-moment and provide for rapid intervention to improve ED flow. This notion is based on previous work where we found that clinicians' workflow decisions were often based on an in-the-moment local perspective, rather than a global perspective. Here, we discuss the challenges of designing and implementing visualizations for ED through a discussion of the development of our prototype Throughput Dashboard and the potential it holds for supporting real-time decision-making. Copyright © 2017. Published by Elsevier Inc.

  7. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  8. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    Directory of Open Access Journals (Sweden)

    Masoom Alam

    Full Text Available Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP, mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS. It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  9. Seismic Motion Stability, Measurement and Precision Control.

    Science.gov (United States)

    1979-12-01

    azimuth reference used for precision alignment of test fixtures and components is a porro prism mounted within the inner labora- tory. The orientation of...14 6.2 Low Frequency Tilt Loop. .................... 15 6.3 Mechanical Resonances . ..................... 17 6.4 Seismometer Matching...this prism is ascertained, using autocollimating theodolites, either by directly "swinging" Polaris into the laboratory from outside through optical

  10. Hidden SUSY from precision gauge unification

    Energy Technology Data Exchange (ETDEWEB)

    Krippendorf, Sven; Nilles, Hans Peter [Bonn Univ. (Germany). Bethe Center for Theoretical Physics; Bonn Univ. (Germany). Physikalisches Inst.; Ratz, Michael [Technische Univ. Muenchen, Garching (Germany). Physik-Department; Winkler, Martin Wolfgang [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-06-15

    We revisit the implications of naturalness and gauge unification in the MSSM. We find that precision unification of the couplings in connection with a small {mu} parameter requires a highly compressed gaugino pattern as it is realized in mirage mediation. Due to the small mass difference between gluino and LSP, collider limits on the gluino mass are drastically relaxed. Without further assumptions, the relic density of the LSP is very close to the observed dark matter density due to coannihilation effects.

  11. Static quarks with improved statistical precision

    International Nuclear Information System (INIS)

    Della Morte, M.; Duerr, S.; Molke, H.; Heitger, J.

    2003-09-01

    We present a numerical study for different discretisations of the static action, concerning cut-off effects and the growth of statistical errors with Euclidean time. An error reduction by an order of magnitude can be obtained with respect to the Eichten-Hill action, for time separations up to 2 fm, keeping discretization errors small. The best actions lead to a big improvement on the precision of the quark mass M b and F B s in the static approximation. (orig.)

  12. Precision Index in the Multivariate Context

    Czech Academy of Sciences Publication Activity Database

    Šiman, Miroslav

    2014-01-01

    Roč. 43, č. 2 (2014), s. 377-387 ISSN 0361-0926 R&D Projects: GA MŠk(CZ) 1M06047 Institutional support: RVO:67985556 Keywords : data depth * multivariate quantile * process capability index * precision index * regression quantile Subject RIV: BA - General Mathematics Impact factor: 0.274, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/siman-0425059.pdf

  13. Proliferation of Precision Strike: Issues for Congress

    Science.gov (United States)

    2012-05-14

    industrial base. Finally, should Congress legislate requirements for DOD to develop precision strike countermeasures and then provide funding for that...Defense, Part 187 – Environmental Effects Abroad of Major Department of Defense Actions, Section 187.3: Definitions. 32 Bryan Clark and Dan Whiteneck...so many missiles ... referring to reports of Venezuelan arms flowing to Colombian guerrillas.... The Chavez regime also has close ties with

  14. Electroweak Precision Measurements with the ATLAS Detector

    CERN Document Server

    Linck, Rebecca Anne; The ATLAS collaboration

    2018-01-01

    As part of its ongoing exploration into the nature of the particles produced in high energy proton-proton collisions, the ATLAS detector has been used to perform a number of new precision electroweak measurements. In this talk the recent measurements of the W-boson mass, the Drell-Yan triple-differential cross-section and the polarisation of tau leptons in Z/γ* → ττ decays will be discussed.

  15. Electroweak precision data and gravitino dark matter

    Indian Academy of Sciences (India)

    We analyze the precision observables in the context of the GDM, focusing on parameter combina- tions that fulfill 0.094 < ΩCDMh2 < 0.129 [7]. In order to simplify the analysis in a motivated manner, we .... m1/2 discussed above maps into an analogous preference for moderate tan β (see ref. [2]). It can be shown that, at the ...

  16. Precision Jet production for the LHC

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Jet production is one of the basic processes at the LHC with numerous uses for standard model and BSM phenomenology. Understanding this process with suitable precision has been a long-standing goal for the particle physics community. I will report on our recent calculation of the NNLO contribution to jet production using antenna subtraction and discuss what these results might mean for jet phenomenology in the near future.

  17. Precision Landing and Hazard Avoidance Doman

    Science.gov (United States)

    Robertson, Edward A.; Carson, John M., III

    2016-01-01

    The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.

  18. Accuracy and precision in activation analysis: counting

    International Nuclear Information System (INIS)

    Becker, D.A.

    1974-01-01

    Accuracy and precision in activation analysis was investigated with regard to counting of induced radioactivity. The various parameters discussed include configuration, positioning, density, homogeneity, intensity, radioisotopic purity, peak integration, and nuclear constants. Experimental results are presented for many of these parameters. The results obtained indicate that counting errors often contribute significantly to the inaccuracy and imprecision of analyses. The magnitude of these errors range from less than 1 percent to 10 percent or more in many cases

  19. Hidden SUSY from precision gauge unification

    International Nuclear Information System (INIS)

    Krippendorf, Sven; Nilles, Hans Peter

    2013-06-01

    We revisit the implications of naturalness and gauge unification in the MSSM. We find that precision unification of the couplings in connection with a small μ parameter requires a highly compressed gaugino pattern as it is realized in mirage mediation. Due to the small mass difference between gluino and LSP, collider limits on the gluino mass are drastically relaxed. Without further assumptions, the relic density of the LSP is very close to the observed dark matter density due to coannihilation effects.

  20. Precision Rescue Behavior in North American Ants

    Directory of Open Access Journals (Sweden)

    Katherine Taylor

    2013-07-01

    Full Text Available Altruistic behavior, in which one individual provides aid to another at some cost to itself, is well documented. However, some species engage in a form of altruism, called rescue, that places the altruist in immediate danger. Here we investigate one such example, namely rescuing victims captured by predators. In a field experiment with two North American ant species, Tetramorium sp. E and Prenolepis imparis, individuals were held in artificial snares simulating capture. T. sp. E, but not P. imparis, exhibited digging, pulling, and snare biting, the latter precisely targeted to the object binding the victim. These results are the first to document precision rescue in a North American ant species; moreover, unlike rescue in other ants, T. sp. E rescues conspecifics from different colonies, mirroring their atypical social behavior, namely the lack of aggression between non-nestmate (heterocolonial conspecifics. In a second, observational study designed to demonstrate rescue from an actual predator, T. sp. E victims were dropped into an antlion's pit and the behavior of a single rescuer was observed. Results showed that T. sp. E not only attempted to release the victim, but also risked attacking the predator, suggesting that precision rescue may play an important role in this species' antipredator behavior.