WorldWideScience

Sample records for next-generation high-end parallel

  1. Parallel phase model : a programming model for high-end parallel machines with manycores.

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Junfeng (Syracuse University, Syracuse, NY); Wen, Zhaofang; Heroux, Michael Allen; Brightwell, Ronald Brian

    2009-04-01

    This paper presents a parallel programming model, Parallel Phase Model (PPM), for next-generation high-end parallel machines based on a distributed memory architecture consisting of a networked cluster of nodes with a large number of cores on each node. PPM has a unified high-level programming abstraction that facilitates the design and implementation of parallel algorithms to exploit both the parallelism of the many cores and the parallelism at the cluster level. The programming abstraction will be suitable for expressing both fine-grained and coarse-grained parallelism. It includes a few high-level parallel programming language constructs that can be added as an extension to an existing (sequential or parallel) programming language such as C; and the implementation of PPM also includes a light-weight runtime library that runs on top of an existing network communication software layer (e.g. MPI). Design philosophy of PPM and details of the programming abstraction are also presented. Several unstructured applications that inherently require high-volume random fine-grained data accesses have been implemented in PPM with very promising results.

  2. High-resolution analysis of the 5'-end transcriptome using a next generation DNA sequencer.

    Directory of Open Access Journals (Sweden)

    Shin-ichi Hashimoto

    Full Text Available Massively parallel, tag-based sequencing systems, such as the SOLiD system, hold the promise of revolutionizing the study of whole genome gene expression due to the number of data points that can be generated in a simple and cost-effective manner. We describe the development of a 5'-end transcriptome workflow for the SOLiD system and demonstrate the advantages in sensitivity and dynamic range offered by this tag-based application over traditional approaches for the study of whole genome gene expression. 5'-end transcriptome analysis was used to study whole genome gene expression within a colon cancer cell line, HT-29, treated with the DNA methyltransferase inhibitor, 5-aza-2'-deoxycytidine (5Aza. More than 20 million 25-base 5'-end tags were obtained from untreated and 5Aza-treated cells and matched to sequences within the human genome. Seventy three percent of the mapped unique tags were associated with RefSeq cDNA sequences, corresponding to approximately 14,000 different protein-coding genes in this single cell type. The level of expression of these genes ranged from 0.02 to 4,704 transcripts per cell. The sensitivity of a single sequence run of the SOLiD platform was 100-1,000 fold greater than that observed from 5'end SAGE data generated from the analysis of 70,000 tags obtained by Sanger sequencing. The high-resolution 5'end gene expression profiling presented in this study will not only provide novel insight into the transcriptional machinery but should also serve as a basis for a better understanding of cell biology.

  3. Next Generation Parallelization Systems for Processing and Control of PDS Image Node Assets

    Science.gov (United States)

    Verma, R.

    2017-06-01

    We present next-generation parallelization tools to help Planetary Data System (PDS) Imaging Node (IMG) better monitor, process, and control changes to nearly 650 million file assets and over a dozen machines on which they are referenced or stored.

  4. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  5. FY1995 next generation highly parallel database / dataminig server using 100 PC's and ATM switch; 1995 nendo tasudai no pasokon wo ATM ketsugoshita jisedai choheiretsu database mining server no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The objective of the research is first to build a highly parallel processing system using 100 personal computers and an ATM switch. The former is a commodity for computer, while the latter can be regarded as a commodity for future communication systems. Second is to implement parallel relational database management system and parallel data mining system over the 100-PC cluster system. Third is to run decision-support queries typicalto data warehouses, to run association rule mining, and to prove the effectiveness of the proposed architecture as a next generation parallel database/datamining server. Performance/cost ratio of PC is significantly improved compared with workstations and proprietry systems due to its mass production. The cost of ATM switch is also considerably decreasing since ATM is being widely accepted as a communication-on infrastructure. By combining 100 PCs as computing commodities and ATM switch as a communication commodity, we built large sca-le parallel processing system inexpensively. Each mode employs the Pentium Pro CPU and the communication badwidth between PC's is more than 120Mbits/sec. A new parallel relational DBMS is design-ed and implemented. TPC-D, which is a standard benchmark for decision support applicants (100GBytes) is executed. Our system attained much higher performance than current commercial systems which are also much more expensive than ours. In addition, we developed a novel parallel data mining algorithm to extract associate rules. We implemented it in our system and succeeded toattain high performance. Thus it is verified that ATM connected PC cluster is very promising as a next generation platform for large scale database/dataminig server. (NEDO)

  6. A Fast, High Quality, and Reproducible Parallel Lagged-Fibonacci Pseudorandom Number Generator

    Science.gov (United States)

    Mascagni, Michael; Cuccaro, Steven A.; Pryor, Daniel V.; Robinson, M. L.

    1995-07-01

    We study the suitability of the additive lagged-Fibonacci pseudo-random number generator for parallel computation. This generator has relatively short period with respect to the size of its seed. However, the short period is more than made up for with the huge number of full-period cycles it contains. These different full period cycles are called equivalence classes. We show how to enumerate the equivalence classes and how to compute seeds to select a given equivalence class, In addition, we present some theoretical measures of quality for this generator when used in parallel. Next, we conjecture on the size of these measures of quality for this generator. Extensive empirical evidence supports this conjecture. In addition, a probabilistic interpretation of these measures leads to another conjecture similarly supported by empirical evidence. Finally we give an explicit parallelization suitable for a fully reproducible asynchronous MIMD implementation.

  7. FY1995 next generation highly parallel database / dataminig server using 100 PC's and ATM switch; 1995 nendo tasudai no pasokon wo ATM ketsugoshita jisedai choheiretsu database mining server no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The objective of the research is first to build a highly parallel processing system using 100 personal computers and an ATM switch. The former is a commodity for computer, while the latter can be regarded as a commodity for future communication systems. Second is to implement parallel relational database management system and parallel data mining system over the 100-PC cluster system. Third is to run decision-support queries typicalto data warehouses, to run association rule mining, and to prove the effectiveness of the proposed architecture as a next generation parallel database/datamining server. Performance/cost ratio of PC is significantly improved compared with workstations and proprietry systems due to its mass production. The cost of ATM switch is also considerably decreasing since ATM is being widely accepted as a communication-on infrastructure. By combining 100 PCs as computing commodities and ATM switch as a communication commodity, we built large sca-le parallel processing system inexpensively. Each mode employs the Pentium Pro CPU and the communication badwidth between PC's is more than 120Mbits/sec. A new parallel relational DBMS is design-ed and implemented. TPC-D, which is a standard benchmark for decision support applicants (100GBytes) is executed. Our system attained much higher performance than current commercial systems which are also much more expensive than ours. In addition, we developed a novel parallel data mining algorithm to extract associate rules. We implemented it in our system and succeeded toattain high performance. Thus it is verified that ATM connected PC cluster is very promising as a next generation platform for large scale database/dataminig server. (NEDO)

  8. Next-generation fiber lasers enabled by high-performance components

    Science.gov (United States)

    Kliner, D. A. V.; Victor, B.; Rivera, C.; Fanning, G.; Balsley, D.; Farrow, R. L.; Kennedy, K.; Hampton, S.; Hawke, R.; Soukup, E.; Reynolds, M.; Hodges, A.; Emery, J.; Brown, A.; Almonte, K.; Nelson, M.; Foley, B.; Dawson, D.; Hemenway, D. M.; Urbanek, W.; DeVito, M.; Bao, L.; Koponen, J.; Gross, K.

    2018-02-01

    Next-generation industrial fiber lasers enable challenging applications that cannot be addressed with legacy fiber lasers. Key features of next-generation fiber lasers include robust back-reflection protection, high power stability, wide power tunability, high-speed modulation and waveform generation, and facile field serviceability. These capabilities are enabled by high-performance components, particularly pump diodes and optical fibers, and by advanced fiber laser designs. We summarize the performance and reliability of nLIGHT diodes, fibers, and next-generation industrial fiber lasers at power levels of 500 W - 8 kW. We show back-reflection studies with up to 1 kW of back-reflected power, power-stability measurements in cw and modulated operation exhibiting sub-1% stability over a 5 - 100% power range, and high-speed modulation (100 kHz) and waveform generation with a bandwidth 20x higher than standard fiber lasers. We show results from representative applications, including cutting and welding of highly reflective metals (Cu and Al) for production of Li-ion battery modules and processing of carbon fiber reinforced polymers.

  9. Next generation initiation techniques

    Science.gov (United States)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The

  10. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  11. Targeted next-generation sequencing at copy-number breakpoints for personalized analysis of rearranged ends in solid tumors.

    Directory of Open Access Journals (Sweden)

    Hyun-Kyoung Kim

    Full Text Available BACKGROUND: The concept of the utilization of rearranged ends for development of personalized biomarkers has attracted much attention owing to its clinical applicability. Although targeted next-generation sequencing (NGS for recurrent rearrangements has been successful in hematologic malignancies, its application to solid tumors is problematic due to the paucity of recurrent translocations. However, copy-number breakpoints (CNBs, which are abundant in solid tumors, can be utilized for identification of rearranged ends. METHOD: As a proof of concept, we performed targeted next-generation sequencing at copy-number breakpoints (TNGS-CNB in nine colon cancer cases including seven primary cancers and two cell lines, COLO205 and SW620. For deduction of CNBs, we developed a novel competitive single-nucleotide polymorphism (cSNP microarray method entailing CNB-region refinement by competitor DNA. RESULT: Using TNGS-CNB, 19 specific rearrangements out of 91 CNBs (20.9% were identified, and two polymerase chain reaction (PCR-amplifiable rearrangements were obtained in six cases (66.7%. And significantly, TNGS-CNB, with its high positive identification rate (82.6% of PCR-amplifiable rearrangements at candidate sites (19/23, just from filtering of aligned sequences, requires little effort for validation. CONCLUSION: Our results indicate that TNGS-CNB, with its utility for identification of rearrangements in solid tumors, can be successfully applied in the clinical laboratory for cancer-relapse and therapy-response monitoring.

  12. Targeted next-generation sequencing at copy-number breakpoints for personalized analysis of rearranged ends in solid tumors.

    Science.gov (United States)

    Kim, Hyun-Kyoung; Park, Won Cheol; Lee, Kwang Man; Hwang, Hai-Li; Park, Seong-Yeol; Sorn, Sungbin; Chandra, Vishal; Kim, Kwang Gi; Yoon, Woong-Bae; Bae, Joon Seol; Shin, Hyoung Doo; Shin, Jong-Yeon; Seoh, Ju-Young; Kim, Jong-Il; Hong, Kyeong-Man

    2014-01-01

    The concept of the utilization of rearranged ends for development of personalized biomarkers has attracted much attention owing to its clinical applicability. Although targeted next-generation sequencing (NGS) for recurrent rearrangements has been successful in hematologic malignancies, its application to solid tumors is problematic due to the paucity of recurrent translocations. However, copy-number breakpoints (CNBs), which are abundant in solid tumors, can be utilized for identification of rearranged ends. As a proof of concept, we performed targeted next-generation sequencing at copy-number breakpoints (TNGS-CNB) in nine colon cancer cases including seven primary cancers and two cell lines, COLO205 and SW620. For deduction of CNBs, we developed a novel competitive single-nucleotide polymorphism (cSNP) microarray method entailing CNB-region refinement by competitor DNA. Using TNGS-CNB, 19 specific rearrangements out of 91 CNBs (20.9%) were identified, and two polymerase chain reaction (PCR)-amplifiable rearrangements were obtained in six cases (66.7%). And significantly, TNGS-CNB, with its high positive identification rate (82.6%) of PCR-amplifiable rearrangements at candidate sites (19/23), just from filtering of aligned sequences, requires little effort for validation. Our results indicate that TNGS-CNB, with its utility for identification of rearrangements in solid tumors, can be successfully applied in the clinical laboratory for cancer-relapse and therapy-response monitoring.

  13. De Novo Ultrascale Atomistic Simulations On High-End Parallel Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, A; Kalia, R K; Nomura, K; Sharma, A; Vashishta, P; Shimojo, F; van Duin, A; Goddard, III, W A; Biswas, R; Srivastava, D; Yang, L H

    2006-09-04

    We present a de novo hierarchical simulation framework for first-principles based predictive simulations of materials and their validation on high-end parallel supercomputers and geographically distributed clusters. In this framework, high-end chemically reactive and non-reactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divide-and-conquer (EDC) algorithmic framework for the design of linear-scaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated model transitioning assisted by graph-based event tracking. A tunable hierarchical cellular decomposition parallelization framework then maps the O(N) EDC algorithms onto Petaflops computers, while achieving performance tunability through a hierarchy of parameterized cell data/computation structures, as well as its implementation using hybrid Grid remote procedure call + message passing + threads programming. High-end computing platforms such as IBM BlueGene/L, SGI Altix 3000 and the NSF TeraGrid provide an excellent test grounds for the framework. On these platforms, we have achieved unprecedented scales of quantum-mechanically accurate and well validated, chemically reactive atomistic simulations--1.06 billion-atom fast reactive force-field MD and 11.8 million-atom (1.04 trillion grid points) quantum-mechanical MD in the framework of the EDC density functional theory on adaptive multigrids--in addition to 134 billion-atom non-reactive space-time multiresolution MD, with the parallel efficiency as high as 0.998 on 65,536 dual-processor BlueGene/L nodes. We have also achieved an automated execution of hierarchical QM

  14. Next generation platforms for high-throughput bio-dosimetry

    International Nuclear Information System (INIS)

    Repin, Mikhail; Turner, Helen C.; Garty, Guy; Brenner, David J.

    2014-01-01

    Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of bio-dosimetry assays was described. These platforms can be used at different stages of bio-dosimetry assays starting from blood collection into micro-tubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multi-well and multichannel plates. Robotically friendly platforms can be used for different bio-dosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. (authors)

  15. Technical presentation: Next Generation Oscilloscopes

    CERN Multimedia

    PH Department

    2011-01-01

      Rohde & Schwarz "Next Generation Oscilloscopes" - Introduction and Presentation Agenda: Wednesday 23 March  -  09:30 to 11:30 (open end) Bldg. 13-2-005 Language: English 09.30 Presentation "Next Generation Oscilloscopes" from Rohde & Schwarz RTO / RTM in theory and practice Gerard Walker 10.15 Technical design details from R&D Dr. Markus Freidhof 10.45 Scope and Probe Roadmap (confidential) Guido Schulze 11.00 Open Discussion Feedback, first impression, wishes, needs and requirements from CERN All 11.30 Expert Talks, Hands on All Mr. Dr. Markus Freidhof, Head of R&D Oscilloscopes, Rohde & Schwarz, Germany; Mr. Guido Schulze, ...

  16. Next-generation phylogenomics

    Directory of Open Access Journals (Sweden)

    Chan Cheong Xin

    2013-01-01

    Full Text Available Abstract Thanks to advances in next-generation technologies, genome sequences are now being generated at breadth (e.g. across environments and depth (thousands of closely related strains, individuals or samples unimaginable only a few years ago. Phylogenomics – the study of evolutionary relationships based on comparative analysis of genome-scale data – has so far been developed as industrial-scale molecular phylogenetics, proceeding in the two classical steps: multiple alignment of homologous sequences, followed by inference of a tree (or multiple trees. However, the algorithms typically employed for these steps scale poorly with number of sequences, such that for an increasing number of problems, high-quality phylogenomic analysis is (or soon will be computationally infeasible. Moreover, next-generation data are often incomplete and error-prone, and analysis may be further complicated by genome rearrangement, gene fusion and deletion, lateral genetic transfer, and transcript variation. Here we argue that next-generation data require next-generation phylogenomics, including so-called alignment-free approaches. Reviewers Reviewed by Mr Alexander Panchin (nominated by Dr Mikhail Gelfand, Dr Eugene Koonin and Prof Peter Gogarten. For the full reviews, please go to the Reviewers’ comments section.

  17. Converged Wireless Networking and Optimization for Next Generation Services

    Directory of Open Access Journals (Sweden)

    J. Rodriguez

    2010-01-01

    Full Text Available The Next Generation Network (NGN vision is tending towards the convergence of internet and mobile services providing the impetus for new market opportunities in combining the appealing services of internet with the roaming capability of mobile networks. However, this convergence does not go far enough, and with the emergence of new coexistence scenarios, there is a clear need to evolve the current architecture to provide cost-effective end-to-end communication. The LOOP project, a EUREKA-CELTIC driven initiative, is one piece in the jigsaw by helping European industry to sustain a leading role in telecommunications and manufacturing of high-value products and machinery by delivering pioneering converged wireless networking solutions that can be successfully demonstrated. This paper provides an overview of the LOOP project and the key achievements that have been tunneled into first prototypes for showcasing next generation services for operators and process manufacturers.

  18. Next Generation Solar Collectors for CSP

    Energy Technology Data Exchange (ETDEWEB)

    Molnar, Attila [3M Company, St. Paul, MN (United States); Charles, Ruth [3M Company, St. Paul, MN (United States)

    2014-07-31

    The intent of “Next Generation Solar Collectors for CSP” program was to develop key technology elements for collectors in Phase 1 (Budget Period 1), design these elements in Phase 2 (Budget Period 2) and to deploy and test the final collector in Phase 3 (Budget Period 3). 3M and DOE mutually agreed to terminate the program at the end of Budget Period 1, primarily due to timeline issues. However, significant advancements were achieved in developing a next generation reflective material and panel that has the potential to significantly improve the efficiency of CSP systems.

  19. A Survey on 5G: The Next Generation of Mobile Communication

    OpenAIRE

    Panwar, Nisha; Sharma, Shantanu; Singh, Awadhesh Kumar

    2015-01-01

    The rapidly increasing number of mobile devices, voluminous data, and higher data rate are pushing to rethink the current generation of the cellular mobile communication. The next or fifth generation (5G) cellular networks are expected to meet high-end requirements. The 5G networks are broadly characterized by three unique features: ubiquitous connectivity, extremely low latency, and very high-speed data transfer. The 5G networks would provide novel architectures and technologies beyond state...

  20. Examination of concept of next generation computer. Progress report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio

    2000-12-01

    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  1. Designing Next Generation Massively Multithreaded Architectures for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tumeo, Antonino; Secchi, Simone; Villa, Oreste

    2012-08-31

    Irregular applications, such as data mining or graph-based computations, show unpredictable memory/network access patterns and control structures. Massively multi-threaded architectures with large node count, like the Cray XMT, have been shown to address their requirements better than commodity clusters. In this paper we present the approaches that we are currently pursuing to design future generations of these architectures. First, we introduce the Cray XMT and compare it to other multithreaded architectures. We then propose an evolution of the architecture, integrating multiple cores per node and next generation network interconnect. We advocate the use of hardware support for remote memory reference aggregation to optimize network utilization. For this evaluation we developed a highly parallel, custom simulation infrastructure for multi-threaded systems. Our simulator executes unmodified XMT binaries with very large datasets, capturing effects due to contention and hot-spotting, while predicting execution times with greater than 90% accuracy. We also discuss the FPGA prototyping approach that we are employing to study efficient support for irregular applications in next generation manycore processors.

  2. Next generation of optical front-ends for numerical services - 15387

    International Nuclear Information System (INIS)

    Fullenbaum, M.; Durieux, A.; Dubroca, G.; Fuss, P.

    2015-01-01

    Visual Inspection and surveillance technology means in environments exhibiting high levels of gamma and neutron radiation are nowadays fulfilled through the use of analog tubes. The images are thus acquired with analog devices whose vast majority relies on 1 and 2/3 inch imaging formats and deliver native analog images. There is a growing demand for real time image processing and distribution through Ethernet services for quicker and seamless process integration throughout many sectors. This will call for the inception of solid state sensor (CCD, CMOS) to generate numerical native images as the first step and building block towards end to end numerical processing (ICT), assuming these sensors can be hardened or protected in the field of the nuclear industry. On the one hand, these sensor sizes will be significantly reduced (by a factor of 2-3) versus those of the tubes, and on the other hand, one will also be presented with the opportunity of increased spatial resolution, stemming from the high pixel count of the solid state technology, for implementation of new or better services or of enhanced pieces of information for decision making purposes. In order to reap the benefits of such sensors, new optical front-ends will have to be designed. Over and beyond the mere aspects of matching the reduced sensor size to the size of the scenes at stake, optical performances of these front-end will also bear an impact on the whole optical chain applications. As an example, detection and tracking needs will be different from a performance standpoint and the overall performances will have to be balanced out in between the optical front-end, the image format, the image processing software capability, processing speed,...just to name a few. In this paper we will review and explain the missing gaps in order to switch to a full numerical optical chain by focusing on the optical front-end and the associated cost trade-offs. Finally, we will conclude by clearly stating the best

  3. Massively parallel E-beam inspection: enabling next-generation patterned defect inspection for wafer and mask manufacturing

    Science.gov (United States)

    Malloy, Matt; Thiel, Brad; Bunday, Benjamin D.; Wurm, Stefan; Mukhtar, Maseeh; Quoi, Kathy; Kemen, Thomas; Zeidler, Dirk; Eberle, Anna Lena; Garbowski, Tomasz; Dellemann, Gregor; Peters, Jan Hendrik

    2015-03-01

    SEMATECH aims to identify and enable disruptive technologies to meet the ever-increasing demands of semiconductor high volume manufacturing (HVM). As such, a program was initiated in 2012 focused on high-speed e-beam defect inspection as a complement, and eventual successor, to bright field optical patterned defect inspection [1]. The primary goal is to enable a new technology to overcome the key gaps that are limiting modern day inspection in the fab; primarily, throughput and sensitivity to detect ultra-small critical defects. The program specifically targets revolutionary solutions based on massively parallel e-beam technologies, as opposed to incremental improvements to existing e-beam and optical inspection platforms. Wafer inspection is the primary target, but attention is also being paid to next generation mask inspection. During the first phase of the multi-year program multiple technologies were reviewed, a down-selection was made to the top candidates, and evaluations began on proof of concept systems. A champion technology has been selected and as of late 2014 the program has begun to move into the core technology maturation phase in order to enable eventual commercialization of an HVM system. Performance data from early proof of concept systems will be shown along with roadmaps to achieving HVM performance. SEMATECH's vision for moving from early-stage development to commercialization will be shown, including plans for development with industry leading technology providers.

  4. ng: What next-generation languages can teach us about HENP frameworks in the manycore era

    International Nuclear Information System (INIS)

    Binet, Sébastien

    2011-01-01

    Current High Energy and Nuclear Physics (HENP) frameworks were written before multicore systems became widely deployed. A 'single-thread' execution model naturally emerged from that environment, however, this no longer fits into the processing model on the dawn of the manycore era. Although previous work focused on minimizing the changes to be applied to the LHC frameworks (because of the data taking phase) while still trying to reap the benefits of the parallel-enhanced CPU architectures, this paper explores what new languages could bring to the design of the next-generation frameworks. Parallel programming is still in an intensive phase of R and D and no silver bullet exists despite the 30+ years of literature on the subject. Yet, several parallel programming styles have emerged: actors, message passing, communicating sequential processes, task-based programming, data flow programming, ... to name a few. We present the work of the prototyping of a next-generation framework in new and expressive languages (python and Go) to investigate how code clarity and robustness are affected and what are the downsides of using languages younger than FORTRAN/C/C++.

  5. Start-to-end simulation of x-ray radiation of a next generation light source using the real number of electrons

    Directory of Open Access Journals (Sweden)

    J. Qiang

    2014-03-01

    Full Text Available In this paper we report on start-to-end simulation of a next generation light source based on a high repetition rate free electron laser (FEL driven by a CW superconducting linac. The simulation integrated the entire system in a seamless start-to-end model, including birth of photoelectrons, transport of electron beam through 600 m of the accelerator beam delivery system, and generation of coherent x-ray radiation in a two-stage self-seeding undulator beam line. The entire simulation used the real number of electrons (∼2 billion electrons/bunch to capture the details of the physical shot noise without resorting to artificial filtering to suppress numerical noise. The simulation results shed light on several issues including the importance of space-charge effects near the laser heater and the reliability of x-ray radiation power predictions when using a smaller number of simulation particles. The results show that the microbunching instability in the linac can be controlled with 15 keV uncorrelated energy spread induced by a laser heater and demonstrate that high brightness and flux 1 nm x-ray radiation (∼10^{12}  photons/pulse with fully spatial and temporal coherence is achievable.

  6. Next Generation Microchannel Heat Exchangers

    CERN Document Server

    Ohadi, Michael; Dessiatoun, Serguei; Cetegen, Edvin

    2013-01-01

    In Next Generation Microchannel Heat Exchangers, the authors’ focus on the new generation highly efficient heat exchangers and presentation of novel data and technical expertise not available in the open literature.  Next generation micro channels offer record high heat transfer coefficients with pressure drops much less than conventional micro channel heat exchangers. These inherent features promise fast penetration into many mew markets, including high heat flux cooling of electronics, waste heat recovery and energy efficiency enhancement applications, alternative energy systems, as well as applications in mass exchangers and chemical reactor systems. The combination of up to the minute research findings and technical know-how make this book very timely as the search for high performance heat and mass exchangers that can cut costs in materials consumption intensifies.

  7. Towards Next Generation BI Systems

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation Business Intelligence (BI) systems require integration of heterogeneous data sources and a strong user-centric orientation. Both needs entail machine-processable metadata to enable automation and allow end users to gain access to relevant data for their decision making processes....... This framework is based on the findings of a survey of current user-centric approaches mainly focusing on query recommendation assistance. Finally, we discuss the benefits of the framework and present the plans for future work....

  8. High-Performance Computing Paradigm and Infrastructure

    CERN Document Server

    Yang, Laurence T

    2006-01-01

    With hyperthreading in Intel processors, hypertransport links in next generation AMD processors, multi-core silicon in today's high-end microprocessors from IBM and emerging grid computing, parallel and distributed computers have moved into the mainstream

  9. NASA's Next Generation Space Geodesy Network

    Science.gov (United States)

    Desai, S. D.; Gross, R. S.; Hilliard, L.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry, J. F.; Merkowitz, S. M.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    NASA's Space Geodesy Project (SGP) is developing a prototype core site for a next generation Space Geodetic Network (SGN). Each of the sites in this planned network co-locate current state-of-the-art stations from all four space geodetic observing systems, GNSS, SLR, VLBI, and DORIS, with the goal of achieving modern requirements for the International Terrestrial Reference Frame (ITRF). In particular, the driving ITRF requirements for this network are 1.0 mm in accuracy and 0.1 mm/yr in stability, a factor of 10-20 beyond current capabilities. Development of the prototype core site, located at NASA's Geophysical and Astronomical Observatory at the Goddard Space Flight Center, started in 2011 and will be completed by the end of 2013. In January 2012, two operational GNSS stations, GODS and GOON, were established at the prototype site within 100 m of each other. Both stations are being proposed for inclusion into the IGS network. In addition, work is underway for the inclusion of next generation SLR and VLBI stations along with a modern DORIS station. An automated survey system is being developed to measure inter-technique vectorties, and network design studies are being performed to define the appropriate number and distribution of these next generation space geodetic core sites that are required to achieve the driving ITRF requirements. We present the status of this prototype next generation space geodetic core site, results from the analysis of data from the established geodetic stations, and results from the ongoing network design studies.

  10. High-Throughput Next-Generation Sequencing of Polioviruses

    Science.gov (United States)

    Montmayeur, Anna M.; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J.; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L.; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A.; Oberste, M. Steven; Burns, Cara C.

    2016-01-01

    ABSTRACT The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance. PMID:27927929

  11. The apeNEXT project

    International Nuclear Information System (INIS)

    Belletti, F.; Bodin, F.; Boucaud, Ph.; Cabibbo, N.; Lonardo, A.; De Luca, S.; Lukyanov, M.; Micheli, J.; Morin, L.; Pene, O.; Pleiter, D.; Rapuano, F.; Rossetti, D.; Schifano, S.F.; Simma, H.; Tripiccione, R.; Vicini, P.

    2006-01-01

    Numerical simulations in theoretical high-energy physics (Lattice QCD) require huge computing resources. Several generations of massively parallel computers optimised for these applications have been developed within the APE (array processor experiment) project. Large prototype systems of the latest generation, apeNEXT, are currently being assembled and tested. This contribution explains how the apeNEXT architecture is optimised for Lattice QCD, provides an overview of the hardware and software of apeNEXT, and describes its new features, like the SPMD programming model and the C compiler

  12. Next generation toroidal devices

    International Nuclear Information System (INIS)

    Yoshikawa, Shoichi

    1998-10-01

    A general survey of the possible approach for the next generation toroidal devices was made. Either surprisingly or obviously (depending on one's view), the technical constraints along with the scientific considerations lead to a fairly limited set of systems for the most favorable approach for the next generation devices. Specifically if the magnetic field strength of 5 T or above is to be created by superconducting coils, it imposes minimum in the aspect ratio for the tokamak which is slightly higher than contemplated now for ITER design. The similar technical constraints make the minimum linear size of a stellarator large. Scientifically, it is indicated that a tokamak of 1.5 times in the linear dimension should be able to produce economically, especially if a hybrid reactor is allowed. For the next stellarator, it is strongly suggested that some kind of helical axis is necessary both for the (almost) absolute confinement of high energy particles and high stability and equilibrium beta limits. The author still favors a heliac most. Although it may not have been clearly stated in the main text, the stability afforded by the shearless layer may be exploited fully in a stellarator. (author)

  13. Next generation toroidal devices

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Shoichi [Princeton Plasma Physics Lab., Princeton Univ., NJ (United States)

    1998-10-01

    A general survey of the possible approach for the next generation toroidal devices was made. Either surprisingly or obviously (depending on one`s view), the technical constraints along with the scientific considerations lead to a fairly limited set of systems for the most favorable approach for the next generation devices. Specifically if the magnetic field strength of 5 T or above is to be created by superconducting coils, it imposes minimum in the aspect ratio for the tokamak which is slightly higher than contemplated now for ITER design. The similar technical constraints make the minimum linear size of a stellarator large. Scientifically, it is indicated that a tokamak of 1.5 times in the linear dimension should be able to produce economically, especially if a hybrid reactor is allowed. For the next stellarator, it is strongly suggested that some kind of helical axis is necessary both for the (almost) absolute confinement of high energy particles and high stability and equilibrium beta limits. The author still favors a heliac most. Although it may not have been clearly stated in the main text, the stability afforded by the shearless layer may be exploited fully in a stellarator. (author)

  14. Next generation of accelerators

    International Nuclear Information System (INIS)

    Richter, B.

    1979-01-01

    Existing high-energy accelerators are reviewed, along with those under construction or being designed. Finally, some of the physics issues which go into setting machine parameters, and some of the features of the design of next generation electron and proton machines are discussed

  15. High-performance ferroelectric and magnetoresistive materials for next-generation thermal detector arrays

    Science.gov (United States)

    Todd, Michael A.; Donohue, Paul P.; Watton, Rex; Williams, Dennis J.; Anthony, Carl J.; Blamire, Mark G.

    2002-12-01

    This paper discusses the potential thermal imaging performance achievable from thermal detector arrays and concludes that the current generation of thin-film ferroelectric and resistance bolometer based detector arrays are limited by the detector materials used. It is proposed that the next generation of large uncooled focal plane arrays will need to look towards higher performance detector materials - particularly if they aim to approach the fundamental performance limits and compete with cooled photon detector arrays. Two examples of bolometer thin-film materials are described that achieve high performance from operating around phase transitions. The material Lead Scandium Tantalate (PST) has a paraelectric-to-ferroelectric phase transition around room temperature and is used with an applied field in the dielectric bolometer mode for thermal imaging. PST films grown by sputtering and liquid-source CVD have shown merit figures for thermal imaging a factor of 2 to 3 times higher than PZT-based pyroelectric thin films. The material Lanthanum Calcium Manganite (LCMO) has a paramagnetic to ferromagnetic phase transition around -20oC. This paper describes recent measurements of TCR and 1/f noise in pulsed laser-deposited LCMO films on Neodymium Gallate substrates. These results show that LCMO not only has high TCR's - up to 30%/K - but also low 1/f excess noise, with bolometer merit figures at least an order of magnitude higher than Vanadium Oxide, making it ideal for the next generation of microbolometer arrays. These high performance properties come at the expense of processing complexities and novel device designs will need to be introduced to realize the potential of these materials in the next generation of thermal detectors.

  16. High Throughput Line-of-Sight MIMO Systems for Next Generation Backhaul Applications

    Science.gov (United States)

    Song, Xiaohang; Cvetkovski, Darko; Hälsig, Tim; Rave, Wolfgang; Fettweis, Gerhard; Grass, Eckhard; Lankl, Berthold

    2017-09-01

    The evolution to ultra-dense next generation networks requires a massive increase in throughput and deployment flexibility. Therefore, novel wireless backhaul solutions that can support these demands are needed. In this work we present an approach for a millimeter wave line-of-sight MIMO backhaul design, targeting transmission rates in the order of 100 Gbit/s. We provide theoretical foundations for the concept showcasing its potential, which are confirmed through channel measurements. Furthermore, we provide insights into the system design with respect to antenna array setup, baseband processing, synchronization, and channel equalization. Implementation in a 60 GHz demonstrator setup proves the feasibility of the system concept for high throughput backhauling in next generation networks.

  17. RES-E-NEXT: Next Generation of RES-E Policy Instruments

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M.; Bird, L.; Cochran, J.; Milligan, M.; Bazilian, M. [National Renewable Energy Laboratory, Golden, CO (United States); Denny, E.; Dillon, J.; Bialek, J.; O’Malley, M. [Ecar Limited (Ireland); Neuhoff, K. [DIW Berlin (Germany)

    2013-07-04

    The RES-E-NEXT study identifies policies that are required for the next phase of renewable energy support. The study analyses policy options that secure high shares of renewable electricity generation and adequate grid infrastructure, enhance flexibility and ensure an appropriate market design. Measures have limited costs or even save money, and policies can be gradually implemented.

  18. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    Science.gov (United States)

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  19. MiSeq: A Next Generation Sequencing Platform for Genomic Analysis.

    Science.gov (United States)

    Ravi, Rupesh Kanchi; Walton, Kendra; Khosroheidari, Mahdieh

    2018-01-01

    MiSeq, Illumina's integrated next generation sequencing instrument, uses reversible-terminator sequencing-by-synthesis technology to provide end-to-end sequencing solutions. The MiSeq instrument is one of the smallest benchtop sequencers that can perform onboard cluster generation, amplification, genomic DNA sequencing, and data analysis, including base calling, alignment and variant calling, in a single run. It performs both single- and paired-end runs with adjustable read lengths from 1 × 36 base pairs to 2 × 300 base pairs. A single run can produce output data of up to 15 Gb in as little as 4 h of runtime and can output up to 25 M single reads and 50 M paired-end reads. Thus, MiSeq provides an ideal platform for rapid turnaround time. MiSeq is also a cost-effective tool for various analyses focused on targeted gene sequencing (amplicon sequencing and target enrichment), metagenomics, and gene expression studies. For these reasons, MiSeq has become one of the most widely used next generation sequencing platforms. Here, we provide a protocol to prepare libraries for sequencing using the MiSeq instrument and basic guidelines for analysis of output data from the MiSeq sequencing run.

  20. The Secret Life of Exosomes: What Bees Can Teach Us About Next-Generation Therapeutics.

    Science.gov (United States)

    Marbán, Eduardo

    2018-01-16

    Mechanistic exploration has pinpointed nanosized extracellular vesicles, known as exosomes, as key mediators of the benefits of cell therapy. Exosomes appear to recapitulate the benefits of cells and more. As durable azoic entities, exosomes have numerous practical and conceptual advantages over cells. Will cells end up just being used to manufacture exosomes, or will they find lasting value as primary therapeutic agents? Here, a venerable natural process-the generation of honey-serves as an instructive parable. Flowers make nectar, which bees collect and process into honey. Cells make conditioned medium, which laboratory workers collect and process into exosomes. Unlike flowers, honey is durable, compact, and nutritious, but these facts do not negate the value of flowers themselves. The parallels suggest new ways of thinking about next-generation therapeutics. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  1. AugerNext: innovative research studies for the next generation ground-based ultra-high energy cosmic ray experiment

    Directory of Open Access Journals (Sweden)

    Haungs Andreas

    2013-06-01

    Full Text Available The findings so far of the Pierre Auger Observatory and also of the Telescope Array define the requirements for a possible next generation experiment: it needs to be considerably increased in size, it needs a better sensitivity to composition, and it should cover the full sky. AugerNext aims to perform innovative research studies in order to prepare a proposal fulfilling these demands. Such R&D studies are primarily focused in the following areas iconsolidation of the detection of cosmic rays using MHz radio antennas; iiproof-of-principle of cosmic-ray microwave detection; iiitest of the large-scale application of a new generation photo-sensors; ivgeneralization of data communication techniques; vdevelopment of new ways of muon detection with surface arrays. These AugerNext studies on new innovative detection methods for a next generation cosmic-ray experiment are performed at the Pierre Auger Observatory. The AugerNext consortium consists presently of fourteen partner institutions from nine European countries supported by a network of European funding agencies and it is a principal element of the ASPERA/ApPEC strategic roadmaps.

  2. Is Monte Carlo embarrassingly parallel?

    Energy Technology Data Exchange (ETDEWEB)

    Hoogenboom, J. E. [Delft Univ. of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Delft Nuclear Consultancy, IJsselzoom 2, 2902 LB Capelle aan den IJssel (Netherlands)

    2012-07-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  3. Is Monte Carlo embarrassingly parallel?

    International Nuclear Information System (INIS)

    Hoogenboom, J. E.

    2012-01-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  4. Test generation for digital circuits using parallel processing

    Science.gov (United States)

    Hartmann, Carlos R.; Ali, Akhtar-Uz-Zaman M.

    1990-12-01

    The problem of test generation for digital logic circuits is an NP-Hard problem. Recently, the availability of low cost, high performance parallel machines has spurred interest in developing fast parallel algorithms for computer-aided design and test. This report describes a method of applying a 15-valued logic system for digital logic circuit test vector generation in a parallel programming environment. A concept called fault site testing allows for test generation, in parallel, that targets more than one fault at a given location. The multi-valued logic system allows results obtained by distinct processors and/or processes to be merged by means of simple set intersections. A machine-independent description is given for the proposed algorithm.

  5. NREL Next Generation Drivetrain: Mechanical Design and Test Plan (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Keller, J.; Halse, C.

    2014-05-01

    The Department of Energy and industry partners are sponsoring a $3m project for design and testing of a 'Next Generation' wind turbine drivetrain at the National Renewable Energy Laboratory (NREL). This poster focuses on innovative aspects of the gearbox design, completed as part of an end-to-end systems engineering approach incorporating innovations that increase drivetrain reliability, efficiency, torque density and minimize capital cost.

  6. Next generation HOM-damping

    Science.gov (United States)

    Marhauser, Frank

    2017-06-01

    can push the envelope towards quasi HOM-free operation suited for next generation storage and collider rings. Geometrical end-cell shape alterations for the five-cell cavity with already efficient mode damping are discussed as a possibility to further lower specific high impedance modes. The findings are eventually put into relation with demanding impedance instability thresholds in future collider rings.

  7. Parallel Algorithms for the Exascale Era

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-10-19

    New parallel algorithms are needed to reach the Exascale level of parallelism with millions of cores. We look at some of the research developed by students in projects at LANL. The research blends ideas from the early days of computing while weaving in the fresh approach brought by students new to the field of high performance computing. We look at reproducibility of global sums and why it is important to parallel computing. Next we look at how the concept of hashing has led to the development of more scalable algorithms suitable for next-generation parallel computers. Nearly all of this work has been done by undergraduates and published in leading scientific journals.

  8. Next Generation Molecular Histology Using Highly Multiplexed Ion Beam Imaging (MIBI) of Breast Cancer Tissue Specimens for Enhanced Clinical Guidance

    Science.gov (United States)

    2016-07-01

    AWARD NUMBER: W81XWH- 14-1-0192 TITLE: Next-Generation Molecular Histology Using Highly Multiplexed Ion Beam Imaging (MIBI) of Breast Cancer...DATES COVERED 4. TITLE AND SUBTITLE Next-Generation Molecular Histology Using Highly Multiplexed Ion Beam Imaging (MIBI) of Breast Cancer Tissue

  9. A National Demonstration Project Building the Next Generation

    International Nuclear Information System (INIS)

    Keuter, Dan; Hughey, Kenneth; Melancon, Steve; Quinn, Edward 'Ted'

    2002-01-01

    energy situation, and in a way which supports U.S. environmental objectives. A key element of this effort will be the reestablishment and maintenance of an industrial base, which can be accessed in response to changing national energy needs. Right now, in a cooperative program through the U.S. Department of Energy, U.S. and Russian dollars are paying for over 700 Russian nuclear scientists and engineers to complete design work on the Gas Turbine - Modular Helium Reactor (GT-MHR), a next generation nuclear power plant that is melt-down proof, substantially more efficient that the existing generation of reactors, creates substantially less waste and is extremely proliferation resistant. To date, the Russians are providing world class engineering design work, resulting in the program being on track to begin construction of this first of a kind reactor by the end of 2005. Just as important in parallel with this effort, a number of key U.S. utilities are speaking with Congress and the Administration to 'piggy back' off this U.S./Russian effort to promote a joint private-public partnership to construct in parallel a similar first of a kind reactor in the U.S. (authors)

  10. Rapid identification and recovery of ENU-induced mutations with next-generation sequencing and Paired-End Low-Error analysis.

    Science.gov (United States)

    Pan, Luyuan; Shah, Arish N; Phelps, Ian G; Doherty, Dan; Johnson, Eric A; Moens, Cecilia B

    2015-02-14

    Targeting Induced Local Lesions IN Genomes (TILLING) is a reverse genetics approach to directly identify point mutations in specific genes of interest in genomic DNA from a large chemically mutagenized population. Classical TILLING processes, based on enzymatic detection of mutations in heteroduplex PCR amplicons, are slow and labor intensive. Here we describe a new TILLING strategy in zebrafish using direct next generation sequencing (NGS) of 250 bp amplicons followed by Paired-End Low-Error (PELE) sequence analysis. By pooling a genomic DNA library made from over 9,000 N-ethyl-N-nitrosourea (ENU) mutagenized F1 fish into 32 equal pools of 288 fish, each with a unique Illumina barcode, we reduce the complexity of the template to a level at which we can detect mutations that occur in a single heterozygous fish in the entire library. MiSeq sequencing generates 250 base-pair overlapping paired-end reads, and PELE analysis aligns the overlapping sequences to each other and filters out any imperfect matches, thereby eliminating variants introduced during the sequencing process. We find that this filtering step reduces the number of false positive calls 50-fold without loss of true variant calls. After PELE we were able to validate 61.5% of the mutant calls that occurred at a frequency between 1 mutant call:100 wildtype calls and 1 mutant call:1000 wildtype calls in a pool of 288 fish. We then use high-resolution melt analysis to identify the single heterozygous mutation carrier in the 288-fish pool in which the mutation was identified. Using this NGS-TILLING protocol we validated 28 nonsense or splice site mutations in 20 genes, at a two-fold higher efficiency than using traditional Cel1 screening. We conclude that this approach significantly increases screening efficiency and accuracy at reduced cost and can be applied in a wide range of organisms.

  11. Next generation light water reactors

    International Nuclear Information System (INIS)

    Omoto, Akira

    1992-01-01

    In the countries where the new order of nuclear reactors has ceased, the development of the light water reactors of new type has been discussed, aiming at the revival of nuclear power. Also in Japan, since it is expected that light water reactors continue to be the main power reactor for long period, the technology of light water reactors of next generation has been discussed. For the development of nuclear power, extremely long lead time is required. The light water reactors of next generation now in consideration will continue to be operated till the middle of the next century, therefore, they must take in advance sufficiently the needs of the age. The improvement of the way men and the facilities should be, the simple design, the flexibility to the trend of fuel cycle and so on are required for the light water reactors of next generation. The trend of the development of next generation light water reactors is discussed. The construction of an ABWR was started in September, 1991, as No. 6 plant in Kashiwazaki Kariwa Power Station. (K.I.)

  12. NIRS report of investigations for the development of the next generation PET apparatus. FY 2000

    International Nuclear Information System (INIS)

    2001-03-01

    This is a summary of study reports from representative technology fields for the development of the next generation PET apparatus directing to 3-D images, and is hoped to be useful for future smooth cooperation between the fields. The investigation started from April 2000 in National Institute of Radiological Sciences (NIRS) with cooperation of other facilities, universities and companies. The report involves chapters of: Detector volume and geometrical efficiency- Design criterion for the next generation PET; Scintillator for PET; An investigation of detector and front-end electronics for the next generation PET; A measurement system of depth of interaction; Detector simulator; Development of an evaluation system for PET detector; On the signal processing system for the next generation PET; List-mode data acquisition method for the next generation PET; List-mode data acquisition simulator; Image reconstruction; A Monte Carlo simulator for the next generation PET scanners; Out-of-field of view (FOV) radioactivity; and Published papers and presentations. (N.I.)

  13. Advanced Material Strategies for Next-Generation Additive Manufacturing.

    Science.gov (United States)

    Chang, Jinke; He, Jiankang; Mao, Mao; Zhou, Wenxing; Lei, Qi; Li, Xiao; Li, Dichen; Chua, Chee-Kai; Zhao, Xin

    2018-01-22

    Additive manufacturing (AM) has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing.

  14. Advanced Material Strategies for Next-Generation Additive Manufacturing

    Directory of Open Access Journals (Sweden)

    Jinke Chang

    2018-01-01

    Full Text Available Additive manufacturing (AM has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing.

  15. Advanced Material Strategies for Next-Generation Additive Manufacturing

    Science.gov (United States)

    Chang, Jinke; He, Jiankang; Zhou, Wenxing; Lei, Qi; Li, Xiao; Li, Dichen

    2018-01-01

    Additive manufacturing (AM) has drawn tremendous attention in various fields. In recent years, great efforts have been made to develop novel additive manufacturing processes such as micro-/nano-scale 3D printing, bioprinting, and 4D printing for the fabrication of complex 3D structures with high resolution, living components, and multimaterials. The development of advanced functional materials is important for the implementation of these novel additive manufacturing processes. Here, a state-of-the-art review on advanced material strategies for novel additive manufacturing processes is provided, mainly including conductive materials, biomaterials, and smart materials. The advantages, limitations, and future perspectives of these materials for additive manufacturing are discussed. It is believed that the innovations of material strategies in parallel with the evolution of additive manufacturing processes will provide numerous possibilities for the fabrication of complex smart constructs with multiple functions, which will significantly widen the application fields of next-generation additive manufacturing. PMID:29361754

  16. Next generation biofuel engineering in prokaryotes

    Science.gov (United States)

    Gronenberg, Luisa S.; Marcheschi, Ryan J.; Liao, James C.

    2014-01-01

    Next-generation biofuels must be compatible with current transportation infrastructure and be derived from environmentally sustainable resources that do not compete with food crops. Many bacterial species have unique properties advantageous to the production of such next-generation fuels. However, no single species possesses all characteristics necessary to make high quantities of fuels from plant waste or CO2. Species containing a subset of the desired characteristics are used as starting points for engineering organisms with all desired attributes. Metabolic engineering of model organisms has yielded high titer production of advanced fuels, including alcohols, isoprenoids and fatty acid derivatives. Technical developments now allow engineering of native fuel producers, as well as lignocellulolytic and autotrophic bacteria, for the production of biofuels. Continued research on multiple fronts is required to engineer organisms for truly sustainable and economical biofuel production. PMID:23623045

  17. Free Space Optics for Next Generation Cellular Backhaul

    KAUST Repository

    Zedini, Emna

    2016-11-01

    The exponential increase in the number of mobile users, coupled with the strong demand for high-speed data services results in a significant growth in the required cellular backhaul capacity. Optimizing the cost efficiency while increasing the capacity is becoming a key challenge to the cellular backhaul. It refers to connections between base stations and mobile switching nodes over a variety of transport technologies such as copper, optical fibers, and radio links. These traditional transmission technologies are either expensive, or cannot provide high data rates. This work is focused on the opportunities of free-space-optical (FSO) technology in next generation cellular back- haul. FSO is a cost effective and wide bandwidth solution as compared with the traditional radio-frequency (RF) transmission. Moreover, due to its ease of deployment, license-free operation, high transmission security, and insensitivity to interference, FSO links are becoming an attractive solution for next generation cellular networks. However, the widespread deployment of FSO links is hampered by the atmospheric turbulence-induced fading, weather conditions, and pointing errors. Increasing the reliability of FSO systems, while still exploiting their high data rate communications, is a key requirement in the deployment of an FSO-based backhaul. Therefore, the aim of this work is to provide different approaches to address these technical challenges. In this context, investigation of hybrid automatic repeat request (HARQ) protocols from an information-theoretic perspective is undertaken. Moreover, performance analysis of asymmetric RF/FSO dual-hop systems is studied. In such system models, multiple RF users can be multiplexed and sent over the FSO link. More specifically, the end-to-end performance metrics are presented in closed-form. This also has increased the interest to study the performance of dual-hop mixed FSO/RF systems, where the FSO link is used as a multicast channel that serves

  18. A filtering method to generate high quality short reads using illumina paired-end technology.

    Science.gov (United States)

    Eren, A Murat; Vineis, Joseph H; Morrison, Hilary G; Sogin, Mitchell L

    2013-01-01

    Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.

  19. Optimizing Open-Ended Crowdsourcing: The Next Frontier in Crowdsourced Data Management

    Science.gov (United States)

    Parameswaran, Aditya; Sarma, Akash Das; Venkataraman, Vipul

    2017-01-01

    Crowdsourcing is the primary means to generate training data at scale, and when combined with sophisticated machine learning algorithms, crowdsourcing is an enabler for a variety of emergent automated applications impacting all spheres of our lives. This paper surveys the emerging field of formally reasoning about and optimizing open-ended crowdsourcing, a popular and crucially important, but severely understudied class of crowdsourcing—the next frontier in crowdsourced data management. The underlying challenges include distilling the right answer when none of the workers agree with each other, teasing apart the various perspectives adopted by workers when answering tasks, and effectively selecting between the many open-ended operators appropriate for a problem. We describe the approaches that we’ve found to be effective for open-ended crowdsourcing, drawing from our experiences in this space. PMID:28951893

  20. PVT: an efficient computational procedure to speed up next-generation sequence analysis.

    Science.gov (United States)

    Maji, Ranjan Kumar; Sarkar, Arijita; Khatua, Sunirmal; Dasgupta, Subhasis; Ghosh, Zhumur

    2014-06-04

    High-throughput Next-Generation Sequencing (NGS) techniques are advancing genomics and molecular biology research. This technology generates substantially large data which puts up a major challenge to the scientists for an efficient, cost and time effective solution to analyse such data. Further, for the different types of NGS data, there are certain common challenging steps involved in analysing those data. Spliced alignment is one such fundamental step in NGS data analysis which is extremely computational intensive as well as time consuming. There exists serious problem even with the most widely used spliced alignment tools. TopHat is one such widely used spliced alignment tools which although supports multithreading, does not efficiently utilize computational resources in terms of CPU utilization and memory. Here we have introduced PVT (Pipelined Version of TopHat) where we take up a modular approach by breaking TopHat's serial execution into a pipeline of multiple stages, thereby increasing the degree of parallelization and computational resource utilization. Thus we address the discrepancies in TopHat so as to analyze large NGS data efficiently. We analysed the SRA dataset (SRX026839 and SRX026838) consisting of single end reads and SRA data SRR1027730 consisting of paired-end reads. We used TopHat v2.0.8 to analyse these datasets and noted the CPU usage, memory footprint and execution time during spliced alignment. With this basic information, we designed PVT, a pipelined version of TopHat that removes the redundant computational steps during 'spliced alignment' and breaks the job into a pipeline of multiple stages (each comprising of different step(s)) to improve its resource utilization, thus reducing the execution time. PVT provides an improvement over TopHat for spliced alignment of NGS data analysis. PVT thus resulted in the reduction of the execution time to ~23% for the single end read dataset. Further, PVT designed for paired end reads showed an

  1. Parallel manipulators with two end-effectors : Getting a grip on Jacobian-based stiffness analysis

    NARCIS (Netherlands)

    Hoevenaars, A.G.L.

    2016-01-01

    Robots that are developed for applications which require a high stiffness-over-inertia ratio, such as pick-and-place robots, machining robots, or haptic devices, are often based on parallel manipulators. Parallel manipulators connect an end-effector to an inertial base using multiple serial

  2. Development of a high performance eigensolver on the peta-scale next generation supercomputer system

    International Nuclear Information System (INIS)

    Imamura, Toshiyuki; Yamada, Susumu; Machida, Masahiko

    2010-01-01

    For the present supercomputer systems, a multicore and multisocket processors are necessary to build a system, and choice of interconnection is essential. In addition, for effective development of a new code, high performance, scalable, and reliable numerical software is one of the key items. ScaLAPACK and PETSc are well-known software on distributed memory parallel computer systems. It is needless to say that highly tuned software towards new architecture like many-core processors must be chosen for real computation. In this study, we present a high-performance and high-scalable eigenvalue solver towards the next-generation supercomputer system, so called 'K-computer' system. We have developed two versions, the standard version (eigen s) and enhanced performance version (eigen sx), which are developed on the T2K cluster system housed at University of Tokyo. Eigen s employs the conventional algorithms; Householder tridiagonalization, divide and conquer (DC) algorithm, and Householder back-transformation. They are carefully implemented with blocking technique and flexible two-dimensional data-distribution to reduce the overhead of memory traffic and data transfer, respectively. Eigen s performs excellently on the T2K system with 4096 cores (theoretical peak is 37.6 TFLOPS), and it shows fine performance 3.0 TFLOPS with a two hundred thousand dimensional matrix. The enhanced version, eigen sx, uses more advanced algorithms; the narrow-band reduction algorithm, DC for band matrices, and the block Householder back-transformation with WY-representation. Even though this version is still on a test stage, it shows 4.7 TFLOPS with the same dimensional matrix on eigen s. (author)

  3. System for Informatics in the Molecular Pathology Laboratory: An Open-Source End-to-End Solution for Next-Generation Sequencing Clinical Data Management.

    Science.gov (United States)

    Kang, Wenjun; Kadri, Sabah; Puranik, Rutika; Wurst, Michelle N; Patil, Sushant A; Mujacic, Ibro; Benhamed, Sonia; Niu, Nifang; Zhen, Chao Jie; Ameti, Bekim; Long, Bradley C; Galbo, Filipo; Montes, David; Iracheta, Crystal; Gamboa, Venessa L; Lopez, Daisy; Yourshaw, Michael; Lawrence, Carolyn A; Aisner, Dara L; Fitzpatrick, Carrie; McNerney, Megan E; Wang, Y Lynn; Andrade, Jorge; Volchenboum, Samuel L; Furtado, Larissa V; Ritterhouse, Lauren L; Segal, Jeremy P

    2018-04-24

    Next-generation sequencing (NGS) diagnostic assays increasingly are becoming the standard of care in oncology practice. As the scale of an NGS laboratory grows, management of these assays requires organizing large amounts of information, including patient data, laboratory processes, genomic data, as well as variant interpretation and reporting. Although several Laboratory Information Systems and/or Laboratory Information Management Systems are commercially available, they may not meet all of the needs of a given laboratory, in addition to being frequently cost-prohibitive. Herein, we present the System for Informatics in the Molecular Pathology Laboratory, a free and open-source Laboratory Information System/Laboratory Information Management System for academic and nonprofit molecular pathology NGS laboratories, developed at the Genomic and Molecular Pathology Division at the University of Chicago Medicine. The System for Informatics in the Molecular Pathology Laboratory was designed as a modular end-to-end information system to handle all stages of the NGS laboratory workload from test order to reporting. We describe the features of the system, its clinical validation at the Genomic and Molecular Pathology Division at the University of Chicago Medicine, and its installation and testing within a different academic center laboratory (University of Colorado), and we propose a platform for future community co-development and interlaboratory data sharing. Copyright © 2018. Published by Elsevier Inc.

  4. Multi-gigabit optical interconnects for next-generation on-board digital equipment

    Science.gov (United States)

    Venet, Norbert; Favaro, Henri; Sotom, Michel; Maignan, Michel; Berthon, Jacques

    2017-11-01

    Parallel optical interconnects are experimentally assessed as a technology that may offer the high-throughput data communication capabilities required to the next-generation on-board digital processing units. An optical backplane interconnect was breadboarded, on the basis of a digital transparent processor that provides flexible connectivity and variable bandwidth in telecom missions with multi-beam antenna coverage. The unit selected for the demonstration required that more than tens of Gbit/s be supported by the backplane. The demonstration made use of commercial parallel optical link modules at 850 nm wavelength, with 12 channels running at up to 2.5 Gbit/s. A flexible optical fibre circuit was developed so as to route board-to-board connections. It was plugged to the optical transmitter and receiver modules through 12-fibre MPO connectors. BER below 10-14 and optical link budgets in excess of 12 dB were measured, which would enable to integrate broadcasting. Integration of the optical backplane interconnect was successfully demonstrated by validating the overall digital processor functionality.

  5. FLEXIBLE, HIGH CHAR YIELD HYBRIDSIL ADHESIVE MATERIALS FOR NEXT GENERATION ABLATIVE THERMAL PROTECTION SYSTEMS, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NanoSonic will create and empirically validate flexible, high char yield HybridSil adhesive nanocomposites for use within current and next generation polymer based...

  6. Generation Next

    Science.gov (United States)

    Hawkins, B. Denise

    2010-01-01

    There is a shortage of accounting professors with Ph.D.s who can prepare the next generation. To help reverse the faculty deficit, the American Institute of Certified Public Accountants (CPAs) has created the new Accounting Doctoral Scholars program by pooling more than $17 million and soliciting commitments from more than 70 of the nation's…

  7. A filtering method to generate high quality short reads using illumina paired-end technology.

    Directory of Open Access Journals (Sweden)

    A Murat Eren

    Full Text Available Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.

  8. JVM: Java Visual Mapping tool for next generation sequencing read.

    Science.gov (United States)

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB.

  9. X-ray computed tomography comparison of individual and parallel assembled commercial lithium iron phosphate batteries at end of life after high rate cycling

    Science.gov (United States)

    Carter, Rachel; Huhman, Brett; Love, Corey T.; Zenyuk, Iryna V.

    2018-03-01

    X-ray computed tomography (X-ray CT) across multiple length scales is utilized for the first time to investigate the physical abuse of high C-rate pulsed discharge on cells wired individually and in parallel.. Manufactured lithium iron phosphate cells boasting high rate capability were pulse power tested in both wiring conditions with high discharge currents of 10C for a high number of cycles (up to 1200) until end of life (health (SOH) monitoring methods, is diagnosed using CT by rendering the interior current collector without harm or alteration to the active materials. Correlation of CT observations to the electrochemical pulse data from the parallel-wired cells reveals the risk of parallel wiring during high C-rate pulse discharge.

  10. Next Generation Microshutter Arrays Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop the next generation MicroShutter Array (MSA) as a multi-object field selector for missions anticipated in the next two decades. For many...

  11. Project control - the next generation

    International Nuclear Information System (INIS)

    Iorii, V.F.; McKinnon, B.L.

    1993-01-01

    The Yucca Mountain Site Characterization Project (YMP) is the U.S. Department of Energy's (DOE) second largest Major System Acquisition Project. We have developed an integrated planning and control system (called PACS) that we believe represents the 'Next Generation' in project control. PACS integrates technical scope, cost, and schedule information for over 50 participating organizations and produces performances measurement reports for science and engineering managers at all levels. Our 'Next Generation' project control too, PACS, has been found to be in compliance with the new DOE Project Control System Guidelines. Additionally, the nuclear utility oversight group of the Edison Electric Institute has suggested PACS be used as a model for other civilian radioactive waste management projects. A 'Next Generation' project control tool will be necessary to do science in the 21st century

  12. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    Science.gov (United States)

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth?

  13. Next Generation of Photovoltaics New Concepts

    CERN Document Server

    Vega, Antonio; López, Antonio

    2012-01-01

    This book presents new concepts for a next generation of PV. Among these concepts are: Multijunction solar cells, multiple excitation solar cells (or how to take benefit of high energy photons for the creation of more than one electron hole-pair), intermediate band solar cells (or how to take advantage of below band-gap energy photons) and related technologies (for quantum dots, nitrides, thin films), advanced light management approaches (plasmonics). Written by world-class experts in next generation photovoltaics this book is an essential reference guide accessible to both beginners and experts working with solar cell technology. The book deeply analyzes the current state-of-the-art of the new photovoltaic approaches and outlines the implementation paths of these advanced devices. Topics addressed range from the fundamentals to the description of state-of-the-art of the new types of solar cells.

  14. Next generation of photovoltaics. New concepts

    Energy Technology Data Exchange (ETDEWEB)

    Cristobal Lopez, Ana Belen; Marti Vega, Antonio; Luque Lopez, Antonio (eds.) [Univ. Politecnica de Madrid (Spain). Inst. de Energia Solar E.T.S.I. Telecomunicacion

    2012-07-01

    This book presents new concepts for a next generation of PV. Among these concepts are: Multijunction solar cells, multiple excitation solar cells (or how to take benefit of high energy photons for the creation of more than one electron hole-pair), intermediate band solar cells (or how to take advantage of below band-gap energy photons) and related technologies (for quantum dots, nitrides, thin films), advanced light management approaches (plasmonics). Written by world-class experts in next generation photovoltaics this book is an essential reference guide accessible to both beginners and experts working with solar cell technology. The book deeply analyzes the current state-of-the-art of the new photovoltaic approaches and outlines the implementation paths of these advanced devices. Topics addressed range from the fundamentals to the description of state-of-the-art of the new types of solar cells. (orig.)

  15. Fiscal 1998 research report. Application technology of next-generation high-density energy beams; 1998 nendo chosa hokokusho. Jisedai komitsudo energy beam riyo gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    Survey was made on application technologies of next- generation high-density energy beams. For real application of laser power, application to not exciting source of YAG crystal but machining directly is highly efficient. For generation of semiconductor laser high-power coherent beam, phase synchronization and summing are large technological walls. Short pulse, high intensity and high repeatability are also important. Since ultra-short pulse laser ends before heat transfer to the periphery, it is suitable for precise machining, in particular, ultra-fine machining. To use beam sources as tool for production process, development of transmission, focusing and control technologies, and optical fiber and device is indispensable. Applicable fields are as follows: machining (more than pico seconds), surface modification (modification and functionalization of tribo- materials and biocompatible materials), complex machining, fabrication of quantum functional structured materials (thin film, ultra-fine particle), agriculture, ultra-precise measurement, non-destructive measurement, and coherent chemistry in chemical and environment fields. (NEDO)

  16. Next-Generation Tools For Next-Generation Surveys

    Science.gov (United States)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed

  17. Mentoring the Next Generation of Social Workers in Palliative and End-of-Life Care: The Zelda Foster Studies Program.

    Science.gov (United States)

    Gardner, Daniel S; Gerbino, Susan; Walls, Jocelyn Warner; Chachkes, Esther; Doherty, Meredith J

    2015-01-01

    As Americans live longer with chronic illnesses, there is a growing need for social workers with the knowledge and skills to deliver quality palliative care to older adults and their families. Nevertheless, there remains a critical shortage of social workers prepared to provide quality palliative and end-of-life care (PELC) and to maintain the field into the next generation. Formal mentorship programs represent an innovative approach to enhancing practice, providing support and guidance, and promoting social work leadership in the field. This article reviews the literature on mentorship as an approach to professional and leadership development for emerging social workers in PELC. The Zelda Foster Studies Program in Palliative and End-of-Life Care bolsters competencies and mentors social workers in PELC over the trajectory of their careers, and enhances the capacity in the field. Findings from the first six years of two components of the ZF Program are examined to illustrate the feasibility, benefits, and challenges of formal mentorship programs. The authors describe the background, structure, and evaluation of the initiative's mentorship programs, and discuss the implications of mentorship in PELC for social work education, practice, and research.

  18. Dynalight Next Generation

    DEFF Research Database (Denmark)

    Jørgensen, Bo Nørregaard; Ottosen, Carl-Otto; Dam-Hansen, Carsten

    2016-01-01

    The project aims to develop the next generation of energy cost-efficient artificial lighting control that enables greenhouse growers to adapt their use of artificial lighting dynamically to fluctuations in the price of electricity. This is a necessity as fluctuations in the price of electricity c...

  19. NASA's Next Generation Space Geodesy Program

    Science.gov (United States)

    Merkowitz, S. M.; Desai, S. D.; Gross, R. S.; Hillard, L. M.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry, J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Requirements for the ITRF have increased dramatically since the 1980s. The most stringent requirement comes from critical sea level monitoring programs: a global accuracy of 1.0 mm, and 0.1mm/yr stability, a factor of 10 to 20 beyond current capability. Other requirements for the ITRF coming from ice mass change, ground motion, and mass transport studies are similar. Current and future satellite missions will have ever-increasing measurement capability and will lead to increasingly sophisticated models of these and other changes in the Earth system. Ground space geodesy networks with enhanced measurement capability will be essential to meeting the ITRF requirements and properly interpreting the satellite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation of the observed geophysical signals. NASA has embarked on a Space Geodesy Program with a long-range goal to build, deploy and operate a next generation NASA Space Geodetic Network (SGN). The plan is to build integrated, multi-technique next-generation space geodetic observing systems as the core contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Phase 1 of this project has been funded to (1) Establish and demonstrate a next-generation prototype integrated Space Geodetic Station at Goddard's Geophysical and Astronomical Observatory (GGAO), including next-generation SLR and VLBI systems along with modern GNSS and DORIS; (2) Complete ongoing Network Design Studies that describe the appropriate number and distribution of next-generation Space Geodetic Stations for an improved global network; (3) Upgrade analysis capability to handle the next-generation data; (4) Implement a modern

  20. Generation 'Next' and nuclear power

    International Nuclear Information System (INIS)

    Sergeev, A.A.

    2001-01-01

    My generation was labeled by Russian mass media as generation 'Next.' My technical education is above average. My current position is as a mechanical engineer in the leading research and development institute for Russian nuclear engineering for peaceful applications. It is noteworthy to point out that many of our developments were really first-of-a-kind in the history of engineering. However, it is difficult to grasp the importance of these accomplishments, especially since the progress of nuclear technologies is at a standstill. Can generation 'Next' be independent in their attitude towards nuclear power or shall we rely on the opinions of elder colleagues in our industry? (authors)

  1. Next-generation mammalian genetics toward organism-level systems biology.

    Science.gov (United States)

    Susaki, Etsuo A; Ukai, Hideki; Ueda, Hiroki R

    2017-01-01

    Organism-level systems biology in mammals aims to identify, analyze, control, and design molecular and cellular networks executing various biological functions in mammals. In particular, system-level identification and analysis of molecular and cellular networks can be accelerated by next-generation mammalian genetics. Mammalian genetics without crossing, where all production and phenotyping studies of genome-edited animals are completed within a single generation drastically reduce the time, space, and effort of conducting the systems research. Next-generation mammalian genetics is based on recent technological advancements in genome editing and developmental engineering. The process begins with introduction of double-strand breaks into genomic DNA by using site-specific endonucleases, which results in highly efficient genome editing in mammalian zygotes or embryonic stem cells. By using nuclease-mediated genome editing in zygotes, or ~100% embryonic stem cell-derived mouse technology, whole-body knock-out and knock-in mice can be produced within a single generation. These emerging technologies allow us to produce multiple knock-out or knock-in strains in high-throughput manner. In this review, we discuss the basic concepts and related technologies as well as current challenges and future opportunities for next-generation mammalian genetics in organism-level systems biology.

  2. Rhamnolipids--next generation surfactants?

    Science.gov (United States)

    Müller, Markus Michael; Kügler, Johannes H; Henkel, Marius; Gerlitzki, Melanie; Hörmann, Barbara; Pöhnlein, Martin; Syldatk, Christoph; Hausmann, Rudolf

    2012-12-31

    The demand for bio-based processes and materials in the petrochemical industry has significantly increased during the last decade because of the expected running out of petroleum. This trend can be ascribed to three main causes: (1) the increased use of renewable resources for chemical synthesis of already established product classes, (2) the replacement of chemical synthesis of already established product classes by new biotechnological processes based on renewable resources, and (3) the biotechnological production of new molecules with new features or better performances than already established comparable chemically synthesized products. All three approaches are currently being pursued for surfactant production. Biosurfactants are a very promising and interesting substance class because they are based on renewable resources, sustainable, and biologically degradable. Alkyl polyglycosides are chemically synthesized biosurfactants established on the surfactant market. The first microbiological biosurfactants on the market were sophorolipids. Of all currently known biosurfactants, rhamnolipids have the highest potential for becoming the next generation of biosurfactants introduced on the market. Although the metabolic pathways and genetic regulation of biosynthesis are known qualitatively, the quantitative understanding relevant for bioreactor cultivation is still missing. Additionally, high product titers have been exclusively described with vegetable oil as sole carbon source in combination with Pseudomonas aeruginosa strains. Competitive productivity is still out of reach for heterologous hosts or non-pathogenic natural producer strains. Thus, on the one hand there is a need to gain a deeper understanding of the regulation of rhamnolipid production on process and cellular level during bioreactor cultivations. On the other hand, there is a need for metabolizable renewable substrates, which do not compete with food and feed. A sustainable bioeconomy approach should

  3. The Next Great Generation?

    Science.gov (United States)

    Brownstein, Andrew

    2000-01-01

    Discusses ideas from a new book, "Millennials Rising: The Next Great Generation," (by Neil Howe and William Strauss) suggesting that youth culture is on the cusp of a radical shift with the generation beginning with this year's college freshmen who are typically team oriented, optimistic, and poised for greatness on a global scale. Includes a…

  4. Development of technology for next generation reactor - Development of next generation reactor in Korea -

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jong Kyun; Chang, Moon Heuy; Hwang, Yung Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); and others

    1993-09-01

    The project, development of next generation reactor, aims overall related technology development and obtainment of related license in 2001. The development direction is to determine the reactor type and to build up the design concept in 1994. For development trend analysis of foreign next generation reactor, level-1 PSA, fuel cycle analysis and computer code development are performed on System 80+ and AP 600. Especially for design characteristics analysis and volume upgrade of AP 600, nuclear fuel and reactor core design analysis, coolant circuit design analysis, mechanical structure design analysis and safety analysis etc. are performed. (Author).

  5. Tailoring next-generation biofuels and their combustion in next-generation engines

    Energy Technology Data Exchange (ETDEWEB)

    Gladden, John Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wu, Weihua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Taatjes, Craig A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scheer, Adam Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Turner, Kevin M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Yu, Eizadora T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Bryan, Greg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Powell, Amy Jo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gao, Connie W. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2013-11-01

    Increasing energy costs, the dependence on foreign oil supplies, and environmental concerns have emphasized the need to produce sustainable renewable fuels and chemicals. The strategy for producing next-generation biofuels must include efficient processes for biomass conversion to liquid fuels and the fuels must be compatible with current and future engines. Unfortunately, biofuel development generally takes place without any consideration of combustion characteristics, and combustion scientists typically measure biofuels properties without any feedback to the production design. We seek to optimize the fuel/engine system by bringing combustion performance, specifically for advanced next-generation engines, into the development of novel biosynthetic fuel pathways. Here we report an innovative coupling of combustion chemistry, from fundamentals to engine measurements, to the optimization of fuel production using metabolic engineering. We have established the necessary connections among the fundamental chemistry, engine science, and synthetic biology for fuel production, building a powerful framework for co-development of engines and biofuels.

  6. Construction of a high-density genetic map for grape using next generation restriction-site associated DNA sequencing

    Directory of Open Access Journals (Sweden)

    Wang Nian

    2012-08-01

    Full Text Available Abstract Background Genetic mapping and QTL detection are powerful methodologies in plant improvement and breeding. Construction of a high-density and high-quality genetic map would be of great benefit in the production of superior grapes to meet human demand. High throughput and low cost of the recently developed next generation sequencing (NGS technology have resulted in its wide application in genome research. Sequencing restriction-site associated DNA (RAD might be an efficient strategy to simplify genotyping. Combining NGS with RAD has proven to be powerful for single nucleotide polymorphism (SNP marker development. Results An F1 population of 100 individual plants was developed. In-silico digestion-site prediction was used to select an appropriate restriction enzyme for construction of a RAD sequencing library. Next generation RAD sequencing was applied to genotype the F1 population and its parents. Applying a cluster strategy for SNP modulation, a total of 1,814 high-quality SNP markers were developed: 1,121 of these were mapped to the female genetic map, 759 to the male map, and 1,646 to the integrated map. A comparison of the genetic maps to the published Vitis vinifera genome revealed both conservation and variations. Conclusions The applicability of next generation RAD sequencing for genotyping a grape F1 population was demonstrated, leading to the successful development of a genetic map with high density and quality using our designed SNP markers. Detailed analysis revealed that this newly developed genetic map can be used for a variety of genome investigations, such as QTL detection, sequence assembly and genome comparison.

  7. An integrated SNP mining and utilization (ISMU) pipeline for next generation sequencing data.

    Science.gov (United States)

    Azam, Sarwar; Rathore, Abhishek; Shah, Trushar M; Telluri, Mohan; Amindala, BhanuPrakash; Ruperao, Pradeep; Katta, Mohan A V S K; Varshney, Rajeev K

    2014-01-01

    Open source single nucleotide polymorphism (SNP) discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS) tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU) for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2), SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC) methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack) integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate) assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a standalone

  8. Molecular Diagnostics in Pathology: Time for a Next-Generation Pathologist?

    Science.gov (United States)

    Fassan, Matteo

    2018-03-01

    - Comprehensive molecular investigations of mainstream carcinogenic processes have led to the use of effective molecular targeted agents in most cases of solid tumors in clinical settings. - To update readers regarding the evolving role of the pathologist in the therapeutic decision-making process and the introduction of next-generation technologies into pathology practice. - Current literature on the topic, primarily sourced from the PubMed (National Center for Biotechnology Information, Bethesda, Maryland) database, were reviewed. - Adequate evaluation of cytologic-based and tissue-based predictive diagnostic biomarkers largely depends on both proper pathologic characterization and customized processing of biospecimens. Moreover, increased requests for molecular testing have paralleled the recent, sharp decrease in tumor material to be analyzed-material that currently comprises cytology specimens or, at minimum, small biopsies in most cases of metastatic/advanced disease. Traditional diagnostic pathology has been completely revolutionized by the introduction of next-generation technologies, which provide multigene, targeted mutational profiling, even in the most complex of clinical cases. Combining traditional and molecular knowledge, pathologists integrate the morphological, clinical, and molecular dimensions of a disease, leading to a proper diagnosis and, therefore, the most-appropriate tailored therapy.

  9. High-energy green supercapacitor driven by ionic liquid electrolytes as an ultra-high stable next-generation energy storage device

    Science.gov (United States)

    Thangavel, Ranjith; Kannan, Aravindaraj G.; Ponraj, Rubha; Thangavel, Vigneysh; Kim, Dong-Won; Lee, Yun-Sung

    2018-04-01

    Development of supercapacitors with high energy density and long cycle life using sustainable materials for next-generation applications is of paramount importance. The ongoing challenge is to elevate the energy density of supercapacitors on par with batteries, while upholding the power and cyclability. In addition, attaining such superior performance with green and sustainable bio-mass derived compounds is very crucial to address the rising environmental concerns. Herein, we demonstrate the use of watermelon rind, a bio-waste from watermelons, towards high energy, and ultra-stable high temperature green supercapacitors with a high-voltage ionic liquid electrolyte. Supercapacitors assembled with ultra-high surface area, hierarchically porous carbon exhibits a remarkable performance both at room temperature and at high temperature (60 °C) with maximum energy densities of ∼174 Wh kg-1 (25 °C), and 177 Wh kg-1 (60 °C) - based on active mass of both electrodes. Furthermore, an ultra-high specific power of ∼20 kW kg-1 along with an ultra-stable cycling performance with 90% retention over 150,000 cycles has been achieved even at 60 °C, outperforming supercapacitors assembled with other carbon based materials. These results demonstrate the potential to develop high-performing, green energy storage devices using eco-friendly materials for next generation electric vehicles and other advanced energy storage systems.

  10. Cluster cosmology with next-generation surveys.

    Science.gov (United States)

    Ascaso, B.

    2017-03-01

    The advent of next-generation surveys will provide a large number of cluster detections that will serve the basis for constraining cos mological parameters using cluster counts. The main two observational ingredients needed are the cluster selection function and the calibration of the mass-observable relation. In this talk, we present the methodology designed to obtain robust predictions of both ingredients based on realistic cosmological simulations mimicking the following next-generation surveys: J-PAS, LSST and Euclid. We display recent results on the selection functions for these mentioned surveys together with others coming from other next-generation surveys such as eROSITA, ACTpol and SPTpol. We notice that the optical and IR surveys will reach the lowest masses between 0.3next-generation surveys and introduce very preliminary results.

  11. High power, repetitive stacked Blumlein pulse generators

    Energy Technology Data Exchange (ETDEWEB)

    Davanloo, F; Borovina, D L; Korioth, J L; Krause, R K; Collins, C B [Univ. of Texas at Dallas, Richardson, TX (United States). Center for Quantum Electronics; Agee, F J [US Air Force Phillips Lab., Kirtland AFB, NM (United States); Kingsley, L E [US Army CECOM, Ft. Monmouth, NJ (United States)

    1997-12-31

    The repetitive stacked Blumlein pulse power generators developed at the University of Texas at Dallas consist of several triaxial Blumleins stacked in series at one end. The lines are charged in parallel and synchronously commuted with a single switch at the other end. In this way, relatively low charging voltages are multiplied to give a high discharge voltage across an arbitrary load. Extensive characterization of these novel pulsers have been performed over the past few years. Results indicate that they are capable of producing high power waveforms with rise times and repetition rates in the range of 0.5-50 ns and 1-300 Hz, respectively, using a conventional thyratron, spark gap, or photoconductive switch. The progress in the development and use of stacked Blumlein pulse generators is reviewed. The technology and the characteristics of these novel pulsers driving flash x-ray diodes are discussed. (author). 4 figs., 5 refs.

  12. Next-Generation Photon Sources for Grand Challenges in Science and Energy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-05-01

    to avoid radiation damage in high-resolution spatial imaging and to avoid space-charge broadening in photoelectron spectroscopy and microscopy. But light sources alone are not enough. The photons produced by next-generation light sources must be measured by state-of-the-art experiments installed at fully equipped end stations. Sophisticated detectors with unprecedented spatial, temporal, and spectral resolution must be designed and created. The theory of ultrafast phenomena that have never before been observed must be developed and implemented. Enormous data sets of diffracted signals in reciprocal space and across wide energy ranges must be collected and analyzed in real time so that they can guide the ongoing experiments. These experimental challenges - end stations, detectors, sophisticated experiments, theory, and data handling - must be planned and provided for as part of the photon source.

  13. Next-Generation Sequencing of Tubal Intraepithelial Carcinomas.

    Science.gov (United States)

    McDaniel, Andrew S; Stall, Jennifer N; Hovelson, Daniel H; Cani, Andi K; Liu, Chia-Jen; Tomlins, Scott A; Cho, Kathleen R

    2015-11-01

    High-grade serous carcinoma (HGSC) is the most prevalent and lethal form of ovarian cancer. HGSCs frequently arise in the distal fallopian tubes rather than the ovary, developing from small precursor lesions called serous tubal intraepithelial carcinomas (TICs, or more specifically, STICs). While STICs have been reported to harbor TP53 mutations, detailed molecular characterizations of these lesions are lacking. We performed targeted next-generation sequencing (NGS) on formalin-fixed, paraffin-embedded tissue from 4 women, 2 with HGSC and 2 with uterine endometrioid carcinoma (UEC) who were diagnosed as having synchronous STICs. We detected concordant mutations in both HGSCs with synchronous STICs, including TP53 mutations as well as assumed germline BRCA1/2 alterations, confirming a clonal association between these lesions. Next-generation sequencing confirmed the presence of a STIC clonally unrelated to 1 case of UEC, and NGS of the other tubal lesion diagnosed as a STIC unexpectedly supported the lesion as a micrometastasis from the associated UEC. We demonstrate that targeted NGS can identify genetic alterations in minute lesions, such as TICs, and confirm TP53 mutations as early driving events for HGSC. Next-generation sequencing also demonstrated unexpected associations between presumed STICs and synchronous carcinomas, providing evidence that some TICs are actually metastases rather than HGSC precursors.

  14. Next Generation Inverter

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Zilai [General Motors LLC, Detroit, MI (United States); Gough, Charles [General Motors LLC, Detroit, MI (United States)

    2016-04-22

    The goal of this Cooperative Agreement was the development of a Next Generation Inverter for General Motors’ electrified vehicles, including battery electric vehicles, range extended electric vehicles, plug-in hybrid electric vehicles and hybrid electric vehicles. The inverter is a critical electronics component that converts battery power (DC) to and from the electric power for the motor (AC).

  15. gCUP: rapid GPU-based HIV-1 co-receptor usage prediction for next-generation sequencing.

    Science.gov (United States)

    Olejnik, Michael; Steuwer, Michel; Gorlatch, Sergei; Heider, Dominik

    2014-11-15

    Next-generation sequencing (NGS) has a large potential in HIV diagnostics, and genotypic prediction models have been developed and successfully tested in the recent years. However, albeit being highly accurate, these computational models lack computational efficiency to reach their full potential. In this study, we demonstrate the use of graphics processing units (GPUs) in combination with a computational prediction model for HIV tropism. Our new model named gCUP, parallelized and optimized for GPU, is highly accurate and can classify >175 000 sequences per second on an NVIDIA GeForce GTX 460. The computational efficiency of our new model is the next step to enable NGS technologies to reach clinical significance in HIV diagnostics. Moreover, our approach is not limited to HIV tropism prediction, but can also be easily adapted to other settings, e.g. drug resistance prediction. The source code can be downloaded at http://www.heiderlab.de d.heider@wz-straubing.de. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Dependable Hydrogen and Industrial Heat Generation from the Next Generation Nuclear Plant

    Energy Technology Data Exchange (ETDEWEB)

    Charles V. Park; Michael W. Patterson; Vincent C. Maio; Piyush Sabharwall

    2009-03-01

    The Department of Energy is working with industry to develop a next generation, high-temperature gas-cooled nuclear reactor (HTGR) as a part of the effort to supply the US with abundant, clean and secure energy. The Next Generation Nuclear Plant (NGNP) project, led by the Idaho National Laboratory, will demonstrate the ability of the HTGR to generate hydrogen, electricity, and high-quality process heat for a wide range of industrial applications. Substituting HTGR power for traditional fossil fuel resources reduces the cost and supply vulnerability of natural gas and oil, and reduces or eliminates greenhouse gas emissions. As authorized by the Energy Policy Act of 2005, industry leaders are developing designs for the construction of a commercial prototype producing up to 600 MWt of power by 2021. This paper describes a variety of critical applications that are appropriate for the HTGR with an emphasis placed on applications requiring a clean and reliable source of hydrogen. An overview of the NGNP project status and its significant technology development efforts are also presented.

  17. High Temperature Gas-Cooled Reactors Lessons Learned Applicable to the Next Generation Nuclear Plant

    International Nuclear Information System (INIS)

    Beck, J.M.; Collins, J.W.; Garcia, C.B.; Pincock, L.F.

    2010-01-01

    High Temperature Gas Reactors (HTGR) have been designed and operated throughout the world over the past five decades. These seven HTGRs are varied in size, outlet temperature, primary fluid, and purpose. However, there is much the Next Generation Nuclear Plant (NGNP) has learned and can learn from these experiences. This report captures these various experiences and documents the lessons learned according to the physical NGNP hardware (i.e., systems, subsystems, and components) affected thereby.

  18. Next-Generation Sequencing: From Understanding Biology to Personalized Medicine

    Directory of Open Access Journals (Sweden)

    Benjamin Meder

    2013-03-01

    Full Text Available Within just a few years, the new methods for high-throughput next-generation sequencing have generated completely novel insights into the heritability and pathophysiology of human disease. In this review, we wish to highlight the benefits of the current state-of-the-art sequencing technologies for genetic and epigenetic research. We illustrate how these technologies help to constantly improve our understanding of genetic mechanisms in biological systems and summarize the progress made so far. This can be exemplified by the case of heritable heart muscle diseases, so-called cardiomyopathies. Here, next-generation sequencing is able to identify novel disease genes, and first clinical applications demonstrate the successful translation of this technology into personalized patient care.

  19. Accelerating next generation sequencing data analysis with system level optimizations.

    Science.gov (United States)

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  20. Advanced Electrode Materials for High Energy Next Generation Li ion Batteries

    Science.gov (United States)

    Hayner, Cary Michael

    Lithium ion batteries are becoming an increasingly ubiquitous part of modern society. Since their commercial introduction by Sony in 1991, lithium-ion batteries have grown to be the most popular form of electrical energy storage for portable applications. Today, lithium-ion batteries power everything from cellphones and electric vehicles to e-cigarettes, satellites, and electric aircraft. Despite the commercialization of lithium-ion batteries over twenty years ago, it remains the most active field of energy storage research for its potential improvement over current technology. In order to capitalize on these opportunities, new materials with higher energy density and storage capacities must be developed. Unfortunately, most next-generation materials suffer from rapid capacity degradation or severe loss of capacity when rapidly discharged. In this dissertation, the development of novel anode and cathode materials for advanced high-energy and high-power lithium-ion batteries is reported. In particular, the application of graphene-based materials to stabilize active material is emphasized. Graphene, a unique two-dimensional material composed of atomically thin carbon sheets, has shown potential to address unsatisfactory rate capability, limited cycling performance and abrupt failure of these next-generation materials. This dissertation covers four major subjects: development of silicon-graphene composites, impact of carbon vacancies on graphene high-rate performance, iron fluoride-graphene composites, and ternary iron-manganese fluoride synthesis. Silicon is considered the most likely material to replace graphite as the anode active material for lithium-ion batteries due to its ability to alloy with large amounts of lithium, leading to significantly higher specific capacities than the graphite standard. However, Si also expands in size over 300% upon lithiation, leading to particle fracture and isolation from conductive support, resulting in cell failure within a few

  1. Fiber to the home: next generation network

    Science.gov (United States)

    Yang, Chengxin; Guo, Baoping

    2006-07-01

    Next generation networks capable of carrying converged telephone, television (TV), very high-speed internet, and very high-speed bi-directional data services (like video-on-demand (VOD), Game etc.) strategy for Fiber To The Home (FTTH) is presented. The potential market is analyzed. The barriers and some proper strategy are also discussed. Several technical problems like various powering methods, optical fiber cables, and different network architecture are discussed too.

  2. A Next Generation BioPhotonics Workstation

    DEFF Research Database (Denmark)

    Glückstad, Jesper; Palima, Darwin; Tauro, Sandeep

    2011-01-01

    We are developing a Next Generation BioPhotonics Workstation to be applied in research on regulated microbial cell growth including their underlying physiological mechanisms, in vivo characterization of cell constituents and manufacturing of nanostructures and meta-materials.......We are developing a Next Generation BioPhotonics Workstation to be applied in research on regulated microbial cell growth including their underlying physiological mechanisms, in vivo characterization of cell constituents and manufacturing of nanostructures and meta-materials....

  3. Next generation breeding.

    Science.gov (United States)

    Barabaschi, Delfina; Tondelli, Alessandro; Desiderio, Francesca; Volante, Andrea; Vaccino, Patrizia; Valè, Giampiero; Cattivelli, Luigi

    2016-01-01

    The genomic revolution of the past decade has greatly improved our understanding of the genetic make-up of living organisms. The sequencing of crop genomes has completely changed our vision and interpretation of genome organization and evolution. Re-sequencing allows the identification of an unlimited number of markers as well as the analysis of germplasm allelic diversity based on allele mining approaches. High throughput marker technologies coupled with advanced phenotyping platforms provide new opportunities for discovering marker-trait associations which can sustain genomic-assisted breeding. The availability of genome sequencing information is enabling genome editing (site-specific mutagenesis), to obtain gene sequences desired by breeders. This review illustrates how next generation sequencing-derived information can be used to tailor genomic tools for different breeders' needs to revolutionize crop improvement. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Next-generation batteries and fuel cells for commercial, military, and space applications

    CERN Document Server

    Jha, A R

    2012-01-01

    Distilling complex theoretical physical concepts into an understandable technical framework, Next-Generation Batteries and Fuel Cells for Commercial, Military, and Space Applications describes primary and secondary (rechargeable) batteries for various commercial, military, spacecraft, and satellite applications for covert communications, surveillance, and reconnaissance missions. It emphasizes the cost, reliability, longevity, and safety of the next generation of high-capacity batteries for applications where high energy density, minimum weight and size, and reliability in harsh conditions are

  5. Progress on next generation linear colliders

    International Nuclear Information System (INIS)

    Ruth, R.D.

    1989-01-01

    In this paper, I focus on reviewing the issues and progress on a next generation linear collider with the general parameters of energy, luminosity, length, power, technology. The energy range is dictated by physics with a mass reach well beyond LEP, although somewhat short of SSC. The luminosity is that required to obtain 10 3 /minus/ 10 4 units of R 0 per year. The length is consistent with a site on Stanford land with collisions occurring on the SLAC site. The power was determined by economic considerations. Finally, the technology was limited by the desire to have a next generation linear collider before the next century. 25 refs., 3 figs., 6 tabs

  6. Network Restoration for Next-Generation Communication and Computing Networks

    Directory of Open Access Journals (Sweden)

    B. S. Awoyemi

    2018-01-01

    Full Text Available Network failures are undesirable but inevitable occurrences for most modern communication and computing networks. A good network design must be robust enough to handle sudden failures, maintain traffic flow, and restore failed parts of the network within a permissible time frame, at the lowest cost achievable and with as little extra complexity in the network as possible. Emerging next-generation (xG communication and computing networks such as fifth-generation networks, software-defined networks, and internet-of-things networks have promises of fast speeds, impressive data rates, and remarkable reliability. To achieve these promises, these complex and dynamic xG networks must be built with low failure possibilities, high network restoration capacity, and quick failure recovery capabilities. Hence, improved network restoration models have to be developed and incorporated in their design. In this paper, a comprehensive study on network restoration mechanisms that are being developed for addressing network failures in current and emerging xG networks is carried out. Open-ended problems are identified, while invaluable ideas for better adaptation of network restoration to evolving xG communication and computing paradigms are discussed.

  7. Parallel Computing:. Some Activities in High Energy Physics

    Science.gov (United States)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  8. Next-Generation Library Catalogs and the Problem of Slow Response Time

    Directory of Open Access Journals (Sweden)

    Margaret Brown-Sica

    2010-12-01

    Full Text Available Response time as defined for this study is the time that it takes for all files that constitute a single webpage to travel across the Internet from a Web server to the end user’s browser. In this study, the authors tested response times on queries for identical items in five different library catalogs, one of them a next-generation (NextGen catalog. The authors also discuss acceptable response time and how it may affect the discovery process. They suggest that librarians and vendors should develop standards for acceptable response time and use it in the product selection and development processes.

  9. Prospects for next-generation e+e- linear colliders

    International Nuclear Information System (INIS)

    Ruth, R.D.

    1990-02-01

    The purpose of this paper is to review progress in the US towards a next generation linear collider. During 1988, there were three workshops held on linear colliders: ''Physics of Linear Colliders,'' in Capri, Italy, June 14--18, 1988; Snowmass 88 (Linear Collider subsection) June 27--July 15, 1988; and SLAC International Workshop on Next Generation Linear Colliders, November 28--December 9, 1988. In this paper, I focus on reviewing the issues and progress on a next generation linear collider. The energy range is dictated by physics with a mass reach well beyond LEP, although somewhat short of SSC. The luminosity is that required to obtain 10 3 --10 4 units of R 0 per year. The length is consistent with a site on Stanford land with collision occurring on the SLAC site; the power was determined by economic considerations. Finally, the technology as limited by the desire to have a next generation linear collider by the next century. 37 refs., 3 figs., 6 tabs

  10. PARISROC, an autonomous front-end ASIC for triggerless acquisition in next generation neutrino experiments

    Science.gov (United States)

    Conforti Di Lorenzo, S.; Campagne, J. E.; Drouet, S.; Dulucq, F.; El Berni, M.; Genolini, B.; de La Taille, C.; Martin-Chassard, G.; Seguin Moreau, N.; Wanlin, E.; Xiangbo, Y.

    2012-12-01

    PARISROC (Photomultiplier ARray Integrated in SiGe ReadOut Chip) is a complete readout chip in AustriaMicroSystems (AMS) SiGe 0.35 μm technology designed to read array of 16 Photomultipliers (PMTs). The ASIC is realized in the context of the PMm2 (square meter PhotoMultiplier) project that has proposed a new system of “smart photo-detectors” composed by sensor and read-out electronics dedicated to next generation neutrino experiments. The future water Cherenkov detectors will take place in megaton size water tanks then with a large surface of photo-detection. We propose to segment the large surface in arrays with a single front-end electronics and only the useful data send in surface to be stocked and analyzed. This paper describes the second version of the ASIC and illustrates the chip principle of operation and the main characteristics thank to a series of measurements. It is a 16-channel ASIC with channels that work independently, in triggerless mode and all managed by a common digital part. Then main innovation is that all the channels are handled independently by the digital part so that only channels that have triggered are digitized. Then the data are transferred to the internal memory and sent out in a data driven way. The ASIC allows charge and time measurement. We measured a charge measurement range starting from 160 fC (1 photoelectron-p.e., at PMT gain of 106) to 100 pC (around 600 p.e.) at 1% of linearity; time tagging at 1 ns thanks to a 24-bit counter at 10 MHz and a Time to Digital Converter (TDC) on a 100 ns ramp.

  11. PARISROC, an autonomous front-end ASIC for triggerless acquisition in next generation neutrino experiments

    International Nuclear Information System (INIS)

    Conforti Di Lorenzo, S.; Campagne, J.E.; Drouet, S.; Dulucq, F.; El Berni, M.; Genolini, B.; La Taille, C. de; Martin-Chassard, G.; Seguin Moreau, N.; Wanlin, E.; Xiangbo, Y.

    2012-01-01

    PARISROC (Photomultiplier ARray Integrated in SiGe ReadOut Chip) is a complete readout chip in AustriaMicroSystems (AMS) SiGe 0.35 μm technology designed to read array of 16 Photomultipliers (PMTs). The ASIC is realized in the context of the PMm2 (square meter PhotoMultiplier) project that has proposed a new system of “smart photo-detectors” composed by sensor and read-out electronics dedicated to next generation neutrino experiments. The future water Cherenkov detectors will take place in megaton size water tanks then with a large surface of photo-detection. We propose to segment the large surface in arrays with a single front-end electronics and only the useful data send in surface to be stocked and analyzed. This paper describes the second version of the ASIC and illustrates the chip principle of operation and the main characteristics thank to a series of measurements. It is a 16-channel ASIC with channels that work independently, in triggerless mode and all managed by a common digital part. Then main innovation is that all the channels are handled independently by the digital part so that only channels that have triggered are digitized. Then the data are transferred to the internal memory and sent out in a data driven way. The ASIC allows charge and time measurement. We measured a charge measurement range starting from 160 fC (1 photoelectron-p.e., at PMT gain of 10 6 ) to 100 pC (around 600 p.e.) at 1% of linearity; time tagging at 1 ns thanks to a 24-bit counter at 10 MHz and a Time to Digital Converter (TDC) on a 100 ns ramp.

  12. ABrowse - a customizable next-generation genome browser framework

    Directory of Open Access Journals (Sweden)

    Kong Lei

    2012-01-01

    Full Text Available Abstract Background With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Results Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. Conclusions ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for

  13. ABrowse--a customizable next-generation genome browser framework.

    Science.gov (United States)

    Kong, Lei; Wang, Jun; Zhao, Shuqi; Gu, Xiaocheng; Luo, Jingchu; Gao, Ge

    2012-01-05

    With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for Arabidopsis thaliana genome has been built at http://arabidopsis.cbi.edu.cn/.

  14. ABrowse - a customizable next-generation genome browser framework

    Science.gov (United States)

    2012-01-01

    Background With the rapid growth of genome sequencing projects, genome browser is becoming indispensable, not only as a visualization system but also as an interactive platform to support open data access and collaborative work. Thus a customizable genome browser framework with rich functions and flexible configuration is needed to facilitate various genome research projects. Results Based on next-generation web technologies, we have developed a general-purpose genome browser framework ABrowse which provides interactive browsing experience, open data access and collaborative work support. By supporting Google-map-like smooth navigation, ABrowse offers end users highly interactive browsing experience. To facilitate further data analysis, multiple data access approaches are supported for external platforms to retrieve data from ABrowse. To promote collaborative work, an online user-space is provided for end users to create, store and share comments, annotations and landmarks. For data providers, ABrowse is highly customizable and configurable. The framework provides a set of utilities to import annotation data conveniently. To build ABrowse on existing annotation databases, data providers could specify SQL statements according to database schema. And customized pages for detailed information display of annotation entries could be easily plugged in. For developers, new drawing strategies could be integrated into ABrowse for new types of annotation data. In addition, standard web service is provided for data retrieval remotely, providing underlying machine-oriented programming interface for open data access. Conclusions ABrowse framework is valuable for end users, data providers and developers by providing rich user functions and flexible customization approaches. The source code is published under GNU Lesser General Public License v3.0 and is accessible at http://www.abrowse.org/. To demonstrate all the features of ABrowse, a live demo for Arabidopsis thaliana genome

  15. Toward a next-generation high-energy gamma-ray telescope. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Bloom, E.D.; Evans, L.L. [eds.

    1997-03-01

    It has been some time between the time of the first Gamma-ray Large Area Space Telescope (GLAST) workshop, Towards a Next Generation High-Energy Gamma-Ray Telescope, in late August 1994, and the publication of a partial proceedings of that meeting. Since then there has been considerable progress in both the technical and project development of GLAST. From its origins at SLAC/Stanford in early 1992, the collaboration has currently grown to more than 20 institutions from France, Germany, Italy, Japan, and the US, and is still growing. About half of these are astrophysics/astronomy institutions; the other half are high-energy physics institutions. About 100 astronomers, astrophysicists, and particle physicists are currently spending some fraction of their time on the GLAST R and D program. The late publication date of this proceedings has resulted in some additions to the original content of the meeting. The first paper is actually a brochure prepared for NASA by Peter Michelson in early 1996. Except for the appendix, the other papers in the proceedings were presented at the conference, and written up over the following two years. Some presentations were never written up.

  16. Towards a next-generation high-energy gamma-ray telescope. Proceedings

    International Nuclear Information System (INIS)

    Bloom, E.D.; Evans, L.L.

    1997-03-01

    It has been some time between the time of the first Gamma-ray Large Area Space Telescope (GLAST) workshop, Towards a Next Generation High-Energy Gamma-Ray Telescope, in late August 1994, and the publication of a partial proceedings of that meeting. Since then there has been considerable progress in both the technical and project development of GLAST. From its origins at SLAC/Stanford in early 1992, the collaboration has currently grown to more than 20 institutions from France, Germany, Italy, Japan, and the US, and is still growing. About half of these are astrophysics/astronomy institutions; the other half are high-energy physics institutions. About 100 astronomers, astrophysicists, and particle physicists are currently spending some fraction of their time on the GLAST R and D program. The late publication date of this proceedings has resulted in some additions to the original content of the meeting. The first paper is actually a brochure prepared for NASA by Peter Michelson in early 1996. Except for the appendix, the other papers in the proceedings were presented at the conference, and written up over the following two years. Some presentations were never written up

  17. Declarative Parallel Programming in Spreadsheet End-User Development

    DEFF Research Database (Denmark)

    Biermann, Florian

    2016-01-01

    Spreadsheets are first-order functional languages and are widely used in research and industry as a tool to conveniently perform all kinds of computations. Because cells on a spreadsheet are immutable, there are possibilities for implicit parallelization of spreadsheet computations. In this liter...... can directly apply results from functional array programming to a spreadsheet model of computations.......Spreadsheets are first-order functional languages and are widely used in research and industry as a tool to conveniently perform all kinds of computations. Because cells on a spreadsheet are immutable, there are possibilities for implicit parallelization of spreadsheet computations....... In this literature study, we provide an overview of the publications on spreadsheet end-user programming and declarative array programming to inform further research on parallel programming in spreadsheets. Our results show that there is a clear overlap between spreadsheet programming and array programming and we...

  18. Planning Instruction to Meet the Intent of the Next Generation Science Standards

    Science.gov (United States)

    Krajcik, Joseph; Codere, Susan; Dahsah, Chanyah; Bayer, Renee; Mun, Kongju

    2014-03-01

    The National Research Council's Framework for K- 12 Science Education and the Next Generation Science Standards (NGSS Lead States in Next Generation Science Standards: For states, by states. The National Academies Press, Washington, 2013) move teaching away from covering many isolated facts to a focus on a smaller number of disciplinary core ideas (DCIs) and crosscutting concepts that can be used to explain phenomena and solve problems by engaging in science and engineering practices. The NGSS present standards as knowledge-in-use by expressing them as performance expectations (PEs) that integrate all three dimensions from the Framework for K- 12 Science Education. This integration of core ideas, practices, and crosscutting concepts is referred to as three-dimensional learning (NRC in Division of Behavioral and Social Sciences and Education. The National Academies Press, Washington, 2014). PEs state what students can be assessed on at the end of grade level for K-5 and at the end of grade band for 6-8 and 9-12. PEs do not specify how instruction should be developed nor do they serve as objectives for individual lessons. To support students in developing proficiency in the PEs, the elements of the DCIs will need to be blended with various practices and crosscutting concepts. In this paper, we examine how to design instruction to support students in meeting a cluster or "bundle" of PEs and how to blend the three dimensions to develop lesson level PEs that can be used for guiding instruction. We provide a ten-step process and an example of that process that teachers and curriculum designers can use to design lessons that meet the intent of the Next Generation of Science Standards.

  19. Overview of NASA's Next Generation Air Transportation System (NextGen) Research

    Science.gov (United States)

    Swenson, Harry N.

    2009-01-01

    This slide presentation is an overview of the research for the Next Generation Air Transportation System (NextGen). Included is a review of the current air transportation system and the challenges of air transportation research. Also included is a review of the current research highlights and significant accomplishments.

  20. Parallel and pipelined front-end for multi-element silicon detectors in scanning electron microscopy

    International Nuclear Information System (INIS)

    Boulin, C.; Epstein, A.

    1992-01-01

    This paper discusses a silicon quadrant detector (128 elements) implemented as an electron detector in a Scanning Transmission Electron Microscope. As the electron beam scans over the sample, electrons are counted during each pixel. The authors developed an ASIC for the multichannel counting system. The digital front-end carries out the readout of all elements, in four groups, and uses these data to compute linear combinations to generate up to eight simultaneous images. For the preprocessing the authors implemented a parallel and pipelined system. Dedicated software tools were developed to generate the programs for all the processors. These tools are transparently accessed by the user via a user friendly interface

  1. Next Generation Social Networks

    DEFF Research Database (Denmark)

    Sørensen, Lene Tolstrup; Skouby, Knud Erik

    2008-01-01

    different online networks for communities of people who share interests or individuals who presents themselves through user produced content is what makes up the social networking of today. The purpose of this paper is to discuss perceived user requirements to the next generation social networks. The paper...

  2. Next generation CANDU plants

    International Nuclear Information System (INIS)

    Hedges, K.R.; Yu, S.K.W.

    1998-01-01

    Future CANDU designs will continue to meet the emerging design and performance requirements expected by the operating utilities. The next generation CANDU products will integrate new technologies into both the product features as well as into the engineering and construction work processes associated with delivering the products. The timely incorporation of advanced design features is the approach adopted for the development of the next generation of CANDU. AECL's current products consist of 700MW Class CANDU 6 and 900 MW Class CANDU 9. Evolutionary improvements are continuing with our CANDU products to enhance their adaptability to meet customers ever increasing need for higher output. Our key product drivers are for improved safety, environmental protection and improved cost effectiveness. Towards these goals we have made excellent progress in Research and Development and our investments are continuing in areas such as fuel channels and passive safety. Our long term focus is utilizing the fuel cycle flexibility of CANDU reactors as part of the long term energy mix

  3. Optical Subsystems for Next Generation Access Networks

    DEFF Research Database (Denmark)

    Lazaro, J.A; Polo, V.; Schrenk, B.

    2011-01-01

    Recent optical technologies are providing higher flexibility to next generation access networks: on the one hand, providing progressive FTTx and specifically FTTH deployment, progressively shortening the copper access network; on the other hand, also opening fixed-mobile convergence solutions...... in next generation PON architectures. It is provided an overview of the optical subsystems developed for the implementation of the proposed NG-Access Networks....

  4. Towards the Next Generation of Space Environment Prediction Capabilities.

    Science.gov (United States)

    Kuznetsova, M. M.

    2015-12-01

    Since its establishment more than 15 years ago, the Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) is serving as an assess point to expanding collection of state-of-the-art space environment models and frameworks as well as a hub for collaborative development of next generation space weather forecasting systems. In partnership with model developers and international research and operational communities the CCMC integrates new data streams and models from diverse sources into end-to-end space weather impacts predictive systems, identifies week links in data-model & model-model coupling and leads community efforts to fill those gaps. The presentation will highlight latest developments, progress in CCMC-led community-wide projects on testing, prototyping, and validation of models, forecasting techniques and procedures and outline ideas on accelerating implementation of new capabilities in space weather operations.

  5. Building next-generation converged networks theory and practice

    CERN Document Server

    Pathan, Al-Sakib Khan

    2013-01-01

    Supplying a comprehensive introduction to next-generation networks, Building Next-Generation Converged Networks: Theory and Practice strikes a balance between how and why things work and how to make them work. It compiles recent advancements along with basic issues from the wide range of fields related to next generation networks. Containing the contributions of 56 industry experts and researchers from 16 different countries, the book presents relevant theoretical frameworks and the latest research. It investigates new technologies such as IPv6 over Low Power Wireless Personal Area Network (6L

  6. Parallel paving: An algorithm for generating distributed, adaptive, all-quadrilateral meshes on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Lober, R.R.; Tautges, T.J.; Vaughan, C.T.

    1997-03-01

    Paving is an automated mesh generation algorithm which produces all-quadrilateral elements. It can additionally generate these elements in varying sizes such that the resulting mesh adapts to a function distribution, such as an error function. While powerful, conventional paving is a very serial algorithm in its operation. Parallel paving is the extension of serial paving into parallel environments to perform the same meshing functions as conventional paving only on distributed, discretized models. This extension allows large, adaptive, parallel finite element simulations to take advantage of paving`s meshing capabilities for h-remap remeshing. A significantly modified version of the CUBIT mesh generation code has been developed to host the parallel paving algorithm and demonstrate its capabilities on both two dimensional and three dimensional surface geometries and compare the resulting parallel produced meshes to conventionally paved meshes for mesh quality and algorithm performance. Sandia`s {open_quotes}tiling{close_quotes} dynamic load balancing code has also been extended to work with the paving algorithm to retain parallel efficiency as subdomains undergo iterative mesh refinement.

  7. Next-Generation Sequencing for Binary Protein-Protein Interactions

    Directory of Open Access Journals (Sweden)

    Bernhard eSuter

    2015-12-01

    Full Text Available The yeast two-hybrid (Y2H system exploits host cell genetics in order to display binary protein-protein interactions (PPIs via defined and selectable phenotypes. Numerous improvements have been made to this method, adapting the screening principle for diverse applications, including drug discovery and the scale-up for proteome wide interaction screens in human and other organisms. Here we discuss a systematic workflow and analysis scheme for screening data generated by Y2H and related assays that includes high-throughput selection procedures, readout of comprehensive results via next-generation sequencing (NGS, and the interpretation of interaction data via quantitative statistics. The novel assays and tools will serve the broader scientific community to harness the power of NGS technology to address PPI networks in health and disease. We discuss examples of how this next-generation platform can be applied to address specific questions in diverse fields of biology and medicine.

  8. Key thrusts in next generation CANDU. Annex 10

    International Nuclear Information System (INIS)

    Shalaby, B.A.; Torgerson, D.F.; Duffey, R.B.

    2002-01-01

    Current electricity markets and the competitiveness of other generation options such as CCGT have influenced the directions of future nuclear generation. The next generation CANDU has used its key characteristics as the basis to leap frog into a new design featuring improved economics, enhanced passive safety, enhanced operability and demonstrated fuel cycle flexibility. Many enabling technologies spinning of current CANDU design features are used in the next generation design. Some of these technologies have been developed in support of existing plants and near term designs while others will need to be developed and tested. This paper will discuss the key principles driving the next generation CANDU design and the fuel cycle flexibility of the CANDU system which provide synergism with the PWR fuel cycle. (author)

  9. Highly Flexible, Fire Resistant HybridSil Foams for Next Generation Fireproofing, Insulation, and Energy Absorption NASA Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this Phase I STTR program is to adapt NanoSonic's HybridSil™ nanocomposite technology for the creation of next generation highly flexible, fire...

  10. Computational Performance of a Parallelized Three-Dimensional High-Order Spectral Element Toolbox

    Science.gov (United States)

    Bosshard, Christoph; Bouffanais, Roland; Clémençon, Christian; Deville, Michel O.; Fiétier, Nicolas; Gruber, Ralf; Kehtari, Sohrab; Keller, Vincent; Latt, Jonas

    In this paper, a comprehensive performance review of an MPI-based high-order three-dimensional spectral element method C++ toolbox is presented. The focus is put on the performance evaluation of several aspects with a particular emphasis on the parallel efficiency. The performance evaluation is analyzed with help of a time prediction model based on a parameterization of the application and the hardware resources. A tailor-made CFD computation benchmark case is introduced and used to carry out this review, stressing the particular interest for clusters with up to 8192 cores. Some problems in the parallel implementation have been detected and corrected. The theoretical complexities with respect to the number of elements, to the polynomial degree, and to communication needs are correctly reproduced. It is concluded that this type of code has a nearly perfect speed up on machines with thousands of cores, and is ready to make the step to next-generation petaflop machines.

  11. Recent progress in nanostructured next-generation field emission devices

    International Nuclear Information System (INIS)

    Mittal, Gaurav; Lahiri, Indranil

    2014-01-01

    Field emission has been known to mankind for more than a century, and extensive research in this field for the last 40–50 years has led to development of exciting applications such as electron sources, miniature x-ray devices, display materials, etc. In the last decade, large-area field emitters were projected as an important material to revolutionize healthcare and medical devices, and space research. With the advent of nanotechnology and advancements related to carbon nanotubes, field emitters are demonstrating highly enhanced performance and novel applications. Next-generation emitters need ultra-high emission current density, high brightness, excellent stability and reproducible performance. Novel design considerations and application of new materials can lead to achievement of these capabilities. This article presents an overview of recent developments in this field and their effects on improved performance of field emitters. These advancements are demonstrated to hold great potential for application in next-generation field emission devices. (topical review)

  12. Recent progress in nanostructured next-generation field emission devices

    Science.gov (United States)

    Mittal, Gaurav; Lahiri, Indranil

    2014-08-01

    Field emission has been known to mankind for more than a century, and extensive research in this field for the last 40-50 years has led to development of exciting applications such as electron sources, miniature x-ray devices, display materials, etc. In the last decade, large-area field emitters were projected as an important material to revolutionize healthcare and medical devices, and space research. With the advent of nanotechnology and advancements related to carbon nanotubes, field emitters are demonstrating highly enhanced performance and novel applications. Next-generation emitters need ultra-high emission current density, high brightness, excellent stability and reproducible performance. Novel design considerations and application of new materials can lead to achievement of these capabilities. This article presents an overview of recent developments in this field and their effects on improved performance of field emitters. These advancements are demonstrated to hold great potential for application in next-generation field emission devices.

  13. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    Science.gov (United States)

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  14. Designing the next generation (fifth generation computers)

    International Nuclear Information System (INIS)

    Wallich, P.

    1983-01-01

    A description is given of the designs necessary to develop fifth generation computers. An analysis is offered of problems and developments in parallelism, VLSI, artificial intelligence, knowledge engineering and natural language processing. Software developments are outlined including logic programming, object-oriented programming and exploratory programming. Computer architecture is detailed including concurrent computer architecture

  15. Parallel processor programs in the Federal Government

    Science.gov (United States)

    Schneck, P. B.; Austin, D.; Squires, S. L.; Lehmann, J.; Mizell, D.; Wallgren, K.

    1985-01-01

    In 1982, a report dealing with the nation's research needs in high-speed computing called for increased access to supercomputing resources for the research community, research in computational mathematics, and increased research in the technology base needed for the next generation of supercomputers. Since that time a number of programs addressing future generations of computers, particularly parallel processors, have been started by U.S. government agencies. The present paper provides a description of the largest government programs in parallel processing. Established in fiscal year 1985 by the Institute for Defense Analyses for the National Security Agency, the Supercomputing Research Center will pursue research to advance the state of the art in supercomputing. Attention is also given to the DOE applied mathematical sciences research program, the NYU Ultracomputer project, the DARPA multiprocessor system architectures program, NSF research on multiprocessor systems, ONR activities in parallel computing, and NASA parallel processor projects.

  16. High performance parallel computers for science

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  17. Next Generation Biopharmaceuticals: Product Development.

    Science.gov (United States)

    Mathaes, Roman; Mahler, Hanns-Christian

    2018-04-11

    Therapeutic proteins show a rapid market growth. The relatively young biotech industry already represents 20 % of the total global pharma market. The biotech industry environment has traditionally been fast-pasted and intellectually stimulated. Nowadays the top ten best selling drugs are dominated by monoclonal antibodies (mABs).Despite mABs being the biggest medical breakthrough in the last 25 years, technical innovation does not stand still.The goal remains to preserve the benefits of a conventional mAB (serum half-life and specificity) whilst further improving efficacy and safety and to open new and better avenues for treating patients, e.g., improving the potency of molecules, target binding, tissue penetration, tailored pharmacokinetics, and reduced adverse effects or immunogenicity.The next generation of biopharmaceuticals can pose specific chemistry, manufacturing, and control (CMC) challenges. In contrast to conventional proteins, next-generation biopharmaceuticals often require lyophilization of the final drug product to ensure storage stability over shelf-life time. In addition, next-generation biopharmaceuticals require analytical methods that cover different ways of possible degradation patterns and pathways, and product development is a long way from being straight forward. The element of "prior knowledge" does not exist equally for most novel formats compared to antibodies, and thus the assessment of critical quality attributes (CQAs) and the definition of CQA assessment criteria and specifications is difficult, especially in early-stage development.

  18. Combat vehicle crew helmet-mounted display: next generation high-resolution head-mounted display

    Science.gov (United States)

    Nelson, Scott A.

    1994-06-01

    The Combat Vehicle Crew Head-Mounted Display (CVC HMD) program is an ARPA-funded, US Army Natick Research, Development, and Engineering Center monitored effort to develop a high resolution, flat panel HMD for the M1 A2 Abrams main battle tank. CVC HMD is part of the ARPA High Definition Systems (HDS) thrust to develop and integrate small (24 micrometers square pels), high resolution (1280 X 1024 X 6-bit grey scale at 60 frame/sec) active matrix electroluminescent (AMEL) and active matrix liquid crystal displays (AMLCD) for head mounted and projection applications. The Honeywell designed CVC HMD is a next generation head-mounted display system that includes advanced flat panel image sources, advanced digital display driver electronics, high speed (> 1 Gbps) digital interconnect electronics, and light weight, high performance optical and mechanical designs. The resulting dramatic improvements in size, weight, power, and cost have already led to program spin offs for both military and commercial applications.

  19. Next Generation Life Support (NGLS): Rapid Cycle Amine Swing Bed

    Data.gov (United States)

    National Aeronautics and Space Administration — The Rapid Cycle Amine (RCA) swingbed has been identified as a technology with high potential to meet the stringent requirements for the next generation spacesuit's...

  20. Efficient Cryptography for the Next Generation Secure Cloud

    Science.gov (United States)

    Kupcu, Alptekin

    2010-01-01

    Peer-to-peer (P2P) systems, and client-server type storage and computation outsourcing constitute some of the major applications that the next generation cloud schemes will address. Since these applications are just emerging, it is the perfect time to design them with security and privacy in mind. Furthermore, considering the high-churn…

  1. Integrated control of next generation power system

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2010-02-28

    The multi-agent system (MAS) approach has been applied with promising results for enhancing an electric power distribution circuit, such as the Circuit of the Future as developed by Southern California Edison. These next generation power system results include better ability to reconfigure the circuit as well as the increased capability to improve the protection and enhance the reliability of the circuit. There were four main tasks in this project. The specific results for each of these four tasks and their related topics are presented in main sections of this report. Also, there were seven deliverables for this project. The main conclusions for these deliverables are summarized in the identified subtask section of this report. The specific details for each of these deliverables are included in the “Project Deliverables” section at the end of this Final Report.

  2. Visual programming for next-generation sequencing data analytics.

    Science.gov (United States)

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  3. Scanning the horizon for high value-add manufacturing science: Accelerating manufacturing readiness for the next generation of disruptive, high-value curative cell therapeutics.

    Science.gov (United States)

    Hourd, Paul; Williams, David J

    2018-05-01

    Since the regenerative medicine sector entered the second phase of its development (RegenMed 2.0) more than a decade ago, there is increasing recognition that current technology innovation trajectories will drive the next translational phase toward the production of disruptive, high-value curative cell and gene-based regenerative medicines. To identify the manufacturing science problems that must be addressed to permit translation of these next generation therapeutics. In this short report, a long lens look within the pluripotent stem cell therapeutic space, both embryonic and induced, is used to gain early insights on where critical technology and manufacturing challenges may emerge. This report offers a future perspective on the development and innovation that will be needed within manufacturing science to add value in the production and commercialization of the next generation of advanced cell therapies and precision medicines. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  4. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    Science.gov (United States)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  5. Next generation PWR

    International Nuclear Information System (INIS)

    Tanaka, Toshihiko; Fukuda, Toshihiko; Usui, Shuji

    2001-01-01

    Development of LWR for power generation in Japan has been intended to upgrade its reliability, safety, operability, maintenance and economy as well as to increase its capacity in order, since nuclear power generation for commercial use was begun on 1970, to steadily increase its generation power. And, in Japan, ABWR (advanced BWR) of the most promising LWR in the world, was already used actually and APWR (advanced PWR) with the largest output in the world is also at a step of its actual use. And, development of the APWR in Japan was begun on 1980s, and is at a step of plan on construction of its first machine at early of this century. However, by large change of social affairs, economy of nuclear power generation is extremely required, to be positioned at an APWR improved development reactor promoted by collaboration of five PWR generation companies and the Mitsubishi Electric Co., Ltd. Therefore, on its development, investigation on effect of change in social affairs on nuclear power stations was at first carried out, to establish a design requirement for the next generation PWR. Here were described on outline, reactor core design, safety concept, and safety evaluation of APWR+ and development of an innovative PWR. (G.K.)

  6. Statistical Approaches for Next-Generation Sequencing Data

    OpenAIRE

    Qiao, Dandi

    2012-01-01

    During the last two decades, genotyping technology has advanced rapidly, which enabled the tremendous success of genome-wide association studies (GWAS) in the search of disease susceptibility loci (DSLs). However, only a small fraction of the overall predicted heritability can be explained by the DSLs discovered. One possible explanation for this ”missing heritability” phenomenon is that many causal variants are rare. The recent development of high-throughput next-generation sequencing (NGS) ...

  7. Next generation ATCA control infrastructure for the CMS Phase-2 upgrades

    CERN Document Server

    Smith, Wesley; Svetek, Aleš; Tikalsky, Jes; Fobes, Robert; Dasu, Sridhara; Smith, Wesley; Vicente, Marcelo

    2017-01-01

    A next generation control infrastructure to be used in Advanced TCA (ATCA) blades at CMS experiment is being designed and tested. Several ATCA systems are being prepared for the High-Luminosity LHC (HL-LHC) and will be installed at CMS during technical stops. The next generation control infrastructure will provide all the necessary hardware, firmware and software required in these systems, decreasing development time and increasing flexibility. The complete infrastructure includes an Intelligent Platform Management Controller (IPMC), a Module Management Controller (MMC) and an Embedded Linux Mezzanine (ELM) processing card.

  8. Conceptual design of next generation MTR

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Hiroshi; Yamaura, Takayuki; Naka, Michihiro; Kawamata, Kazuo; Izumo, Hironobu; Hori, Naohiko; Nagao, Yoshiharu; Kusunoki, Tsuyoshi; Kaminaga, Masanori; Komori, Yoshihiro; Suzuki, Masahide; Kawamura, Hiroshi [Japan Atomic Energy Agency, Oarai Research and Development Center, Oarai, Ibaraki (Japan); Mine, M [Hitachi-GE Nuclear Energy, Ltd., Hitachi, Ibaraki (Japan); Yamazaki, S [Kawasaki Heavy Industries, Ltd., Kobe, Hyogo (Japan); Ishikawa, S [NGK Insulators, Ltd., Nagoya, Aichi (Japan); Miura, K [Sukegawa Electric Co., Ltd., Takahagi, Ibaraki (Japan); Nakashima, S [Fuji Electric Co., Ltd., Tokyo (Japan); Yamaguchi, K [Chiyoda Technol Corp., Tokyo (Japan)

    2012-03-15

    Conceptual design of the high-performance and low-cost next generation materials testing reactor (MTR) which will be expected to construct in the nuclear power plant introduction countries, started from 2010 in JAEA and nuclear-related companies in Japan. The aims of this conceptual design are to achieve highly safe reactor, economical design, high availability factor and advanced irradiation utilization. One of the basic reactor concept was determined as swimming pool type, thermal power of 10MW and water cooled and moderated reactor with plate type fuel element same as the JMTR. It is expected that the research reactors are used for human resource development, progress of the science and technology, expansion of industry use, lifetime extension of LWRs and so on. (author)

  9. Real-Time Optimization and Control of Next-Generation Distribution

    Science.gov (United States)

    -Generation Distribution Infrastructure Real-Time Optimization and Control of Next-Generation Distribution developing a system-theoretic distribution network management framework that unifies real-time voltage and Infrastructure | Grid Modernization | NREL Real-Time Optimization and Control of Next

  10. Hacking the next generation

    CERN Document Server

    Dhanjani, Nitesh; Hardin, Brett

    2009-01-01

    With the advent of rich Internet applications, the explosion of social media, and the increased use of powerful cloud computing infrastructures, a new generation of attackers has added cunning new techniques to its arsenal. For anyone involved in defending an application or a network of systems, Hacking: The Next Generation is one of the few books to identify a variety of emerging attack vectors. You'll not only find valuable information on new hacks that attempt to exploit technical flaws, you'll also learn how attackers take advantage of individuals via social networking sites, and abuse

  11. The study of methodologies of software development for the next generation of HEP detector software

    International Nuclear Information System (INIS)

    Ding Yuzheng; Wang Taijie; Dai Guiliang

    1997-01-01

    The author discusses the characteristics of the next generation of HEP (High Energy Physics) detector software, and describes the basic strategy for the usage of object oriented methodologies, languages and tools in the development of the next generation of HEP detector software

  12. Towards High-throughput Immunomics for Infectious Diseases: Use of Next-generation Peptide Microarrays for Rapid Discovery and Mapping of Antigenic Determinants

    DEFF Research Database (Denmark)

    J. Carmona, Santiago; Nielsen, Morten; Schafer-Nielsen, Claus

    2015-01-01

    , we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than...

  13. Next Generation Summer School

    Science.gov (United States)

    Eugenia, Marcu

    2013-04-01

    On 21.06.2010 the "Next Generation" Summer School has opened the doors for its first students. They were introduced in the astronomy world by astronomical observations, astronomy and radio-astronomy lectures, laboratory projects meant to initiate them into modern radio astronomy and radio communications. The didactic programme was structure as fallowing: 1) Astronomical elements from the visible spectrum (lectures + practical projects) 2) Radio astronomy elements (lectures + practical projects) 3) Radio communication base (didactic- recreative games) The students and professors accommodation was at the Agroturistic Pension "Popasul Iancului" situated at 800m from the Marisel Observatory. First day (summer solstice day) began with a practical activity: determination of the meridian by measurements of the shadow (the direction of one vertical alignment, when it has the smallest length). The experiment is very instructive and interesting because combines notions of physics, spatial geometry and basic astronomy elements. Next day the activities took place in four stages: the students processed the experimental data obtained on first day (on sheets of millimetre paper they represented the length of the shadow alignments according the time), each team realised its own sun quadrant, point were given considering the design and functionality of these quadrant, the four teams had to mimic important constellations on carton boards with phosphorescent sticky stars and the students, accompanied by the professors took a hiking trip to the surroundings, marking the interest point coordinates, using a GPS to establish the geographical coronations and at the end of the day the students realised a small map of central Marisel area based on the GPS data. On the third day, the students were introduced to basic notions of radio astronomy, the principal categories of artificial Earth satellites: low orbit satellites (LEO), Medium orbit satellites (MEO) and geostationary satellites (GEO

  14. Tablet—next generation sequence assembly visualization

    Science.gov (United States)

    Milne, Iain; Bayer, Micha; Cardle, Linda; Shaw, Paul; Stephen, Gordon; Wright, Frank; Marshall, David

    2010-01-01

    Summary: Tablet is a lightweight, high-performance graphical viewer for next-generation sequence assemblies and alignments. Supporting a range of input assembly formats, Tablet provides high-quality visualizations showing data in packed or stacked views, allowing instant access and navigation to any region of interest, and whole contig overviews and data summaries. Tablet is both multi-core aware and memory efficient, allowing it to handle assemblies containing millions of reads, even on a 32-bit desktop machine. Availability: Tablet is freely available for Microsoft Windows, Apple Mac OS X, Linux and Solaris. Fully bundled installers can be downloaded from http://bioinf.scri.ac.uk/tablet in 32- and 64-bit versions. Contact: tablet@scri.ac.uk PMID:19965881

  15. Development of internal CRD for next generation BWR-endurance and robustness tests of ball-bearing materials in high-pressure and high-temperature water

    International Nuclear Information System (INIS)

    Shoji Goto; Shuichi Ohmori; Michitsugu Mori; Shohei Kawano; Tadashi Narabayashi; Shinichi Ishizato

    2005-01-01

    An internal CRD using a heatproof ceramics insulated coil is under development to be a competitive and higher performance as Next- Generation BWR. In the case of the 1700MWe next generation BWR, adapting the internal CRDs, the reactor pressure vessel is almost equivalent to that of 1356 MWe ABWR. The endurance and robustness tests were examined in order to confirm the durability of the bearing for the internal CRD. The durability of the ball bearing for the internal CRD was performed in the high-pressure and high-temperature reactor water of current BWR conditions. The experimental results confirmed the durability of rotational numbers for the operation length of 60 years. We added the cruds into water to confirm the robustness of the ball bearing. The test results also showed good robustness even in high-density crud conditions, compared with the current BWR. This program is conducted as one of the selected offers for the advertised technical developments of the Institute of Applied Energy founded by METI (Ministry of Economy, Trade and Industry) of Japan. (authors)

  16. The NASA Next Generation Stirling Technology Program Overview

    Science.gov (United States)

    Schreiber, J. G.; Shaltens, R. K.; Wong, W. A.

    2005-12-01

    NASAs Science Mission Directorate is developing the next generation Stirling technology for future Radioisotope Power Systems (RPS) for surface and deep space missions. The next generation Stirling convertor is one of two advanced power conversion technologies currently being developed for future NASA missions, and is capable of operating for both planetary atmospheres and deep space environments. The Stirling convertor (free-piston engine integrated with a linear alternator) produces about 90 We(ac) and has a specific power of about 90 We/kg. Operating conditions of Thot at 850 degree C and Trej at 90 degree C results in the Stirling convertor estimated efficiency of about 40 per cent. Using the next generation Stirling convertor in future RPS, the "system" specific power is estimated at 8 We/kg. The design lifetime is three years on the surface of Mars and fourteen years in deep space missions. Electrical power of about 160 We (BOM) is produced by two (2) free-piston Stirling convertors heated by two (2) General Purpose Heat Source (GPHS) modules. This development is being performed by Sunpower, Athens, OH with Pratt & Whitney, Rocketdyne, Canoga Park, CA under contract to Glenn Research Center (GRC), Cleveland, Ohio. GRC is guiding the independent testing and technology development for the next generation Stirling generator.

  17. High diagnostic yield of syndromic intellectual disability by targeted next-generation sequencing.

    Science.gov (United States)

    Martínez, Francisco; Caro-Llopis, Alfonso; Roselló, Mónica; Oltra, Silvestre; Mayo, Sonia; Monfort, Sandra; Orellana, Carmen

    2017-02-01

    Intellectual disability is a very complex condition where more than 600 genes have been reported. Due to this extraordinary heterogeneity, a large proportion of patients remain without a specific diagnosis and genetic counselling. The need for new methodological strategies in order to detect a greater number of mutations in multiple genes is therefore crucial. In this work, we screened a large panel of 1256 genes (646 pathogenic, 610 candidate) by next-generation sequencing to determine the molecular aetiology of syndromic intellectual disability. A total of 92 patients, negative for previous genetic analyses, were studied together with their parents. Clinically relevant variants were validated by conventional sequencing. A definitive diagnosis was achieved in 29 families by testing the 646 known pathogenic genes. Mutations were found in 25 different genes, where only the genes KMT2D, KMT2A and MED13L were found mutated in more than one patient. A preponderance of de novo mutations was noted even among the X linked conditions. Additionally, seven de novo probably pathogenic mutations were found in the candidate genes AGO1, JARID2, SIN3B, FBXO11, MAP3K7, HDAC2 and SMARCC2. Altogether, this means a diagnostic yield of 39% of the cases (95% CI 30% to 49%). The developed panel proved to be efficient and suitable for the genetic diagnosis of syndromic intellectual disability in a clinical setting. Next-generation sequencing has the potential for high-throughput identification of genetic variations, although the challenges of an adequate clinical interpretation of these variants and the knowledge on further unknown genes causing intellectual disability remain to be solved. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. Next-Generation Sequencing of Antibody Display Repertoires

    Directory of Open Access Journals (Sweden)

    Romain Rouet

    2018-02-01

    Full Text Available In vitro selection technology has transformed the development of therapeutic monoclonal antibodies. Using methods such as phage, ribosome, and yeast display, high affinity binders can be selected from diverse repertoires. Here, we review strategies for the next-generation sequencing (NGS of phage- and other antibody-display libraries, as well as NGS platforms and analysis tools. Moreover, we discuss recent examples relating to the use of NGS to assess library diversity, clonal enrichment, and affinity maturation.

  19. Physical Configuration of the Next Generation Home Network

    Science.gov (United States)

    Terada, Shohei; Kakishima, Yu; Hanawa, Dai; Oguchi, Kimio

    The number of broadband users is rapidly increasing worldwide. Japan already has over 10 million FTTH users. Another trend is the rapid digitalization of home electrical equipment e. g. digital cameras and hard disc recorders. These trends will encourage the emergence of the next generation home network. In this paper, we introduce the next generation home network image and describe the five domains into which home devices can be classified. We then clarify the optimum medium with which to configure the network given the requirements imposed by the home environment. Wiring cable lengths for three network topologies are calculated. The results gained from the next generation home network implemented on the first phase testbed are shown. Finally, our conclusions are given.

  20. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    Science.gov (United States)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  1. Next-Generation Ultra-Compact Stowage/Lightweight Solar Array System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Deployable Space Systems, Inc. (DSS) has developed a next-generation high performance solar array system that has game-changing performance metrics in terms of...

  2. Parallel grid generation algorithm for distributed memory computers

    Science.gov (United States)

    Moitra, Stuti; Moitra, Anutosh

    1994-01-01

    A parallel grid-generation algorithm and its implementation on the Intel iPSC/860 computer are described. The grid-generation scheme is based on an algebraic formulation of homotopic relations. Methods for utilizing the inherent parallelism of the grid-generation scheme are described, and implementation of multiple levELs of parallelism on multiple instruction multiple data machines are indicated. The algorithm is capable of providing near orthogonality and spacing control at solid boundaries while requiring minimal interprocessor communications. Results obtained on the Intel hypercube for a blended wing-body configuration are used to demonstrate the effectiveness of the algorithm. Fortran implementations bAsed on the native programming model of the iPSC/860 computer and the Express system of software tools are reported. Computational gains in execution time speed-up ratios are given.

  3. Parallel processing of genomics data

    Science.gov (United States)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  4. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    Science.gov (United States)

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  5. Next generation vaccines.

    Science.gov (United States)

    Riedmann, Eva M

    2011-07-01

    In February this year, about 100 delegates gathered for three days in Vienna (Austria) for the Next Generation Vaccines conference. The meeting held in the Vienna Hilton Hotel from 23rd-25th February 2011 had a strong focus on biotech and industry. The conference organizer Jacob Fleming managed to put together a versatile program ranging from the future generation of vaccines to manufacturing, vaccine distribution and delivery, to regulatory and public health issues. Carefully selected top industry experts presented first-hand experience and shared solutions for overcoming the latest challenges in the field of vaccinology. The program also included several case study presentations on novel vaccine candidates in different stages of development. An interactive pre-conference workshop as well as interactive panel discussions during the meeting allowed all delegates to gain new knowledge and become involved in lively discussions on timely, interesting and sometimes controversial topics related to vaccines.

  6. NOAA NEXt-Generation RADar (NEXRAD) Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Level III weather radar products collected from Next-Generation Radar (NEXRAD) stations located in the contiguous United States, Alaska,...

  7. Architectural and Algorithmic Requirements for a Next-Generation System Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    V.A. Mousseau

    2010-05-01

    This document presents high-level architectural and system requirements for a next-generation system analysis code (NGSAC) to support reactor safety decision-making by plant operators and others, especially in the context of light water reactor plant life extension. The capabilities of NGSAC will be different from those of current-generation codes, not only because computers have evolved significantly in the generations since the current paradigm was first implemented, but because the decision-making processes that need the support of next-generation codes are very different from the decision-making processes that drove the licensing and design of the current fleet of commercial nuclear power reactors. The implications of these newer decision-making processes for NGSAC requirements are discussed, and resulting top-level goals for the NGSAC are formulated. From these goals, the general architectural and system requirements for the NGSAC are derived.

  8. The Next Generation Science Standards

    Science.gov (United States)

    Pruitt, Stephen L.

    2015-01-01

    The Next Generation Science Standards (NGSS Lead States 2013) were released almost two years ago. Work tied to the NGSS, their adoption, and implementation continues to move forward around the country. Stephen L. Pruitt, senior vice president, science, at Achieve, an independent, nonpartisan, nonprofit education reform organization that was a lead…

  9. GeneLab for High Schools: Data Mining for the Next Generation

    Science.gov (United States)

    Blaber, Elizabeth A.; Ly, Diana; Sato, Kevin Y.; Taylor, Elizabeth

    2016-01-01

    Modern biological sciences have become increasingly based on molecular biology and high-throughput molecular techniques, such as genomics, transcriptomics, and proteomics. NASA Scientists and the NASA Space Biology Program have aimed to examine the fundamental building blocks of life (RNA, DNA and protein) in order to understand the response of living organisms to space and aid in fundamental research discoveries on Earth. In an effort to enable NASA funded science to be available to everyone, NASA has collected the data from omics studies and curated them in a data system called GeneLab. Whilst most college-level interns, academics and other scientists have had some interaction with omics data sets and analysis tools, high school students often have not. Therefore, the Space Biology Program is implementing a new Summer Program for high-school students that aims to inspire the next generation of scientists to learn about and get involved in space research using GeneLabs Data System. The program consists of three main components core learning modules, focused on developing students knowledge on the Space Biology Program and Space Biology research, Genelab and the data system, and previous research conducted on model organisms in space; networking and team work, enabling students to interact with guest lecturers from local universities and their fellow peers, and also enabling them to visit local universities and genomics centers around the Bay area; and finally an independent learning project, whereby students will be required to form small groups, analyze a dataset on the Genelab platform, generate a hypothesis and develop a research plan to test their hypothesis. This program will not only help inspire high-school students to become involved in space-based research but will also help them develop key critical thinking and bioinformatics skills required for most college degrees and furthermore, will enable them to establish networks with their peers and connections

  10. Next generation science standards available for comment

    Science.gov (United States)

    Asher, Pranoti

    2012-05-01

    The first public draft of the Next Generation Science Standards (NGSS) is now available for public comment. Feedback on the standards is sought from people who have a stake in science education, including individuals in the K-12, higher education, business, and research communities. Development of NGSS is a state-led effort to define the content and practices students need to learn from kindergarten through high school. NGSS will be based on the U.S. National Research Council's reportFramework for K-12 Science Education.

  11. NeSSM: a Next-generation Sequencing Simulator for Metagenomics.

    Directory of Open Access Journals (Sweden)

    Ben Jia

    Full Text Available BACKGROUND: Metagenomics can reveal the vast majority of microbes that have been missed by traditional cultivation-based methods. Due to its extremely wide range of application areas, fast metagenome sequencing simulation systems with high fidelity are in great demand to facilitate the development and comparison of metagenomics analysis tools. RESULTS: We present here a customizable metagenome simulation system: NeSSM (Next-generation Sequencing Simulator for Metagenomics. Combining complete genomes currently available, a community composition table, and sequencing parameters, it can simulate metagenome sequencing better than existing systems. Sequencing error models based on the explicit distribution of errors at each base and sequencing coverage bias are incorporated in the simulation. In order to improve the fidelity of simulation, tools are provided by NeSSM to estimate the sequencing error models, sequencing coverage bias and the community composition directly from existing metagenome sequencing data. Currently, NeSSM supports single-end and pair-end sequencing for both 454 and Illumina platforms. In addition, a GPU (graphics processing units version of NeSSM is also developed to accelerate the simulation. By comparing the simulated sequencing data from NeSSM with experimental metagenome sequencing data, we have demonstrated that NeSSM performs better in many aspects than existing popular metagenome simulators, such as MetaSim, GemSIM and Grinder. The GPU version of NeSSM is more than one-order of magnitude faster than MetaSim. CONCLUSIONS: NeSSM is a fast simulation system for high-throughput metagenome sequencing. It can be helpful to develop tools and evaluate strategies for metagenomics analysis and it's freely available for academic users at http://cbb.sjtu.edu.cn/~ccwei/pub/software/NeSSM.php.

  12. Statistical analysis of next generation sequencing data

    CERN Document Server

    Nettleton, Dan

    2014-01-01

    Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...

  13. Pharmacokinetic and pharmacodynamic considerations for the next generation protein therapeutics.

    Science.gov (United States)

    Shah, Dhaval K

    2015-10-01

    Increasingly sophisticated protein engineering efforts have been undertaken lately to generate protein therapeutics with desired properties. This has resulted in the discovery of the next generation of protein therapeutics, which include: engineered antibodies, immunoconjugates, bi/multi-specific proteins, antibody mimetic novel scaffolds, and engineered ligands/receptors. These novel protein therapeutics possess unique physicochemical properties and act via a unique mechanism-of-action, which collectively makes their pharmacokinetics (PK) and pharmacodynamics (PD) different than other established biological molecules. Consequently, in order to support the discovery and development of these next generation molecules, it becomes important to understand the determinants controlling their PK/PD. This review discusses the determinants that a PK/PD scientist should consider during the design and development of next generation protein therapeutics. In addition, the role of systems PK/PD models in enabling rational development of the next generation protein therapeutics is emphasized.

  14. NIRS report of investigations for the development of the next generation PET apparatus. FY 2002

    International Nuclear Information System (INIS)

    2003-03-01

    The present status of studies conducted by representative technology fields for the development of the next generation PET apparatus, and the summary of opinions given by investigators of nuclear medicine are reported. The former involves chapters of: Summary of representative technologies for the development of the next generation PET apparatus; Count rate analysis of PET apparatuses for the whole body and small animals by PET simulator; Scintillator; DOI (depth of interaction) detector-evaluation of the detector with 256-ch fluorescence polarization-photomultiplier tubes (FP-PMT) trial apparatus etc; Examination of multi-slice DOI-MR compatible detector for PET; Development of application specific integrated circuit (ASIC) for processing the front-end signals; Detector simulation; Circuit for processing PET detector signals; Signal processing-coincidence circuit; Data collection system; Signal processing technology for the next generation PET; Reconstruction of statistical PET image using DOI signals; Monte Carlo simulation and Unique directions-PET for infants and for the whole body autonomic nervous systems and mental activity; and Actual design and evaluation of image reconstruction by statistical means. Opinions are: Progress of clinical PET apparatus; Desirable PET drugs and apparatuses; From clinical practice for the development of the next generation PET apparatus; From clinical psychiatric studies for the development; From application of drug development and basic researches; From brain PET practice; From clinical PET practice; and The role of National Institute of Radiological Sciences (NIRS) in PET development. Also involved is the publication list. (N.I.)

  15. Next generation advanced nuclear reactor designs

    International Nuclear Information System (INIS)

    Turgut, M. H.

    2009-01-01

    Growing energy demand by technological developments and the increase of the world population and gradually diminishing energy resources made nuclear power an indispensable option. The renewable energy sources like solar, wind and geothermal may be suited to meet some local needs. Environment friendly nuclear energy which is a suitable solution to large scale demands tends to develop highly economical, advanced next generation reactors by incorporating technological developments and years of operating experience. The enhancement of safety and reliability, facilitation of maintainability, impeccable compatibility with the environment are the goals of the new generation reactors. The protection of the investment and property is considered as well as the protection of the environment and mankind. They became economically attractive compared to fossil-fired units by the use of standard designs, replacing some active systems by passive, reducing construction time and increasing the operation lifetime. The evolutionary designs were introduced at first by ameliorating the conventional plants, than revolutionary systems which are denoted as generation IV were verged to meet future needs. The investigations on the advanced, proliferation resistant fuel cycle technologies were initiated to minimize the radioactive waste burden by using new generation fast reactors and ADS transmuters.

  16. IPv6: The Next Generation Internet Protocol

    Indian Academy of Sciences (India)

    addressing, new generation internet. 2. ... required the creation of the next generation of Internet ... IPv6 standards have defined the following Extension headers ..... addresses are represented as x:x:x:x:x:x:x:x, where each x is the hexadecimal ...

  17. Next Generation Workload Management and Analysis System for Big Data

    Energy Technology Data Exchange (ETDEWEB)

    De, Kaushik [Univ. of Texas, Arlington, TX (United States)

    2017-04-24

    We report on the activities and accomplishments of a four-year project (a three-year grant followed by a one-year no cost extension) to develop a next generation workload management system for Big Data. The new system is based on the highly successful PanDA software developed for High Energy Physics (HEP) in 2005. PanDA is used by the ATLAS experiment at the Large Hadron Collider (LHC), and the AMS experiment at the space station. The program of work described here was carried out by two teams of developers working collaboratively at Brookhaven National Laboratory (BNL) and the University of Texas at Arlington (UTA). These teams worked closely with the original PanDA team – for the sake of clarity the work of the next generation team will be referred to as the BigPanDA project. Their work has led to the adoption of BigPanDA by the COMPASS experiment at CERN, and many other experiments and science projects worldwide.

  18. Conception of Next Generation Networks

    Directory of Open Access Journals (Sweden)

    Slavko Šarić

    2004-11-01

    tool for the realization ofadditional se1vices and for enabling the control in NGN. Theproblem of JP routers for NGN has also been mentioned, aswell as the importance of the new core generation of optical networks.The conceptual framework of NGN is based today onIP/ATM transport technology, which is at this level of developmentgenerally accepted as the optimal transp011 solution. The problem of addressing caused by the insufficient address spaceof Ipv4 has been stressed and the solution of that problem hasbeen anticipated with the introduction of lpv6 technology,which, due to its complexity and high costs, would be graduallyintroduced by a dual approach into the system.The differentiating elements of NGN in relation to the existingnetworks have been specially pointed out. The modulm;that is, plane nature of the NGN conception in relation to thevertical and hierarchical conception of PSTN has beenstressed, as well as the pdvileges that this open conception offerswhen choosing the equipment of the highest quality by differentmanufacturers. Both existing, voice (TDM and data(NGN (ATM/IP, networks will act parallel in the next yearsuntil new solutions to NGN will have been introduced.

  19. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    International Nuclear Information System (INIS)

    Nash, T.

    1989-05-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC systems, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described. 6 figs

  20. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    International Nuclear Information System (INIS)

    Nash, T.

    1989-01-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC systems, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described. (orig.)

  1. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    Science.gov (United States)

    Nash, Thomas

    1989-12-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC system, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described.

  2. Next-generation wireless technologies 4G and beyond

    CERN Document Server

    Chilamkurti, Naveen; Chaouchi, Hakima

    2013-01-01

    This comprehensive text/reference examines the various challenges to secure, efficient and cost-effective next-generation wireless networking. Topics and features: presents the latest advances, standards and technical challenges in a broad range of emerging wireless technologies; discusses cooperative and mesh networks, delay tolerant networks, and other next-generation networks such as LTE; examines real-world applications of vehicular communications, broadband wireless technologies, RFID technology, and energy-efficient wireless communications; introduces developments towards the 'Internet o

  3. GASPRNG: GPU accelerated scalable parallel random number generator library

    Science.gov (United States)

    Gao, Shuang; Peterson, Gregory D.

    2013-04-01

    Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or

  4. R&D, Marketing, and the Success of Next-Generation Products

    OpenAIRE

    Elie Ofek; Miklos Sarvary

    2003-01-01

    This paper studies dynamic competition in markets characterized by the introduction of technologically advanced next-generation products. Firms invest in new product effort in an attempt to attain industry leadership, thus securing high profits and benefiting from advantages relevant for the success of future product generations. The analysis reveals that when the current leader possesses higher research and development (R&D) competence, it tends to investin R&D than rivals and to retain its ...

  5. MAP3D: a media processor approach for high-end 3D graphics

    Science.gov (United States)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  6. Perspectives on the development of next generation reactor systems safety analysis codes

    International Nuclear Information System (INIS)

    Zhang, H.

    2015-01-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  7. Perspectives on the development of next generation reactor systems safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-07-01

    'Full text:' Existing reactor system analysis codes, such as RELAP5-3D and TRAC, have gained worldwide success in supporting reactor safety analyses, as well as design and licensing of new reactors. These codes are important assets to the nuclear engineering research community, as well as to the nuclear industry. However, most of these codes were originally developed during the 1970s', and it becomes necessary to develop next-generation reactor system analysis codes for several reasons. Firstly, as new reactor designs emerge, there are new challenges emerging in numerical simulations of reactor systems such as long lasting transients and multi-physics phenomena. These new requirements are beyond the range of applicability of the existing system analysis codes. Advanced modeling and numerical methods must be taken into consideration to improve the existing capabilities. Secondly, by developing next-generation reactor system analysis codes, the knowledge (know how) in two phase flow modeling and the highly complex constitutive models will be transferred to the young generation of nuclear engineers. And thirdly, all computer codes have limited shelf life. It becomes less and less cost-effective to maintain a legacy code, due to the fast change of computer hardware and software environment. There are several critical perspectives in terms of developing next-generation reactor system analysis codes: 1) The success of the next-generation codes must be built upon the success of the existing codes. The knowledge of the existing codes, not just simply the manuals and codes, but knowing why and how, must be transferred to the next-generation codes. The next-generation codes should encompass the capability of the existing codes. The shortcomings of existing codes should be identified, understood, and properly categorized, for example into model deficiencies or numerical method deficiencies. 2) State-of-the-art models and numerical methods must be considered to

  8. Next Generation Nuclear Plant System Requirements Manual

    International Nuclear Information System (INIS)

    Not Listed

    2008-01-01

    System Requirements Manual for the NGNP Project. The Energy Policy Act of 2005 (H.R. 6; EPAct), which was signed into law by President George W. Bush in August 2005, required the Secretary of the U.S. Department of Energy (DOE) to establish a project to be known as the Next Generation Nuclear Plant (NGNP) Project. According to the EPAct, the NGNP Project shall consist of the research, development, design, construction, and operation of a prototype plant (to be referred to herein as the NGNP) that (1) includes a nuclear reactor based on the research and development (R and D) activities supported by the Generation IV Nuclear Energy Systems initiative, and (2) shall be used to generate electricity, to produce hydrogen, or to both generate electricity and produce hydrogen. The NGNP Project supports both the national need to develop safe, clean, economical nuclear energy and the Nuclear Hydrogen Initiative (NHI), which has the goal of establishing greenhouse-gas-free technologies for the production of hydrogen. The DOE has selected the helium-cooled High Temperature Gas-Cooled Reactor (HTGR) as the reactor concept to be used for the NGNP because it is the only near-term Generation IV concept that has the capability to provide process heat at high-enough temperatures for highly efficient production of hydrogen. The EPAct also names the Idaho National Laboratory (INL), the DOE's lead national laboratory for nuclear energy research, as the site for the prototype NGNP

  9. FY 1999 Report on feasibility research and development for next generation liquid crystal process basic technologies; 1999 nendo jisedai ekisho process kiban gijutsu ni kakawaru sendo kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Described herein are the FY 1999 results of the feasibility study on the next generation liquid crystal processes. Technology for low-temperature thin film formation fabricates high-purity, high-density Si films useful as the laser annealing (crystallization) precursor by the IBD method, without using thermal annealing. Formation of thin films of a-Si and SiNx on substrates kept at 200 degrees C or lower is studied using a high-density plasma source, and the surface conditions are uniformly controlled over a large area of the film precursor. The new technology needs less power to produce the film than the conventional CVD method which uses parallel flat plates by controlling the plasma-generating region. Resources- and energy-saving using the TFT method are essential for production of liquid-crystal displays, and the techniques for forming the thin films at low temperature are studied. Reduction in wiring resistance (signal transmission delay) is studied for the next generation TFT, and it is found that the Cu film is selectively formed on TiN but not on SiO{sub 2} by the MOCVD method at 150 to 180 degrees C. Similarly, the selective film formation is confirmed in the plating technology. The comprehensive investigations for the next generation liquid crystal process technologies cover high-quality polycrystalline Si films and lithography (exposed to light). (NEDO)

  10. A software tool for simulation of surfaces generated by ball nose end milling

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2004-01-01

    , for prediction of surface topography of ball nose end milled surfaces, was developed. Such software tool is based on a simplified model of the ideal tool motion and neglects the effects due to run-out, static and dynamic deflections and error motions, but has the merit of generating in output a file in a format...... readable by a surface processor software (SPIP [2]), for calculation of a number of surface roughness parameters. In the next paragraph a description of the basic features of ball nose end milled surfaces is given, while in paragraph 3 the model is described....

  11. Parallel hierarchical radiosity rendering

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Michael [Iowa State Univ., Ames, IA (United States)

    1993-07-01

    In this dissertation, the step-by-step development of a scalable parallel hierarchical radiosity renderer is documented. First, a new look is taken at the traditional radiosity equation, and a new form is presented in which the matrix of linear system coefficients is transformed into a symmetric matrix, thereby simplifying the problem and enabling a new solution technique to be applied. Next, the state-of-the-art hierarchical radiosity methods are examined for their suitability to parallel implementation, and scalability. Significant enhancements are also discovered which both improve their theoretical foundations and improve the images they generate. The resultant hierarchical radiosity algorithm is then examined for sources of parallelism, and for an architectural mapping. Several architectural mappings are discussed. A few key algorithmic changes are suggested during the process of making the algorithm parallel. Next, the performance, efficiency, and scalability of the algorithm are analyzed. The dissertation closes with a discussion of several ideas which have the potential to further enhance the hierarchical radiosity method, or provide an entirely new forum for the application of hierarchical methods.

  12. NEXT GENERATION TURBINE SYSTEM STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Frank Macri

    2002-02-28

    Rolls-Royce has completed a preliminary design and marketing study under a Department of Energy (DOE) cost shared contract (DE-AC26-00NT40852) to analyze the feasibility of developing a clean, high efficiency, and flexible Next Generation Turbine (NGT) system to meet the power generation market needs of the year 2007 and beyond. Rolls-Royce evaluated the full range of its most advanced commercial aerospace and aeroderivative engines alongside the special technologies necessary to achieve the aggressive efficiency, performance, emissions, economic, and flexibility targets desired by the DOE. Heavy emphasis was placed on evaluating the technical risks and the economic viability of various concept and technology options available. This was necessary to ensure the resulting advanced NGT system would provide extensive public benefits and significant customer benefits without introducing unacceptable levels of technical and operational risk that would impair the market acceptance of the resulting product. Two advanced cycle configurations were identified as offering significant advantages over current combined cycle products available in the market. In addition, balance of plant (BOP) technologies, as well as capabilities to improve the reliability, availability, and maintainability (RAM) of industrial gas turbine engines, have been identified. A customer focused survey and economic analysis of a proposed Rolls-Royce NGT product configuration was also accomplished as a part of this research study. The proposed Rolls-Royce NGT solution could offer customers clean, flexible power generation systems with very high efficiencies, similar to combined cycle plants, but at a much lower specific cost, similar to those of simple cycle plants.

  13. Parallel Application Performance on Two Generations of Intel Xeon HPC Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Christopher H.; Long, Hai; Sides, Scott; Vaidhynathan, Deepthi; Jones, Wesley

    2015-10-15

    Two next-generation node configurations hosting the Haswell microarchitecture were tested with a suite of microbenchmarks and application examples, and compared with a current Ivy Bridge production node on NREL" tm s Peregrine high-performance computing cluster. A primary conclusion from this study is that the additional cores are of little value to individual task performance--limitations to application parallelism, or resource contention among concurrently running but independent tasks, limits effective utilization of these added cores. Hyperthreading generally impacts throughput negatively, but can improve performance in the absence of detailed attention to runtime workflow configuration. The observations offer some guidance to procurement of future HPC systems at NREL. First, raw core count must be balanced with available resources, particularly memory bandwidth. Balance-of-system will determine value more than processor capability alone. Second, hyperthreading continues to be largely irrelevant to the workloads that are commonly seen, and were tested here, at NREL. Finally, perhaps the most impactful enhancement to productivity might occur through enabling multiple concurrent jobs per node. Given the right type and size of workload, more may be achieved by doing many slow things at once, than fast things in order.

  14. Generation After Next Propulsor Research: Robust Design for Embedded Engine Systems

    Science.gov (United States)

    Arend, David J.; Tillman, Gregory; O'Brien, Walter F.

    2012-01-01

    The National Aeronautics and Space Administration, United Technologies Research Center and Virginia Polytechnic and State University have contracted to pursue multi-disciplinary research into boundary layer ingesting (BLI) propulsors for generation after next environmentally responsible subsonic fixed wing aircraft. This Robust Design for Embedded Engine Systems project first conducted a high-level vehicle system study based on a large commercial transport class hybrid wing body aircraft, which determined that a 3 to 5 percent reduction in fuel burn could be achieved over a 7,500 nanometer mission. Both pylon-mounted baseline and BLI propulsion systems were based on a low-pressure-ratio fan (1.35) in an ultra-high-bypass ratio engine (16), consistent with the next generation of advanced commercial turbofans. An optimized, coupled BLI inlet and fan system was subsequently designed to achieve performance targets identified in the system study. The resulting system possesses an inlet with total pressure losses less than 0.5%, and a fan stage with an efficiency debit of less than 1.5 percent relative to the pylon-mounted, clean-inflow baseline. The subject research project has identified tools and methodologies necessary for the design of next-generation, highly-airframe-integrated propulsion systems. These tools will be validated in future large-scale testing of the BLI inlet / fan system in NASA's 8 foot x 6 foot transonic wind tunnel. In addition, fan unsteady response to screen-generated total pressure distortion is being characterized experimentally in a JT15D engine test rig. These data will document engine sensitivities to distortion magnitude and spatial distribution, providing early insight into key physical processes that will control BLI propulsor design.

  15. Galaxy LIMS for next-generation sequencing

    NARCIS (Netherlands)

    Scholtalbers, J.; Rossler, J.; Sorn, P.; Graaf, J. de; Boisguerin, V.; Castle, J.; Sahin, U.

    2013-01-01

    SUMMARY: We have developed a laboratory information management system (LIMS) for a next-generation sequencing (NGS) laboratory within the existing Galaxy platform. The system provides lab technicians standard and customizable sample information forms, barcoded submission forms, tracking of input

  16. Quantitative miRNA expression analysis: comparing microarrays with next-generation sequencing

    DEFF Research Database (Denmark)

    Willenbrock, Hanni; Salomon, Jesper; Søkilde, Rolf

    2009-01-01

    Recently, next-generation sequencing has been introduced as a promising, new platform for assessing the copy number of transcripts, while the existing microarray technology is considered less reliable for absolute, quantitative expression measurements. Nonetheless, so far, results from the two...... technologies have only been compared based on biological data, leading to the conclusion that, although they are somewhat correlated, expression values differ significantly. Here, we use synthetic RNA samples, resembling human microRNA samples, to find that microarray expression measures actually correlate...... better with sample RNA content than expression measures obtained from sequencing data. In addition, microarrays appear highly sensitive and perform equivalently to next-generation sequencing in terms of reproducibility and relative ratio quantification....

  17. Generation of Random Numbers and Parallel Random Number Streams for Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    L. Yu. Barash

    2012-01-01

    Full Text Available Modern methods and libraries for high quality pseudorandom number generation and for generation of parallel random number streams for Monte Carlo simulations are considered. The probability equidistribution property and the parameters when the property holds at dimensions up to logarithm of mesh size are considered for Multiple Recursive Generators.

  18. Big Data Perspective and Challenges in Next Generation Networks

    Directory of Open Access Journals (Sweden)

    Kashif Sultan

    2018-06-01

    Full Text Available With the development towards the next generation cellular networks, i.e., 5G, the focus has shifted towards meeting the higher data rate requirements, potential of micro cells and millimeter wave spectrum. The goals for next generation networks are very high data rates, low latency and handling of big data. The achievement of these goals definitely require newer architecture designs, upgraded technologies with possible backward support, better security algorithms and intelligent decision making capability. In this survey, we identify the opportunities which can be provided by 5G networks and discuss the underlying challenges towards implementation and realization of the goals of 5G. This survey also provides a discussion on the recent developments made towards standardization, the architectures which may be potential candidates for deployment and the energy concerns in 5G networks. Finally, the paper presents a big data perspective and the potential of machine learning for optimization and decision making in 5G networks.

  19. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code

  20. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, Robert

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code. (author)

  1. Next-generation Digital Earth.

    Science.gov (United States)

    Goodchild, Michael F; Guo, Huadong; Annoni, Alessandro; Bian, Ling; de Bie, Kees; Campbell, Frederick; Craglia, Max; Ehlers, Manfred; van Genderen, John; Jackson, Davina; Lewis, Anthony J; Pesaresi, Martino; Remetey-Fülöpp, Gábor; Simpson, Richard; Skidmore, Andrew; Wang, Changlin; Woodgate, Peter

    2012-07-10

    A speech of then-Vice President Al Gore in 1998 created a vision for a Digital Earth, and played a role in stimulating the development of a first generation of virtual globes, typified by Google Earth, that achieved many but not all the elements of this vision. The technical achievements of Google Earth, and the functionality of this first generation of virtual globes, are reviewed against the Gore vision. Meanwhile, developments in technology continue, the era of "big data" has arrived, the general public is more and more engaged with technology through citizen science and crowd-sourcing, and advances have been made in our scientific understanding of the Earth system. However, although Google Earth stimulated progress in communicating the results of science, there continue to be substantial barriers in the public's access to science. All these factors prompt a reexamination of the initial vision of Digital Earth, and a discussion of the major elements that should be part of a next generation.

  2. Generation of artificial FASTQ files to evaluate the performance of next-generation sequencing pipelines.

    Directory of Open Access Journals (Sweden)

    Matthew Frampton

    Full Text Available Pipelines for the analysis of Next-Generation Sequencing (NGS data are generally composed of a set of different publicly available software, configured together in order to map short reads of a genome and call variants. The fidelity of pipelines is variable. We have developed ArtificialFastqGenerator, which takes a reference genome sequence as input and outputs artificial paired-end FASTQ files containing Phred quality scores. Since these artificial FASTQs are derived from the reference genome, it provides a gold-standard for read-alignment and variant-calling, thereby enabling the performance of any NGS pipeline to be evaluated. The user can customise DNA template/read length, the modelling of coverage based on GC content, whether to use real Phred base quality scores taken from existing FASTQ files, and whether to simulate sequencing errors. Detailed coverage and error summary statistics are outputted. Here we describe ArtificialFastqGenerator and illustrate its implementation in evaluating a typical bespoke NGS analysis pipeline under different experimental conditions. ArtificialFastqGenerator was released in January 2012. Source code, example files and binaries are freely available under the terms of the GNU General Public License v3.0. from https://sourceforge.net/projects/artfastqgen/.

  3. Bringing Next-Generation Sequencing into the Classroom through a Comparison of Molecular Biology Techniques

    Science.gov (United States)

    Bowling, Bethany; Zimmer, Erin; Pyatt, Robert E.

    2014-01-01

    Although the development of next-generation (NextGen) sequencing technologies has revolutionized genomic research and medicine, the incorporation of these topics into the classroom is challenging, given an implied high degree of technical complexity. We developed an easy-to-implement, interactive classroom activity investigating the similarities…

  4. Special Issue: Next Generation DNA Sequencing

    Directory of Open Access Journals (Sweden)

    Paul Richardson

    2010-10-01

    Full Text Available Next Generation Sequencing (NGS refers to technologies that do not rely on traditional dideoxy-nucleotide (Sanger sequencing where labeled DNA fragments are physically resolved by electrophoresis. These new technologies rely on different strategies, but essentially all of them make use of real-time data collection of a base level incorporation event across a massive number of reactions (on the order of millions versus 96 for capillary electrophoresis for instance. The major commercial NGS platforms available to researchers are the 454 Genome Sequencer (Roche, Illumina (formerly Solexa Genome analyzer, the SOLiD system (Applied Biosystems/Life Technologies and the Heliscope (Helicos Corporation. The techniques and different strategies utilized by these platforms are reviewed in a number of the papers in this special issue. These technologies are enabling new applications that take advantage of the massive data produced by this next generation of sequencing instruments. [...

  5. Next Generation NASA Initiative for Space Geodesy

    Science.gov (United States)

    Merkowitz, S. M.; Desai, S.; Gross, R. S.; Hilliard, L.; Lemoine, F. G.; Long, J. L.; Ma, C.; McGarry J. F.; Murphy, D.; Noll, C. E.; hide

    2012-01-01

    Space geodesy measurement requirements have become more and more stringent as our understanding of the physical processes and our modeling techniques have improved. In addition, current and future spacecraft will have ever-increasing measurement capability and will lead to increasingly sophisticated models of changes in the Earth system. Ground-based space geodesy networks with enhanced measurement capability will be essential to meeting these oncoming requirements and properly interpreting the sate1!ite data. These networks must be globally distributed and built for longevity, to provide the robust data necessary to generate improved models for proper interpretation ofthe observed geophysical signals. These requirements have been articulated by the Global Geodetic Observing System (GGOS). The NASA Space Geodesy Project (SGP) is developing a prototype core site as the basis for a next generation Space Geodetic Network (SGN) that would be NASA's contribution to a global network designed to produce the higher quality data required to maintain the Terrestrial Reference Frame and provide information essential for fully realizing the measurement potential of the current and coming generation of Earth Observing spacecraft. Each of the sites in the SGN would include co-located, state of-the-art systems from all four space geodetic observing techniques (GNSS, SLR, VLBI, and DORIS). The prototype core site is being developed at NASA's Geophysical and Astronomical Observatory at Goddard Space Flight Center. The project commenced in 2011 and is scheduled for completion in late 2013. In January 2012, two multiconstellation GNSS receivers, GODS and GODN, were established at the prototype site as part of the local geodetic network. Development and testing are also underway on the next generation SLR and VLBI systems along with a modern DORIS station. An automated survey system is being developed to measure inter-technique vector ties, and network design studies are being

  6. Aptaligner: automated software for aligning pseudorandom DNA X-aptamers from next-generation sequencing data.

    Science.gov (United States)

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E

    2014-06-10

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.

  7. StarTrax --- The Next Generation User Interface

    Science.gov (United States)

    Richmond, Alan; White, Nick

    StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.

  8. Evaluation Metrics for Intermediate Heat Exchangers for Next Generation Nuclear Reactors

    International Nuclear Information System (INIS)

    Sabharwall, Piyush; Kim, Eung Soo; Anderson, Nolan

    2011-01-01

    The Department of Energy (DOE) is working with industry to develop a next generation, high-temperature gas-cooled reactor (HTGR) as a part of the effort to supply the United States with abundant, clean, and secure energy as initiated by the Energy Policy Act of 2005 (EPAct; Public Law 109-58,2005). The NGNP Project, led by the Idaho National Laboratory (INL), will demonstrate the ability of the HTGR to generate hydrogen, electricity, and/or high-quality process heat for a wide range of industrial applications.

  9. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  10. Preliminary thoughts on the data acquisition for the next generation of silicon tracking systems

    International Nuclear Information System (INIS)

    Genat, J.F.; Savoy-Navarro, A.

    2007-01-01

    Preliminary thoughts about the data acquisition system to be developed for the next generation of large area silicon tracker are presented in this paper. This paper describes the set of data delivered by these tracking systems, and the various stages of processing and data flow transmission from the front-end chip sitting on the detector to the latest stage in the data processing. How to best profit from the status of the art technologies is a major goal. (author)

  11. Towards next-generation biodiversity assessment using DNA metabarcoding

    DEFF Research Database (Denmark)

    Taberlet, Pierre; Coissac, Eric; Pompanon, Francois

    2012-01-01

    Virtually all empirical ecological studies require species identification during data collection. DNA metabarcoding refers to the automated identification of multiple species from a single bulk sample containing entire organisms or from a single environmental sample containing degraded DNA (soil......, water, faeces, etc.). It can be implemented for both modern and ancient environmental samples. The availability of next-generation sequencing platforms and the ecologists need for high-throughput taxon identification have facilitated the emergence of DNA metabarcoding. The potential power of DNA...

  12. Next generation solar energy. From fundamentals to applications

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    semiconductor nanostructures in relation to problems of solar energy conversion (V. Klimov); (23) Progress in intermediate band solar cells (A.M. Vega); (24) Microscopic and macroscopic simulation in photovoltaics (J. Nelson); (25) Third generation photovoltaics - Thin film tandem cells, up/down conversion and hot carrier cells (G. Conibeer); (26) Development, design, synthesis and upscaling of CQD (S. Khalil); (27) Probing triplet states in OPV materials and devices (V. Dyakonov); (28) Future generation solar photon conversion to electricity and solar fuels - Multiple Exciton generation in colloidal nanocrystals and quantum dot solar cells and singlet fission in molecules (A. Nozik); (29) Plasmon-enhanced solar upconversion for photovoltaics and photocatalysis (J. Dionne); (30) Alternative materials for scaling up flexible dye solar cells (P.D. Lund); (31) Glow discharge techniques in the chemical analysis of photovoltaic materials (S. Schmitt); (32) Atomic layer deposition technology from R and D efforts to industrial PV production (M. Putkonen); (33) Spectroscopical methods in photovoltaic research (R. Seitz); (34) Scanning electron microscopy, focused ion beams and their analytical techniques to advance material and device developments in next generation photovoltaics (J. Jiruse); (35) Flash lamp annealing - High temperature treatment of surfaces/layers on temperature sensitive substrates in next generation PV technologies (T. Gebel); (36) MOCVD technology for concentrated PV (D. Schmitz); (37) PV technology perspectives in Lithuania (T. Malinauskas); (38) Interfaces in III-V-heterostructure solar cells (A.S. Gudovskikh); (39) Formation and investigation of luminescent silicon nanocrystals for optoelectronic applications (V.Y. Timoshenko); (40) Methods for fault detection in PV systems (T. Zdanowicz); (41) Powder materials in PV, overview and research at Tallinn University of technology (D. Meissner); (42) Giant radiant heat transfer through the micron gaps (I.S. Nefedov

  13. Next Generation TRD for CREAM Using Gas Straw Tubes and Foam Radiators

    Science.gov (United States)

    Malinin, A.; Ahn, H.S.; Fedin, O.; Ganel, O.; Han, J.H.; Kim, C.H.; Kim, K.C.; Lee, M.H.; Lutz, L.; Seo, E.S.; Walpole, P.; Wu, J.; Yoo, J.H.; Yoon, Y.S.; Zinn, S.Y.

    The Cosmic Ray Energetics And Mass (CREAM) experiment is designed to investigate the source, propagation and acceleration mechanism of high energy cosmic-ray nuclei, by directly measuring their energy and charge. Incorporating a transition radiation detector (TRD) provides an energy measurement complementary to the calorimeter, as well as additional track reconstruction capability. The next generation CREAM TRD is designed with 4 mm straw tubes to greatly improve tracking over the previous 20 mm tube design, thereby enhancing charge identification in the silicon charge detector (SCD). Plastic foam provides a weight-efficient radiator that doubles as a mechanical support for the straw layers. This design provides a compact, robust, reliable, low density detector to measure incident nucleus energy for 3 < Z < 30 nuclei in the Lorentz gamma factor range of 102-105. This paper discusses the new TRD design and the low power front end electronics used to achieve the large dynamic range required. Beam test results of a prototype TRD are also reported.

  14. A Comparative Study of Multiplexing Schemes for Next Generation Optical Access Networks

    Science.gov (United States)

    Imtiaz, Waqas A.; Khan, Yousaf; Shah, Pir Mehar Ali; Zeeshan, M.

    2014-09-01

    Passive optical network (PON) is a high bandwidth, economical solution which can provide the necessary bandwidth to end-users. Wavelength division multiplexed passive optical networks (WDM PONs) and time division multiplexed passive optical networks (TDM PONs) are considered as an evolutionary step for next-generation optical access (NGOA) networks. However they fail to provide highest transmission capacity, efficient bandwidth access, and robust dispersion tolerance. Thus future PONs are considered on simpler, efficient and potentially scalable, optical code division multiplexed (OCDM) PONs. This paper compares the performance of existing PONs with OCDM PON to determine a suitable scheme for NGOA networks. Two system parameter are used in this paper: fiber length, and bit rate. Performance analysis using Optisystem shows that; for a sufficient system performance parameters i.e. bit error rate (BER) ≤ 10-9, and maximum quality factor (Q) ≥ 6, OCDMA PON efficiently performs upto 50 km with 10 Gbit/s per ONU.

  15. Thermal management of next-generation contact-cooled synchrotron x-ray mirrors

    Energy Technology Data Exchange (ETDEWEB)

    Khounsary, A.

    1999-10-29

    In the past decade, several third-generation synchrotrons x-ray sources have been constructed and commissioned around the world. Many of the major problems in the development and design of the optical components capable of handling the extremely high heat loads of the generated x-ray beams have been resolved. It is expected, however, that in the next few years even more powerful x-ray beams will be produced at these facilities, for example, by increasing the particle beam current. In this paper, the design of a next generation of synchrotron x-ray mirrors is discussed. The author shows that the design of contact-cooled mirrors capable of handing x-ray beam heat fluxes in excess of 500 W/mm{sup 2} - or more than three times the present level - is well within reach, and the limiting factor is the thermal stress rather then thermally induced slope error.

  16. Application of photonics in next generation telecommunication satellites payloads

    Science.gov (United States)

    Anzalchi, J.; Inigo, P.; Roy, B.

    2017-11-01

    Next generation broadband telecommunication satellites are required to provide very high data throughput using complex multibeam architectures. These high throughput `Terabit/s' Satellites will incorporate payloads with very large quantity of conventional RF equipment, co-axial cables, waveguides, harnesses and ancillary equipment, making the Assembly, Integration and Test (AIT) very complex. Use of `RF over Fiber' and associated photonics equipment can make the process of AIT much simpler with the added benefit of significant reduction in number of payload equipment and inherent payload mass.

  17. Integration of microbiological, epidemiological and next generation sequencing technologies data for the managing of nosocomial infections

    Directory of Open Access Journals (Sweden)

    Matteo Brilli

    2018-02-01

    Full Text Available At its core, the work of clinical microbiologists consists in the retrieving of a few bytes of information (species identification; metabolic capacities; staining and antigenic properties; antibiotic resistance profiles, etc. from pathogenic agents. The development of next generation sequencing technologies (NGS, and the possibility to determine the entire genome for bacterial pathogens, fungi and protozoans will likely introduce a breakthrough in the amount of information generated by clinical microbiology laboratories: from bytes to Megabytes of information, for a single isolate. In parallel, the development of novel informatics tools, designed for the management and analysis of the so-called Big Data, offers the possibility to search for patterns in databases collecting genomic and microbiological information on the pathogens, as well as epidemiological data and information on the clinical parameters of the patients. Nosocomial infections and antibiotic resistance will likely represent major challenges for clinical microbiologists, in the next decades. In this paper, we describe how bacterial genomics based on NGS, integrated with novel informatic tools, could contribute to the control of hospital infections and multi-drug resistant pathogens.

  18. Engineering microbes for tolerance to next-generation biofuels

    Directory of Open Access Journals (Sweden)

    Dunlop Mary J

    2011-09-01

    Full Text Available Abstract A major challenge when using microorganisms to produce bulk chemicals such as biofuels is that the production targets are often toxic to cells. Many biofuels are known to reduce cell viability through damage to the cell membrane and interference with essential physiological processes. Therefore, cells must trade off biofuel production and survival, reducing potential yields. Recently, there have been several efforts towards engineering strains for biofuel tolerance. Promising methods include engineering biofuel export systems, heat shock proteins, membrane modifications, more general stress responses, and approaches that integrate multiple tolerance strategies. In addition, in situ recovery methods and media supplements can help to ease the burden of end-product toxicity and may be used in combination with genetic approaches. Recent advances in systems and synthetic biology provide a framework for tolerance engineering. This review highlights recent targeted approaches towards improving microbial tolerance to next-generation biofuels with a particular emphasis on strategies that will improve production.

  19. New materials for next-generation commercial transports

    National Research Council Canada - National Science Library

    Committee on New Materials for Advanced Civil Aircraft, Commission on Engineering and Technical Systems, National Research Council

    ... civil aircraft throughout their service life. The committee investigated the new materials and structural concepts that are likely to be incorporated into next generation commercial aircraft and the factors influencing application decisions...

  20. Energy Efficient Glass Melting - The Next Generation Melter

    Energy Technology Data Exchange (ETDEWEB)

    David Rue

    2008-03-01

    The objective of this project is to demonstrate a high intensity glass melter, based on the submerged combustion melting technology. This melter will serve as the melting and homogenization section of a segmented, lower-capital cost, energy-efficient Next Generation Glass Melting System (NGMS). After this project, the melter will be ready to move toward commercial trials for some glasses needing little refining (fiberglass, etc.). For other glasses, a second project Phase or glass industry research is anticipated to develop the fining stage of the NGMS process.

  1. Compact 2100 nm laser diode module for next-generation DIRCM

    Science.gov (United States)

    Dvinelis, Edgaras; Greibus, Mindaugas; TrinkÅ«nas, Augustinas; NaujokaitÄ--, Greta; Vizbaras, Augustinas; Vizbaras, Dominykas; Vizbaras, Kristijonas

    2017-10-01

    Compact high-power 2100 nm laser diode module for next-generation directional infrared countermeasure (DIRCM) systems is presented. Next-generation DIRCM systems require compact, light-weight and robust laser modules which could provide intense IR light emission capable of disrupting the tracking sensor of heat-seeking missile. Currently used solid-state and fiber laser solutions for mid-IR band are bulky and heavy making them difficult to implement in smaller form-factor DIRCM systems. Recent development of GaSb laser diode technology greatly improved optical output powers and efficiencies of laser diodes working in 1900 - 2450 nm band [1] while also maintaining very attractive size, weight, power consumption and cost characteristics. 2100 nm laser diode module presented in this work performance is based on high-efficiency broad emitting area GaSb laser diode technology. Each laser diode emitter is able to provide 1 W of CW output optical power with working point efficiency up to 20% at temperature of 20 °C. For output beam collimation custom designed fast-axis collimator and slow-axis collimator lenses were used. These lenses were actively aligned and attached using UV epoxy curing. Total 2 emitters stacked vertically were used in 2100 nm laser diode module. Final optical output power of the module goes up to 2 W at temperature of 20 °C. Total dimensions of the laser diode module are 35 x 25 x 16 mm (L x W x H) with a weight of 28 grams. Finally output beam is bore-sighted to mechanical axes of the module housing allowing for easy integration into next-generation DIRCM systems.

  2. Assessment of the Capability of Molten Salt Reactors as a Next Generation High Temperature Reactors

    International Nuclear Information System (INIS)

    Elsheikh, B.M.

    2017-01-01

    Molten Salt Reactor according to Aircraft Reactor Experiment (ARE) and the Molten Salt Reactor Experiment (MSRE) programs, was designed to be the first full-scale, commercial nuclear power plant utilizing molten salt liquid fuels that can be used for producing electricity, and producing fissile fuels (breeding)burning actinides. The high temperature in the primary cycle enables the realization of efficient thermal conversion cycles with net thermal efficiencies reach in some of the designs of nuclear reactors greater than 45%. Molten salts and liquid salt because of their low vapor pressure are excellent candidates for meeting most of the requirements of these high temperature reactors. There is renewed interest in MSRs because of changing goals and new technologies in the use of high-temperature reactors. Molten Salt Reactors for high temperature create substantial technical challenges to have high effectiveness intermediate heat transfer loop components. This paper will discuss and investigate the capability and compatibility of molten salt reactors, toward next generation high temperature energy system and its technical challenges

  3. Preparation of next generation set of group cross sections. 3

    International Nuclear Information System (INIS)

    Kaneko, Kunio

    2002-03-01

    This fiscal year, based on the examination result about the evaluation energy range of heavy element unresolved resonance cross sections, the upper energy limit of the energy range, where ultra-fine group cross sections are produced, was raised to 50 keV, and an improvement of the group cross section processing system was promoted. At the same time, reflecting the result of studies carried out till now, a function producing delayed neutron data was added to the general-purpose group cross section processing system , thus the preparation of general purpose group cross section processing system has been completed. On the other hand, the energy structure, data constitution and data contents of next generation group cross section set were determined, and the specification of a 151 groups next generation group cross section set was defined. Based on the above specification, a concrete library format of the next generation cross section set has been determined. After having carried out the above-described work, using the general-purpose group cross section processing system , which was complete in this study, with use of the JENDL-3. 2 evaluated nuclear data, the 151 groups next generation group cross section of 92 nuclides and the ultra fine group resonance cross section library for 29 nuclides have been prepared. Utilizing the 151 groups next generation group cross section set and the ultra-fine group resonance cross-section library, a bench mark test calculation of fast reactors has been performed by using an advanced lattice calculation code. It was confirmed, by comparing the calculation result with a calculation result of continuous energy Monte Carlo code, that the 151 groups next generation cross section set has sufficient accuracy. (author)

  4. Acoustic methods for high-throughput protein crystal mounting at next-generation macromolecular crystallographic beamlines.

    Science.gov (United States)

    Roessler, Christian G; Kuczewski, Anthony; Stearns, Richard; Ellson, Richard; Olechno, Joseph; Orville, Allen M; Allaire, Marc; Soares, Alexei S; Héroux, Annie

    2013-09-01

    To take full advantage of advanced data collection techniques and high beam flux at next-generation macromolecular crystallography beamlines, rapid and reliable methods will be needed to mount and align many samples per second. One approach is to use an acoustic ejector to eject crystal-containing droplets onto a solid X-ray transparent surface, which can then be positioned and rotated for data collection. Proof-of-concept experiments were conducted at the National Synchrotron Light Source on thermolysin crystals acoustically ejected onto a polyimide `conveyor belt'. Small wedges of data were collected on each crystal, and a complete dataset was assembled from a well diffracting subset of these crystals. Future developments and implementation will focus on achieving ejection and translation of single droplets at a rate of over one hundred per second.

  5. Next-generation Sequencing-based genomic profiling: Fostering innovation in cancer care?

    Directory of Open Access Journals (Sweden)

    Gustavo S. Fernandes

    Full Text Available OBJECTIVES: With the development of next-generation sequencing (NGS technologies, DNA sequencing has been increasingly utilized in clinical practice. Our goal was to investigate the impact of genomic evaluation on treatment decisions for heavily pretreated patients with metastatic cancer. METHODS: We analyzed metastatic cancer patients from a single institution whose cancers had progressed after all available standard-of-care therapies and whose tumors underwent next-generation sequencing analysis. We determined the percentage of patients who received any therapy directed by the test, and its efficacy. RESULTS: From July 2013 to December 2015, 185 consecutive patients were tested using a commercially available next-generation sequencing-based test, and 157 patients were eligible. Sixty-six patients (42.0% were female, and 91 (58.0% were male. The mean age at diagnosis was 52.2 years, and the mean number of pre-test lines of systemic treatment was 2.7. One hundred and seventy-seven patients (95.6% had at least one identified gene alteration. Twenty-four patients (15.2% underwent systemic treatment directed by the test result. Of these, one patient had a complete response, four (16.7% had partial responses, two (8.3% had stable disease, and 17 (70.8% had disease progression as the best result. The median progression-free survival time with matched therapy was 1.6 months, and the median overall survival was 10 months. CONCLUSION: We identified a high prevalence of gene alterations using an next-generation sequencing test. Although some benefit was associated with the matched therapy, most of the patients had disease progression as the best response, indicating the limited biological potential and unclear clinical relevance of this practice.

  6. Next Generation Attenuation Relationships for the Eastern United States (NGA-East)

    Energy Technology Data Exchange (ETDEWEB)

    Mahin, Stephen [Univ. of California, Berkeley, CA (United States); Bozorgnia, Yousef [Univ. of California, Berkeley, CA (United States)

    2016-04-11

    This is a progress report to DOE for project Next Generation Attenuation for Central & Eastern US (NGA-East).This progress report consists of numerous monthly progress segments starting June 1, 2010 until December 31, 2015. Please note: the December 2015 progress report was issued in January 2016 due to the final university financial reporting at the end of this project. For each month, there is a technical progress list, and an update on the financial progress of the project. As you know, this project is jointly funded by the DOE, US Nuclear Regulatory Commission (NRC) and Electric Power Research Institute (EPRI). Thus, each segment includes financial progress for these three funding agencies.

  7. Parallel-Vector Algorithm For Rapid Structural Anlysis

    Science.gov (United States)

    Agarwal, Tarun R.; Nguyen, Duc T.; Storaasli, Olaf O.

    1993-01-01

    New algorithm developed to overcome deficiency of skyline storage scheme by use of variable-band storage scheme. Exploits both parallel and vector capabilities of modern high-performance computers. Gives engineers and designers opportunity to include more design variables and constraints during optimization of structures. Enables use of more refined finite-element meshes to obtain improved understanding of complex behaviors of aerospace structures leading to better, safer designs. Not only attractive for current supercomputers but also for next generation of shared-memory supercomputers.

  8. Design Features and Technology Uncertainties for the Next Generation Nuclear Plant

    Energy Technology Data Exchange (ETDEWEB)

    John M. Ryskamp; Phil Hildebrandt; Osamu Baba; Ron Ballinger; Robert Brodsky; Hans-Wolfgang Chi; Dennis Crutchfield; Herb Estrada; Jeane-Claude Garnier; Gerald Gordon; Richard Hobbins; Dan Keuter; Marilyn Kray; Philippe Martin; Steve Melancon; Christian Simon; Henry Stone; Robert Varrin; Werner von Lensa

    2004-06-01

    This report presents the conclusions, observations, and recommendations of the Independent Technology Review Group (ITRG) regarding design features and important technology uncertainties associated with very-high-temperature nuclear system concepts for the Next Generation Nuclear Plant (NGNP). The ITRG performed its reviews during the period November 2003 through April 2004.

  9. Implementing the Next Generation Science Standards

    Science.gov (United States)

    Penuel, William R.; Harris, Christopher J.; DeBarger, Angela Haydel

    2015-01-01

    The Next Generation Science Standards embody a new vision for science education grounded in the idea that science is both a body of knowledge and a set of linked practices for developing knowledge. The authors describe strategies that they suggest school and district leaders consider when designing strategies to support NGSS implementation.

  10. Next-Generation Sequencing Platforms

    Science.gov (United States)

    Mardis, Elaine R.

    2013-06-01

    Automated DNA sequencing instruments embody an elegant interplay among chemistry, engineering, software, and molecular biology and have built upon Sanger's founding discovery of dideoxynucleotide sequencing to perform once-unfathomable tasks. Combined with innovative physical mapping approaches that helped to establish long-range relationships between cloned stretches of genomic DNA, fluorescent DNA sequencers produced reference genome sequences for model organisms and for the reference human genome. New types of sequencing instruments that permit amazing acceleration of data-collection rates for DNA sequencing have been developed. The ability to generate genome-scale data sets is now transforming the nature of biological inquiry. Here, I provide an historical perspective of the field, focusing on the fundamental developments that predated the advent of next-generation sequencing instruments and providing information about how these instruments work, their application to biological research, and the newest types of sequencers that can extract data from single DNA molecules.

  11. Unbundling in Current Broadband and Next-Generation Ultra-Broadband Access Networks

    Science.gov (United States)

    Gaudino, Roberto; Giuliano, Romeo; Mazzenga, Franco; Valcarenghi, Luca; Vatalaro, Francesco

    2014-05-01

    This article overviews the methods that are currently under investigation for implementing multi-operator open-access/shared-access techniques in next-generation access ultra-broadband architectures, starting from the traditional "unbundling-of-the-local-loop" techniques implemented in legacy twisted-pair digital subscriber line access networks. A straightforward replication of these copper-based unbundling-of-the-local-loop techniques is usually not feasible on next-generation access networks, including fiber-to-the-home point-to-multipoint passive optical networks. To investigate this issue, the article first gives a concise description of traditional copper-based unbundling-of-the-local-loop solutions, then focalizes on both next-generation access hybrid fiber-copper digital subscriber line fiber-to-the-cabinet scenarios and on fiber to the home by accounting for the mix of regulatory and technological reasons driving the next-generation access migration path, focusing mostly on the European situation.

  12. Next generation sequencing reveals the hidden diversity of zooplankton assemblages.

    Directory of Open Access Journals (Sweden)

    Penelope K Lindeque

    Full Text Available BACKGROUND: Zooplankton play an important role in our oceans, in biogeochemical cycling and providing a food source for commercially important fish larvae. However, difficulties in correctly identifying zooplankton hinder our understanding of their roles in marine ecosystem functioning, and can prevent detection of long term changes in their community structure. The advent of massively parallel next generation sequencing technology allows DNA sequence data to be recovered directly from whole community samples. Here we assess the ability of such sequencing to quantify richness and diversity of a mixed zooplankton assemblage from a productive time series site in the Western English Channel. METHODOLOGY/PRINCIPLE FINDINGS: Plankton net hauls (200 µm were taken at the Western Channel Observatory station L4 in September 2010 and January 2011. These samples were analysed by microscopy and metagenetic analysis of the 18S nuclear small subunit ribosomal RNA gene using the 454 pyrosequencing platform. Following quality control a total of 419,041 sequences were obtained for all samples. The sequences clustered into 205 operational taxonomic units using a 97% similarity cut-off. Allocation of taxonomy by comparison with the National Centre for Biotechnology Information database identified 135 OTUs to species level, 11 to genus level and 1 to order, <2.5% of sequences were classified as unknowns. By comparison a skilled microscopic analyst was able to routinely enumerate only 58 taxonomic groups. CONCLUSIONS: Metagenetics reveals a previously hidden taxonomic richness, especially for Copepoda and hard-to-identify meroplankton such as Bivalvia, Gastropoda and Polychaeta. It also reveals rare species and parasites. We conclude that Next Generation Sequencing of 18S amplicons is a powerful tool for elucidating the true diversity and species richness of zooplankton communities. While this approach allows for broad diversity assessments of plankton it may

  13. Wireless and wireline service convergence in next generation optical access networks - the FP7 WISCON project

    DEFF Research Database (Denmark)

    Vegas Olmos, Juan José; Pang, Xiaodan; Lebedev, Alexander

    2014-01-01

    The next generation of information technology demands both high capacity and mobility for applications such as high speed wireless access capable of supporting broadband services. The transport of wireless and wireline signals is converging into a common telecommunication infrastructure....... In this paper, we will present the Marie Curie Framework Program 7 project “Wireless and wireline service convergence in next generation optical access networks” (WISCON), which focuses on the conception and study of novel architectures for wavelength-division-multiplexing (WDM) optical multi-modulation format...

  14. Mobile e-Learning for Next Generation Communication Environment

    Science.gov (United States)

    Wu, Tin-Yu; Chao, Han-Chieh

    2008-01-01

    This article develops an environment for mobile e-learning that includes an interactive course, virtual online labs, an interactive online test, and lab-exercise training platform on the fourth generation mobile communication system. The Next Generation Learning Environment (NeGL) promotes the term "knowledge economy." Inter-networking…

  15. Parallel Polarization State Generation.

    Science.gov (United States)

    She, Alan; Capasso, Federico

    2016-05-17

    The control of polarization, an essential property of light, is of wide scientific and technological interest. The general problem of generating arbitrary time-varying states of polarization (SOP) has always been mathematically formulated by a series of linear transformations, i.e. a product of matrices, imposing a serial architecture. Here we show a parallel architecture described by a sum of matrices. The theory is experimentally demonstrated by modulating spatially-separated polarization components of a laser using a digital micromirror device that are subsequently beam combined. This method greatly expands the parameter space for engineering devices that control polarization. Consequently, performance characteristics, such as speed, stability, and spectral range, are entirely dictated by the technologies of optical intensity modulation, including absorption, reflection, emission, and scattering. This opens up important prospects for polarization state generation (PSG) with unique performance characteristics with applications in spectroscopic ellipsometry, spectropolarimetry, communications, imaging, and security.

  16. Next generation of high-efficient waste incinerators. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jappe Frandsen, F.

    2010-11-15

    Modern society produces increasing amounts of combustible waste which may be utilized for heat and power production, at a lower emission of CO{sub 2}, e.g. by substituting a certain fraction of energy from fossil fuel-fired power stations. In 2007, 20.4 % of the district heating and 4.5 % of the power produced in Denmark came from thermal conversion of waste, and waste is a very important part of a future sustainable, and independent, Danish energy supply [Frandsen et al., 2009; Groen Energi, 2010]. In Denmark, approx 3.3 Mtons of waste was produced in 2005, an amount predicted to increase to 4.4 Mtons by the year 2030. According to Affald Danmark, 25 % of the current WtE plant capacity in Denmark is older than 20 years, which is usually considered as the technical and economical lifetime of WtE plants. Thus, there is a need for installation of a significant fraction of new waste incineration capacity, preferentially with an increased electrical efficiency, within the next few years. Compared to fossil fuels, waste is difficult to handle in terms of pre-treatment, combustion, and generation of reusable solid residues. In particular, the content of inorganic species (S, Cl, K, Na, etc.) is problematic, due to enhanced deposition and corrosion - especially at higher temperatures. This puts severe constraints on the electrical efficiency of grate-fired units utilizing waste, which seldom exceeds 26-27%, campared to 46-48 % for coal combustion in suspension. The key parameters when targeting higher electrical efficiency are the pressure and temperature in the steam cycle, which are limited by high-temperature corrosion, boiler- and combustion-technology. This report reviews some of the means that can be applied in order to increase the electrical efficiency in plants firing waste on a grate. (Author)

  17. A Window Into Clinical Next-Generation Sequencing-Based Oncology Testing Practices.

    Science.gov (United States)

    Nagarajan, Rakesh; Bartley, Angela N; Bridge, Julia A; Jennings, Lawrence J; Kamel-Reid, Suzanne; Kim, Annette; Lazar, Alexander J; Lindeman, Neal I; Moncur, Joel; Rai, Alex J; Routbort, Mark J; Vasalos, Patricia; Merker, Jason D

    2017-12-01

    - Detection of acquired variants in cancer is a paradigm of precision medicine, yet little has been reported about clinical laboratory practices across a broad range of laboratories. - To use College of American Pathologists proficiency testing survey results to report on the results from surveys on next-generation sequencing-based oncology testing practices. - College of American Pathologists proficiency testing survey results from more than 250 laboratories currently performing molecular oncology testing were used to determine laboratory trends in next-generation sequencing-based oncology testing. - These presented data provide key information about the number of laboratories that currently offer or are planning to offer next-generation sequencing-based oncology testing. Furthermore, we present data from 60 laboratories performing next-generation sequencing-based oncology testing regarding specimen requirements and assay characteristics. The findings indicate that most laboratories are performing tumor-only targeted sequencing to detect single-nucleotide variants and small insertions and deletions, using desktop sequencers and predesigned commercial kits. Despite these trends, a diversity of approaches to testing exists. - This information should be useful to further inform a variety of topics, including national discussions involving clinical laboratory quality systems, regulation and oversight of next-generation sequencing-based oncology testing, and precision oncology efforts in a data-driven manner.

  18. Next generation reactor development activity at Hitachi, Ltd

    International Nuclear Information System (INIS)

    Yamashita, Junichi

    2005-01-01

    Developments of innovative nuclear systems in Japan have been highly requested to cope with uncertain future nuclear power generation and fuel cycle situation. Next generation reactor system shall be surely deployed earlier to be capable to provide with several options such as plutonium multi-recycle, intermediate storage of spent fuels, simplified reprocessing of spent fuels and separated storage of 'Pu+FP' and 'U', spent fuels storage after Pu LWR recycle and their combinations, while future reactor system will be targeted at ideal fuel recycle system of higher breeding gain and transmutation of radioactive wastes. Modified designs of the ABWR at large size and medium and small size have been investigated as well as a BWR based RMWR and a supercritical-pressure LWR to ensure safety and improve economics. Advanced fuel cycle technologies of a combination of fluoride volatility process and PUREX process with high decontamination (FLUOREX process) and a modified fluoride volatility process with low decontamination have been developed. (T. Tanaka)

  19. Public Outreach at RAL: Engaging the Next Generation of Scientists and Engineers

    Science.gov (United States)

    Corbett, G.; Ryall, G.; Palmer, S.; Collier, I. P.; Adams, J.; Appleyard, R.

    2015-12-01

    The Rutherford Appleton Laboratory (RAL) is part of the UK's Science and Technology Facilities Council (STFC). As part of the Royal Charter that established the STFC, the organisation is required to generate public awareness and encourage public engagement and dialogue in relation to the science undertaken. The staff at RAL firmly support this activity as it is important to encourage the next generation of students to consider studying Science, Technology, Engineering, and Mathematics (STEM) subjects, providing the UK with a highly skilled work-force in the future. To this end, the STFC undertakes a variety of outreach activities. This paper will describe the outreach activities undertaken by RAL, particularly focussing on those of the Scientific Computing Department (SCD). These activities include: an Arduino based activity day for 12-14 year-olds to celebrate Ada Lovelace day; running a centre as part of the Young Rewired State - encouraging 11-18 year-olds to create web applications with open data; sponsoring a team in the Engineering Education Scheme - supporting a small team of 16-17 year-olds to solve a real world engineering problem; as well as the more traditional tours of facilities. These activities could serve as an example for other sites involved in scientific computing around the globe.

  20. Composite Materials under Extreme Radiation and Temperature Environments of the Next Generation Nuclear Reactors

    International Nuclear Information System (INIS)

    Simos, N.

    2011-01-01

    In the nuclear energy renaissance, driven by fission reactor concepts utilizing very high temperatures and fast neutron spectra, materials with enhanced performance that exceeds are expected to play a central role. With the operating temperatures of the Generation III reactors bringing the classical reactor materials close to their performance limits there is an urgent need to develop and qualify new alloys and composites. Efforts have been focused on the intricate relations and the high demands placed on materials at the anticipated extreme states within the next generation fusion and fission reactors which combine high radiation fluxes, elevated temperatures and aggressive environments. While nuclear reactors have been in operation for several decades, the structural materials associated with the next generation options need to endure much higher temperatures (1200 C), higher neutron doses (tens of displacements per atom, dpa), and extremely corrosive environments, which are beyond the experience on materials accumulated to-date. The most important consideration is the performance and reliability of structural materials for both in-core and out-of-core functions. While there exists a great body of nuclear materials research and operating experience/performance from fission reactors where epithermal and thermal neutrons interact with materials and alter their physio-mechanical properties, a process that is well understood by now, there are no operating or even experimental facilities that will facilitate the extreme conditions of flux and temperature anticipated and thus provide insights into the behaviour of these well understood materials. Materials, however, still need to be developed and their interaction and damage potential or lifetime to be quantified for the next generation nuclear energy. Based on material development advances, composites, and in particular ceramic composites, seem to inherently possess properties suitable for key functions within the

  1. Navigating the tip of the genomic iceberg: Next-generation sequencing for plant systematics.

    Science.gov (United States)

    Straub, Shannon C K; Parks, Matthew; Weitemier, Kevin; Fishbein, Mark; Cronn, Richard C; Liston, Aaron

    2012-02-01

    Just as Sanger sequencing did more than 20 years ago, next-generation sequencing (NGS) is poised to revolutionize plant systematics. By combining multiplexing approaches with NGS throughput, systematists may no longer need to choose between more taxa or more characters. Here we describe a genome skimming (shallow sequencing) approach for plant systematics. Through simulations, we evaluated optimal sequencing depth and performance of single-end and paired-end short read sequences for assembly of nuclear ribosomal DNA (rDNA) and plastomes and addressed the effect of divergence on reference-guided plastome assembly. We also used simulations to identify potential phylogenetic markers from low-copy nuclear loci at different sequencing depths. We demonstrated the utility of genome skimming through phylogenetic analysis of the Sonoran Desert clade (SDC) of Asclepias (Apocynaceae). Paired-end reads performed better than single-end reads. Minimum sequencing depths for high quality rDNA and plastome assemblies were 40× and 30×, respectively. Divergence from the reference significantly affected plastome assembly, but relatively similar references are available for most seed plants. Deeper rDNA sequencing is necessary to characterize intragenomic polymorphism. The low-copy fraction of the nuclear genome was readily surveyed, even at low sequencing depths. Nearly 160000 bp of sequence from three organelles provided evidence of phylogenetic incongruence in the SDC. Adoption of NGS will facilitate progress in plant systematics, as whole plastome and rDNA cistrons, partial mitochondrial genomes, and low-copy nuclear markers can now be efficiently obtained for molecular phylogenetics studies.

  2. Data Analysis and Next Generation Assessments

    Science.gov (United States)

    Pon, Kathy

    2013-01-01

    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  3. Exploiting parallel R in the cloud with SPRINT.

    Science.gov (United States)

    Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J

    2013-01-01

    Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.

  4. Achieving universal access to next generation networks

    DEFF Research Database (Denmark)

    Falch, Morten; Henten, Anders

    The paper examines investment dimensions of next generation networks in a universal service perspective in a European context. The question is how new network infrastructures for getting access to communication, information and entertainment services in the present and future information society...

  5. Educating the Next Generation of Lunar Scientists

    Science.gov (United States)

    Shaner, A. J.; Shipp, S. S.; Allen, J. S.; Kring, D. A.

    2010-12-01

    The Center for Lunar Science and Exploration (CLSE), a collaboration between the Lunar and Planetary Institute (LPI) and NASA’s Johnson Space Center (JSC), is one of seven member teams of the NASA Lunar Science Institute (NLSI). In addition to research and exploration activities, the CLSE team is deeply invested in education and outreach. In support of NASA’s and NLSI’s objective to train the next generation of scientists, CLSE’s High School Lunar Research Project is a conduit through which high school students can actively participate in lunar science and learn about pathways into scientific careers. The High School Lunar Research Project engages teams of high school students in authentic lunar research that envelopes them in the process of science and supports the science goals of the CLSE. Most high school students’ lack of scientific research experience leaves them without an understanding of science as a process. Because of this, each team is paired with a lunar scientist mentor responsible for guiding students through the process of conducting a scientific investigation. Before beginning their research, students undertake “Moon 101,” designed to familiarize them with lunar geology and exploration. Students read articles covering various lunar geology topics and analyze images from past and current lunar missions to become familiar with available lunar data sets. At the end of “Moon 101”, students present a characterization of the geology and chronology of features surrounding the Apollo 11 landing site. To begin their research, teams choose a research subject from a pool of topics compiled by the CLSE staff. After choosing a topic, student teams ask their own research questions, within the context of the larger question, and design their own research approach to direct their investigation. At the conclusion of their research, teams present their results and, after receiving feedback, create and present a conference style poster to a panel of

  6. Next-Generation Genomics Facility at C-CAMP: Accelerating Genomic Research in India

    Science.gov (United States)

    S, Chandana; Russiachand, Heikham; H, Pradeep; S, Shilpa; M, Ashwini; S, Sahana; B, Jayanth; Atla, Goutham; Jain, Smita; Arunkumar, Nandini; Gowda, Malali

    2014-01-01

    Next-Generation Sequencing (NGS; http://www.genome.gov/12513162) is a recent life-sciences technological revolution that allows scientists to decode genomes or transcriptomes at a much faster rate with a lower cost. Genomic-based studies are in a relatively slow pace in India due to the non-availability of genomics experts, trained personnel and dedicated service providers. Using NGS there is a lot of potential to study India's national diversity (of all kinds). We at the Centre for Cellular and Molecular Platforms (C-CAMP) have launched the Next Generation Genomics Facility (NGGF) to provide genomics service to scientists, to train researchers and also work on national and international genomic projects. We have HiSeq1000 from Illumina and GS-FLX Plus from Roche454. The long reads from GS FLX Plus, and high sequence depth from HiSeq1000, are the best and ideal hybrid approaches for de novo and re-sequencing of genomes and transcriptomes. At our facility, we have sequenced around 70 different organisms comprising of more than 388 genomes and 615 transcriptomes – prokaryotes and eukaryotes (fungi, plants and animals). In addition we have optimized other unique applications such as small RNA (miRNA, siRNA etc), long Mate-pair sequencing (2 to 20 Kb), Coding sequences (Exome), Methylome (ChIP-Seq), Restriction Mapping (RAD-Seq), Human Leukocyte Antigen (HLA) typing, mixed genomes (metagenomes) and target amplicons, etc. Translating DNA sequence data from NGS sequencer into meaningful information is an important exercise. Under NGGF, we have bioinformatics experts and high-end computing resources to dissect NGS data such as genome assembly and annotation, gene expression, target enrichment, variant calling (SSR or SNP), comparative analysis etc. Our services (sequencing and bioinformatics) have been utilized by more than 45 organizations (academia and industry) both within India and outside, resulting several publications in peer-reviewed journals and several genomic

  7. Next Generation Drivetrain Development and Test Program

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Jonathan; Erdman, Bill; Blodgett, Doug; Halse, Chris; Grider, Dave

    2015-11-03

    This presentation was given at the Wind Energy IQ conference in Bremen, Germany, November 30 through December 2, 2105. It focused on the next-generation drivetrain architecture and drivetrain technology development and testing (including gearbox and inverter software and medium-voltage inverter modules.

  8. A Next Generation Light Source Facility at LBNL

    International Nuclear Information System (INIS)

    Corlett, J.N.; Austin, B.; Baptiste, K.M.; Byrd, J.M.; Denes, P.; Donahue, R.; Doolittle, L.; Falcone, R.W.; Filippetto, D.; Fournier, S.; Li, D.; Padmore, H.A.; Papadopoulos, C.; Pappas, C.; Penn, G.; Placidi, M.; Prestemon, S.; Prosnitz, D.; Qiang, J.; Ratti, A.; Reinsch, M.; Sannibale, F.; Schlueter, R.; Schoenlein, R.W.; Staples, J.W.; Vecchione, T.; Venturini, M.; Wells, R.; Wilcox, R.; Wurtele, J.; Charman, A.; Kur, E.; Zholents, A.A.

    2011-01-01

    The Next Generation Light Source (NGLS) is a design concept, under development at LBNL, for a multibeamline soft x-ray FEL array powered by a ∼2 GeV superconducting linear accelerator, operating with a 1 MHz bunch repetition rate. The CW superconducting linear accelerator is supplied by a high-brightness, high-repetition-rate photocathode electron gun. Electron bunches are distributed from the linac to the array of independently configurable FEL beamlines with nominal bunch rates up to 100 kHz in each FEL, and with even pulse spacing. Individual FELs may be configured for EEHG, HGHG, SASE, or oscillator mode of operation, and will produce high peak and average brightness x-rays with a flexible pulse format, with pulse durations ranging from sub-femtoseconds to hundreds of femtoseconds.

  9. Heterogeneous next-generation wireless network interference model-and its applications

    KAUST Repository

    Mahmood, Nurul Huda

    2014-04-01

    Next-generation wireless systems facilitating better utilisation of the scarce radio spectrum have emerged as a response to inefficient and rigid spectrum assignment policies. These are comprised of intelligent radio nodes that opportunistically operate in the radio spectrum of existing primary systems, yet unwanted interference at the primary receivers is unavoidable. In order to design efficient next-generation systems and to minimise the adverse effect of their interference, it is necessary to realise how the resulting interference impacts the performance of the primary systems. In this work, a generalised framework for the interference analysis of such a next-generation system is presented where the nextgeneration transmitters may transmit randomly with different transmit powers. The analysis is built around a model developed for the statistical representation of the interference at the primary receivers, which is then used to evaluate various performance measures of the primary system. Applications of the derived interference model in designing the next-generation network system parameters are also demonstrated. Such approach provides a unified and generalised framework, the use of which allows a wide range of performance metrics can be evaluated. Findings of the analytical performance analyses are confirmed through extensive computer-based Monte-Carlo simulations. © 2012 John Wiley & Sons, Ltd.

  10. IceCube Gen2. The next-generation neutrino observatory for the South Pole

    Energy Technology Data Exchange (ETDEWEB)

    Santen, Jakob van [DESY, Zeuthen (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic-kilometer Cherenkov telescope buried in the ice sheet at the South Pole that detects neutrinos of all flavors with energies from tens of GeV to several PeV. The instrument provided the first measurement of the flux of high-energy astrophysical neutrinos, opening a new window to the TeV universe. At the other end of its sensitivity range, IceCube has provided precision measurements of neutrino oscillation parameters that are competitive with dedicated accelerator-based experiments. Here we present design studies for IceCube Gen2, the next-generation neutrino observatory for the South Pole. Instrumenting a volume of more that 5 km{sup 3} with over 100 new strings, IceCube Gen2 will have substantially greater sensitivity to high-energy neutrinos than current-generation instruments. PINGU, a dense infill array, will lower the energy threshold of the inner detector region to 4 GeV, allowing a determination of the neutrino mass hierarchy. On the surface, a large air shower detector will veto high-energy atmospheric muons and neutrinos from the southern hemisphere, enhancing the reach of astrophysical neutrino searches. With its versatile instrumentation, the IceCube Gen2 facility will allow us to explore the neutrino sky with unprecedented sensitivity, providing new constraints on the sources of the highest-energy cosmic rays, and yield precision data on the mixing and mass ordering of neutrinos.

  11. The next generation PanDA Pilot for and beyond the ATLAS experiment

    CERN Document Server

    Nilsson, Paul; The ATLAS collaboration

    2018-01-01

    The Production and Distributed Analysis system (PanDA) is a pilot-based workload management system that was originally designed for the ATLAS Experiment at the LHC to operate on grid sites. Since the coming LHC data taking runs will require more resources than grid computing alone can provide, the various LHC experiments are engaged in an ambitious program to extend the computing model to include opportunistically used resources such as High Performance Computers (HPCs), clouds and volunteer computers. To this end, PanDA is being extended beyond grids and ATLAS to be used on the new types of resources as well as by other experiments. A new key component is being developed, the next generation PanDA Pilot (Pilot 2). Pilot 2 is a complete rewrite of the original PanDA Pilot which has been used in the ATLAS Experiment for over a decade. The new Pilot architecture follows a component-based approach which improves system flexibility, enables a clear workflow control, evolves the system according to modern function...

  12. Usefulness of Genetic Study by Next-generation Sequencing in High-risk Arrhythmogenic Cardiomyopathy.

    Science.gov (United States)

    Ruiz Salas, Amalio; Peña Hernández, José; Medina Palomo, Carmen; Barrera Cordero, Alberto; Cabrera Bueno, Fernando; García Pinilla, José Manuel; Guijarro, Ana; Morcillo-Hidalgo, Luis; Jiménez Navarro, Manuel; Gómez Doblas, Juan José; de Teresa, Eduardo; Alzueta, Javier

    2018-03-29

    Arrhythmogenic right ventricular cardiomyopathy (ARVC) is an inherited cardiomyopathy characterized by progressive fibrofatty replacement of predominantly right ventricular myocardium. This cardiomyopathy is a frequent cause of sudden cardiac death in young people and athletes. The aim of our study was to determine the incidence of pathological or likely pathological desmosomal mutations in patients with high-risk definite ARVC. This was an observational, retrospective cohort study, which included 36 patients diagnosed with high-risk ARVC in our hospital between January 1998 and January 2015. Genetic analysis was performed using next-generation sequencing. Most patients were male (28 patients, 78%) with a mean age at diagnosis of 45 ± 18 years. A pathogenic or probably pathogenic desmosomal mutation was detected in 26 of the 35 index cases (74%): 5 nonsense, 14 frameshift, 1 splice, and 6 missense. Novel mutations were found in 15 patients (71%). The presence or absence of desmosomal mutations causing the disease and the type of mutation were not associated with specific electrocardiographic, clinical, arrhythmic, anatomic, or prognostic characteristics. The incidence of pathological or likely pathological desmosomal mutations in ARVC is very high, with most mutations causing truncation. The presence of desmosomal mutations was not associated with prognosis. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  13. Applications and Security of Next-Generation, User-Centric Wireless Systems

    Directory of Open Access Journals (Sweden)

    Danfeng Yao

    2010-07-01

    Full Text Available Pervasive wireless systems have significantly improved end-users’ quality of life. As manufacturing costs decrease, communications bandwidth increases, and contextual information is made more readily available, the role of next generation wireless systems in facilitating users’ daily activities will grow. Unique security and privacy issues exist in these wireless, context-aware, often decentralized systems. For example, the pervasive nature of such systems allows adversaries to launch stealthy attacks against them. In this review paper, we survey several emergent personal wireless systems and their applications. These systems include mobile social networks, active implantable medical devices, and consumer products. We explore each system’s usage of contextual information and provide insight into its security vulnerabilities. Where possible, we describe existing solutions for defendingagainst these vulnerabilities. Finally, we point out promising future research directions for improving these systems’ robustness and security

  14. Evaluating the 'next generation' of cell salvage--will it make a difference?

    Science.gov (United States)

    Yarham, Gemma; Clements, Ann; Oliver, Martin; Morris, Christopher; Cumberland, Tom; Bryan, Megan; Jekler, Sasa; Johns, Kathy; Mulholland, John

    2011-07-01

    Donor blood supplies are diminishing, becoming more costly and these transfusions lead to higher mortality in cardiac patients. The transfusion risks and the literature highlight the need for an alternative similar to cell salvage to be routinely considered. The Xtra is the first cell saver to be launched since 2001 and will undoubtedly initiate evolution towards the 'next generation' of cell savers. It is also the first to be launched in a new era where the demand for electronic perfusion data management (EPDM) has grown. The user interface (UI) was easy to use. The increased data entry options improved the quality of the recordable data. The integrated data management system (DMS) was comprehensive. Data was easy to manage and enabled central data compilation, which reduces repeated data, the risk of inconsistent data inventory and provides the potential for research and analyses. The haematocrit of the processed blood is a key quality indicator for cell salvage. The comparison of the manufacturer's integrated protocol, Popt, to our team's own protocol showed that Popt delivered a higher haematocrit on its '1st bowl' (59.1% compared to 57.3%) and its 'total process' end product haematocrit was 0.68% higher. The Popt cycle took an average of 330s, whereas our own settings completed in just over 300s. The Xtra is a device which will lead the evolution of 'next generation' cell saver technology. The user interface and data management system provide export options and the ability to record the level of data required for good EPDM. This is essential to 'future proof' cell salvage technology. The manufacturer's integrated protocol achieved a higher end product haematocrit than our perfusion team's best practice. The design of the Xtra is contemporary, but the DMS equips this cell saver for the new era that faces both Perfusion and Cardiac Surgery.

  15. 76 FR 49776 - The Development and Evaluation of Next-Generation Smallpox Vaccines; Public Workshop

    Science.gov (United States)

    2011-08-11

    ...] The Development and Evaluation of Next-Generation Smallpox Vaccines; Public Workshop AGENCY: Food and... Evaluation of Next-Generation Smallpox Vaccines.'' The purpose of the public workshop is to identify and discuss the key issues related to the development and evaluation of next-generation smallpox vaccines. The...

  16. Applications of nanotechnology, next generation sequencing and microarrays in biomedical research.

    Science.gov (United States)

    Elingaramil, Sauli; Li, Xiaolong; He, Nongyue

    2013-07-01

    Next-generation sequencing technologies, microarrays and advances in bio nanotechnology have had an enormous impact on research within a short time frame. This impact appears certain to increase further as many biomedical institutions are now acquiring these prevailing new technologies. Beyond conventional sampling of genome content, wide-ranging applications are rapidly evolving for next-generation sequencing, microarrays and nanotechnology. To date, these technologies have been applied in a variety of contexts, including whole-genome sequencing, targeted re sequencing and discovery of transcription factor binding sites, noncoding RNA expression profiling and molecular diagnostics. This paper thus discusses current applications of nanotechnology, next-generation sequencing technologies and microarrays in biomedical research and highlights the transforming potential these technologies offer.

  17. Next generation surveillance system (NGSS)

    International Nuclear Information System (INIS)

    Aparo, Massimo

    2006-01-01

    Development of 'functional requirements' for transparency systems may offer a near-term mode of regional cooperation. New requirements under development at the IAEA may provide a foundation for this potential activity. The Next Generation Surveillance System (NGSS) will become the new IAEA remote monitoring system Under new requirements the NGSS would attempt to use more commercial components to reduce cost, increase radiation survivability and further increase reliability. The NGSS must be available in two years due to rapidly approaching obsolescence in the existing DCM family. (author)

  18. A Next Generation Light Source Facility at LBNL

    Energy Technology Data Exchange (ETDEWEB)

    Corlett, J.N.; Austin, B.; Baptiste, K.M.; Byrd, J.M.; Denes, P.; Donahue, R.; Doolittle, L.; Falcone, R.W.; Filippetto, D.; Fournier, S.; Li, D.; Padmore, H.A.; Papadopoulos, C.; Pappas, C.; Penn, G.; Placidi, M.; Prestemon, S.; Prosnitz, D.; Qiang, J.; Ratti, A.; Reinsch, M.; Sannibale, F.; Schlueter, R.; Schoenlein, R.W.; Staples, J.W.; Vecchione, T.; Venturini, M.; Wells, R.; Wilcox, R.; Wurtele, J.; Charman, A.; Kur, E.; Zholents, A.A.

    2011-03-23

    The Next Generation Light Source (NGLS) is a design concept, under development at LBNL, for a multibeamline soft x-ray FEL array powered by a ~;;2 GeV superconducting linear accelerator, operating with a 1 MHz bunch repetition rate. The CW superconducting linear accelerator is supplied by a high-brightness, highrepetition- rate photocathode electron gun. Electron bunches are distributed from the linac to the array of independently configurable FEL beamlines with nominal bunch rates up to 100 kHz in each FEL, and with even pulse spacing. Individual FELs may be configured for EEHG, HGHG, SASE, or oscillator mode of operation, and will produce high peak and average brightness x-rays with a flexible pulse format, with pulse durations ranging from sub-femtoseconds to hundreds of femtoseconds.

  19. High performance parallel I/O

    CERN Document Server

    Prabhat

    2014-01-01

    Gain Critical Insight into the Parallel I/O EcosystemParallel I/O is an integral component of modern high performance computing (HPC), especially in storing and processing very large datasets to facilitate scientific discovery. Revealing the state of the art in this field, High Performance Parallel I/O draws on insights from leading practitioners, researchers, software architects, developers, and scientists who shed light on the parallel I/O ecosystem.The first part of the book explains how large-scale HPC facilities scope, configure, and operate systems, with an emphasis on choices of I/O har

  20. Detecting exact breakpoints of deletions with diversity in hepatitis B viral genomic DNA from next-generation sequencing data.

    Science.gov (United States)

    Cheng, Ji-Hong; Liu, Wen-Chun; Chang, Ting-Tsung; Hsieh, Sun-Yuan; Tseng, Vincent S

    2017-10-01

    Many studies have suggested that deletions of Hepatitis B Viral (HBV) are associated with the development of progressive liver diseases, even ultimately resulting in hepatocellular carcinoma (HCC). Among the methods for detecting deletions from next-generation sequencing (NGS) data, few methods considered the characteristics of virus, such as high evolution rates and high divergence among the different HBV genomes. Sequencing high divergence HBV genome sequences using the NGS technology outputs millions of reads. Thus, detecting exact breakpoints of deletions from these big and complex data incurs very high computational cost. We proposed a novel analytical method named VirDelect (Virus Deletion Detect), which uses split read alignment base to detect exact breakpoint and diversity variable to consider high divergence in single-end reads data, such that the computational cost can be reduced without losing accuracy. We use four simulated reads datasets and two real pair-end reads datasets of HBV genome sequence to verify VirDelect accuracy by score functions. The experimental results show that VirDelect outperforms the state-of-the-art method Pindel in terms of accuracy score for all simulated datasets and VirDelect had only two base errors even in real datasets. VirDelect is also shown to deliver high accuracy in analyzing the single-end read data as well as pair-end data. VirDelect can serve as an effective and efficient bioinformatics tool for physiologists with high accuracy and efficient performance and applicable to further analysis with characteristics similar to HBV on genome length and high divergence. The software program of VirDelect can be downloaded at https://sourceforge.net/projects/virdelect/. Copyright © 2017. Published by Elsevier Inc.

  1. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    Science.gov (United States)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  2. Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units

    International Nuclear Information System (INIS)

    Mburu, Joe Mwangi; Hah, Chang Joo Hah

    2014-01-01

    Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization

  3. Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units

    Energy Technology Data Exchange (ETDEWEB)

    Mburu, Joe Mwangi; Hah, Chang Joo Hah [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization.

  4. Next Generation Driver for Attosecond and Laser-plasma Physics.

    Science.gov (United States)

    Rivas, D E; Borot, A; Cardenas, D E; Marcus, G; Gu, X; Herrmann, D; Xu, J; Tan, J; Kormin, D; Ma, G; Dallari, W; Tsakiris, G D; Földes, I B; Chou, S-W; Weidman, M; Bergues, B; Wittmann, T; Schröder, H; Tzallas, P; Charalambidis, D; Razskazovskaya, O; Pervak, V; Krausz, F; Veisz, L

    2017-07-12

    The observation and manipulation of electron dynamics in matter call for attosecond light pulses, routinely available from high-order harmonic generation driven by few-femtosecond lasers. However, the energy limitation of these lasers supports only weak sources and correspondingly linear attosecond studies. Here we report on an optical parametric synthesizer designed for nonlinear attosecond optics and relativistic laser-plasma physics. This synthesizer uniquely combines ultra-relativistic focused intensities of about 10 20  W/cm 2 with a pulse duration of sub-two carrier-wave cycles. The coherent combination of two sequentially amplified and complementary spectral ranges yields sub-5-fs pulses with multi-TW peak power. The application of this source allows the generation of a broad spectral continuum at 100-eV photon energy in gases as well as high-order harmonics in relativistic plasmas. Unprecedented spatio-temporal confinement of light now permits the investigation of electric-field-driven electron phenomena in the relativistic regime and ultimately the rise of next-generation intense isolated attosecond sources.

  5. Next Generation Geothermal Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Brugman, John; Hattar, Mai; Nichols, Kenneth; Esaki, Yuri

    1995-09-01

    cycle. Results of this study indicate that dual flash type plants are preferred at resources with temperatures above 400 F. Closed loop (binary type) plants are preferred at resources with temperatures below 400 F. A rotary separator turbine upstream of a dual flash plant can be beneficial at Salton Sea, the hottest resource, or at high temperature resources where there is a significant variance in wellhead pressures from well to well. Full scale demonstration is required to verify cost and performance. Hot water turbines that recover energy from the spent brine in a dual flash cycle improve that cycle's brine efficiency. Prototype field tests of this technology have established its technical feasibility. If natural gas prices remain low, a combustion turbine/binary hybrid is an economic option for the lowest temperature sites. The use of mixed fluids appear to be an attractive low risk option. The synchronous turbine option as prepared by Barber-Nichols is attractive but requires a pilot test to prove cost and performance. Dual flash binary bottoming cycles appear promising provided that scaling of the brine/working fluid exchangers is controllable. Metastable expansion, reheater, Subatmospheric flash, dual flash backpressure turbine, and hot dry rock concepts do not seem to offer any cost advantage over the baseline technologies. If implemented, the next generation geothermal power plant concept may improve brine utilization but is unlikely to reduce the cost of power generation by much more than 10%. Colder resources will benefit more from the development of a next generation geothermal power plant than will hotter resources. All values presented in this study for plant cost and for busbar cost of power are relative numbers intended to allow an objective and meaningful comparison of technologies. The goal of this study is to assess various technologies on an common basis and, secondarily, to give an approximate idea of the current costs of the technologies at

  6. Optimizing the next generation optical access networks

    DEFF Research Database (Denmark)

    Amaya Fernández, Ferney Orlando; Soto, Ana Cardenas; Tafur Monroy, Idelfonso

    2009-01-01

    Several issues in the design and optimization of the next generation optical access network (NG-OAN) are presented. The noise, the distortion and the fiber optic nonlinearities are considered to optimize the video distribution link in a passive optical network (PON). A discussion of the effect...

  7. IPv6: The Next Generation Internet Protocol

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 3. IPv6: The Next Generation Internet Protocol - IPv4 and its Shortcomings. Harsha Srinath. General Article Volume 8 Issue 3 March 2003 pp 33-41. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. IPv6: The Next Generation Internet Protocol

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. IPv6: The Next Generation Internet Protocol - New Features in IPv6. Harsha Srinath. General Article Volume 8 Issue 4 April 2003 pp 8-16. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Next Generation Safeguards Initiative: Human Capital Development

    International Nuclear Information System (INIS)

    Scholz, M.; Irola, G.; Glynn, K.

    2015-01-01

    Since 2008, the Human Capital Development (HCD) subprogramme of the U.S. National Nuclear Security Administration's (NNSA) Next Generation Safeguards Initiative (NGSI) has supported the recruitment, education, training, and retention of the next generation of international safeguards professionals to meet the needs of both the International Atomic Energy Agency (IAEA) and the United States. Specifically, HCD's efforts respond to data indicating that 82% of safeguards experts at U.S. Laboratories will have left the workforce within 15 years. This paper provides an update on the status of the subprogramme since its last presentation at the IAEA Safeguards Symposium in 2010. It highlights strengthened, integrated efforts in the areas of graduate and post-doctoral fellowships, young and midcareer professional support, short safeguards courses, and university engagement. It also discusses lessons learned from the U.S. experience in safeguards education and training as well as the importance of long-range strategies to develop a cohesive, effective, and efficient human capital development approach. (author)

  10. Next Generation Integrated Environment for Collaborative Work Across Internets

    Energy Technology Data Exchange (ETDEWEB)

    Harvey B. Newman

    2009-02-24

    We are now well-advanced in our development, prototyping and deployment of a high performance next generation Integrated Environment for Collaborative Work. The system, aimed at using the capability of ESnet and Internet2 for rapid data exchange, is based on the Virtual Room Videoconferencing System (VRVS) developed by Caltech. The VRVS system has been chosen by the Internet2 Digital Video (I2-DV) Initiative as a preferred foundation for the development of advanced video, audio and multimedia collaborative applications by the Internet2 community. Today, the system supports high-end, broadcast-quality interactivity, while enabling a wide variety of clients (Mbone, H.323) to participate in the same conference by running different standard protocols in different contexts with different bandwidth connection limitations, has a fully Web-integrated user interface, developers and administrative APIs, a widely scalable video network topology based on both multicast domains and unicast tunnels, and demonstrated multiplatform support. This has led to its rapidly expanding production use for national and international scientific collaborations in more than 60 countries. We are also in the process of creating a 'testbed video network' and developing the necessary middleware to support a set of new and essential requirements for rapid data exchange, and a high level of interactivity in large-scale scientific collaborations. These include a set of tunable, scalable differentiated network services adapted to each of the data streams associated with a large number of collaborative sessions, policy-based and network state-based resource scheduling, authentication, and optional encryption to maintain confidentiality of inter-personal communications. High performance testbed video networks will be established in ESnet and Internet2 to test and tune the implementation, using a few target application-sets.

  11. Next Generation Sequencing of Ancient DNA: Requirements, Strategies and Perspectives

    Directory of Open Access Journals (Sweden)

    Michael Knapp

    2010-07-01

    Full Text Available The invention of next-generation-sequencing has revolutionized almost all fields of genetics, but few have profited from it as much as the field of ancient DNA research. From its beginnings as an interesting but rather marginal discipline, ancient DNA research is now on its way into the centre of evolutionary biology. In less than a year from its invention next-generation-sequencing had increased the amount of DNA sequence data available from extinct organisms by several orders of magnitude. Ancient DNA  research is now not only adding a temporal aspect to evolutionary studies and allowing for the observation of evolution in real time, it also provides important data to help understand the origins of our own species. Here we review progress that has been made in next-generation-sequencing of ancient DNA over the past five years and evaluate sequencing strategies and future directions.

  12. NOAA Next Generation Radar (NEXRAD) Level 3 Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Level 3 weather radar products collected from Next-Generation Radar (NEXRAD) stations located in the contiguous United States, Alaska,...

  13. Next Generation CANDU: Conceptual Design for a Short Construction Schedule

    International Nuclear Information System (INIS)

    Hopwood, Jerry M.; Love, Ian J.W.; Elgohary, Medhat; Fairclough, Neville

    2002-01-01

    Atomic Energy of Canada Ltd. (AECL) has very successful experience in implementing new construction methods at the Qinshan (Phase III) twin unit CANDU 6 plant in China. This paper examines the construction method that must be implemented during the conceptual design phase of a project if short construction schedules are to be met. A project schedule of 48 months has been developed for the nth unit of NG (Next Generation) CANDU with a 42 month construction period from 1. Concrete to In-Service. An overall construction strategy has been developed involving paralleling project activities that are normally conducted in series. Many parts of the plant will be fabricated as modules and be installed using heavy lift cranes. The Reactor Building (RB), being on the critical path, has been the focus of considerable assessment, looking at alternative ways of applying the construction strategy to this building. A construction method has been chosen which will result in excess of 80% of internal work being completed as modules or as very streamlined traditional construction. This method is being further evaluated as the detailed layout proceeds. Other areas of the plant have been integrated into the schedule and new construction methods are being applied to these so that further modularization and even greater paralleling of activities will be achieved. It is concluded that the optimized construction method is a requirement, which must be implemented through all phases of design to make a 42 month construction schedule a reality. If the construction methods are appropriately chosen, the schedule reductions achieved will make nuclear more competitive. (authors)

  14. Next generation sensors and systems

    CERN Document Server

    2016-01-01

    Written by experts in their area of research, this book has outlined the current status of the fundamentals and analytical concepts, modelling and design issues, technical details and practical applications of different types of sensors and discussed about the trends of next generation of sensors and systems happening in the area of Sensing technology. This book will be useful as a reference book for engineers and scientist especially the post-graduate students find will this book as reference book for their research on wearable sensors, devices and technologies.  .

  15. (U) Ristra Next Generation Code Report

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-22

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming a common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.

  16. Diagnostics of Primary Immunodeficiencies through Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Vera Gallo

    2016-11-01

    Full Text Available Background: Recently, a growing number of novel genetic defects underlying primary immunodeficiencies (PID have been identified, increasing the number of PID up to more than 250 well-defined forms. Next-generation sequencing (NGS technologies and proper filtering strategies greatly contributed to this rapid evolution, providing the possibility to rapidly and simultaneously analyze large numbers of genes or the whole exome. Objective: To evaluate the role of targeted next-generation sequencing and whole exome sequencing in the diagnosis of a case series, characterized by complex or atypical clinical features suggesting a PID, difficult to diagnose using the current diagnostic procedures.Methods: We retrospectively analyzed genetic variants identified through targeted next-generation sequencing or whole exome sequencing in 45 patients with complex PID of unknown etiology. Results: 40 variants were identified using targeted next-generation sequencing, while 5 were identified using whole exome sequencing. Newly identified genetic variants were classified into 4 groups: I variations associated with a well-defined PID; II variations associated with atypical features of a well-defined PID; III functionally relevant variations potentially involved in the immunological features; IV non-diagnostic genotype, in whom the link with phenotype is missing. We reached a conclusive genetic diagnosis in 7/45 patients (~16%. Among them, 4 patients presented with a typical well-defined PID. In the remaining 3 cases, mutations were associated with unexpected clinical features, expanding the phenotypic spectrum of typical PIDs. In addition, we identified 31 variants in 10 patients with complex phenotype, individually not causative per se of the disorder.Conclusion: NGS technologies represent a cost-effective and rapid first-line genetic approaches for the evaluation of complex PIDs. Whole exome sequencing, despite a moderate higher cost compared to targeted, is

  17. Next-generation approaches to the microbial ecology of food fermentations

    Directory of Open Access Journals (Sweden)

    Nicholas A. Bokulich1,2,3 & David A. Mills1,2,3*

    2012-07-01

    Full Text Available Food fermentations have enhanced human health since the dawnof time and remain a prevalent means of food processing andpreservation. Due to their cultural and nutritional importance,many of these foods have been studied in detail using moleculartools, leading to enhancements in quality and safety. Furthermore,recent advances in high-throughput sequencing technologyare revolutionizing the study of food microbial ecology,deepening insight into complex fermentation systems. Thisreview provides insight into novel applications of selectmolecular techniques, particularly next-generation sequencingtechnology, for analysis of microbial communities in fermentedfoods. We present a guideline for integrated molecular analysis offood microbial ecology and a starting point for implementingnext-generation analysis of food systems.

  18. Scalable Multicasting over Next-Generation Internet Design, Analysis and Applications

    CERN Document Server

    Tian, Xiaohua

    2013-01-01

    Next-generation Internet providers face high expectations, as contemporary users worldwide expect high-quality multimedia functionality in a landscape of ever-expanding network applications. This volume explores the critical research issue of turning today’s greatly enhanced hardware capacity to good use in designing a scalable multicast  protocol for supporting large-scale multimedia services. Linking new hardware to improved performance in the Internet’s next incarnation is a research hot-spot in the computer communications field.   The methodical presentation deals with the key questions in turn: from the mechanics of multicast protocols to current state-of-the-art designs, and from methods of theoretical analysis of these protocols to applying them in the ns2 network simulator, known for being hard to extend. The authors’ years of research in the field inform this thorough treatment, which covers details such as applying AOM (application-oriented multicast) protocol to IPTV provision and resolving...

  19. Next-Generation Pathology.

    Science.gov (United States)

    Caie, Peter D; Harrison, David J

    2016-01-01

    The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.

  20. Potential of OFDM for next generation optical access

    Science.gov (United States)

    Fritzsche, Daniel; Weis, Erik; Breuer, Dirk

    2011-01-01

    This paper shows the requirements for next generation optical access (NGOA) networks and analyzes the potential of OFDM (orthogonal frequency division multiplexing) for the use in such network scenarios. First, we show the motivation for NGOA systems based on the future requirements on FTTH access systems and list the advantages of OFDM in such scenarios. In the next part, the basics of OFDM and different methods to generate and detect optical OFDM signals are explained and analyzed. At the transmitter side the options include intensity modulation and the more advanced field modulation of the optical OFDM signal. At the receiver there is the choice between direct detection and coherent detection. As the result of this discussion we show our vision of the future use of OFDM in optical access networks.

  1. Power Electronics for the Next Generation Wind Turbine System

    DEFF Research Database (Denmark)

    Ma, Ke

    generation unit, are becoming crucial in the wind turbine system. The objective of this project is to study the power electronics technology used for the next generation wind turbines. Some emerging challenges as well as potentials like the cost of energy and reliability are going to be addressed. First...... conversion is pushed to multi-MW level with high power density requirement. It has also been revealed that thermal stress in the power semiconductors is closely related to many determining factors in the wind power application like the reliability, cost, power density, etc. therefore it is an important......The wind power generation has been steadily growing both for the total installed capacity and for the individual turbine size. Due to much more significant impacts to the power grid, the power electronics, which can change the behavior of wind turbines from an unregulated power source to an active...

  2. Neutronics activities for next generation devices

    International Nuclear Information System (INIS)

    Gohar, Y.

    1985-01-01

    Neutronic activities for the next generation devices are the subject of this paper. The main activities include TFCX and FPD blanket/shield studies, neutronic aspects of ETR/INTOR critical issues, and neutronics computational modules for the tokamak system code and tandem mirror reactor system code. Trade-off analyses, optimization studies, design problem investigations and computational models development for reactor parametric studies carried out for these activities are summarized

  3. Next Generation Germanium Systems for Safeguards Applications

    International Nuclear Information System (INIS)

    Dreyer, J.; Burks, M.; Hull, E.

    2015-01-01

    We are developing the latest generation of highly portable, mechanically cooled germanium systems for safeguard applications. In collaboration with our industrial partner, Ph.D.s Co, we have developed the Germanium Gamma Ray Imager (GeGI), an imager with a 2π field of view. This instrument has been thoroughly field tested in a wide range of environments and have performed reliably even in the harshest conditions. The imaging capability of GeGI complements existing safeguards techniques by allowing for the spatial detection, identification, and characterization of nuclear material. Additionally, imaging can be used in design information verification activities to address potential material diversions. Measurements conducted at the Paducah Gaseous Diffusion Plant highlight the advantages this instrument offers in the identification and localization of LEU, HEU and Pu holdup. GeGI has also been deployed to the Savannah River Site for the measurement of radioactive waste canisters, providing information valuable for waste characterization and inventory accountancy. Measuring 30 x 15 x 23 cm and weighing approximately 15 kg, this instrument is the first portable germanium-based imager. GeGI offers high reliability with the convenience of mechanical cooling, making this instrument ideal for the next generation of safeguards instrumentation. (author)

  4. THE TRAINING OF NEXT GENERATION DATA SCIENTISTS IN BIOMEDICINE.

    Science.gov (United States)

    Garmire, Lana X; Gliske, Stephen; Nguyen, Quynh C; Chen, Jonathan H; Nemati, Shamim; VAN Horn, John D; Moore, Jason H; Shreffler, Carol; Dunn, Michelle

    2017-01-01

    With the booming of new technologies, biomedical science has transformed into digitalized, data intensive science. Massive amount of data need to be analyzed and interpreted, demand a complete pipeline to train next generation data scientists. To meet this need, the transinstitutional Big Data to Knowledge (BD2K) Initiative has been implemented since 2014, complementing other NIH institutional efforts. In this report, we give an overview the BD2K K01 mentored scientist career awards, which have demonstrated early success. We address the specific trainings needed in representative data science areas, in order to make the next generation of data scientists in biomedicine.

  5. TALE nucleases and next generation GM crops.

    KAUST Repository

    Mahfouz, Magdy M.

    2011-04-01

    Site-specific and adaptable DNA binding domains are essential modules to develop genome engineering technologies for crop improvement. Transcription activator-like effectors (TALEs) proteins are used to provide a highly specific and adaptable DNA binding modules. TALE chimeric nucleases (TALENs) were used to generate site-specific double strand breaks (DSBs) in vitro and in yeast, Caenorhabditis elegans, mammalian and plant cells. The genomic DSBs can be generated at predefined and user-selected loci and repaired by either the non-homologous end joining (NHEJ) or homology dependent repair (HDR). Thus, TALENs can be used to achieve site-specific gene addition, stacking, deletion or inactivation. TALE-based genome engineering tools should be powerful to develop new agricultural biotechnology approaches for crop improvement. Here, we discuss the recent research and the potential applications of TALENs to accelerate the generation of genomic variants through targeted mutagenesis and to produce a non-transgenic GM crops with the desired phenotype.

  6. A two-stage flow-based intrusion detection model for next-generation networks.

    Science.gov (United States)

    Umer, Muhammad Fahad; Sher, Muhammad; Bi, Yaxin

    2018-01-01

    The next-generation network provides state-of-the-art access-independent services over converged mobile and fixed networks. Security in the converged network environment is a major challenge. Traditional packet and protocol-based intrusion detection techniques cannot be used in next-generation networks due to slow throughput, low accuracy and their inability to inspect encrypted payload. An alternative solution for protection of next-generation networks is to use network flow records for detection of malicious activity in the network traffic. The network flow records are independent of access networks and user applications. In this paper, we propose a two-stage flow-based intrusion detection system for next-generation networks. The first stage uses an enhanced unsupervised one-class support vector machine which separates malicious flows from normal network traffic. The second stage uses a self-organizing map which automatically groups malicious flows into different alert clusters. We validated the proposed approach on two flow-based datasets and obtained promising results.

  7. Detection of mobile user location on next generation wireless networks

    DEFF Research Database (Denmark)

    Schou, Saowanee; Olesen, Henning

    2005-01-01

    This paper proposes a novel conceptual mechanism for detecting the location of a mobile user on next generation wireless networks. This mechanism can provide location information of a mobile user at different levels of accuracy, by applying the movement detection mechanism of Mobile IPv6 at both...... macro- and micromobility level. In this scheme, an intradomain mobility management protocol (IDMP) is applied to manage the location of the mobile terminal. The mobile terminal needs two care-of addresses, a global care-of address (GCoA) and a local care-of address (LCoA). The current location...... of a Mobile IPv6 device can be determined by mapping the geographical location information with the two care-of-addresses and the physical address of the access point where the user is connected. Such a mechanism makes location services for mobile entities available on a global IP network. The end-users can...

  8. Radio resource management for next generation mobile communication systems

    DEFF Research Database (Denmark)

    Wang, Hua

    The key feature of next generation (4G) mobile communication system is the ability to deliver a variety of multimedia services with different Quality-of-Service (QoS) requirements. Compared to the third generation (3G) mobile communication systems, 4G mobile communication system introduces several...

  9. Energy and luminosity requirements for the next generation of linear colliders

    International Nuclear Information System (INIS)

    Amaldi, U.

    1987-01-01

    In order to gain new knowledge ('new physics') from 'next generation' linear colliders energy and luminosity are important variables when considering the design of these new elementary particle probes. The standard model of the electroweak interaction is reviewed and stipulations for postulated Higgs particle, a new neutral Z particle, and a new quark and a neutral lepton searches with next generation colliders are given

  10. Next generation network based carrier ethernet test bed for IPTV traffic

    DEFF Research Database (Denmark)

    Fu, Rong; Berger, Michael Stübert; Zheng, Yu

    2009-01-01

    This paper presents a Carrier Ethernet (CE) test bed based on the Next Generation Network (NGN) framework. After the concept of CE carried out by Metro Ethernet Forum (MEF), the carrier-grade Ethernet are obtaining more and more interests and being investigated as the low cost and high performanc...... services of transport network to carry the IPTV traffic. This test bed is approaching to support the research on providing a high performance carrier-grade Ethernet transport network for IPTV traffic....

  11. An expert system for automatic mesh generation for Sn particle transport simulation in parallel environment

    International Nuclear Information System (INIS)

    Apisit, Patchimpattapong; Alireza, Haghighat; Shedlock, D.

    2003-01-01

    An expert system for generating an effective mesh distribution for the SN particle transport simulation has been developed. This expert system consists of two main parts: 1) an algorithm for generating an effective mesh distribution in a serial environment, and 2) an algorithm for inference of an effective domain decomposition strategy for parallel computing. For the first part, the algorithm prepares an effective mesh distribution considering problem physics and the spatial differencing scheme. For the second part, the algorithm determines a parallel-performance-index (PPI), which is defined as the ratio of the granularity to the degree-of-coupling. The parallel-performance-index provides expected performance of an algorithm depending on computing environment and resources. A large index indicates a high granularity algorithm with relatively low coupling among processors. This expert system has been successfully tested within the PENTRAN (Parallel Environment Neutral-Particle Transport) code system for simulating real-life shielding problems. (authors)

  12. An expert system for automatic mesh generation for Sn particle transport simulation in parallel environment

    Energy Technology Data Exchange (ETDEWEB)

    Apisit, Patchimpattapong [Electricity Generating Authority of Thailand, Office of Corporate Planning, Bangkruai, Nonthaburi (Thailand); Alireza, Haghighat; Shedlock, D. [Florida Univ., Department of Nuclear and Radiological Engineering, Gainesville, FL (United States)

    2003-07-01

    An expert system for generating an effective mesh distribution for the SN particle transport simulation has been developed. This expert system consists of two main parts: 1) an algorithm for generating an effective mesh distribution in a serial environment, and 2) an algorithm for inference of an effective domain decomposition strategy for parallel computing. For the first part, the algorithm prepares an effective mesh distribution considering problem physics and the spatial differencing scheme. For the second part, the algorithm determines a parallel-performance-index (PPI), which is defined as the ratio of the granularity to the degree-of-coupling. The parallel-performance-index provides expected performance of an algorithm depending on computing environment and resources. A large index indicates a high granularity algorithm with relatively low coupling among processors. This expert system has been successfully tested within the PENTRAN (Parallel Environment Neutral-Particle Transport) code system for simulating real-life shielding problems. (authors)

  13. Synchronization System for Next Generation Light Sources

    Energy Technology Data Exchange (ETDEWEB)

    Zavriyev, Anton [MagiQ Technologies, Inc., Somerville, MA (United States)

    2014-03-27

    An alternative synchronization technique – one that would allow explicit control of the pulse train including its repetition rate and delay is clearly desired. We propose such a scheme. Our method is based on optical interferometry and permits synchronization of the pulse trains generated by two independent mode-locked lasers. As the next generation x-ray sources will be driven by a clock signal derived from a mode-locked optical source, our technique will provide a way to synchronize x-ray probe with the optical pump pulses.

  14. Next-Generation Sequencing and Genome Editing in Plant Virology

    Directory of Open Access Journals (Sweden)

    Ahmed Hadidi

    2016-08-01

    Full Text Available Next-generation sequencing (NGS has been applied to plant virology since 2009. NGS provides highly efficient, rapid, low cost DNA or RNA high-throughput sequencing of the genomes of plant viruses and viroids and of the specific small RNAs generated during the infection process. These small RNAs, which cover frequently the whole genome of the infectious agent, are 21-24 nt long and are known as vsRNAs for viruses and vd-sRNAs for viroids. NGS has been used in a number of studies in plant virology including, but not limited to, discovery of novel viruses and viroids as well as detection and identification of those pathogens already known, analysis of genome diversity and evolution, and study of pathogen epidemiology. The genome engineering editing method, clustered regularly interspaced short palindromic repeats (CRISPR-Cas9 system has been successfully used recently to engineer resistance to DNA geminiviruses (family, Geminiviridae by targeting different viral genome sequences in infected Nicotiana benthamiana or Arabidopsis plants. The DNA viruses targeted include tomato yellow leaf curl virus and merremia mosaic virus (begomovirus; beet curly top virus and beet severe curly top virus (curtovirus; and bean yellow dwarf virus (mastrevirus. The technique has also been used against the RNA viruses zucchini yellow mosaic virus, papaya ringspot virus and turnip mosaic virus (potyvirus and cucumber vein yellowing virus (ipomovirus, family, Potyviridae by targeting the translation initiation genes eIF4E in cucumber or Arabidopsis plants. From these recent advances of major importance, it is expected that NGS and CRISPR-Cas technologies will play a significant role in the very near future in advancing the field of plant virology and connecting it with other related fields of biology.Keywords: Next-generation sequencing, NGS, plant virology, plant viruses, viroids, resistance to plant viruses by CRISPR-Cas9

  15. Mobility Models for Next Generation Wireless Networks Ad Hoc, Vehicular and Mesh Networks

    CERN Document Server

    Santi, Paolo

    2012-01-01

    Mobility Models for Next Generation Wireless Networks: Ad Hoc, Vehicular and Mesh Networks provides the reader with an overview of mobility modelling, encompassing both theoretical and practical aspects related to the challenging mobility modelling task. It also: Provides up-to-date coverage of mobility models for next generation wireless networksOffers an in-depth discussion of the most representative mobility models for major next generation wireless network application scenarios, including WLAN/mesh networks, vehicular networks, wireless sensor networks, and

  16. Methodology on the sparger development for Korean next generation reactor

    International Nuclear Information System (INIS)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K.

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloud acceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs

  17. Methodology on the sparger development for Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloudacceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs.

  18. Engineered CRISPR Systems for Next Generation Gene Therapies.

    Science.gov (United States)

    Pineda, Michael; Moghadam, Farzaneh; Ebrahimkhani, Mo R; Kiani, Samira

    2017-09-15

    An ideal in vivo gene therapy platform provides safe, reprogrammable, and precise strategies which modulate cell and tissue gene regulatory networks with a high temporal and spatial resolution. Clustered regularly interspaced short palindromic repeats (CRISPR), a bacterial adoptive immune system, and its CRISPR-associated protein 9 (Cas9), have gained attention for the ability to target and modify DNA sequences on demand with unprecedented flexibility and precision. The precision and programmability of Cas9 is derived from its complexation with a guide-RNA (gRNA) that is complementary to a desired genomic sequence. CRISPR systems open-up widespread applications including genetic disease modeling, functional screens, and synthetic gene regulation. The plausibility of in vivo genetic engineering using CRISPR has garnered significant traction as a next generation in vivo therapeutic. However, there are hurdles that need to be addressed before CRISPR-based strategies are fully implemented. Some key issues center on the controllability of the CRISPR platform, including minimizing genomic-off target effects and maximizing in vivo gene editing efficiency, in vivo cellular delivery, and spatial-temporal regulation. The modifiable components of CRISPR systems: Cas9 protein, gRNA, delivery platform, and the form of CRISPR system delivered (DNA, RNA, or ribonucleoprotein) have recently been engineered independently to design a better genome engineering toolbox. This review focuses on evaluating CRISPR potential as a next generation in vivo gene therapy platform and discusses bioengineering advancements that can address challenges associated with clinical translation of this emerging technology.

  19. Advanced parallel processing with supercomputer architectures

    International Nuclear Information System (INIS)

    Hwang, K.

    1987-01-01

    This paper investigates advanced parallel processing techniques and innovative hardware/software architectures that can be applied to boost the performance of supercomputers. Critical issues on architectural choices, parallel languages, compiling techniques, resource management, concurrency control, programming environment, parallel algorithms, and performance enhancement methods are examined and the best answers are presented. The authors cover advanced processing techniques suitable for supercomputers, high-end mainframes, minisupers, and array processors. The coverage emphasizes vectorization, multitasking, multiprocessing, and distributed computing. In order to achieve these operation modes, parallel languages, smart compilers, synchronization mechanisms, load balancing methods, mapping parallel algorithms, operating system functions, application library, and multidiscipline interactions are investigated to ensure high performance. At the end, they assess the potentials of optical and neural technologies for developing future supercomputers

  20. Precision medicine for cancer with next-generation functional diagnostics.

    Science.gov (United States)

    Friedman, Adam A; Letai, Anthony; Fisher, David E; Flaherty, Keith T

    2015-12-01

    Precision medicine is about matching the right drugs to the right patients. Although this approach is technology agnostic, in cancer there is a tendency to make precision medicine synonymous with genomics. However, genome-based cancer therapeutic matching is limited by incomplete biological understanding of the relationship between phenotype and cancer genotype. This limitation can be addressed by functional testing of live patient tumour cells exposed to potential therapies. Recently, several 'next-generation' functional diagnostic technologies have been reported, including novel methods for tumour manipulation, molecularly precise assays of tumour responses and device-based in situ approaches; these address the limitations of the older generation of chemosensitivity tests. The promise of these new technologies suggests a future diagnostic strategy that integrates functional testing with next-generation sequencing and immunoprofiling to precisely match combination therapies to individual cancer patients.

  1. Next-generation science information network for leading-edge applications

    International Nuclear Information System (INIS)

    Urushidani, S.; Matsukata, J.

    2008-01-01

    High-speed networks are definitely essential tools for leading-edge applications in many research areas, including nuclear fusion research. This paper describes a number of advanced features in the Japanese next-generation science information network, called SINET3, and gives researchers clues on the uses of advanced high-speed network for their applications. The network services have four categories, multiple layer transfer, enriched virtual private network, enhanced quality-of-service, and bandwidth on demand services, and comprise a versatile service platform. The paper also describes the network architecture and advanced networking capabilities that enable economical service accommodation and flexible network resource assignment as well as effective use of Japan's first 40-Gbps lines

  2. Next-generation science information network for leading-edge applications

    Energy Technology Data Exchange (ETDEWEB)

    Urushidani, S. [National Institute of Informatics, 2-1-2 Hitotsubashi Chiyoda-ku, Tokyo 101-8430 (Japan)], E-mail: urushi@nii.ac.jp; Matsukata, J. [National Institute of Informatics, 2-1-2 Hitotsubashi Chiyoda-ku, Tokyo 101-8430 (Japan)

    2008-04-15

    High-speed networks are definitely essential tools for leading-edge applications in many research areas, including nuclear fusion research. This paper describes a number of advanced features in the Japanese next-generation science information network, called SINET3, and gives researchers clues on the uses of advanced high-speed network for their applications. The network services have four categories, multiple layer transfer, enriched virtual private network, enhanced quality-of-service, and bandwidth on demand services, and comprise a versatile service platform. The paper also describes the network architecture and advanced networking capabilities that enable economical service accommodation and flexible network resource assignment as well as effective use of Japan's first 40-Gbps lines.

  3. Next Generation Nuclear Plant Materials Research and Development Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    G. O. Hayner; E.L. Shaber

    2004-09-01

    The U.S Department of Energy (DOE) has selected the Very High Temperature Reactor (VHTR) design for the Next Generation Nuclear Plant (NGNP) Project. The NGNP will demonstrate the use of nuclear power for electricity and hydrogen production without greenhouse gas emissions. The reactor design will be a graphite moderated, helium-cooled, prismatic or pebble-bed, thermal neutron spectrum reactor that will produce electricity and hydrogen in a state-of-the-art thermodynamically efficient manner. The NGNP will use very high burn-up, low-enriched uranium, TRISO-coated fuel and have a projected plant design service life of 60 years.

  4. Next Generation Nuclear Plant Materials Selection and Qualification Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    R. Doug Hamelin; G. O. Hayner

    2004-11-01

    The U.S. Department of Energy (DOE) has selected the Very High Temperature Reactor (VHTR) design for the Next Generation Nuclear Plant (NGNP) Project. The NGNP will demonstrate the use of nuclear power for electricity and hydrogen production without greenhouse gas emissions. The reactor design is a graphite-moderated, helium-cooled, prismatic or pebble bed thermal neutron spectrum reactor with an average reactor outlet temperature of at least 1000 C. The NGNP will use very high burn up, lowenriched uranium, TRISO-Coated fuel in a once-through fuel cycle. The design service life of the NGNP is 60 years.

  5. Next Generation Science Standards and edTPA: Evidence of Science and Engineering Practices

    Science.gov (United States)

    Brownstein, Erica M.; Horvath, Larry

    2016-01-01

    Science teacher educators in the United States are currently preparing future science teachers to effectively implement the "Next Generation Science Standards" (NGSS) and, in thirteen states, to successfully pass a content-specific high stakes teacher performance assessment, the edTPA. Science education and teacher performance assessment…

  6. Next-generation storm tracking for minimizing service interruption

    Energy Technology Data Exchange (ETDEWEB)

    Sznaider, R. [Meteorlogix, Minneapolis, MN (United States)

    2002-08-01

    Several technological changes have taken place in the field of weather radar since its discovery during World War II. A wide variety of industries have benefited over the years from conventional weather radar displays, providing assistance in forecasting and estimating the potential severity of storms. The characteristics of individual storm cells can now be derived from the next-generation of weather radar systems (NEXRAD). The determination of which storm cells possess distinct features such as large hail or developing tornadoes was made possible through the fusing of various pieces of information with radar pictures. To exactly determine when and where a storm will hit, this data can be combined and overlaid into a display that includes the geographical physical landmarks of a specific region. Combining Geographic Information Systems (GIS) and storm tracking provides a more complete, timely and accurate forecast, which clearly benefits the electric utilities industries. The generation and production of energy are dependent on how hot or cold it will be today and tomorrow. The author described each major feature of this next-generation weather radar system. 9 figs.

  7. Educating the next generation of nature entrepreneurs

    Science.gov (United States)

    Judith C. Jobse; Loes Witteveen; Judith Santegoets; Daan van der Linde

    2015-01-01

    With this paper, it is illustrated that a focus on entrepreneurship training in the nature and wilderness sector is relevant for diverse organisations and situations. The first curricula on nature entrepreneurship are currently being developed. In this paper the authors describe a project that focusses on educating the next generation of nature entrepreneurs, reflect...

  8. Two-Dimensional Metal Oxide Nanomaterials for Next-Generation Rechargeable Batteries.

    Science.gov (United States)

    Mei, Jun; Liao, Ting; Kou, Liangzhi; Sun, Ziqi

    2017-12-01

    The exponential increase in research focused on two-dimensional (2D) metal oxides has offered an unprecedented opportunity for their use in energy conversion and storage devices, especially for promising next-generation rechargeable batteries, such as lithium-ion batteries (LIBs) and sodium-ion batteries (NIBs), as well as some post-lithium batteries, including lithium-sulfur batteries, lithium-air batteries, etc. The introduction of well-designed 2D metal oxide nanomaterials into next-generation rechargeable batteries has significantly enhanced the performance of these energy-storage devices by providing higher chemically active interfaces, shortened ion-diffusion lengths, and improved in-plane carrier-/charge-transport kinetics, which have greatly promoted the development of nanotechnology and the practical application of rechargeable batteries. Here, the recent progress in the application of 2D metal oxide nanomaterials in a series of rechargeable LIBs, NIBs, and other post lithium-ion batteries is reviewed relatively comprehensively. Current opportunities and future challenges for the application of 2D nanomaterials in energy-storage devices to achieve high energy density, high power density, stable cyclability, etc. are summarized and outlined. It is believed that the integration of 2D metal oxide nanomaterials in these clean energy devices offers great opportunities to address challenges driven by increasing global energy demands. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Benchmark problem suite for reactor physics study of LWR next generation fuels

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Ikehara, Tadashi; Ito, Takuya; Saji, Etsuro

    2002-01-01

    This paper proposes a benchmark problem suite for studying the physics of next-generation fuels of light water reactors. The target discharge burnup of the next-generation fuel was set to 70 GWd/t considering the increasing trend in discharge burnup of light water reactor fuels. The UO 2 and MOX fuels are included in the benchmark specifications. The benchmark problem consists of three different geometries: fuel pin cell, PWR fuel assembly and BWR fuel assembly. In the pin cell problem, detailed nuclear characteristics such as burnup dependence of nuclide-wise reactivity were included in the required calculation results to facilitate the study of reactor physics. In the assembly benchmark problems, important parameters for in-core fuel management such as local peaking factors and reactivity coefficients were included in the required results. The benchmark problems provide comprehensive test problems for next-generation light water reactor fuels with extended high burnup. Furthermore, since the pin cell, the PWR assembly and the BWR assembly problems are independent, analyses of the entire benchmark suite is not necessary: e.g., the set of pin cell and PWR fuel assembly problems will be suitable for those in charge of PWR in-core fuel management, and the set of pin cell and BWR fuel assembly problems for those in charge of BWR in-core fuel management. (author)

  10. The Next Generation: Students Discuss Archaeology in the 21st Century

    OpenAIRE

    Sands, Ashley; Butler, Kristin

    2010-01-01

    The Next Generation Project is a multi-agent, multi-directional cultural diplomacy effort. The need for communication among emerging archaeologists has never been greater. Increasingly, archaeological sites are impacted by military activity, destroyed through the development of dams and building projects, and torn apart through looting. The Next Generation Project works to develop communication via social networking sites online and through in-person meetings at international conferences. As ...

  11. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  12. Parallel computing in experimental mechanics and optical measurement: A review (II)

    Science.gov (United States)

    Wang, Tianyi; Kemao, Qian

    2018-05-01

    With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.

  13. The next generation of power reactors - safety characteristics

    International Nuclear Information System (INIS)

    Modro, S.M.

    1995-01-01

    The next generation of commercial nuclear power reactors is characterized by a new approach to achieving reliability of their safety systems. In contrast to current generation reactors, these designs apply passive safety features that rely on gravity-driven transfer processes or stored energy, such as gas-pressurized accumulators or electric batteries. This paper discusses the passive safety system of the AP600 and Simplified Boiling Water Reactor (SBWR) designs

  14. Next Generation, Si-Compatible Materials and Devices in the Si-Ge-Sn System

    Science.gov (United States)

    2015-10-09

    and conclusions The work initially focused on growth of next generation Ge1-ySny alloys on Ge buffered Si wafers via UHV CVD depositions of Ge3H8...Abstract The work initially focused on growth of next generation Ge1-ySny alloys on Ge buffered Si wafers via UHV CVD depositions of Ge3H8, SnD4. The...AFRL-AFOSR-VA-TR-2016-0044 Next generation, Si -compatible materials and devices in the Si - Ge -Sn system John Kouvetakis ARIZONA STATE UNIVERSITY Final

  15. Parallel beam dynamics simulation of linear accelerators

    International Nuclear Information System (INIS)

    Qiang, Ji; Ryne, Robert D.

    2002-01-01

    In this paper we describe parallel particle-in-cell methods for the large scale simulation of beam dynamics in linear accelerators. These techniques have been implemented in the IMPACT (Integrated Map and Particle Accelerator Tracking) code. IMPACT is being used to study the behavior of intense charged particle beams and as a tool for the design of next-generation linear accelerators. As examples, we present applications of the code to the study of emittance exchange in high intensity beams and to the study of beam transport in a proposed accelerator for the development of accelerator-driven waste transmutation technologies

  16. A Survey on Next-generation Power Grid Data Architecture

    Energy Technology Data Exchange (ETDEWEB)

    You, Shutang [University of Tennessee, Knoxville (UTK); Zhu, Dr. Lin [University of Tennessee (UT); Liu, Yong [ORNL; Liu, Yilu [ORNL; Shankar, Mallikarjun (Arjun) [ORNL; Robertson, Russell [Grid Protection Alliance; King Jr, Thomas J [ORNL

    2015-01-01

    The operation and control of power grids will increasingly rely on data. A high-speed, reliable, flexible and secure data architecture is the prerequisite of the next-generation power grid. This paper summarizes the challenges in collecting and utilizing power grid data, and then provides reference data architecture for future power grids. Based on the data architecture deployment, related research on data architecture is reviewed and summarized in several categories including data measurement/actuation, data transmission, data service layer, data utilization, as well as two cross-cutting issues, interoperability and cyber security. Research gaps and future work are also presented.

  17. HLA typing: Conventional techniques v.next-generation sequencing

    African Journals Online (AJOL)

    The existing techniques have contributed significantly to our current knowledge of allelic diversity. At present, sequence-based typing (SBT) methods, in particular next-generation sequencing. (NGS), provide the highest possible resolution. NGS platforms were initially only used for genomic sequencing, but also showed.

  18. Robust Sub-nanomolar Library Preparation for High Throughput Next Generation Sequencing.

    Science.gov (United States)

    Wu, Wells W; Phue, Je-Nie; Lee, Chun-Ting; Lin, Changyi; Xu, Lai; Wang, Rong; Zhang, Yaqin; Shen, Rong-Fong

    2018-05-04

    Current library preparation protocols for Illumina HiSeq and MiSeq DNA sequencers require ≥2 nM initial library for subsequent loading of denatured cDNA onto flow cells. Such amounts are not always attainable from samples having a relatively low DNA or RNA input; or those for which a limited number of PCR amplification cycles is preferred (less PCR bias and/or more even coverage). A well-tested sub-nanomolar library preparation protocol for Illumina sequencers has however not been reported. The aim of this study is to provide a much needed working protocol for sub-nanomolar libraries to achieve outcomes as informative as those obtained with the higher library input (≥ 2 nM) recommended by Illumina's protocols. Extensive studies were conducted to validate a robust sub-nanomolar (initial library of 100 pM) protocol using PhiX DNA (as a control), genomic DNA (Bordetella bronchiseptica and microbial mock community B for 16S rRNA gene sequencing), messenger RNA, microRNA, and other small noncoding RNA samples. The utility of our protocol was further explored for PhiX library concentrations as low as 25 pM, which generated only slightly fewer than 50% of the reads achieved under the standard Illumina protocol starting with > 2 nM. A sub-nanomolar library preparation protocol (100 pM) could generate next generation sequencing (NGS) results as robust as the standard Illumina protocol. Following the sub-nanomolar protocol, libraries with initial concentrations as low as 25 pM could also be sequenced to yield satisfactory and reproducible sequencing results.

  19. Comparison of Four Human Papillomavirus Genotyping Methods: Next-generation Sequencing, INNO-LiPA, Electrochemical DNA Chip, and Nested-PCR.

    Science.gov (United States)

    Nilyanimit, Pornjarim; Chansaenroj, Jira; Poomipak, Witthaya; Praianantathavorn, Kesmanee; Payungporn, Sunchai; Poovorawan, Yong

    2018-03-01

    Human papillomavirus (HPV) infection causes cervical cancer, thus necessitating early detection by screening. Rapid and accurate HPV genotyping is crucial both for the assessment of patients with HPV infection and for surveillance studies. Fifty-eight cervicovaginal samples were tested for HPV genotypes using four methods in parallel: nested-PCR followed by conventional sequencing, INNO-LiPA, electrochemical DNA chip, and next-generation sequencing (NGS). Seven HPV genotypes (16, 18, 31, 33, 45, 56, and 58) were identified by all four methods. Nineteen HPV genotypes were detected by NGS, but not by nested-PCR, INNO-LiPA, or electrochemical DNA chip. Although NGS is relatively expensive and complex, it may serve as a sensitive HPV genotyping method. Because of its highly sensitive detection of multiple HPV genotypes, NGS may serve as an alternative for diagnostic HPV genotyping in certain situations. © The Korean Society for Laboratory Medicine

  20. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  1. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  2. Next-generation sequencing for endocrine cancers: Recent advances and challenges.

    Science.gov (United States)

    Suresh, Padmanaban S; Venkatesh, Thejaswini; Tsutsumi, Rie; Shetty, Abhishek

    2017-05-01

    Contemporary molecular biology research tools have enriched numerous areas of biomedical research that address challenging diseases, including endocrine cancers (pituitary, thyroid, parathyroid, adrenal, testicular, ovarian, and neuroendocrine cancers). These tools have placed several intriguing clues before the scientific community. Endocrine cancers pose a major challenge in health care and research despite considerable attempts by researchers to understand their etiology. Microarray analyses have provided gene signatures from many cells, tissues, and organs that can differentiate healthy states from diseased ones, and even show patterns that correlate with stages of a disease. Microarray data can also elucidate the responses of endocrine tumors to therapeutic treatments. The rapid progress in next-generation sequencing methods has overcome many of the initial challenges of these technologies, and their advantages over microarray techniques have enabled them to emerge as valuable aids for clinical research applications (prognosis, identification of drug targets, etc.). A comprehensive review describing the recent advances in next-generation sequencing methods and their application in the evaluation of endocrine and endocrine-related cancers is lacking. The main purpose of this review is to illustrate the concepts that collectively constitute our current view of the possibilities offered by next-generation sequencing technological platforms, challenges to relevant applications, and perspectives on the future of clinical genetic testing of patients with endocrine tumors. We focus on recent discoveries in the use of next-generation sequencing methods for clinical diagnosis of endocrine tumors in patients and conclude with a discussion on persisting challenges and future objectives.

  3. Cisco Networking Academy: Next-Generation Assessments and Their Implications for K-12 Education

    Science.gov (United States)

    Liu, Meredith

    2014-01-01

    To illuminate the possibilities for next-generation assessments in K-12 schools, this case study profiles the Cisco Networking Academy, which creates comprehensive online training curriculum to teach networking skills. Since 1997, the Cisco Networking Academy has served more than five million high school and college students and now delivers…

  4. The next generation CANDU 6

    International Nuclear Information System (INIS)

    Hopwood, J.M.

    1999-01-01

    AECL's product line of CANDU 6 and CANDU 9 nuclear power plants are adapted to respond to changing market conditions, experience feedback and technological development by a continuous improvement process of design evolution. The CANDU 6 Nuclear Power Plant design is a successful family of nuclear units, with the first four units entering service in 1983, and the most recent entering service this year. A further four CANDU 6 units are under construction. Starting in 1996, a focused forward-looking development program is under way at AECL to incorporate a series of individual improvements and integrate them into the CANDU 6, leading to the evolutionary development of the next-generation enhanced CANDU 6. The CANDU 6 improvements program includes all aspects of an NPP project, including engineering tools improvements, design for improved constructability, scheduling for faster, more streamlined commissioning, and improved operating performance. This enhanced CANDU 6 product will combine the benefits of design provenness (drawing on the more than 70 reactor-years experience of the seven operating CANDU 6 units), with the advantages of an evolutionary next-generation design. Features of the enhanced CANDU 6 design include: Advanced Human Machine Interface - built around the Advanced CANDU Control Centre; Advanced fuel design - using the newly demonstrated CANFLEX fuel bundle; Improved Efficiency based on improved utilization of waste heat; Streamlined System Design - including simplifications to improve performance and safety system reliability; Advanced Engineering Tools, -- featuring linked electronic databases from 3D CADDS, equipment specification and material management; Advanced Construction Techniques - based on open top equipment installation and the use of small skid mounted modules; Options defined for Passive Heat Sink capability and low-enrichment core optimization. (author)

  5. Bioinformatics for Next Generation Sequencing Data

    Directory of Open Access Journals (Sweden)

    Alberto Magi

    2010-09-01

    Full Text Available The emergence of next-generation sequencing (NGS platforms imposes increasing demands on statistical methods and bioinformatic tools for the analysis and the management of the huge amounts of data generated by these technologies. Even at the early stages of their commercial availability, a large number of softwares already exist for analyzing NGS data. These tools can be fit into many general categories including alignment of sequence reads to a reference, base-calling and/or polymorphism detection, de novo assembly from paired or unpaired reads, structural variant detection and genome browsing. This manuscript aims to guide readers in the choice of the available computational tools that can be used to face the several steps of the data analysis workflow.

  6. Next Generation HeliMag UXO Mapping Technology

    Science.gov (United States)

    2010-01-01

    Ancillary instrumentation records aircraft height above ground and attitude. A fluxgate magnetometer is used to allow for aeromagnetic compensation of... Magnetometer System WWII World War II WAA wide area assessment ACKNOWLEDGEMENTS This Next Generation HeliMag Unexploded Ordnance (UXO) Mapping...for deployment of seven total-field magnetometers on a Kevlar reinforced boom mounted on a Bell 206L helicopter. The objectives of this

  7. SQoS as the Base for Next Generation Global Infrastructure

    DEFF Research Database (Denmark)

    Madsen, Ole Brun; Knudsen, Thomas Phillip; Pedersen, Jens Myrup

    2003-01-01

    The convergence towards a unified global WAN platform, providing both best effort services and guaranteed high quality services, sets the agenda for the design and implementation of the next generation global information infrastructure. The absence of design principles, allowing for smooth and cost...... efficient scalability without loss of control over the structurally based properties may prevent or seriously delay the introduction of globally available new application and switching services. Reliability and scalability issues are addressed from a structural viewpoint. The concept of Structural Quality...

  8. SQoS as the Base for Next Generation Global Infrastructure

    DEFF Research Database (Denmark)

    Madsen, Ole Brun; Knudsen, Thomas Phillip; Pedersen, Jens Myrup

    The convergence towards a unified global WAN platform, providing both best effort services and guaranteed high quality services, sets the agenda for the design and implementation of the next generation global information infrastructure. The absence of design principles, allowing for smooth and cost...... efficient scalability without loss of control over the structurally based properties may prevent or seriously delay the introduction of globally available new application and switching services.Reliability and scalability issues are addressed from a structural viewpoint. The concept of Structural Quality...

  9. Application of Next Generation Sequencing on Genetic Testing

    DEFF Research Database (Denmark)

    Li, Jian

    The discovery of genetic factors behind increasing number of human diseases and the growth of education of genetic knowledge to the public make demands for genetic testing increase rapidly. However, traditional genetic testing methods cannot meet all kinds of the requirements. Next generation seq...

  10. HLA typing: Conventional techniques v. next-generation sequencing ...

    African Journals Online (AJOL)

    Background. The large number of population-specific polymorphisms present in the HLA complex in the South African (SA) population reduces the probability of finding an adequate HLA-matched donor for individuals in need of an unrelated haematopoietic stem cell transplantation (HSCT). Next-generation sequencing ...

  11. NGSS and the Next Generation of Science Teachers

    Science.gov (United States)

    Bybee, Rodger W.

    2014-01-01

    This article centers on the "Next Generation Science Standards" (NGSS) and their implications for teacher development, particularly at the undergraduate level. After an introduction to NGSS and the influence of standards in the educational system, the article addresses specific educational shifts--interconnecting science and engineering…

  12. Next-generation sequencing offers new insights into DNA degradation

    DEFF Research Database (Denmark)

    Overballe-Petersen, Søren; Orlando, Ludovic Antoine Alexandre; Willerslev, Eske

    2012-01-01

    The processes underlying DNA degradation are central to various disciplines, including cancer research, forensics and archaeology. The sequencing of ancient DNA molecules on next-generation sequencing platforms provides direct measurements of cytosine deamination, depurination and fragmentation...... rates that previously were obtained only from extrapolations of results from in vitro kinetic experiments performed over short timescales. For example, recent next-generation sequencing of ancient DNA reveals purine bases as one of the main targets of postmortem hydrolytic damage, through base...... elimination and strand breakage. It also shows substantially increased rates of DNA base-loss at guanosine. In this review, we argue that the latter results from an electron resonance structure unique to guanosine rather than adenosine having an extra resonance structure over guanosine as previously suggested....

  13. Automotive Radar and Lidar Systems for Next Generation Driver Assistance Functions

    Directory of Open Access Journals (Sweden)

    R. H. Rasshofer

    2005-01-01

    Full Text Available Automotive radar and lidar sensors represent key components for next generation driver assistance functions (Jones, 2001. Today, their use is limited to comfort applications in premium segment vehicles although an evolution process towards more safety-oriented functions is taking place. Radar sensors available on the market today suffer from low angular resolution and poor target detection in medium ranges (30 to 60m over azimuth angles larger than ±30°. In contrast, Lidar sensors show large sensitivity towards environmental influences (e.g. snow, fog, dirt. Both sensor technologies today have a rather high cost level, forbidding their wide-spread usage on mass markets. A common approach to overcome individual sensor drawbacks is the employment of data fusion techniques (Bar-Shalom, 2001. Raw data fusion requires a common, standardized data interface to easily integrate a variety of asynchronous sensor data into a fusion network. Moreover, next generation sensors should be able to dynamically adopt to new situations and should have the ability to work in cooperative sensor environments. As vehicular function development today is being shifted more and more towards virtual prototyping, mathematical sensor models should be available. These models should take into account the sensor's functional principle as well as all typical measurement errors generated by the sensor.

  14. Photographic inspection apparatus and process to know the shape and the dimensions of the end parts of steam generator tubes

    International Nuclear Information System (INIS)

    Martin, A.

    1986-01-01

    Before any inspection or repair operation of the tubes of a steam generator, one needs to know the shape and the dimension of the hole of the tube in the near the primary face of the tube plate. The photographic inspection apparatus is moved parallel with the tube plate, inside the water box, such as its optical axis keeps parallel to a determined direction during its displacement. One takes successively photographs of the primary face of the tube plate with the photographic apparatus in different positions, to obtain at least two photographs of each tube to be inspected, under different angles. Photographs are developed at a determined scale of the primary face of the tube plate and of the tube ends. The photographs are oriented two by two to obtain a stereophotogrammetric view of the end parts of each tube. Measurements and examinations are done from the stereophotogrammetric view obtained for each tube, outside the steam generator zone. The invention concerns the process and also the photographic apparatus described in the present patent [fr

  15. Precise Thermometry for Next Generation LHC Superconducting Magnet Prototypes

    CERN Document Server

    Datskov, V; Bottura, L; Perez, J C; Borgnolutti, F; Jenninger, B; Ryan, P

    2013-01-01

    The next generation of LHC superconducting magnets is very challenging and must operate in harsh conditions: high radiation doses in a range between 10 and 50 MGy, high voltage environment of 1 to 5 kV during the quench, dynamic high magnetic field up to 12 T, dynamic temperature range 1.8 K to 300 K in 0.6 sec. For magnet performance and long term reliability it is important to study dynamic thermal effects, such as the heat flux through the magnet structure, or measuring hot spot in conductors during a magnet quench with high sampling rates above 200 Hz. Available on the market cryogenic temperature sensors comparison is given. An analytical model for special electrically insulating thermal anchor (Kapton pad) with high voltage insulation is described. A set of instrumentation is proposed for fast monitoring of thermal processes during normal operation, quenches and failure situations. This paper presents the technology applicable for mounting temperature sensors on high voltage superconducting (SC) cables....

  16. Development of High Frequency Transition-Edge-Sensor Polarimeters for Next Generation Cosmic Microwave Background Experiments and Galactic Foreground Measurements

    Science.gov (United States)

    Walker, Samantha; Sierra, Carlos E.; Austermann, Jason Edward; Beall, James; Becker, Dan; Dober, Bradley; Duff, Shannon; Hilton, Gene; Hubmayr, Johannes; Van Lanen, Jeffrey L.; McMahon, Jeff; Simon, Sara M.; Ullom, Joel; Vissers, Michael R.; NIST Quantum Sensors Group

    2018-06-01

    Observations of the cosmic microwave background (CMB) provide a powerful tool for probing the earliest moments of the universe and therefore have the potential to transform our understanding of cosmology. In particular, precision measurements of its polarization can reveal the existence of gravitational waves produced during cosmic inflation. However, these observations are complicated by the presence of astrophysical foregrounds, which may be separated by using broad frequency coverage, as the spectral energy distribution between foregrounds and the CMB is distinct. For this purpose, we are developing large-bandwidth, feedhorn-coupled transition-edge-sensor (TES) arrays that couple polarized light from waveguide to superconducting microstrip by use of a symmetric, planar orthomode transducer (OMT). In this work, we describe two types of pixels, an ultra-high frequency (UHF) design, which operates from 195 GHz-315 GHz, and an extended ultra-high frequency (UHF++) design, which operates from 195 GHz-420 GHz, being developed for next generation CMB experiments that will come online in the next decade, such as CCAT-prime and the Simons Observatory. We present the designs, simulation results, fabrication, and preliminary measurements of these prototype pixels.

  17. EIGER: Next generation single photon counting detector for X-ray applications

    Energy Technology Data Exchange (ETDEWEB)

    Dinapoli, Roberto, E-mail: roberto.dinapoli@psi.ch [Paul Scherrer Institut, 5232 Villigen PSI (Switzerland); Bergamaschi, Anna; Henrich, Beat; Horisberger, Roland; Johnson, Ian; Mozzanica, Aldo; Schmid, Elmar; Schmitt, Bernd; Schreiber, Akos; Shi, Xintian; Theidel, Gerd [Paul Scherrer Institut, 5232 Villigen PSI (Switzerland)

    2011-09-11

    EIGER is an advanced family of single photon counting hybrid pixel detectors, primarily aimed at diffraction experiments at synchrotrons. Optimization of maximal functionality and minimal pixel size (using a 0.25{mu}m process and conserving the radiation tolerant design) has resulted in 75x75{mu}m{sup 2} pixels. Every pixel comprises a preamplifier, shaper, discriminator (with a 6 bit DAC for threshold trimming), a configurable 4/8/12 bit counter with double buffering, as well as readout, control and test circuitry. A novel feature of this chip is its double buffered counter, meaning a next frame can be acquired while the previous one is being readout. An array of 256x256 pixels fits on a {approx}2x2cm{sup 2} chip and a sensor of {approx}8x4cm{sup 2} will be equipped with eight readout chips to form a module containing 0.5 Mpixel. Several modules can then be tiled to form larger area detectors. Detectors up to 4x8 modules (16 Mpixel) are planned. To achieve frame rates of up to 24 kHz the readout architecture is highly parallel, and the chip readout happens in parallel on 32 readout lines with a 100 MHz Double Data Rate clock. Several chips and singles (i.e. a single chip bump-bonded to a single chip silicon sensor) were tested both with a lab X-ray source and at Swiss Light Source (SLS) beamlines. These tests demonstrate the full functionality of the chip and provide a first assessment of its performance. High resolution X-ray images and 'high speed movies' were produced, even without threshold trimming, at the target system frame rates (up to {approx}24kHz in 4 bit mode). In parallel, dedicated hardware, firmware and software had to be developed to comply with the enormous data rate the chip is capable of delivering. Details of the chip design and tests will be given, as well as highlights of both test and final readout systems.

  18. Design of a fault diagnosis system for next generation nuclear power plants

    International Nuclear Information System (INIS)

    Zhao, K.; Upadhyaya, B.R.; Wood, R.T.

    2004-01-01

    A new design approach for fault diagnosis is developed for next generation nuclear power plants. In the nuclear reactor design phase, data reconciliation is used as an efficient tool to determine the measurement requirements to achieve the specified goal of fault diagnosis. In the reactor operation phase, the plant measurements are collected to estimate uncertain model parameters so that a high fidelity model can be obtained for fault diagnosis. The proposed algorithm of fault detection and isolation is able to combine the strength of first principle model based fault diagnosis and the historical data based fault diagnosis. Principal component analysis on the reconciled data is used to develop a statistical model for fault detection. The updating of the principal component model based on the most recent reconciled data is a locally linearized model around the current plant measurements, so that it is applicable to any generic nonlinear systems. The sensor fault diagnosis and process fault diagnosis are decoupled through considering the process fault diagnosis as a parameter estimation problem. The developed approach has been applied to the IRIS helical coil steam generator system to monitor the operational performance of individual steam generators. This approach is general enough to design fault diagnosis systems for the next generation nuclear power plants. (authors)

  19. A single-stage high pressure steam injector for next generation reactors: test results and analysis

    International Nuclear Information System (INIS)

    Cattadori, G.; Galbiati, L.; Mazzocchi, L.; Vanini, P.

    1995-01-01

    Steam injectors can be used in advanced light water reactors (ALWRs) for high pressure makeup water supply; this solution seems to be very attractive because of the ''passive'' features of steam injectors, that would take advantage of the available energy from primary steam without the introduction of any rotating machinery. The reference application considered in this work is a high pressure safety injection system for a BWR; a water flow rate of about 60 kg/s to be delivered against primary pressures covering a quite wide range up to 9 MPa is required. Nevertheless, steam driven water injectors with similar characteristics could be used to satisfy the high pressure core coolant makeup requirements of next generation PWRs. With regard to BWR application, an instrumented steam injector prototype with a flow rate scaling factor of about 1:6 has been built and tested. The tested steam injector operates at a constant inlet water pressure (about 0.2 MPa) and inlet water temperature ranging from 15 to 37 o C, with steam pressure ranging from 2.5 to 8.7 MPa, always fulfilling the discharge pressure target (10% higher than steam pressure). To achieve these results an original double-overflow flow rate-control/startup system has been developed. (Author)

  20. NOAA Next Generation Radar (NEXRAD) Level 2 Base Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Level II weather radar data collected from Next-Generation Radar (NEXRAD) stations located in the contiguous United States, Alaska, Hawaii,...

  1. Next-generation phylogeography: a targeted approach for multilocus sequencing of non-model organisms.

    Directory of Open Access Journals (Sweden)

    Jonathan B Puritz

    Full Text Available The field of phylogeography has long since realized the need and utility of incorporating nuclear DNA (nDNA sequences into analyses. However, the use of nDNA sequence data, at the population level, has been hindered by technical laboratory difficulty, sequencing costs, and problematic analytical methods dealing with genotypic sequence data, especially in non-model organisms. Here, we present a method utilizing the 454 GS-FLX Titanium pyrosequencing platform with the capacity to simultaneously sequence two species of sea star (Meridiastra calcar and Parvulastra exigua at five different nDNA loci across 16 different populations of 20 individuals each per species. We compare results from 3 populations with traditional Sanger sequencing based methods, and demonstrate that this next-generation sequencing platform is more time and cost effective and more sensitive to rare variants than Sanger based sequencing. A crucial advantage is that the high coverage of clonally amplified sequences simplifies haplotype determination, even in highly polymorphic species. This targeted next-generation approach can greatly increase the use of nDNA sequence loci in phylogeographic and population genetic studies by mitigating many of the time, cost, and analytical issues associated with highly polymorphic, diploid sequence markers.

  2. 76 FR 77939 - Proposed Provision of Navigation Services for the Next Generation Air Transportation System...

    Science.gov (United States)

    2011-12-15

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration 14 CFR Parts 91, 121, 125, 129, and 135 Proposed Provision of Navigation Services for the Next Generation Air Transportation System (Next...) navigation infrastructure to enable performance-based navigation (PBN) as part of the Next Generation Air...

  3. Metrology/viewing system for next generation fusion reactors

    International Nuclear Information System (INIS)

    Spampinato, P.T.; Barry, R.E.; Chesser, J.B.; Menon, M.M.; Dagher, M.A.

    1997-01-01

    Next generation fusion reactors require accurate measuring systems to verify sub-millimeter alignment of plasma-facing components in the reactor vessel. A metrology system capable of achieving such accuracy must be compatible with the vessel environment of high gamma radiation, high vacuum, elevated temperature, and magnetic field. This environment requires that the system must be remotely deployed. A coherent, frequency modulated laser radar system is being integrated with a remotely operated deployment system to meet these requirements. The metrology/viewing system consists of a compact laser transceiver optics module which is linked through fiber optics to the laser source and imaging units that are located outside of the harsh environment. The deployment mechanism is a telescopic-mast positioning system. This paper identifies the requirements for the International Thermonuclear Experimental Reactor metrology and viewing system, and describes a remotely operated precision ranging and surface mapping system

  4. Graphene Transparent Conductive Electrodes for Next- Generation Microshutter Arrays

    Science.gov (United States)

    Li, Mary; Sultana, Mahmooda; Hess, Larry

    2012-01-01

    Graphene is a single atomic layer of graphite. It is optically transparent and has high electron mobility, and thus has great potential to make transparent conductive electrodes. This invention contributes towards the development of graphene transparent conductive electrodes for next-generation microshutter arrays. The original design for the electrodes of the next generation of microshutters uses indium-tin-oxide (ITO) as the electrode material. ITO is widely used in NASA flight missions. The optical transparency of ITO is limited, and the material is brittle. Also, ITO has been getting more expensive in recent years. The objective of the invention is to develop a graphene transparent conductive electrode that will replace ITO. An exfoliation procedure was developed to make graphene out of graphite crystals. In addition, large areas of single-layer graphene were produced using low-pressure chemical vapor deposition (LPCVD) with high optical transparency. A special graphene transport procedure was developed for transferring graphene from copper substrates to arbitrary substrates. The concept is to grow large-size graphene sheets using the LPCVD system through chemical reaction, transfer the graphene film to a substrate, dope graphene to reduce the sheet resistance, and pattern the film to the dimension of the electrodes in the microshutter array. Graphene transparent conductive electrodes are expected to have a transparency of 97.7%. This covers the electromagnetic spectrum from UV to IR. In comparison, ITO electrodes currently used in microshutter arrays have 85% transparency in mid-IR, and suffer from dramatic transparency drop at a wavelength of near-IR or shorter. Thus, graphene also has potential application as transparent conductive electrodes for Schottky photodiodes in the UV region.

  5. Cloud Sourcing – Next Generation Outsourcing?

    OpenAIRE

    Muhic, Mirella; Johansson, Björn

    2014-01-01

    Although Cloud Sourcing has been around for some time it could be questioned what actually is known about it. This paper presents a literature review on the specific question if Cloud Sourcing could be seen as the next generation of outsourcing. The reason for doing this is that from an initial sourcing study we found that the sourcing decisions seems to go in the direction of outsourcing as a service which could be described as Cloud Sourcing. Whereas some are convinced that Cloud Sourcing r...

  6. Securing Networks from Modern Threats using Next Generation Firewalls

    OpenAIRE

    Delgiusto, Valter

    2016-01-01

    Classic firewalls have long been unable to cope with modern threats that ordinary Internet users are exposed to. This thesis discusses their successors - the next-generation firewalls. The first part of the thesis describes modern threats and attacks. We described in detail the DoS and APT attacks, which are among the most frequent and which may cause most damage to the system under attack. Then we explained the theoretical basics of firewalls and described the functionalities of next gen...

  7. Next generation of energy production systems

    International Nuclear Information System (INIS)

    Rouault, J.; Garnier, J.C.; Carre, F.

    2003-01-01

    This document gathers the slides that have been presented at the Gedepeon conference. Gedepeon is a research group involving scientists from Cea (French atomic energy commission), CNRS (national center of scientific research), EDF (electricity of France) and Framatome that is devoted to the study of new energy sources and particularly to the study of the future generations of nuclear systems. The contributions have been classed into 9 topics: 1) gas cooled reactors, 2) molten salt reactors (MSBR), 3) the recycling of plutonium and americium, 4) reprocessing of molten salt reactor fuels, 5) behavior of graphite under radiation, 6) metallic materials for molten salt reactors, 7) refractory fuels of gas cooled reactors, 8) the nuclear cycle for the next generations of nuclear systems, and 9) organization of research programs on the new energy sources

  8. MCBooster: a tool for MC generation for massively parallel platforms

    CERN Multimedia

    Alves Junior, Antonio Augusto

    2016-01-01

    MCBooster is a header-only, C++11-compliant library for the generation of large samples of phase-space Monte Carlo events on massively parallel platforms. It was released on GitHub in the spring of 2016. The library core algorithms implement the Raubold-Lynch method; they are able to generate the full kinematics of decays with up to nine particles in the final state. The library supports the generation of sequential decays as well as the parallel evaluation of arbitrary functions over the generated events. The output of MCBooster completely accords with popular and well-tested software packages such as GENBOD (W515 from CERNLIB) and TGenPhaseSpace from the ROOT framework. MCBooster is developed on top of the Thrust library and runs on Linux systems. It deploys transparently on NVidia CUDA-enabled GPUs as well as multicore CPUs. This contribution summarizes the main features of MCBooster. A basic description of the user interface and some examples of applications are provided, along with measurements of perfor...

  9. libstable: Fast, Parallel, and High-Precision Computation of α-Stable Distributions in R, C/C++, and MATLAB

    Directory of Open Access Journals (Sweden)

    Javier Royuela-del-Val

    2017-06-01

    Full Text Available α-stable distributions are a family of well-known probability distributions. However, the lack of closed analytical expressions hinders their application. Currently, several tools have been developed to numerically evaluate their density and distribution functions or to estimate their parameters, but available solutions either do not reach sufficient precision on their evaluations or are excessively slow for practical purposes. Moreover, they do not take full advantage of the parallel processing capabilities of current multi-core machines. Other solutions work only on a subset of the α-stable parameter space. In this paper we present an R package and a C/C++ library with a MATLAB front-end that permit parallelized, fast and high precision evaluation of density, distribution and quantile functions, as well as random variable generation and parameter estimation of α-stable distributions in their whole parameter space. The described library can be easily integrated into third party developments.

  10. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    Science.gov (United States)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  11. Comparative analyses of two Geraniaceae transcriptomes using next-generation sequencing.

    Science.gov (United States)

    Zhang, Jin; Ruhlman, Tracey A; Mower, Jeffrey P; Jansen, Robert K

    2013-12-29

    Organelle genomes of Geraniaceae exhibit several unusual evolutionary phenomena compared to other angiosperm families including accelerated nucleotide substitution rates, widespread gene loss, reduced RNA editing, and extensive genomic rearrangements. Since most organelle-encoded proteins function in multi-subunit complexes that also contain nuclear-encoded proteins, it is likely that the atypical organellar phenomena affect the evolution of nuclear genes encoding organellar proteins. To begin to unravel the complex co-evolutionary interplay between organellar and nuclear genomes in this family, we sequenced nuclear transcriptomes of two species, Geranium maderense and Pelargonium x hortorum. Normalized cDNA libraries of G. maderense and P. x hortorum were used for transcriptome sequencing. Five assemblers (MIRA, Newbler, SOAPdenovo, SOAPdenovo-trans [SOAPtrans], Trinity) and two next-generation technologies (454 and Illumina) were compared to determine the optimal transcriptome sequencing approach. Trinity provided the highest quality assembly of Illumina data with the deepest transcriptome coverage. An analysis to determine the amount of sequencing needed for de novo assembly revealed diminishing returns of coverage and quality with data sets larger than sixty million Illumina paired end reads for both species. The G. maderense and P. x hortorum transcriptomes contained fewer transcripts encoding the PLS subclass of PPR proteins relative to other angiosperms, consistent with reduced mitochondrial RNA editing activity in Geraniaceae. In addition, transcripts for all six plastid targeted sigma factors were identified in both transcriptomes, suggesting that one of the highly divergent rpoA-like ORFs in the P. x hortorum plastid genome is functional. The findings support the use of the Illumina platform and assemblers optimized for transcriptome assembly, such as Trinity or SOAPtrans, to generate high-quality de novo transcriptomes with broad coverage. In addition

  12. Tin-based anode materials with well-designed architectures for next-generation lithium-ion batteries

    Science.gov (United States)

    Liu, Lehao; Xie, Fan; Lyu, Jing; Zhao, Tingkai; Li, Tiehu; Choi, Bong Gill

    2016-07-01

    Tin (Sn) has long been considered to be a promising replacement anode material for graphite in next-generation lithium-ion batteries (LIBs), because of its attractive comprehensive advantages of high gravimetric/volumetric capacities, environmental benignity, low cost, high safety, etc. However, Sn-based anodes suffer from severe capacity fading resulting mainly from their large volume expansions/contractions during lithiation/delithiation and subsequent pulverization, coalescence, delamination from current collectors, and poor Li+/electron transport. To circumvent these issues, a number of extraordinary architectures from nanostructures to anchored, layered/sandwich, core-shell, porous and even integrated structures have been exquisitely constructed to enhance the cycling performance. To cater for the rapid development of Sn-based anodes, we summarize the advances made in structural design principles, fabrication methods, morphological features and battery performance with focus on material structures. In addition, we identify the associated challenges and problems presented by recently-developed anodes and offer suggestions and perspectives for facilitating their practical implementations in next-generation LIBs.

  13. Next generation data harmonization

    Science.gov (United States)

    Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg

    2015-05-01

    Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.

  14. Novel nanostructures for next generation dye-sensitized solar cells

    KAUST Repository

    Té treault, Nicolas; Grä tzel, Michael

    2012-01-01

    Herein, we review our latest advancements in nanostructured photoanodes for next generation photovoltaics in general and dye-sensitized solar cells in particular. Bottom-up self-assembly techniques are developed to fabricate large-area 3D

  15. Analysis, design, and experimental evaluation of power calculation in digital droop-controlled parallel microgrid inverters

    DEFF Research Database (Denmark)

    Gao, Ming-zhi; Chen, Min; Jin, Cheng

    2013-01-01

    Parallel operation of distributed generation is an important topic for microgrids, which can provide a highly reliable electric supply service and good power quality to end customers when the utility is unavailable. However, there is a well-known limitation: the power sharing accuracy between...

  16. SAMSIN: the next-generation servo-manipulator

    International Nuclear Information System (INIS)

    Adams, R.H.; Jennrich, C.E.; Korpi, K.W.

    1985-01-01

    The Central Research Laboratories (CRL) Division of Sargent Industries is now developing SAMSIN, a next-generation servo-manipulator. SAMSIN is an acronym for Servo-Actuated Manipulator Systems with Intelligent Networks. This paper discusses the objectives of this development and describes the key features of the servo-manipulator system. There are three main objectives in the SAMSIN development: adaptability, reliability, and maintainability. SAMSIN utilizes standard Sargent/CRL sealed master and slave manipulator arms as well as newly developed compact versions. The mechanical arms have more than 20 yr of successful performance in industrial applications such as hot cells, high vacuums, fuel pools, and explosives handling. The servo-actuator package is in a protective enclosure, which may be sealed in various ways from the remote environment. The force limiting characteristics of the servo-actuators extend motion tendon life. Protective bootings increase the reliability of the arms in an environment that is high in airborne contamination. These bootings also simplify the decontamination of the system. The modularity in construction permits quick removal and replacement of slave arms, wrist joints, tong fingers, and actuator packages for maintenance. SAMSIN utilizes readily available off-the-shelf actuator and control system components. Each manipulator motion uses the same actuator and control system components

  17. Generating multiplex gradients of biomolecules for controlling cellular adhesion in parallel microfluidic channels.

    Science.gov (United States)

    Didar, Tohid Fatanat; Tabrizian, Maryam

    2012-11-07

    Here we present a microfluidic platform to generate multiplex gradients of biomolecules within parallel microfluidic channels, in which a range of multiplex concentration gradients with different profile shapes are simultaneously produced. Nonlinear polynomial gradients were also generated using this device. The gradient generation principle is based on implementing parrallel channels with each providing a different hydrodynamic resistance. The generated biomolecule gradients were then covalently functionalized onto the microchannel surfaces. Surface gradients along the channel width were a result of covalent attachments of biomolecules to the surface, which remained functional under high shear stresses (50 dyn/cm(2)). An IgG antibody conjugated to three different fluorescence dyes (FITC, Cy5 and Cy3) was used to demonstrate the resulting multiplex concentration gradients of biomolecules. The device enabled generation of gradients with up to three different biomolecules in each channel with varying concentration profiles. We were also able to produce 2-dimensional gradients in which biomolecules were distributed along the length and width of the channel. To demonstrate the applicability of the developed design, three different multiplex concentration gradients of REDV and KRSR peptides were patterned along the width of three parallel channels and adhesion of primary human umbilical vein endothelial cell (HUVEC) in each channel was subsequently investigated using a single chip.

  18. Teachers' Practices in High School Chemistry Just Prior to the Adoption of the Next Generation Science Standards

    Science.gov (United States)

    Boesdorfer, Sarah B.; Staude, Kristin D.

    2016-01-01

    Effective professional development that influences teachers' classroom practices starts with what teachers know, understand, and do in their classroom. The Next Generation Science Standards (NGSS) challenge teachers to make changes to their classroom; to help teachers make these changes, it is necessary to know what they are doing in their…

  19. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  20. Challenges and opportunities in estimating viral genetic diversity from next-generation sequencing data

    Directory of Open Access Journals (Sweden)

    Niko eBeerenwinkel

    2012-09-01

    Full Text Available Many viruses, including the clinically relevant RNA viruses HIV and HCV, exist in large populations and display high genetic heterogeneity within and between infected hosts. Assessing intra-patient viral genetic diversity is essential for understanding the evolutionary dynamics of viruses, for designing effective vaccines, and for the success of antiviral therapy. Next-generation sequencing technologies allow the rapid and cost-effective acquisition of thousands to millions of short DNA sequences from a single sample. However, this approach entails several challenges in experimental design and computational data analysis. Here, we review the entire process of inferring viral diversity from sample collection to computing measures of genetic diversity. We discuss sample preparation, including reverse transcription and amplification, and the effect of experimental conditions on diversity estimates due to in vitro base substitutions, insertions, deletions, and recombination. The use of different next-generation sequencing platforms and their sequencing error profiles are compared in the context of various applications of diversity estimation, ranging from the detection of single nucleotide variants to the reconstruction of whole-genome haplotypes. We describe the statistical and computational challenges arising from these technical artifacts, and we review existing approaches, including available software, for their solution. Finally, we discuss open problems, and highlight successful biomedical applications and potential future clinical use of next-generation sequencing to estimate viral diversity.

  1. Numerical Analysis of Flow Field in Generator End-Winding Region

    Directory of Open Access Journals (Sweden)

    Wei Tong

    2008-01-01

    Full Text Available Cooling in an end-winding region of a high-powered, large-sized generator still remains a challenge today because of a number of factors: a larger number of parts/components with irregular geometries, complexity in cooling flow paths, flow splitting and mixing, and interactions between rotor-induced rotating flows and nonrotating flows from stationary sections. One of the key challenges is to model cooling flows passing through armature bars, which are made up of bundles of strands of insulated copper wires and are bent oppositely to cross each other. This work succeeded in modeling a complex generator end-winding region with great efforts to simplify the model by treating the armature bar region as a porous medium. The flow and pressure fields at the end-winding region were investigated numerically using an axial symmetric computational fluid dynamics (CFD model. Based on the analysis, the cooling flow rate at each flow branch (rotor-stator gap, rotor subslot, outside space block, and small ventilation holes to the heat exchanger was determined, and the high-pressure gradient zones were identified. The CFD results have been successfully used to optimize the flow path configuration for improving the generator operation performance, and the control of the cooling flow, as well as minimizing windage losses and flow-introduced noises.

  2. Preparing the Next Generation of Educators for Democracy

    Science.gov (United States)

    Embry-Jenlink, Karen

    2018-01-01

    In the keynote address of the 42nd annual meeting of the Southeastern Regional Educators Association (SRATE), ATE President Karen Embry-Jenlink examines the critical role of teacher educators in preparing the next generation of citizens and leaders to sustain democracy. Drawing from historic and current events and personal experience,…

  3. Next Generation Science Standards: All Standards, All Students

    Science.gov (United States)

    Lee, Okhee; Miller, Emily C.; Januszyk, Rita

    2014-01-01

    The Next Generation Science Standards (NGSS) offer a vision of science teaching and learning that presents both learning opportunities and demands for all students, particularly student groups that have traditionally been underserved in science classrooms. The NGSS have addressed issues of diversity and equity from their inception, and the NGSS…

  4. Microbial production of next-generation stevia sweeteners

    DEFF Research Database (Denmark)

    Olsson, Kim; Carlsen, Simon; Semmler, Angelika

    2016-01-01

    BACKGROUND: The glucosyltransferase UGT76G1 from Stevia rebaudiana is a chameleon enzyme in the targeted biosynthesis of the next-generation premium stevia sweeteners, rebaudioside D (Reb D) and rebaudioside M (Reb M). These steviol glucosides carry five and six glucose units, respectively......, and have low sweetness thresholds, high maximum sweet intensities and exhibit a greatly reduced lingering bitter taste compared to stevioside and rebaudioside A, the most abundant steviol glucosides in the leaves of Stevia rebaudiana. RESULTS: In the metabolic glycosylation grid leading to production....... This screen made it possible to identify variants, such as UGT76G1Thr146Gly and UGT76G1His155Leu, which diminished accumulation of unwanted side-products and gave increased specific accumulation of the desired Reb D or Reb M sweeteners. This improvement in a key enzyme of the Stevia sweetener biosynthesis...

  5. Next Generation Nuclear Plant Project Evaluation of Siting a HTGR Co-generation Plant on an Operating Commercial Nuclear Power Plant Site

    International Nuclear Information System (INIS)

    Demick, L.E.

    2011-01-01

    This paper summarizes an evaluation by the Idaho National Laboratory (INL) Next Generation Nuclear Plant (NGNP) Project of siting a High Temperature Gas-cooled Reactor (HTGR) plant on an existing nuclear plant site that is located in an area of significant industrial activity. This is a co-generation application in which the HTGR Plant will be supplying steam and electricity to one or more of the nearby industrial plants.

  6. Next Generation Nuclear Plant Project Evaluation of Siting a HTGR Co-generation Plant on an Operating Commercial Nuclear Power Plant Site

    Energy Technology Data Exchange (ETDEWEB)

    L.E. Demick

    2011-10-01

    This paper summarizes an evaluation by the Idaho National Laboratory (INL) Next Generation Nuclear Plant (NGNP) Project of siting a High Temperature Gas-cooled Reactor (HTGR) plant on an existing nuclear plant site that is located in an area of significant industrial activity. This is a co-generation application in which the HTGR Plant will be supplying steam and electricity to one or more of the nearby industrial plants.

  7. Next-Generation Sequencing in the Mycology Lab.

    Science.gov (United States)

    Zoll, Jan; Snelders, Eveline; Verweij, Paul E; Melchers, Willem J G

    New state-of-the-art techniques in sequencing offer valuable tools in both detection of mycobiota and in understanding of the molecular mechanisms of resistance against antifungal compounds and virulence. Introduction of new sequencing platform with enhanced capacity and a reduction in costs for sequence analysis provides a potential powerful tool in mycological diagnosis and research. In this review, we summarize the applications of next-generation sequencing techniques in mycology.

  8. Next generation multi-particle event generators for the MSSM

    International Nuclear Information System (INIS)

    Reuter, J.; Kilian, W.; Hagiwara, K.; Krauss, F.; Schumann, S.; Rainwater, D.

    2005-12-01

    We present a next generation of multi-particle Monte Carlo (MC) Event generators for LHC and ILC for the MSSM, namely the three program packages Madgraph/MadEvent, WHiZard/O'Mega and Sherpa/Amegic++. The interesting but difficult phenomenology of supersymmetric models at the upcoming colliders demands a corresponding complexity and maturity from simulation tools. This includes multi-particle final states, reducible and irreducible backgrounds, spin correlations, real emission of photons and gluons, etc., which are incorporated in the programs presented here. The framework of a model with such a huge particle content and as complicated as the MSSM makes strenuous tests and comparison of codes inevitable. Various tests show agreement among the three different programs; the tables of cross sections produced in these tests may serve as a future reference for other codes. Furthermore, first MSSM physics analyses performed with these programs are presented here. (orig.)

  9. Next-Generation Thermal Infrared Multi-Body Radiometer Experiment (TIMBRE)

    Science.gov (United States)

    Kenyon, M.; Mariani, G.; Johnson, B.; Brageot, E.; Hayne, P.

    2016-10-01

    We have developed an instrument concept called TIMBRE which belongs to the important class of instruments called thermal imaging radiometers (TIRs). TIMBRE is the next-generation TIR with unparalleled performance compared to the state-of-the-art.

  10. Cellular Automata-Based Parallel Random Number Generators Using FPGAs

    Directory of Open Access Journals (Sweden)

    David H. K. Hoe

    2012-01-01

    Full Text Available Cellular computing represents a new paradigm for implementing high-speed massively parallel machines. Cellular automata (CA, which consist of an array of locally connected processing elements, are a basic form of a cellular-based architecture. The use of field programmable gate arrays (FPGAs for implementing CA accelerators has shown promising results. This paper investigates the design of CA-based pseudo-random number generators (PRNGs using an FPGA platform. To improve the quality of the random numbers that are generated, the basic CA structure is enhanced in two ways. First, the addition of a superrule to each CA cell is considered. The resulting self-programmable CA (SPCA uses the superrule to determine when to make a dynamic rule change in each CA cell. The superrule takes its inputs from neighboring cells and can be considered itself a second CA working in parallel with the main CA. When implemented on an FPGA, the use of lookup tables in each logic cell removes any restrictions on how the super-rules should be defined. Second, a hybrid configuration is formed by combining a CA with a linear feedback shift register (LFSR. This is advantageous for FPGA designs due to the compactness of the LFSR implementations. A standard software package for statistically evaluating the quality of random number sequences known as Diehard is used to validate the results. Both the SPCA and the hybrid CA/LFSR were found to pass all the Diehard tests.

  11. What can next generation sequencing do for you? Next generation sequencing as a valuable tool in plant research

    OpenAIRE

    Bräutigam, Andrea; Gowik, Udo

    2010-01-01

    Next generation sequencing (NGS) technologies have opened fascinating opportunities for the analysis of plants with and without a sequenced genome on a genomic scale. During the last few years, NGS methods have become widely available and cost effective. They can be applied to a wide variety of biological questions, from the sequencing of complete eukaryotic genomes and transcriptomes, to the genome-scale analysis of DNA-protein interactions. In this review, we focus on the use of NGS for pla...

  12. Next generation tools for genomic data generation, distribution, and visualization.

    Science.gov (United States)

    Nix, David A; Di Sera, Tonya L; Dalley, Brian K; Milash, Brett A; Cundick, Robert M; Quinn, Kevin S; Courdy, Samir J

    2010-09-09

    With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  13. Targeted enrichment strategies for next-generation plant biology

    Science.gov (United States)

    Richard Cronn; Brian J. Knaus; Aaron Liston; Peter J. Maughan; Matthew Parks; John V. Syring; Joshua. Udall

    2012-01-01

    The dramatic advances offered by modem DNA sequencers continue to redefine the limits of what can be accomplished in comparative plant biology. Even with recent achievements, however, plant genomes present obstacles that can make it difficult to execute large-scale population and phylogenetic studies on next-generation sequencing platforms. Factors like large genome...

  14. Next Generation Science Standards: Adoption and Implementation Workbook

    Science.gov (United States)

    Peltzman, Alissa; Rodriguez, Nick

    2013-01-01

    The Next Generation Science Standards (NGSS) represent the culmination of years of collaboration and effort by states, science educators and experts from across the United States. Based on the National Research Council's "A Framework for K-12 Science Education" and developed in partnership with 26 lead states, the NGSS, when…

  15. Framework for Leading Next Generation Science Standards Implementation

    Science.gov (United States)

    Stiles, Katherine; Mundry, Susan; DiRanna, Kathy

    2017-01-01

    In response to the need to develop leaders to guide the implementation of the Next Generation Science Standards (NGSS), the Carnegie Corporation of New York provided funding to WestEd to develop a framework that defines the leadership knowledge and actions needed to effectively implement the NGSS. The development of the framework entailed…

  16. Validation of Metagenomic Next-Generation Sequencing Tests for Universal Pathogen Detection.

    Science.gov (United States)

    Schlaberg, Robert; Chiu, Charles Y; Miller, Steve; Procop, Gary W; Weinstock, George

    2017-06-01

    - Metagenomic sequencing can be used for detection of any pathogens using unbiased, shotgun next-generation sequencing (NGS), without the need for sequence-specific amplification. Proof-of-concept has been demonstrated in infectious disease outbreaks of unknown causes and in patients with suspected infections but negative results for conventional tests. Metagenomic NGS tests hold great promise to improve infectious disease diagnostics, especially in immunocompromised and critically ill patients. - To discuss challenges and provide example solutions for validating metagenomic pathogen detection tests in clinical laboratories. A summary of current regulatory requirements, largely based on prior guidance for NGS testing in constitutional genetics and oncology, is provided. - Examples from 2 separate validation studies are provided for steps from assay design, and validation of wet bench and bioinformatics protocols, to quality control and assurance. - Although laboratory and data analysis workflows are still complex, metagenomic NGS tests for infectious diseases are increasingly being validated in clinical laboratories. Many parallels exist to NGS tests in other fields. Nevertheless, specimen preparation, rapidly evolving data analysis algorithms, and incomplete reference sequence databases are idiosyncratic to the field of microbiology and often overlooked.

  17. Mobility management techniques for the next-generation wireless networks

    Science.gov (United States)

    Sun, Junzhao; Howie, Douglas P.; Sauvola, Jaakko J.

    2001-10-01

    The tremendous demands from social market are pushing the booming development of mobile communications faster than ever before, leading to plenty of new advanced techniques emerging. With the converging of mobile and wireless communications with Internet services, the boundary between mobile personal telecommunications and wireless computer networks is disappearing. Wireless networks of the next generation need the support of all the advances on new architectures, standards, and protocols. Mobility management is an important issue in the area of mobile communications, which can be best solved at the network layer. One of the key features of the next generation wireless networks is all-IP infrastructure. This paper discusses the mobility management schemes for the next generation mobile networks through extending IP's functions with mobility support. A global hierarchical framework model for the mobility management of wireless networks is presented, in which the mobility management is divided into two complementary tasks: macro mobility and micro mobility. As the macro mobility solution, a basic principle of Mobile IP is introduced, together with the optimal schemes and the advances in IPv6. The disadvantages of the Mobile IP on solving the micro mobility problem are analyzed, on the basis of which three main proposals are discussed as the micro mobility solutions for mobile communications, including Hierarchical Mobile IP (HMIP), Cellular IP, and Handoff-Aware Wireless Access Internet Infrastructure (HAWAII). A unified model is also described in which the different micro mobility solutions can coexist simultaneously in mobile networks.

  18. A Generic and Efficient E-field Parallel Imaging Correlator for Next-Generation Radio Telescopes

    Science.gov (United States)

    Thyagarajan, Nithyanandan; Beardsley, Adam P.; Bowman, Judd D.; Morales, Miguel F.

    2017-05-01

    Modern radio telescopes are favouring densely packed array layouts with large numbers of antennas (NA ≳ 1000). Since the complexity of traditional correlators scales as O(N_A^2), there will be a steep cost for realizing the full imaging potential of these powerful instruments. Through our generic and efficient E-field Parallel Imaging Correlator (epic), we present the first software demonstration of a generalized direct imaging algorithm, namely the Modular Optimal Frequency Fourier imager. Not only does it bring down the cost for dense layouts to O(N_A log _2N_A) but can also image from irregular layouts and heterogeneous arrays of antennas. epic is highly modular, parallelizable, implemented in object-oriented python, and publicly available. We have verified the images produced to be equivalent to those from traditional techniques to within a precision set by gridding coarseness. We have also validated our implementation on data observed with the Long Wavelength Array (LWA1). We provide a detailed framework for imaging with heterogeneous arrays and show that epic robustly estimates the input sky model for such arrays. Antenna layouts with dense filling factors consisting of a large number of antennas such as LWA, the Square Kilometre Array, Hydrogen Epoch of Reionization Array, and Canadian Hydrogen Intensity Mapping Experiment will gain significant computational advantage by deploying an optimized version of epic. The algorithm is a strong candidate for instruments targeting transient searches of fast radio bursts as well as planetary and exoplanetary phenomena due to the availability of high-speed calibrated time-domain images and low output bandwidth relative to visibility-based systems.

  19. Experimental and computational studies of thermal mixing in next generation nuclear reactors

    Science.gov (United States)

    Landfried, Douglas Tyler

    The Very High Temperature Reactor (VHTR) is a proposed next generation nuclear power plant. The VHTR utilizes helium as a coolant in the primary loop of the reactor. Helium traveling through the reactor mixes below the reactor in a region known as the lower plenum. In this region there exists large temperature and velocity gradients due to non-uniform heat generation in the reactor core. Due to these large gradients, concern should be given to reducing thermal striping in the lower plenum. Thermal striping is the phenomena by which temperature fluctuations in the fluid and transferred to and attenuated by surrounding structures. Thermal striping is a known cause of long term material failure. To better understand and predict thermal striping in the lower plenum two separate bodies of work have been conducted. First, an experimental facility capable of predictably recreating some aspects of flow in the lower plenum is designed according to scaling analysis of the VHTR. Namely the facility reproduces jets issuing into a crossflow past a tube bundle. Secondly, extensive studies investigate the mixing of a non-isothermal parallel round triple-jet at two jet-to-jet spacings was conducted. Experimental results were validation with an open source computational fluid dynamics package, OpenFOAMRTM. Additional care is given to understanding the implementation of the realizable k-a and Launder Gibson RSM turbulence Models in OpenFOAMRTM. In order to measure velocity and temperature in the triple-jet experiment a detailed investigation of temperature compensated hotwire anemometry is carried out with special concern being given to quantify the error with the measurements. Finally qualitative comparisons of trends in the experimental results and the computational results is conducted. A new and unexpected physical behavior was observed in the center jet as it appeared to spread unexpectedly for close spacings (S/Djet = 1.41).

  20. New Dimensions of Research on Actinomycetes: Quest for Next Generation Antibiotics

    Directory of Open Access Journals (Sweden)

    Polpass Arul Jose

    2016-08-01

    Full Text Available Starting with the discovery of streptomycin, the promise of natural products research on actinomycetes has been captivat¬ing researchers and offered an array of life-saving antibiotics. However, most of the actinomycetes have received a little attention of researchers beyond isolation and activity screening. Noticeable gaps in genomic information and associated biosynthetic potential of actinomycetes are mainly the reasons for this situation, which has led to a decline in the discovery rate of novel antibiotics. Recent insights gained from genome mining have revealed a massive existence of previously unrecognized biosynthetic potential in actinomycetes. Successive developments in next-generation sequencing, genome editing, analytical separation and high-resolution spectroscopic methods have reinvigorated interest on such actinomycetes and opened new avenues for the discovery of natural and natural-inspired antibiotics. This article describes the new dimensions that have driven the ongoing resurgence of research on actinomycetes with historical background since the commencement in 1940, for the attention of worldwide researchers. Coupled with increasing advancement in molecular and analytical tools and techniques, the discovery of next-generation antibiotics could be possible by revisiting the untapped potential of actinomycetes from different natural sources.

  1. Cloning and Identification of Recombinant Argonaute-Bound Small RNAs Using Next-Generation Sequencing.

    Science.gov (United States)

    Gangras, Pooja; Dayeh, Daniel M; Mabin, Justin W; Nakanishi, Kotaro; Singh, Guramrit

    2018-01-01

    Argonaute proteins (AGOs) are loaded with small RNAs as guides to recognize target mRNAs. Since the target specificity heavily depends on the base complementarity between two strands, it is important to identify small guide and long target RNAs bound to AGOs. For this purpose, next-generation sequencing (NGS) technologies have extended our appreciation truly to the nucleotide level. However, the identification of RNAs via NGS from scarce RNA samples remains a challenge. Further, most commercial and published methods are compatible with either small RNAs or long RNAs, but are not equally applicable to both. Therefore, a single method that yields quantitative, bias-free NGS libraries to identify small and long RNAs from low levels of input will be of wide interest. Here, we introduce such a procedure that is based on several modifications of two published protocols and allows robust, sensitive, and reproducible cloning and sequencing of small amounts of RNAs of variable lengths. The method was applied to the identification of small RNAs bound to a purified eukaryotic AGO. Following ligation of a DNA adapter to RNA 3'-end, the key feature of this method is to use the adapter for priming reverse transcription (RT) wherein biotinylated deoxyribonucleotides specifically incorporated into the extended complementary DNA. Such RT products are enriched on streptavidin beads, circularized while immobilized on beads and directly used for PCR amplification. We provide a stepwise guide to generate RNA-Seq libraries, their purification, quantification, validation, and preparation for next-generation sequencing. We also provide basic steps in post-NGS data analyses using Galaxy, an open-source, web-based platform.

  2. Convergence of wireless, wireline, and photonics next generation networks

    CERN Document Server

    Iniewski, Krzysztof

    2010-01-01

    Filled with illustrations and practical examples from industry, this book provides a brief but comprehensive introduction to the next-generation wireless networks that will soon replace more traditional wired technologies. Written by a mixture of top industrial experts and key academic professors, it is the only book available that covers both wireless networks (such as wireless local area and personal area networks) and optical networks (such as long-haul and metropolitan networks) in one volume. It gives engineers and engineering students the necessary knowledge to meet challenges of next-ge

  3. Economic factors for the next generation NPPs

    International Nuclear Information System (INIS)

    Bengt, I.; Matzie, R.A.

    1996-01-01

    This paper has summarized the major economic factors that will impact the economic viability of the next generation of nuclear power plants. To make these plants competitive with other sources of electric power, they must have a large plant output (1000 - 1400 M We), be constructed over a short time period (on the order of four years or less), be standardized designs which are pre-licensed, and achieve high availability through the use of long operating cycles and short refueling outages. Many features in the design of these plants can promote these attributes. This is the task of the designer in concert with the plant constructor and equipment supplier to work in a concurrent manner to obtain an integrated design that achieves these goals. It is important from the beginning that all interested parties recognize that there must be a balance between the desire for improved safety and the cost to achieve this safety. Similarly, there must be a recognition that the economics of nuclear power plants are based on power generation costs over a sixty year period, not on the initial capital cost of the plant. The initial capital cost of the plant is only about one-third of the total cost of running the plant for its life time. Thus, focusing on the initial capital costs may drive the designers to incorporate features that adversely affect its future operation. Features such as compact plant designs that have restricted access to components, and the use of highly interconnected systems that perform multiple functions, result in increased difficulty of operating and maintaining the plant. Exhaustive planning in all phases of the plant life cycle will reap dramatic dividends in the reduction of power generation costs. The planning done in the design phase by utilizing designers, constructors, and operators will result in a plant that has lower power generation costs. Planning during the construction phase can result in a shorter schedule, by eliminating essentially all rework

  4. Coupled Qubits for Next Generation Quantum Annealing: Novel Interactions

    Science.gov (United States)

    Samach, Gabriel; Weber, Steven; Hover, David; Rosenberg, Danna; Yoder, Jonilyn; Kim, David; Oliver, William D.; Kerman, Andrew J.

    While the first generation of quantum annealers based on Josephson junction technology have been successfully engineered to represent arrays of spins in the quantum transverse-field Ising model, no circuit architecture to date has succeeded in emulating the more complicated non-stoquastic Hamiltonians of interest for next generation quantum annealing. Here, we present our recent results for tunable ZZ- and XX-coupling between high coherence superconducting flux qubits. We discuss the larger architectures these coupled two-qubit building blocks will enable, as well as comment on the limitations of such architectures. This research was funded by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA) and by the Assistant Secretary of Defense for Research & Engineering under Air Force Contract No. FA8721-05-C-0002. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of ODNI, IARPA, or the US Government.

  5. Designing Next Generation Telecom Regulation

    DEFF Research Database (Denmark)

    Henten, Anders; Samarajiva, Rohan

    – ICT convergence regulation and multisector utility regulation. Whatever structure of next generation telecom regulation is adopted, all countries will need to pay much greater attention to the need for increased coordination of policy directions and regulatory activities both across the industries......Continuously expanding applications of information and communication technologies (ICT) are transforming local, national, regional and international economies into network economies, the foundation for information societies. They are being built upon expanded and upgraded national telecom networks...... to creating an environment to foster a massive expansion in the coverage and capabilities of the information infrastructure networks, with national telecom regulators as the key implementers of the policies of reform. The first phase of reform has focused on industry specific telecom policy and regulation...

  6. ERP II: Next-generation Extended Enterprise Resource Planning

    DEFF Research Database (Denmark)

    Møller, Charles

    2004-01-01

    ERP II (ERP/2) systems is a new concept introduced by Gartner Group in 2000 in order to label the latest extensions of the ERP-systems. The purpose of this paper is to explore the next-generation of ERP systems, the Extended Enterprise Resource Planning (EERP or as we prefer to use: e...... impact on extended enterprise architecture.....

  7. ERP II - Next-generation Extended Enterprise Resource Planning

    DEFF Research Database (Denmark)

    Møller, Charles

    2003-01-01

    ERP II (ERP/2) systems is a new concept introduced by Gartner Group in 2000 in order to label the latest extensions of the ERP-systems. The purpose of this paper is to explore the next-generation of ERP systems, the Extended Enterprise Resource Planning (EERP or as we prefer to use: e...... impact on extended enterprise architecture....

  8. Applications and Case Studies of the Next-Generation Sequencing Technologies in Food, Nutrition and Agriculture.

    Science.gov (United States)

    Next-generation sequencing technologies are able to produce high-throughput short sequence reads in a cost-effective fashion. The emergence of these technologies has not only facilitated genome sequencing but also changed the landscape of life sciences. Here I survey their major applications ranging...

  9. High-Average-Power Diffraction Pulse-Compression Gratings Enabling Next-Generation Ultrafast Laser Systems

    Energy Technology Data Exchange (ETDEWEB)

    Alessi, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-01

    Pulse compressors for ultrafast lasers have been identified as a technology gap in the push towards high peak power systems with high average powers for industrial and scientific applications. Gratings for ultrashort (sub-150fs) pulse compressors are metallic and can absorb a significant percentage of laser energy resulting in up to 40% loss as well as thermal issues which degrade on-target performance. We have developed a next generation gold grating technology which we have scaled to the petawatt-size. This resulted in improvements in efficiency, uniformity and processing as compared to previous substrate etched gratings for high average power. This new design has a deposited dielectric material for the grating ridge rather than etching directly into the glass substrate. It has been observed that average powers as low as 1W in a compressor can cause distortions in the on-target beam. We have developed and tested a method of actively cooling diffraction gratings which, in the case of gold gratings, can support a petawatt peak power laser with up to 600W average power. We demonstrated thermo-mechanical modeling of a grating in its use environment and benchmarked with experimental measurement. Multilayer dielectric (MLD) gratings are not yet used for these high peak power, ultrashort pulse durations due to their design challenges. We have designed and fabricated broad bandwidth, low dispersion MLD gratings suitable for delivering 30 fs pulses at high average power. This new grating design requires the use of a novel Out Of Plane (OOP) compressor, which we have modeled, designed, built and tested. This prototype compressor yielded a transmission of 90% for a pulse with 45 nm bandwidth, and free of spatial and angular chirp. In order to evaluate gratings and compressors built in this project we have commissioned a joule-class ultrafast Ti:Sapphire laser system. Combining the grating cooling and MLD technologies developed here could enable petawatt laser systems to

  10. Environmental Information for the U.S. Next Generation Air Transportation System (NextGen)

    Science.gov (United States)

    Murray, J.; Miner, C.; Pace, D.; Minnis, P.; Mecikalski, J.; Feltz, W.; Johnson, D.; Iskendarian, H.; Haynes, J.

    2009-09-01

    It is estimated that weather is responsible for approximately 70% of all air traffic delays and cancellations in the United States. Annually, this produces an overall economic loss of nearly 40B. The FAA and NASA have determined that weather impacts and other environmental constraints on the U.S. National Airspace System (NAS) will increase to the point of system unsustainability unless the NAS is radically transformed. A Next Generation Air Transportation System (NextGen) is planned to accommodate the anticipated demand for increased system capacity and the super-density operations that this transformation will entail. The heart of the environmental information component that is being developed for NextGen will be a 4-dimensional data cube which will include a single authoritative source comprising probabilistic weather information for NextGen Air Traffic Management (ATM) systems. Aviation weather constraints and safety hazards typically comprise meso-scale, storm-scale and microscale observables that can significantly impact both terminal and enroute aviation operations. With these operational impacts in mind, functional and performance requirements for the NextGen weather system were established which require significant improvements in observation and forecasting capabilities. This will include satellite observations from geostationary and/or polar-orbiting hyperspectral sounders, multi-spectral imagers, lightning mappers, space weather monitors and other environmental observing systems. It will also require improved in situ and remotely sensed observations from ground-based and airborne systems. These observations will be used to better understand and to develop forecasting applications for convective weather, in-flight icing, turbulence, ceilings and visibility, volcanic ash, space weather and the environmental impacts of aviation. Cutting-edge collaborative research efforts and results from NASA, NOAA and the FAA which address these phenomena are summarized

  11. Cross-Layer Framework for Fine-Grained Channel Access in Next Generation High-Density WiFi Networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Haitao; ZHANG Shaojie; Emiliano Garcia-Palacios

    2016-01-01

    Densely deployed WiFi networks will play a crucial role in providing the capacity for next generation mobile internet.However,due to increasing interference,overlapped channels in WiFi networks and throughput efficiency degradation,densely deployed WiFi networks is not a guarantee to obtain higher throughput.An emergent challenge is how to efficiently utilize scarce spectrum resources,by matching physical layer resources to traffic demand.In this aspect,access control allocation strategies play a pivotal role but remain too coarse-grained.As a solution,this research proposes a flexible framework for fine-grained channel width adaptation and multi-channel access in WiFi networks.This approach,named SFCA (Subcarrier Fine-grained Channel Access),adopts DOFDM (Discontinuous Orthogonal Frequency Division Multiplexing) at the PHY layer.It allocates the frequency resource with a subcarrier granularity,which facilitates the channel width adaptation for multi-channel access and thus brings more flexibility and higher frequency efficiency.The MAC layer uses a frequencytime domain backoff scheme,which combines the popular time-domain BEB scheme with a frequency-domain backoff to decrease access collision,resulting in higher access probability for the contending nodes.SFCA is compared with FICA (an established access scheme) showing significant outperformance.Finally we present results for next generation 802.11 ac WiFi networks.

  12. The contribution of next generation sequencing to epilepsy genetics

    DEFF Research Database (Denmark)

    Møller, Rikke S.; Dahl, Hans A.; Helbig, Ingo

    2015-01-01

    During the last decade, next generation sequencing technologies such as targeted gene panels, whole exome sequencing and whole genome sequencing have led to an explosion of gene identifications in monogenic epilepsies including both familial epilepsies and severe epilepsies, often referred to as ...

  13. Cost and schedule reduction for next-generation Candu

    International Nuclear Information System (INIS)

    Hopwood, J.M.; Yu, S.; Pakan, M.; Soulard, M.

    2002-01-01

    AECL has developed a suite of technologies for Candu R reactors that enable the next step in the evolution of the Candu family of heavy-water-moderated fuel-channel reactors. These technologies have been combined in the design for the Advanced Candu Reactor TM1 (ACRTM), AECL's next generation Candu power plant. The ACR design builds extensively on the existing Candu experience base, but includes innovations, in design and in delivery technology, that provide very substantial reductions in capital cost and in project schedules. In this paper, main features of next generation design and delivery are summarized, to provide the background basis for the cost and schedule reductions that have been achieved. In particular the paper outlines the impact of the innovative design steps for ACR: - Selection of slightly enriched fuel bundle design; - Use of light water coolant in place of traditional Candu heavy water coolant; - Compact core design with unique reactor physics benefits; - Optimized coolant and turbine system conditions. In addition to the direct cost benefits arising from efficiency improvement, and from the reduction in heavy water, the next generation Candu configuration results in numerous additional indirect cost benefits, including: - Reduction in number and complexity of reactivity mechanisms; - Reduction in number of heavy water auxiliary systems; - Simplification in heat transport and its support systems; - Simplified human-machine interface. The paper also describes the ACR approach to design for constructability. The application of module assembly and open-top construction techniques, based on Candu and other worldwide experience, has been proven to generate savings in both schedule durations and overall project cost, by reducing premium on-site activities, and by improving efficiency of system and subsystem assembly. AECL's up-to-date experience in the use of 3-D CADDS and related engineering tools has also been proven to reduce both engineering and

  14. An evaluation of possible next-generation high temperature molten-salt power towers.

    Energy Technology Data Exchange (ETDEWEB)

    Kolb, Gregory J.

    2011-12-01

    Since completion of the Solar Two molten-salt power tower demonstration in 1999, the solar industry has been developing initial commercial-scale projects that are 3 to 14 times larger. Like Solar Two, these initial plants will power subcritical steam-Rankine cycles using molten salt with a temperature of 565 C. The main question explored in this study is whether there is significant economic benefit to develop future molten-salt plants that operate at a higher receiver outlet temperature. Higher temperatures would allow the use of supercritical steam cycles that achieve an improved efficiency relative to today's subcritical cycle ({approx}50% versus {approx}42%). The levelized cost of electricity (LCOE) of a 565 C subcritical baseline plant was compared with possible future-generation plants that operate at 600 or 650 C. The analysis suggests that {approx}8% reduction in LCOE can be expected by raising salt temperature to 650 C. However, most of that benefit can be achieved by raising the temperature to only 600 C. Several other important insights regarding possible next-generation power towers were also drawn: (1) the evaluation of receiver-tube materials that are capable of higher fluxes and temperatures, (2) suggested plant reliability improvements based on a detailed evaluation of the Solar Two experience, and (3) a thorough evaluation of analysis uncertainties.

  15. Resonance analysis in parallel voltage-controlled Distributed Generation inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2013-01-01

    Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops...

  16. Next generation environment for collaborative research

    International Nuclear Information System (INIS)

    Collados, D.; Denis, G.; Galvez, P.; Newman, H.

    2001-01-01

    Collaborative environments supporting point to point and multipoint video-conferencing, document and application sharing across both local and wide area networks, video on demand (broadcast and playback) and interactive text facilities will be a crucial element for the development of the next generation of HEP experiments by geographically dispersed collaborations. The 'Virtual Room Video conferencing System' (VRVS) has been developed since 1995, in order to provide a low cost, bandwidth-efficient, extensible means for video conferencing and remote collaboration over networks within the High Energy and Nuclear Physics communities. The VRVS provides worldwide videoconferencing service and collaborative environment to the research and education communities. VRVS uses the Internet2 and ESnet high-performance networks infrastructure to deploy its Web-based system, which now includes more than 5790 registered hosts running VRVS software in more than 50 different countries. VRVS hosts an average of 100-multipoint videoconference and collaborative sessions worldwide every month. There are around 35 reflectors that manage the traffic flow, at HENP labs and universities in the US and Europe. So far, there are 7 Virtual Rooms for World Wide Conferences (involving more than one continent), and 4 Virtual Rooms each for intra-continental conferences in the US, Europe and Asia. VRVS continues to expand and implement new digital video technologies, including H.323 ITU standard integration, MPEG-2 videoconferencing integration, shared environments, and Quality of Service

  17. Next-generation digital information storage in DNA.

    Science.gov (United States)

    Church, George M; Gao, Yuan; Kosuri, Sriram

    2012-09-28

    Digital information is accumulating at an astounding rate, straining our ability to store and archive it. DNA is among the most dense and stable information media known. The development of new technologies in both DNA synthesis and sequencing make DNA an increasingly feasible digital storage medium. We developed a strategy to encode arbitrary digital information in DNA, wrote a 5.27-megabit book using DNA microchips, and read the book by using next-generation DNA sequencing.

  18. Popular Imagination and Identity Politics: Reading the Future in "Star Trek: The Next Generation."

    Science.gov (United States)

    Ott, Brian L.; Aoki, Eric

    2001-01-01

    Analyzes the television series "Star Trek: The Next Generation." Theorizes the relationship between collective visions of the future and the identity politics of the present. Argues that "The Next Generation" invites audiences to participate in a shared sense of the future that constrains human agency and (re)produces the…

  19. Issues on generating primordial anisotropies at the end of inflation

    Energy Technology Data Exchange (ETDEWEB)

    Emami, Razieh; Firouzjahi, Hassan, E-mail: emami@mail.ipm.ir, E-mail: firouz@mail.ipm.ir [School of Physics, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

    2012-01-01

    We revisit the idea of generating primordial anisotropies at the end of inflation in models of inflation with gauge fields. To be specific we consider the charged hybrid inflation model where the waterfall field is charged under a U(1) gauge field so the surface of end of inflation is controlled both by inflaton and the gauge fields. Using δN formalism properly we find that the anisotropies generated at the end of inflation from the gauge field fluctuations are exponentially suppressed on cosmological scales. This is because the gauge field evolves exponentially during inflation while in order to generate appreciable anisotropies at the end of inflation the spectator gauge field has to be frozen. We argue that this is a generic feature, that is, one can not generate observable anisotropies at the end of inflation within an FRW background.

  20. Issues on generating primordial anisotropies at the end of inflation

    International Nuclear Information System (INIS)

    Emami, Razieh; Firouzjahi, Hassan

    2012-01-01

    We revisit the idea of generating primordial anisotropies at the end of inflation in models of inflation with gauge fields. To be specific we consider the charged hybrid inflation model where the waterfall field is charged under a U(1) gauge field so the surface of end of inflation is controlled both by inflaton and the gauge fields. Using δN formalism properly we find that the anisotropies generated at the end of inflation from the gauge field fluctuations are exponentially suppressed on cosmological scales. This is because the gauge field evolves exponentially during inflation while in order to generate appreciable anisotropies at the end of inflation the spectator gauge field has to be frozen. We argue that this is a generic feature, that is, one can not generate observable anisotropies at the end of inflation within an FRW background

  1. Experimental study into a hybrid PCCI/CI concept for next-generation heavy-duty diesel engines

    NARCIS (Netherlands)

    Doosje, E.; Willems, F.P.T.; Baert, R.S.G.; Dijk, M.D. van

    2012-01-01

    This paper presents the first results of an experimental study into a hybrid combustion concept for next-generation heavy-duty diesel engines. In this hybrid concept, at low load operating conditions, the engine is run in Pre-mixed Charge Compression Ignition (PCCI) mode, whereas at high load

  2. Modelling of control system architecture for next-generation accelerators

    International Nuclear Information System (INIS)

    Liu, Shi-Yao; Kurokawa, Shin-ichi

    1990-01-01

    Functional, hardware and software system architectures define the fundamental structure of control systems. Modelling is a protocol of system architecture used in system design. This paper reviews various modellings adopted in past ten years and suggests a new modelling for next generation accelerators. (author)

  3. Next Generation UV Coronagraph Instrumentation for Solar Cycle-24

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... New concepts for next generation instrumentation include imaging ultraviolet spectrocoronagraphs and large aperture ultraviolet coronagraph spectrometers. An imaging instrument would be the first to obtain absolute spectral line intensities of the extended corona over a wide field of view. Such images ...

  4. Power Electronics for the Next Generation Wind Turbine System

    DEFF Research Database (Denmark)

    Ma, Ke

    This book presents recent studies on the power electronics used for the next generation wind turbine system. Some criteria and tools for evaluating and improving the critical performances of the wind power converters have been proposed and established. The book addresses some emerging problems...

  5. Advanced optical components for next-generation photonic networks

    Science.gov (United States)

    Yoo, S. J. B.

    2003-08-01

    Future networks will require very high throughput, carrying dominantly data-centric traffic. The role of Photonic Networks employing all-optical systems will become increasingly important in providing scalable bandwidth, agile reconfigurability, and low-power consumptions in the future. In particular, the self-similar nature of data traffic indicates that packet switching and burst switching will be beneficial in the Next Generation Photonic Networks. While the natural conclusion is to pursue Photonic Packet Switching and Photonic Burst Switching systems, there are significant challenges in realizing such a system due to practical limitations in optical component technologies. Lack of a viable all-optical memory technology will continue to drive us towards exploring rapid reconfigurability in the wavelength domain. We will introduce and discuss the advanced optical component technologies behind the Photonic Packet Routing system designed and demonstrated at UC Davis. The system is capable of packet switching and burst switching, as well as circuit switching with 600 psec switching speed and scalability to 42 petabit/sec aggregated switching capacity. By utilizing a combination of rapidly tunable wavelength conversion and a uniform-loss cyclic frequency (ULCF) arrayed waveguide grating router (AWGR), the system is capable of rapidly switching the packets in wavelength, time, and space domains. The label swapping module inside the Photonic Packet Routing system containing a Mach-Zehnder wavelength converter and a narrow-band fiber Bragg-grating achieves all-optical label swapping with optical 2R (potentially 3R) regeneration while maintaining optical transparency for the data payload. By utilizing the advanced optical component technologies, the Photonic Packet Routing system successfully demonstrated error-free, cascaded, multi-hop photonic packet switching and routing with optical-label swapping. This paper will review the advanced optical component technologies

  6. Flexibility and Performance of Parallel File Systems

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1996-01-01

    As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.

  7. Codon-Precise, Synthetic, Antibody Fragment Libraries Built Using Automated Hexamer Codon Additions and Validated through Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Laura Frigotto

    2015-05-01

    Full Text Available We have previously described ProxiMAX, a technology that enables the fabrication of precise, combinatorial gene libraries via codon-by-codon saturation mutagenesis. ProxiMAX was originally performed using manual, enzymatic transfer of codons via blunt-end ligation. Here we present Colibra™: an automated, proprietary version of ProxiMAX used specifically for antibody library generation, in which double-codon hexamers are transferred during the saturation cycling process. The reduction in process complexity, resulting library quality and an unprecedented saturation of up to 24 contiguous codons are described. Utility of the method is demonstrated via fabrication of complementarity determining regions (CDR in antibody fragment libraries and next generation sequencing (NGS analysis of their quality and diversity.

  8. Development of a framework for the neutronics analysis system for next generation (3)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hirai, Yasushi; Hyoudou, Hideaki; Tatsumi, Masahiro

    2010-02-01

    Development of innovative analysis methods and models in fundamental studies for next-generation nuclear reactor systems is in progress. In order to efficiently and effectively reflect the latest analysis methods and models to primary design of commercial reactor and/or in-core fuel management for power reactors, a next-generation analysis system MARBLE has been developed. The next-generation analysis system provides solutions to the following requirements: (1) flexibility, extensibility and user-friendliness that can apply new methods and models rapidly and effectively for fundamental studies, (2) quantitative proof of solution accuracy and adaptive scoping range for design studies, (3) coupling analysis among different study domains for the purpose of rationalization of plant systems and improvement of reliability, (4) maintainability and reusability for system extensions for the purpose of total quality management and development efficiency. The next-generation analysis system supports many fields, such as thermal-hydraulic analysis, structure analysis, reactor physics etc., and now we are studying reactor physics analysis system for fast reactor in advance. As for reactor physics analysis methods for fast reactor, we have established the JUPITER standard analysis methods based on the past study. But, there has been a problem of extreme inefficiency due to lack of functionality in the conventional analysis system when changing analysis targets and/or modeling levels. That is why, we have developed the next-generation analysis system for reactor physics which reproduces the JUPITER standard analysis method that has been developed so far and newly realizes burnup and design analysis for fast reactor and functions for cross section adjustment. In the present study, we examined in detail the existing design and implementation of ZPPR critical experiment analysis database followed by unification of models within the framework of the next-generation analysis system by

  9. Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.

    Science.gov (United States)

    Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele

    2015-01-01

    Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.

  10. Design features in Korean next generation reactor focused on performance and economic viability

    International Nuclear Information System (INIS)

    Lee, J.S.; Chung, M.S.; Na, J.H.; Kim, M.C.; Choi, Y.S.

    2001-01-01

    As of the end of Dec. 1999, Korea's total nuclear power capacity reached 13,716 MWe with 16 units in operation and 4 units under construction. In addition, as part of the national long-term R and D programme launched in 1992, the Korean Next Generation Reactor (KNGR) is being developed to meet the electricity demands in the years to come and is expected to be safer and more economically competitive than any other conventional electric power sources in Korea. The KNGR project has successfully completed its second phase and is now on the third phase. In Phase III of the KNGR design development project, KNGR aims at reinforcing the economic competitiveness while maintaining safety goals. To achieve these objectives, the design options studied and the design requirements set up in the first phase are pursued while the second phase are being reviewed. This paper summarizes such efforts for design improvement in terms of performance and economic viability along with the status of nuclear power generation in Korea, focusing on KNGR currently. (author)

  11. Public Policy and the Next Generation of Farmers, Ranchers, Producers, and Agribusiness Leaders.

    Science.gov (United States)

    Gasperini, Frank A

    2017-01-01

    The emerging, next generation of people engaged as managers in agriculture differs from the "baby boomer" farm generation that relishes certain traditions and an agrarian lifestyle. These futuristic producers and managers have been raised in a society that promulgates safety environment rules. They have witnessed lives saved by automobile seatbelts and lives improved from clean air and water. They know the basic cost of effective safety compliance is relatively fixed, regardless of the number of employees, and they are willing to invest resources that ensure a culture of safety, because it is economically beneficial, socially responsible, and probably required by the companies to whom they need to market their products. These same millennials understand that society and their customers will not continue to tolerate the high rate of agricultural injuries and deaths indefinitely. Public policy as a means to improve agricultural workers' safety and health is likely to be less resisted by the next generation of farmers, ranchers, producers, and agribusiness leaders who, regardless of legal or regulatory pressure, will implement internal business policies emphasizing safety, health, sustainability, and social justness as they understand it.

  12. Next generation tools for genomic data generation, distribution, and visualization

    Directory of Open Access Journals (Sweden)

    Nix David A

    2010-09-01

    Full Text Available Abstract Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx; an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub; and a standalone Java Swing application (GWrap that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  13. Control of Next Generation Aircraft and Wind Turbines

    Science.gov (United States)

    Frost, Susan

    2010-01-01

    The first part of this talk will describe some of the exciting new next generation aircraft that NASA is proposing for the future. These aircraft are being designed to reduce aircraft fuel consumption and environmental impact. Reducing the aircraft weight is one approach that will be used to achieve these goals. A new control framework will be presented that enables lighter, more flexible aircraft to maintain aircraft handling qualities, while preventing the aircraft from exceeding structural load limits. The second part of the talk will give an overview of utility-scale wind turbines and their control. Results of collaboration with Dr. Balas will be presented, including new theory to adaptively control the turbine in the presence of structural modes, with the focus on the application of this theory to a high-fidelity simulation of a wind turbine.

  14. Effects of parallel planning on agreement production.

    Science.gov (United States)

    Veenstra, Alma; Meyer, Antje S; Acheson, Daniel J

    2015-11-01

    An important issue in current psycholinguistics is how the time course of utterance planning affects the generation of grammatical structures. The current study investigated the influence of parallel activation of the components of complex noun phrases on the generation of subject-verb agreement. Specifically, the lexical interference account (Gillespie & Pearlmutter, 2011b; Solomon & Pearlmutter, 2004) predicts more agreement errors (i.e., attraction) for subject phrases in which the head and local noun mismatch in number (e.g., the apple next to the pears) when nouns are planned in parallel than when they are planned in sequence. We used a speeded picture description task that yielded sentences such as the apple next to the pears is red. The objects mentioned in the noun phrase were either semantically related or unrelated. To induce agreement errors, pictures sometimes mismatched in number. In order to manipulate the likelihood of parallel processing of the objects and to test the hypothesized relationship between parallel processing and the rate of agreement errors, the pictures were either placed close together or far apart. Analyses of the participants' eye movements and speech onset latencies indicated slower processing of the first object and stronger interference from the related (compared to the unrelated) second object in the close than in the far condition. Analyses of the agreement errors yielded an attraction effect, with more errors in mismatching than in matching conditions. However, the magnitude of the attraction effect did not differ across the close and far conditions. Thus, spatial proximity encouraged parallel processing of the pictures, which led to interference of the associated conceptual and/or lexical representation, but, contrary to the prediction, it did not lead to more attraction errors. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Developing the next generation of nuclear workers at OPG

    International Nuclear Information System (INIS)

    Spekkens, P.

    2007-01-01

    This presentation is about developing the next generation of nuclear workers at Ontario Power Generation (OPG). Industry developments are creating urgent need to hire, train and retain new staff. OPG has an aggressive hiring campaign. Training organization is challenged to accommodate influx of new staff. Collaborating with colleges and universities is increasing the supply of qualified recruits with an interest in nuclear. Program for functional and leadership training have been developed. Knowledge retention is urgently required

  16. NGS QC Toolkit: a toolkit for quality control of next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Ravi K Patel

    Full Text Available Next generation sequencing (NGS technologies provide a high-throughput means to generate large amount of sequence data. However, quality control (QC of sequence data generated from these technologies is extremely important for meaningful downstream analysis. Further, highly efficient and fast processing tools are required to handle the large volume of datasets. Here, we have developed an application, NGS QC Toolkit, for quality check and filtering of high-quality data. This toolkit is a standalone and open source application freely available at http://www.nipgr.res.in/ngsqctoolkit.html. All the tools in the application have been implemented in Perl programming language. The toolkit is comprised of user-friendly tools for QC of sequencing data generated using Roche 454 and Illumina platforms, and additional tools to aid QC (sequence format converter and trimming tools and analysis (statistics tools. A variety of options have been provided to facilitate the QC at user-defined parameters. The toolkit is expected to be very useful for the QC of NGS data to facilitate better downstream analysis.

  17. Heterogeneous next-generation wireless network interference model-and its applications

    KAUST Repository

    Mahmood, Nurul Huda; Yilmaz, Ferkan; Alouini, Mohamed-Slim; Ø ien, Geir Egil

    2014-01-01

    Next-generation wireless systems facilitating better utilisation of the scarce radio spectrum have emerged as a response to inefficient and rigid spectrum assignment policies. These are comprised of intelligent radio nodes that opportunistically

  18. Epidemiological Studies to Support the Development of Next Generation Influenza Vaccines.

    Science.gov (United States)

    Petrie, Joshua G; Gordon, Aubree

    2018-03-26

    The National Institute of Allergy and Infectious Diseases recently published a strategic plan for the development of a universal influenza vaccine. This plan focuses on improving understanding of influenza infection, the development of influenza immunity, and rational design of new vaccines. Epidemiological studies such as prospective, longitudinal cohort studies are essential to the completion of these objectives. In this review, we discuss the contributions of epidemiological studies to our current knowledge of vaccines and correlates of immunity, and how they can contribute to the development and evaluation of the next generation of influenza vaccines. These studies have been critical in monitoring the effectiveness of current influenza vaccines, identifying issues such as low vaccine effectiveness, reduced effectiveness among those who receive repeated vaccination, and issues related to egg adaptation during the manufacturing process. Epidemiological studies have also identified population-level correlates of protection that can inform the design and development of next generation influenza vaccines. Going forward, there is an enduring need for epidemiological studies to continue advancing knowledge of correlates of protection and the development of immunity, to evaluate and monitor the effectiveness of next generation influenza vaccines, and to inform recommendations for their use.

  19. Next generation multi-material 3D food printer concept

    NARCIS (Netherlands)

    Klomp, D.J.; Anderson, P.D.

    2017-01-01

    3D food printing is a new rapidly developing technology capable of creating food structures that are impossible to create with normal processing techniques. Challenges in this field are creating texture and multi-material food products. To address these challenges a next generation food printer will

  20. Next-generation sequencing approaches to understanding the oral microbiome

    NARCIS (Netherlands)

    Zaura, E.

    2012-01-01

    Until recently, the focus in dental research has been on studying a small fraction of the oral microbiome—so-called opportunistic pathogens. With the advent of next-generation sequencing (NGS) technologies, researchers now have the tools that allow for profiling of the microbiomes and metagenomes at

  1. Next Generation Life Support Project Status

    Science.gov (United States)

    Barta, Daniel J.; Chullen, Cinda; Vega, Leticia; Cox, Marlon R.; Aitchison, Lindsay T.; Lange, Kevin E.; Pensinger, Stuart J.; Meyer, Caitlin E.; Flynn, Michael; Jackson, W. Andrew; hide

    2014-01-01

    Next Generation Life Support (NGLS) is one of over twenty technology development projects sponsored by NASA's Game Changing Development Program. The NGLS Project develops selected life support technologies needed for humans to live and work productively in space, with focus on technologies for future use in spacecraft cabin and space suit applications. Over the last three years, NGLS had five main project elements: Variable Oxygen Regulator (VOR), Rapid Cycle Amine (RCA) swing bed, High Performance (HP) Extravehicular Activity (EVA) Glove, Alternative Water Processor (AWP) and Series-Bosch Carbon Dioxide Reduction. The RCA swing bed, VOR and HP EVA Glove tasks are directed at key technology needs for the Portable Life Support System (PLSS) and pressure garment for an Advanced Extravehicular Mobility Unit (EMU). Focus is on prototyping and integrated testing in cooperation with the Advanced Exploration Systems (AES) Advanced EVA Project. The HP EVA Glove Element, new this fiscal year, includes the generation of requirements and standards to guide development and evaluation of new glove designs. The AWP and Bosch efforts focus on regenerative technologies to further close spacecraft cabin atmosphere revitalization and water recovery loops and to meet technology maturation milestones defined in NASA's Space Technology Roadmaps. These activities are aimed at increasing affordability, reliability, and vehicle self-sufficiency while decreasing mass and mission cost, supporting a capability-driven architecture for extending human presence beyond low-Earth orbit, along a human path toward Mars. This paper provides a status of current technology development activities with a brief overview of future plans.

  2. Dual Converter Fed Open-End Transformer Topology with Parallel Converters and Integrated Magnetics

    DEFF Research Database (Denmark)

    Gohil, Ghanshyamsinh Vijaysinh; Bede, Lorand; Teodorescu, Remus

    2016-01-01

    that flows between the parallel interleaved VSCs. An integrated inductor is proposed which suppresses the circulating current in both the converter groups. In addition, the functionality of the line filter inductor is also integrated. Flux in various parts of the integrated inductor is analyzed and a design......A converter system for high power applications, connected to a medium-voltage network using a stepup transformer, is presented in this paper. The converterside winding of the transformer is configured as an openend and both the ends of the windings are fed from two different converter groups. Each...... procedure is also described. The volume and the losses of the proposed solution are compared with that of the state-of-art solution. The control of the proposed converter system is also discussed. The analysis has been verified by the simulation and experimental results....

  3. Carrier ethernet network control plane based on the Next Generation Network

    DEFF Research Database (Denmark)

    Fu, Rong; Wang, Yanmeng; Berger, Michael Stubert

    2008-01-01

    This paper contributes on presenting a step towards the realization of Carrier Ethernet control plane based on the next generation network (NGN). Specifically, transport MPLS (T-MPLS) is taken as the transport technology in Carrier Ethernet. It begins with providing an overview of the evolving...... architecture of the next generation network (NGN). As an essential candidate among the NGN transport technologies, the definition of Carrier Ethernet (CE) is also introduced here. The second part of this paper depicts the contribution on the T-MPLS based Carrier Ethernet network with control plane based on NGN...... at illustrating the improvement of the Carrier Ethernet network with the NGN control plane....

  4. Development of source term evaluation method for Korean Next Generation Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Keon Jae; Cheong, Jae Hak; Park, Jin Baek; Kim, Guk Gee [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-10-15

    This project had investigate several design features of radioactive waste processing system and method to predict nuclide concentration at primary coolant basic concept of next generation reactor and safety goals at the former phase. In this project several prediction methods of source term are evaluated conglomerately and detailed contents of this project are : model evaluation of nuclide concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant(NPP), investigation of prediction parameter of source term evaluation, basic parameter of PWR, operational parameter, respectively, radionuclide removal system and adjustment values of reference NPP, suggestion of source term prediction method of next generation NPP.

  5. Next generation of relativistic heavy ion accelerators

    International Nuclear Information System (INIS)

    Grunder, H.; Leemann, C.; Selph, F.

    1978-06-01

    Results are presented of exploratory and preliminary studies of a next generation of heavy ion accelerators. The conclusion is reached that useful luminosities are feasible in a colliding beam facility for relativistic heavy ions. Such an accelerator complex may be laid out in such a way as to provide extractebeams for fixed target operation, therefore allowing experimentation in an energy region overlapping with that presently available. These dual goals seem achievable without undue complications, or penalties with respect to cost and/or performance

  6. Thermonuclear ignition in the next generation tokamaks

    International Nuclear Information System (INIS)

    Johner, J.

    1989-04-01

    The extrapolation of experimental rules describing energy confinement and magnetohydrodynamic - stability limits, in known tokamaks, allow to show that stable thermonuclear ignition equilibria should exist in this configuration, if the product aB t x of the dimensions by a magnetic-field power is large enough. Quantitative application of this result to several next-generation tokamak projects show that those kinds of equilibria could exist in such devices, which would also have enough additional heating power to promote an effective accessible ignition

  7. Next Generation Nuclear Plant Project Preliminary Project Management Plan

    International Nuclear Information System (INIS)

    Dennis J. Harrell

    2006-01-01

    This draft preliminary project management plan presents the conceptual framework for the Next Generation Nuclear Plant (NGNP) Project, consistent with the authorization in the Energy Policy Act of 2005. In developing this plan, the Idaho National Laboratory has considered three fundamental project planning options that are summarized in the following section. Each of these planning options is literally compliant with the Energy Policy Act of 2005, but each emphasizes different approaches to technology development risks, design, licensing and construction risks, and to the extent of commercialization support provided to the industry. The primary focus of this draft preliminary project management plan is to identify those activities important to Critical Decision-1, at which point a decision on proceeding with the NGNP Project can be made. The conceptual project framework described herein is necessary to establish the scope and priorities for the technology development activities. The framework includes: A reference NGNP prototype concept based on what is judged to be the lowest risk technology development that would achieve the needed commercial functional requirements to provide an economically competitive nuclear heat source and hydrogen production capability. A high-level schedule logic for design, construction, licensing, and acceptance testing. This schedule logic also includes an operational shakedown period that provides proof-of-principle to establish the basis for commercialization decisions by end-users. An assessment of current technology development plans to support Critical Decision-1 and overall project progress. The most important technical and programmatic uncertainties (risks) are evaluated, and potential mitigation strategies are identified so that the technology development plans may be modified as required to support ongoing project development. A rough-order-of-magnitude cost evaluation that provides an initial basis for budget planning. This

  8. Next Generation Antibody Therapeutics Using Bispecific Antibody Technology.

    Science.gov (United States)

    Igawa, Tomoyuki

    2017-01-01

    Nearly fifty monoclonal antibodies have been approved to date, and the market for monoclonal antibodies is expected to continue to grow. Since global competition in the field of antibody therapeutics is intense, we need to establish novel antibody engineering technologies to provide true benefit for patients, with differentiated product values. Bispecific antibodies are among the next generation of antibody therapeutics that can bind to two different target antigens by the two arms of immunoglobulin G (IgG) molecule, and are thus believed to be applicable to various therapeutic needs. Until recently, large scale manufacturing of human IgG bispecific antibody was impossible. We have established a technology, named asymmetric re-engineering technology (ART)-Ig, to enable large scale manufacturing of bispecific antibodies. Three examples of next generation antibody therapeutics using ART-Ig technology are described. Recent updates on bispecific antibodies against factor IXa and factor X for the treatment of hemophilia A, bispecific antibodies against a tumor specific antigen and T cell surface marker CD3 for cancer immunotherapy, and bispecific antibodies against two different epitopes of soluble antigen with pH-dependent binding property for the elimination of soluble antigen from plasma are also described.

  9. Feasibility and application on steam injector for next-generation reactor

    International Nuclear Information System (INIS)

    Narabayashi, Tadashi; Ishiyama, Takenori; Miyano, Hiroshi; Nei, Hiromichi; Shioiri, Akio

    1991-01-01

    A feasibility study has been conducted on steam injector for a next generation reactor. The steam injector is a simple, compact passive device for water injection, such as Passive Core Injection System (PCIS) of Passive Containment Cooling System (PCCS), because of easy start-up without an AC power. An analysis model for a steam injector characteristics has been developed, and investigated with a visualized fundamental test for a two-stage Steam Injector System (SIS) for PCIS and a one-stage low pressure SIS for PCCS. The test results showed good agreement with the analysis results. The analysis and the test results showed the SIS could work over a very wide range of the steam pressure, and is applicable for PCIS or PCCS in the next generation reactors. (author)

  10. Results of Analyses of the Next Generation Solvent for Parsons

    International Nuclear Information System (INIS)

    Peters, T.; Washington, A.; Fink, S.

    2012-01-01

    Savannah River National Laboratory (SRNL) prepared a nominal 150 gallon batch of Next Generation Solvent (NGS) for Parsons. This material was then analyzed and tested for cesium mass transfer efficiency. The bulk of the results indicate that the solvent is qualified as acceptable for use in the upcoming pilot-scale testing at Parsons Technology Center. This report describes the analysis and testing of a batch of Next Generation Solvent (NGS) prepared in support of pilot-scale testing in the Parsons Technology Center. A total of ∼150 gallons of NGS solvent was prepared in late November of 2011. Details for the work are contained in a controlled laboratory notebook. Analysis of the Parsons NGS solvent indicates that the material is acceptable for use. SRNL is continuing to improve the analytical method for the guanidine.

  11. Evaluating multiplexed next-generation sequencing as a method in palynology for mixed pollen samples.

    Science.gov (United States)

    Keller, A; Danner, N; Grimmer, G; Ankenbrand, M; von der Ohe, K; von der Ohe, W; Rost, S; Härtel, S; Steffan-Dewenter, I

    2015-03-01

    The identification of pollen plays an important role in ecology, palaeo-climatology, honey quality control and other areas. Currently, expert knowledge and reference collections are essential to identify pollen origin through light microscopy. Pollen identification through molecular sequencing and DNA barcoding has been proposed as an alternative approach, but the assessment of mixed pollen samples originating from multiple plant species is still a tedious and error-prone task. Next-generation sequencing has been proposed to avoid this hindrance. In this study we assessed mixed pollen probes through next-generation sequencing of amplicons from the highly variable, species-specific internal transcribed spacer 2 region of nuclear ribosomal DNA. Further, we developed a bioinformatic workflow to analyse these high-throughput data with a newly created reference database. To evaluate the feasibility, we compared results from classical identification based on light microscopy from the same samples with our sequencing results. We assessed in total 16 mixed pollen samples, 14 originated from honeybee colonies and two from solitary bee nests. The sequencing technique resulted in higher taxon richness (deeper assignments and more identified taxa) compared to light microscopy. Abundance estimations from sequencing data were significantly correlated with counted abundances through light microscopy. Simulation analyses of taxon specificity and sensitivity indicate that 96% of taxa present in the database are correctly identifiable at the genus level and 70% at the species level. Next-generation sequencing thus presents a useful and efficient workflow to identify pollen at the genus and species level without requiring specialised palynological expert knowledge. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  12. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    K.M. Hettne (Kristina); J. Boorsma (Jeffrey); D.A.M. van Dartel (Dorien A M); J.J. Goeman (Jelle); E.C. de Jong (Esther); A.H. Piersma (Aldert); R.H. Stierum (Rob); J. Kleinjans (Jos); J.A. Kors (Jan)

    2013-01-01

    textabstractBackground: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with

  13. OLEDs : Technology's next generation

    Energy Technology Data Exchange (ETDEWEB)

    Anon

    2001-10-01

    Major advances in organic light emitting device (OLED) technology are bringing some science fiction concepts to the brink of reality. At the moment. OLED technology is being developed for the flat panel display industry. Liquid crystal display dominates the market for wristwatches and cellular phones for example, while the cathode ray tube plays the same role for television sets and desktop computers. Both have limitations when it comes to meeting the needs of the next generation of smart products. The attributes required include high brightness, low power consumption, high definition, full colour, wide preview angle, fast response time and portability, and low cost. OLED has the potential to meet all those requirements. Universal Display Corporation (UDC) was founded, and specializes in the development and commercialization of OLED technology. A partnership was established early with Princeton University professors, and no fewer than 20 researchers are working on OLED technology projects at both Princeton University and the University of Southern California. To date, 35 patents have been issued, and 60 others are pending. A joint development agreement was reached with Sony Corporation this year for high efficiency active matrix OLEDs to be used in large area monitor applications. OLED technology is based on vacuum-deposited organic small molecule materials that emit very bright light when electrically stimulated. Three advances in the technology were briefly discussed: TOLED{sup TM} for Transparent OLED, SOLED{sup TM} for Stacked OLED, and FOLED{sup TM} for Flexible OLED. A list detailing the various potential uses for the technology was also included in this paper. 3 figs.

  14. Seamless-merging-oriented parallel inverse lithography technology

    International Nuclear Information System (INIS)

    Yang Yiwei; Shi Zheng; Shen Shanhu

    2009-01-01

    Inverse lithography technology (ILT), a promising resolution enhancement technology (RET) used in next generations of IC manufacture, has the capability to push lithography to its limit. However, the existing methods of ILT are either time-consuming due to the large layout in a single process, or not accurate enough due to simply block merging in the parallel process. The seamless-merging-oriented parallel ILT method proposed in this paper is fast because of the parallel process; and most importantly, convergence enhancement penalty terms (CEPT) introduced in the parallel ILT optimization process take the environment into consideration as well as environmental change through target updating. This method increases the similarity of the overlapped area between guard-bands and work units, makes the merging process approach seamless and hence reduces hot-spots. The experimental results show that seamless-merging-oriented parallel ILT not only accelerates the optimization process, but also significantly improves the quality of ILT.

  15. The "Next Generation Science Standards" and the Life Sciences

    Science.gov (United States)

    Bybee, Rodger W.

    2013-01-01

    Publication of the "Next Generation Science Standards" will be just short of two decades since publication of the "National Science Education Standards" (NRC 1996). In that time, biology and science education communities have advanced, and the new standards will reflect that progress (NRC 1999, 2007, 2009; Kress and Barrett…

  16. DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  17. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  18. Electric vehicle charge patterns and the electricity generation mix and competitiveness of next generation vehicles

    International Nuclear Information System (INIS)

    Masuta, Taisuke; Murata, Akinobu; Endo, Eiichi

    2014-01-01

    Highlights: • The energy system of whole of Japan is analyzed in this study. • An advanced model based on MARKAL is used for the energy system analysis. • The impact of charge patterns of EVs on electricity generation mix is evaluated. • Technology competitiveness of the next generation vehicles is also evaluated. - Abstract: The nuclear accident of 2011 brought about a reconsideration of the future electricity generation mix of power systems in Japan. A debate on whether to phase out nuclear power plants and replace them with renewable energy sources is taking place. Demand-side management becomes increasingly important in future Japanese power systems with a large-scale integration of renewable energy sources. This paper considers the charge control of electric vehicles (EVs) through demand-side management. There have been many studies of the control or operation methods of EVs known as vehicle-to-grid (V2G), and it is important to evaluate both their short-term and long-term operation. In this study, we employ energy system to evaluate the impact of the charge patterns of EVs on both the electricity generation mix and the technology competitiveness of the next generation vehicles. An advanced energy system model based on Market Allocation (MARKAL) is used to consider power system control in detail

  19. Performance studies of the parallel VIM code

    International Nuclear Information System (INIS)

    Shi, B.; Blomquist, R.N.

    1996-01-01

    In this paper, the authors evaluate the performance of the parallel version of the VIM Monte Carlo code on the IBM SPx at the High Performance Computing Research Facility at ANL. Three test problems with contrasting computational characteristics were used to assess effects in performance. A statistical method for estimating the inefficiencies due to load imbalance and communication is also introduced. VIM is a large scale continuous energy Monte Carlo radiation transport program and was parallelized using history partitioning, the master/worker approach, and p4 message passing library. Dynamic load balancing is accomplished when the master processor assigns chunks of histories to workers that have completed a previously assigned task, accommodating variations in the lengths of histories, processor speeds, and worker loads. At the end of each batch (generation), the fission sites and tallies are sent from each worker to the master process, contributing to the parallel inefficiency. All communications are between master and workers, and are serial. The SPx is a scalable 128-node parallel supercomputer with high-performance Omega switches of 63 microsec latency and 35 MBytes/sec bandwidth. For uniform and reproducible performance, they used only the 120 identical regular processors (IBM RS/6000) and excluded the remaining eight planet nodes, which may be loaded by other's jobs

  20. The language parallel Pascal and other aspects of the massively parallel processor

    Science.gov (United States)

    Reeves, A. P.; Bruner, J. D.

    1982-01-01

    A high level language for the Massively Parallel Processor (MPP) was designed. This language, called Parallel Pascal, is described in detail. A description of the language design, a description of the intermediate language, Parallel P-Code, and details for the MPP implementation are included. Formal descriptions of Parallel Pascal and Parallel P-Code are given. A compiler was developed which converts programs in Parallel Pascal into the intermediate Parallel P-Code language. The code generator to complete the compiler for the MPP is being developed independently. A Parallel Pascal to Pascal translator was also developed. The architecture design for a VLSI version of the MPP was completed with a description of fault tolerant interconnection networks. The memory arrangement aspects of the MPP are discussed and a survey of other high level languages is given.

  1. Next generation industrial biotechnology based on extremophilic bacteria.

    Science.gov (United States)

    Chen, Guo-Qiang; Jiang, Xiao-Ran

    2018-04-01

    Industrial biotechnology aims to produce bulk chemicals including polymeric materials and biofuels based on bioprocessing sustainable agriculture products such as starch, fatty acids and/or cellulose. However, traditional bioprocesses require bioreactors made of stainless steel, complicated sterilization, difficult and expensive separation procedures as well as well-trained engineers that are able to conduct bioprocessing under sterile conditions, reducing the competitiveness of the bio-products. Amid the continuous low petroleum price, next generation industrial biotechnology (NGIB) allows bioprocessing to be conducted under unsterile (open) conditions using ceramic, cement or plastic bioreactors in a continuous way, it should be an energy, water and substrate saving technology with convenient operation procedure. NGIB also requires less capital investment and reduces demand on highly trained engineers. The foundation for the simplified NGIB is microorganisms that resist contaminations by other microbes, one of the examples is rapid growing halophilic bacteria inoculated under high salt concentration and alkali pH. They have been engineered to produce multiple products in various scales. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Modeling the video distribution link in the Next Generation Optical Access Networks

    International Nuclear Information System (INIS)

    Amaya, F; Cardenas, A; Tafur, I

    2011-01-01

    In this work we present a model for the design and optimization of the video distribution link in the next generation optical access network. We analyze the video distribution performance in a SCM-WDM link, including the noise, the distortion and the fiber optic nonlinearities. Additionally, we consider in the model the effect of distributed Raman amplification, used to extent the capacity and the reach of the optical link. In the model, we use the nonlinear Schroedinger equation with the purpose to obtain capacity limitations and design constrains of the next generation optical access networks.

  3. High voltage pulse generator. [Patent application

    Science.gov (United States)

    Fasching, G.E.

    1975-06-12

    An improved high-voltage pulse generator is described which is especially useful in ultrasonic testing of rock core samples. An N number of capacitors are charged in parallel to V volts and at the proper instance are coupled in series to produce a high-voltage pulse of N times V volts. Rapid switching of the capacitors from the paralleled charging configuration to the series discharging configuration is accomplished by using silicon-controlled rectifiers which are chain self-triggered following the initial triggering of the first rectifier connected between the first and second capacitors. A timing and triggering circuit is provided to properly synchronize triggering pulses to the first SCR at a time when the charging voltage is not being applied to the parallel-connected charging capacitors. The output voltage can be readily increased by adding additional charging networks. The circuit allows the peak level of the output to be easily varied over a wide range by using a variable autotransformer in the charging circuit.

  4. Ceria-Based Anodes for Next Generation Solid Oxide Fuel Cells

    Science.gov (United States)

    Mirfakhraei, Behzad

    Mixed ionic and electronic conducting materials (MIECs) have been suggested to represent the next generation of solid oxide fuel cell (SOFC) anodes, primarily due to their significantly enhanced active surface area and their tolerance to fuel components. In this thesis, the main focus has been on determining and tuning the physicochemical and electrochemical properties of ceria-based MIECs in the versatile perovskite or fluorite crystal structures. In one direction, BaZr0.1Ce0.7Y0.1 M0.1O3-delta (M = Fe, Ni, Co and Yb) (BZCY-M) perovskites were synthesized using solid-state or wet citric acid combustion methods and the effect of various transition metal dopants on the sintering behavior, crystal structure, chemical stability under CO2 and H 2S, and electrical conductivity, was investigated. BZCY-Ni, synthesized using the wet combustion method, was the best performing anode, giving a polarization resistance (RP) of 0.4 O.cm2 at 800 °C. Scanning electron microscopy and X-ray diffraction analysis showed that this was due to the exsolution of catalytic Ni nanoparticles onto the oxide surface. Evolving from this promising result, the effect of Mo-doped CeO 2 (nCMO) or Ni nanoparticle infiltration into a porous Gd-doped CeO 2 (GDC) anode (in the fluorite structure) was studied. While 3 wt. % Ni infiltration lowered RP by up to 90 %, giving 0.09 O.cm2 at 800 °C and exhibiting a ca. 5 times higher tolerance towards 10 ppm H2, nCMO infiltration enhanced the H2 stability by ca. 3 times, but had no influence on RP. In parallel work, a first-time study of the Ce3+ and Ce 4+ redox process (pseudocapacitance) within GDC anode materials was carried out using cyclic voltammetry (CV) in wet H2 at high temperatures. It was concluded that, at 500-600 °C, the Ce3+/Ce 4+ reaction is diffusion controlled, probably due to O2- transport limitations in the outer 5-10 layers of the GDC particles, giving a very high capacitance of ca. 70 F/g. Increasing the temperature ultimately

  5. Next-Generation Sequencing in Clinical Molecular Diagnostics of Cancer: Advantages and Challenges

    Directory of Open Access Journals (Sweden)

    Rajyalakshmi Luthra

    2015-10-01

    Full Text Available The application of next-generation sequencing (NGS to characterize cancer genomes has resulted in the discovery of numerous genetic markers. Consequently, the number of markers that warrant routine screening in molecular diagnostic laboratories, often from limited tumor material, has increased. This increased demand has been difficult to manage by traditional low- and/or medium-throughput sequencing platforms. Massively parallel sequencing capabilities of NGS provide a much-needed alternative for mutation screening in multiple genes with a single low investment of DNA. However, implementation of NGS technologies, most of which are for research use only (RUO, in a diagnostic laboratory, needs extensive validation in order to establish Clinical Laboratory Improvement Amendments (CLIA and College of American Pathologists (CAP-compliant performance characteristics. Here, we have reviewed approaches for validation of NGS technology for routine screening of tumors. We discuss the criteria for selecting gene markers to include in the NGS panel and the deciding factors for selecting target capture approaches and sequencing platforms. We also discuss challenges in result reporting, storage and retrieval of the voluminous sequencing data and the future potential of clinical NGS.

  6. Challenges in the Setup of Large-scale Next-Generation Sequencing Analysis Workflows

    Directory of Open Access Journals (Sweden)

    Pranav Kulkarni

    Full Text Available While Next-Generation Sequencing (NGS can now be considered an established analysis technology for research applications across the life sciences, the analysis workflows still require substantial bioinformatics expertise. Typical challenges include the appropriate selection of analytical software tools, the speedup of the overall procedure using HPC parallelization and acceleration technology, the development of automation strategies, data storage solutions and finally the development of methods for full exploitation of the analysis results across multiple experimental conditions. Recently, NGS has begun to expand into clinical environments, where it facilitates diagnostics enabling personalized therapeutic approaches, but is also accompanied by new technological, legal and ethical challenges. There are probably as many overall concepts for the analysis of the data as there are academic research institutions. Among these concepts are, for instance, complex IT architectures developed in-house, ready-to-use technologies installed on-site as well as comprehensive Everything as a Service (XaaS solutions. In this mini-review, we summarize the key points to consider in the setup of the analysis architectures, mostly for scientific rather than diagnostic purposes, and provide an overview of the current state of the art and challenges of the field.

  7. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  8. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    Hettne, K.M.; Boorsma, A.; Dartel, D.A. van; Goeman, J.J.; Jong, E. de; Piersma, A.H.; Stierum, R.H.; Kleinjans, J.C.; Kors, J.A.

    2013-01-01

    BACKGROUND: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set

  9. Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    NARCIS (Netherlands)

    Hettne, K.M.; Boorsma, A.; Dartel, van D.A.M.; Goeman, J.J.; Jong, de E.; Piersma, A.H.; Stierum, R.H.; Kleinjans, J.C.; Kors, J.A.

    2013-01-01

    Background: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set

  10. Multiple Access Techniques for Next Generation Wireless: Recent Advances and Future Perspectives

    Directory of Open Access Journals (Sweden)

    Shree Krishna Sharma

    2016-01-01

    Full Text Available The advances in multiple access techniques has been one of the key drivers in moving from one cellular generation to another. Starting from the first generation, several multiple access techniques have been explored in different generations and various emerging multiplexing/multiple access techniques are being investigated for the next generation of cellular networks. In this context, this paper first provides a detailed review on the existing Space Division Multiple Access (SDMA related works. Subsequently, it highlights the main features and the drawbacks of various existing and emerging multiplexing/multiple access techniques. Finally, we propose a novel concept of clustered orthogonal signature division multiple access for the next generation of cellular networks. The proposed concept envisions to employ joint antenna coding in order to enhance the orthogonality of SDMA beams with the objective of enhancing the spectral efficiency of future cellular networks.

  11. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  12. Design Principles of Next-Generation Digital Gaming for Education.

    Science.gov (United States)

    Squire, Kurt; Jenkins, Henry; Holland, Walter; Miller, Heather; O'Driscoll, Alice; Tan, Katie Philip; Todd, Katie.

    2003-01-01

    Discusses the rapid growth of digital games, describes research at MIT that is exploring the potential of digital games for supporting learning, and offers hypotheses about the design of next-generation educational video and computer games. Highlights include simulations and games; and design principles, including context and using information to…

  13. Identification and Characterization of Key Human Performance Issues and Research in the Next Generation Air Transportation System (NextGen)

    Science.gov (United States)

    Lee, Paul U.; Sheridan, Tom; Poage, james L.; Martin, Lynne Hazel; Jobe, Kimberly K.

    2010-01-01

    This report identifies key human-performance-related issues associated with Next Generation Air Transportation System (NextGen) research in the NASA NextGen-Airspace Project. Four Research Focus Areas (RFAs) in the NextGen-Airspace Project - namely Separation Assurance (SA), Airspace Super Density Operations (ASDO), Traffic Flow Management (TFM), and Dynamic Airspace Configuration (DAC) - were examined closely. In the course of the research, it was determined that the identified human performance issues needed to be analyzed in the context of NextGen operations rather than through basic human factors research. The main gaps in human factors research in NextGen were found in the need for accurate identification of key human-systems related issues within the context of specific NextGen concepts and better design of the operational requirements for those concepts. By focusing on human-system related issues for individual concepts, key human performance issues for the four RFAs were identified and described in this report. In addition, mixed equipage airspace with components of two RFAs were characterized to illustrate potential human performance issues that arise from the integration of multiple concepts.

  14. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  15. 75 FR 82387 - Next Generation Risk Assessment Public Dialogue Conference

    Science.gov (United States)

    2010-12-30

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9246-7] Next Generation Risk Assessment Public Dialogue Conference AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of Public Dialogue Conference... methods with the National Institutes of Environmental Health Sciences' National Toxicology Program, Center...

  16. Comparison of next generation sequencing technologies for transcriptome characterization

    Directory of Open Access Journals (Sweden)

    Soltis Douglas E

    2009-08-01

    Full Text Available Abstract Background We have developed a simulation approach to help determine the optimal mixture of sequencing methods for most complete and cost effective transcriptome sequencing. We compared simulation results for traditional capillary sequencing with "Next Generation" (NG ultra high-throughput technologies. The simulation model was parameterized using mappings of 130,000 cDNA sequence reads to the Arabidopsis genome (NCBI Accession SRA008180.19. We also generated 454-GS20 sequences and de novo assemblies for the basal eudicot California poppy (Eschscholzia californica and the magnoliid avocado (Persea americana using a variety of methods for cDNA synthesis. Results The Arabidopsis reads tagged more than 15,000 genes, including new splice variants and extended UTR regions. Of the total 134,791 reads (13.8 MB, 119,518 (88.7% mapped exactly to known exons, while 1,117 (0.8% mapped to introns, 11,524 (8.6% spanned annotated intron/exon boundaries, and 3,066 (2.3% extended beyond the end of annotated UTRs. Sequence-based inference of relative gene expression levels correlated significantly with microarray data. As expected, NG sequencing of normalized libraries tagged more genes than non-normalized libraries, although non-normalized libraries yielded more full-length cDNA sequences. The Arabidopsis data were used to simulate additional rounds of NG and traditional EST sequencing, and various combinations of each. Our simulations suggest a combination of FLX and Solexa sequencing for optimal transcriptome coverage at modest cost. We have also developed ESTcalc http://fgp.huck.psu.edu/NG_Sims/ngsim.pl, an online webtool, which allows users to explore the results of this study by specifying individualized costs and sequencing characteristics. Conclusion NG sequencing technologies are a highly flexible set of platforms that can be scaled to suit different project goals. In terms of sequence coverage alone, the NG sequencing is a dramatic advance

  17. The Gut Microbiotassay: a high-throughput qPCR approach combinable with next generation sequencing to study gut microbial diversity

    DEFF Research Database (Denmark)

    Hermann-Bank, Marie Louise; Skovgaard, Kerstin; Stockmarr, Anders

    2013-01-01

    ®) followed by next generation sequencing. Primers were designed if necessary and all primer sets were screened against DNA extracted from pure cultures of 15 representative bacterial species. Subsequently the setup was tested on DNA extracted from small and large intestinal content from piglets...

  18. Next Generation Space Interconnect Standard (NGSIS): a modular open standards approach for high performance interconnects for space

    Science.gov (United States)

    Collier, Charles Patrick

    2017-04-01

    The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.

  19. Development of next generation tempered and ODS reduced activation ferritic/martensitic steels for fusion energy applications

    Science.gov (United States)

    Zinkle, S. J.; Boutard, J. L.; Hoelzer, D. T.; Kimura, A.; Lindau, R.; Odette, G. R.; Rieth, M.; Tan, L.; Tanigawa, H.

    2017-09-01

    Reduced activation ferritic/martensitic steels are currently the most technologically mature option for the structural material of proposed fusion energy reactors. Advanced next-generation higher performance steels offer the opportunity for improvements in fusion reactor operational lifetime and reliability, superior neutron radiation damage resistance, higher thermodynamic efficiency, and reduced construction costs. The two main strategies for developing improved steels for fusion energy applications are based on (1) an evolutionary pathway using computational thermodynamics modelling and modified thermomechanical treatments (TMT) to produce higher performance reduced activation ferritic/martensitic (RAFM) steels and (2) a higher risk, potentially higher payoff approach based on powder metallurgy techniques to produce very high strength oxide dispersion strengthened (ODS) steels capable of operation to very high temperatures and with potentially very high resistance to fusion neutron-induced property degradation. The current development status of these next-generation high performance steels is summarized, and research and development challenges for the successful development of these materials are outlined. Material properties including temperature-dependent uniaxial yield strengths, tensile elongations, high-temperature thermal creep, Charpy impact ductile to brittle transient temperature (DBTT) and fracture toughness behaviour, and neutron irradiation-induced low-temperature hardening and embrittlement and intermediate-temperature volumetric void swelling (including effects associated with fusion-relevant helium and hydrogen generation) are described for research heats of the new steels.

  20. Deep learning—Accelerating Next Generation Performance Analysis Systems?

    Directory of Open Access Journals (Sweden)

    Heike Brock

    2018-02-01

    Full Text Available Deep neural network architectures show superior performance in recognition and prediction tasks of the image, speech and natural language domains. The success of such multi-layered networks encourages their implementation in further application scenarios as the retrieval of relevant motion information for performance enhancement in sports. However, to date deep learning is only seldom applied to activity recognition problems of the human motion domain. Therefore, its use for sports data analysis might remain abstract to many practitioners. This paper provides a survey on recent works in the field of high-performance motion data and examines relevant technologies for subsequent deployment in real training systems. In particular, it discusses aspects of data acquisition, processing and network modeling. Analysis suggests the advantage of deep neural networks under difficult and noisy data conditions. However, further research is necessary to confirm the benefit of deep learning for next generation performance analysis systems.

  1. Rapid Conditioning for the Next Generation Melting System

    Energy Technology Data Exchange (ETDEWEB)

    Rue, David M. [Gas Technology Institute, Des Plaines, IL (United States)

    2015-06-17

    This report describes work on Rapid Conditioning for the Next Generation Melting System under US Department of Energy Contract DE-FC36-06GO16010. The project lead was the Gas Technology Institute (GTI). Partners included Owens Corning and Johns Manville. Cost share for this project was provided by NYSERDA (the New York State Energy Research and Development Authority), Owens Corning, Johns Manville, Owens Illinois, and the US natural gas industry through GTI’s SMP and UTD programs. The overreaching focus of this project was to study and develop rapid refining approaches for segmented glass manufacturing processes using high-intensity melters such as the submerged combustion melter. The objectives of this project were to 1) test and evaluate the most promising approaches to rapidly condition the homogeneous glass produced from the submerged combustion melter, and 2) to design a pilot-scale NGMS system for fiberglass recycle.

  2. RIPng- A next Generation Routing Protocal (IPv6) | Falaye | Journal ...

    African Journals Online (AJOL)

    ... Information Protocol Next Generation (RIPng) owing to the current depletion rate of IPv4. ... that support the Internet Protocol Version 6 (IPv6).addressing scheme. ... A brief history is given; its various versions are discussed, and detailed ...

  3. Clinical utility of a 377 gene custom next-generation sequencing ...

    Indian Academy of Sciences (India)

    JEN BEVILACQUA

    2017-07-26

    Jul 26, 2017 ... Clinical utility of a 377 gene custom next-generation sequencing epilepsy panel ... number of genes, making it a very attractive option for a condition as .... clinical value of various test offerings to guide decision making.

  4. Optimum fuel allocation in parallel steam generator systems

    International Nuclear Information System (INIS)

    Bollettini, U.; Cangioli, E.; Cerri, G.; Rome Univ. 'La Sapienza'; Trento Univ.

    1991-01-01

    An optimization procedure was developed to allocate fuels into parallel steam generators. The procedure takes into account the level of performance deterioration connected with the loading history (fossil fuel allocation and maintenance) of each steam generator. The optimization objective function is the system hourly cost, overall steam demand being satisfied. Costs are due to fuel and electric power supply and to plant depreciation and maintenance as well. In order to easily updata the state of each steam generator, particular care was put in the general formulation of the steam production function by adopting a special efficiency-load curve description based on a deterioration scaling parameter. The influence of the characteristic time interval length on the optimum operation result is investigated. A special implementation of the method based on minimum cost paths is suggested

  5. AgMIP: Next Generation Models and Assessments

    Science.gov (United States)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6

  6. Simultaneous genomic identification and profiling of a single cell using semiconductor-based next generation sequencing

    Directory of Open Access Journals (Sweden)

    Manabu Watanabe

    2014-09-01

    Full Text Available Combining single-cell methods and next-generation sequencing should provide a powerful means to understand single-cell biology and obviate the effects of sample heterogeneity. Here we report a single-cell identification method and seamless cancer gene profiling using semiconductor-based massively parallel sequencing. A549 cells (adenocarcinomic human alveolar basal epithelial cell line were used as a model. Single-cell capture was performed using laser capture microdissection (LCM with an Arcturus® XT system, and a captured single cell and a bulk population of A549 cells (≈106 cells were subjected to whole genome amplification (WGA. For cell identification, a multiplex PCR method (AmpliSeq™ SNP HID panel was used to enrich 136 highly discriminatory SNPs with a genotype concordance probability of 1031–35. For cancer gene profiling, we used mutation profiling that was performed in parallel using a hotspot panel for 50 cancer-related genes. Sequencing was performed using a semiconductor-based bench top sequencer. The distribution of sequence reads for both HID and Cancer panel amplicons was consistent across these samples. For the bulk population of cells, the percentages of sequence covered at coverage of more than 100× were 99.04% for the HID panel and 98.83% for the Cancer panel, while for the single cell percentages of sequence covered at coverage of more than 100× were 55.93% for the HID panel and 65.96% for the Cancer panel. Partial amplification failure or randomly distributed non-amplified regions across samples from single cells during the WGA procedures or random allele drop out probably caused these differences. However, comparative analyses showed that this method successfully discriminated a single A549 cancer cell from a bulk population of A549 cells. Thus, our approach provides a powerful means to overcome tumor sample heterogeneity when searching for somatic mutations.

  7. Next Generation Nuclear Plant GAP Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Ball, Sydney J [ORNL; Burchell, Timothy D [ORNL; Corwin, William R [ORNL; Fisher, Stephen Eugene [ORNL; Forsberg, Charles W. [Massachusetts Institute of Technology (MIT); Morris, Robert Noel [ORNL; Moses, David Lewis [ORNL

    2008-12-01

    As a follow-up to the phenomena identification and ranking table (PIRT) studies conducted recently by NRC on next generation nuclear plant (NGNP) safety, a study was conducted to identify the significant 'gaps' between what is needed and what is already available to adequately assess NGNP safety characteristics. The PIRT studies focused on identifying important phenomena affecting NGNP plant behavior, while the gap study gives more attention to off-normal behavior, uncertainties, and event probabilities under both normal operation and postulated accident conditions. Hence, this process also involved incorporating more detailed evaluations of accident sequences and risk assessments. This study considers thermal-fluid and neutronic behavior under both normal and postulated accident conditions, fission product transport (FPT), high-temperature metals, and graphite behavior and their effects on safety. In addition, safety issues related to coupling process heat (hydrogen production) systems to the reactor are addressed, given the limited design information currently available. Recommendations for further study, including analytical methods development and experimental needs, are presented as appropriate in each of these areas.

  8. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  9. The Next Generation Science Standards: The Features and Challenges

    Science.gov (United States)

    Pruitt, Stephen L.

    2014-01-01

    Beginning in January of 2010, the Carnegie Corporation of New York funded a two-step process to develop a new set of state developed science standards intended to prepare students for college and career readiness in science. These new internationally benchmarked science standards, the Next Generation Science Standards (NGSS) were completed in…

  10. Next-to-next-to-leading logarithms in four-fermion electroweak processes at high energy

    International Nuclear Information System (INIS)

    Kuehn, J.H.; Moch, S.; Penin, A.A.; Smirnov, V.A.

    2001-01-01

    We sum up the next-to-next-to-leading logarithmic virtual electroweak corrections to the high energy asymptotics of the neutral current four-fermion processes for light fermions to all orders in the coupling constants using the evolution equation approach. From this all order result we derive finite order expressions through next-to-next-to leading order for the total cross section and various asymmetries. We observe an amazing cancellation between the sizable leading, next-to-leading and next-to-next-to-leading logarithmic contributions at TeV energies

  11. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  12. Career Advancement: Meeting the Challenges Confronting the Next Generation of Endocrinologists and Endocrine Researchers.

    Science.gov (United States)

    Santen, Richard J; Joham, Anju; Fishbein, Lauren; Vella, Kristen R; Ebeling, Peter R; Gibson-Helm, Melanie; Teede, Helena

    2016-12-01

    Challenges and opportunities face the next generation (Next-Gen) of endocrine researchers and clinicians, the lifeblood of the field of endocrinology for the future. A symposium jointly sponsored by The Endocrine Society and the Endocrine Society of Australia was convened to discuss approaches to addressing the present and future Next-Gen needs. Data collection by literature review, assessment of previously completed questionnaires, commissioning of a new questionnaire, and summarization of symposium discussions were studied. Next-Gen endocrine researchers face diminishing grant funding in inflation-adjusted terms. The average age of individuals being awarded their first independent investigator funding has increased to age 45 years. For clinicians, a workforce gap exists between endocrinologists needed and those currently trained. Clinicians in practice are increasingly becoming employees of integrated hospital systems, resulting in greater time spent on nonclinical issues. Workforce data and published reviews identify challenges specifically related to early career women in endocrinology. Strategies to Address Issues: Recommendations encompassed the areas of grant support for research, mentoring, education, templates for career development, specific programs for Next-Gen members by senior colleagues as outlined in the text, networking, team science, and life/work integration. Endocrine societies focusing on Next-Gen members provide a powerful mechanism to support these critical areas. A concerted effort to empower, train, and support the next generation of clinical endocrinologists and endocrine researchers is necessary to ensure the viability and vibrancy of our discipline and to optimize our contributions to improving health outcomes. Collaborative engagement of endocrine societies globally will be necessary to support our next generation moving forward.

  13. "ASTRO 101" Course Materials 2.0: Next Generation Lecture Tutorials and Beyond

    Science.gov (United States)

    Slater, Stephanie; Grazier, Kevin

    2015-01-01

    Early efforts to create course materials were often local in scale and were based on "gut instinct," and classroom experience and observation. While subsequent efforts were often based on those same instincts and observations of classrooms, they also incorporated the results of many years of education research. These "second generation" course materials, such as lecture tutorials, relied heavily on research indicating that instructors need to actively engage students in the learning process. While imperfect, these curricular innovations, have provided evidence that research-based materials can be constructed, can easily be disseminated to a broad audience, and can provide measureable improvement in student learning across many settings. In order to improve upon this prior work, next generation materials must build upon the strengths of these innovations while engineering in findings from education research, cognitive science, and instructor feedback. A next wave of materials, including a set of next generation lecture tutorials, have been constructed with attention to the body of research on student motivation, and cognitive load; and they are responsive to our body of knowledge on learning difficulties related to specific content in the domain. From instructor feedback, these materials have been constructed to have broader coverage of the materials typically taught in an ASTRO 101 course, to take less class time, and to be more affordable for students. This next generation of lecture tutorials may serve as a template of the ways in which course materials can be reengineered to respond to current instructor and student needs.

  14. Houston Methodist variant viewer: An application to support clinical laboratory interpretation of next-generation sequencing data for cancer

    Directory of Open Access Journals (Sweden)

    Paul A Christensen

    2017-01-01

    Full Text Available Introduction: Next-generation-sequencing (NGS is increasingly used in clinical and research protocols for patients with cancer. NGS assays are routinely used in clinical laboratories to detect mutations bearing on cancer diagnosis, prognosis and personalized therapy. A typical assay may interrogate 50 or more gene targets that encompass many thousands of possible gene variants. Analysis of NGS data in cancer is a labor-intensive process that can become overwhelming to the molecular pathologist or research scientist. Although commercial tools for NGS data analysis and interpretation are available, they are often costly, lack key functionality or cannot be customized by the end user. Methods: To facilitate NGS data analysis in our clinical molecular diagnostics laboratory, we created a custom bioinformatics tool termed Houston Methodist Variant Viewer (HMVV. HMVV is a Java-based solution that integrates sequencing instrument output, bioinformatics analysis, storage resources and end user interface. Results: Compared to the predicate method used in our clinical laboratory, HMVV markedly simplifies the bioinformatics workflow for the molecular technologist and facilitates the variant review by the molecular pathologist. Importantly, HMVV reduces time spent researching the biological significance of the variants detected, standardizes the online resources used to perform the variant investigation and assists generation of the annotated report for the electronic medical record. HMVV also maintains a searchable variant database, including the variant annotations generated by the pathologist, which is useful for downstream quality improvement and research projects. Conclusions: HMVV is a clinical grade, low-cost, feature-rich, highly customizable platform that we have made available for continued development by the pathology informatics community.

  15. Next generation Zero-Code control system UI

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Developing ergonomic user interfaces for control systems is challenging, especially during machine upgrade and commissioning where several small changes may suddenly be required. Zero-code systems, such as *Inspector*, provide agile features for creating and maintaining control system interfaces. More so, these next generation Zero-code systems bring simplicity and uniformity and brake the boundaries between Users and Developers. In this talk we present *Inspector*, a CERN made Zero-code application development system, and we introduce the major differences and advantages of using Zero-code control systems to develop operational UI.

  16. Beamstrahlung spectra in next generation linear colliders

    Energy Technology Data Exchange (ETDEWEB)

    Barklow, T.; Chen, P. (Stanford Linear Accelerator Center, Menlo Park, CA (United States)); Kozanecki, W. (DAPNIA-SPP, CEN-Saclay (France))

    1992-04-01

    For the next generation of linear colliders, the energy loss due to beamstrahlung during the collision of the e{sup +}e{sup {minus}} beams is expected to substantially influence the effective center-of-mass energy distribution of the colliding particles. In this paper, we first derive analytical formulae for the electron and photon energy spectra under multiple beamstrahlung processes, and for the e{sup +}e{sup {minus}} and {gamma}{gamma} differential luminosities. We then apply our formulation to various classes of 500 GeV e{sup +}e{sup {minus}} linear collider designs currently under study.

  17. Microbial production of next-generation stevia sweeteners.

    Science.gov (United States)

    Olsson, Kim; Carlsen, Simon; Semmler, Angelika; Simón, Ernesto; Mikkelsen, Michael Dalgaard; Møller, Birger Lindberg

    2016-12-07

    The glucosyltransferase UGT76G1 from Stevia rebaudiana is a chameleon enzyme in the targeted biosynthesis of the next-generation premium stevia sweeteners, rebaudioside D (Reb D) and rebaudioside M (Reb M). These steviol glucosides carry five and six glucose units, respectively, and have low sweetness thresholds, high maximum sweet intensities and exhibit a greatly reduced lingering bitter taste compared to stevioside and rebaudioside A, the most abundant steviol glucosides in the leaves of Stevia rebaudiana. In the metabolic glycosylation grid leading to production of Reb D and Reb M, UGT76G1 was found to catalyze eight different reactions all involving 1,3-glucosylation of steviol C 13 - and C 19 -bound glucoses. Four of these reactions lead to Reb D and Reb M while the other four result in formation of side-products unwanted for production. In this work, side-product formation was reduced by targeted optimization of UGT76G1 towards 1,3 glucosylation of steviol glucosides that are already 1,2-diglucosylated. The optimization of UGT76G1 was based on homology modelling, which enabled identification of key target amino acids present in the substrate-binding pocket. These residues were then subjected to site-saturation mutagenesis and a mutant library containing a total of 1748 UGT76G1 variants was screened for increased accumulation of Reb D or M, as well as for decreased accumulation of side-products. This screen was performed in a Saccharomyces cerevisiae strain expressing all enzymes in the rebaudioside biosynthesis pathway except for UGT76G1. Screening of the mutant library identified mutations with positive impact on the accumulation of Reb D and Reb M. The effect of the introduced mutations on other reactions in the metabolic grid was characterized. This screen made it possible to identify variants, such as UGT76G1 Thr146Gly and UGT76G1 His155Leu , which diminished accumulation of unwanted side-products and gave increased specific accumulation of the desired

  18. Screening for SNPs with Allele-Specific Methylation based on Next-Generation Sequencing Data

    OpenAIRE

    Hu, Bo; Ji, Yuan; Xu, Yaomin; Ting, Angela H

    2013-01-01

    Allele-specific methylation (ASM) has long been studied but mainly documented in the context of genomic imprinting and X chromosome inactivation. Taking advantage of the next-generation sequencing technology, we conduct a high-throughput sequencing experiment with four prostate cell lines to survey the whole genome and identify single nucleotide polymorphisms (SNPs) with ASM. A Bayesian approach is proposed to model the counts of short reads for each SNP conditional on its genotypes of multip...

  19. Near-term and next-generation nuclear power plant concepts

    International Nuclear Information System (INIS)

    Shiga, Shigenori; Handa, Norihiko; Heki, Hideaki

    2002-01-01

    Near-term and next-generation nuclear reactors will be required to have high economic competitiveness in the deregulated electricity market, flexibility with respect to electricity demand and investment, and good public acceptability. For near-term reactors in the 2010s, Toshiba is developing an improved advanced boiling water reactor (ABWR) based on the present ABWR with newly rationalized systems and components; a construction period of 36 months, one year shorter than the current period; and a power lineup ranging from 800 MWe to 1,600 MWe. For future reactors in the 2020s and beyond, Toshiba is developing the ABWR-II for large-scale, centralized power sources; a supercritical water-cooled power reactor with high thermal efficiency for medium-scale power sources; a modular reactor with siting flexibility for small-scale power sources; and a small, fast neutron reactor with inherent safety for independent power sources. From the viewpoint of efficient uranium resource utilization, a low-moderation BWR core with a high conversion factor is also being developed. (author)

  20. Proceedings of the international workshop on next-generation linear colliders

    International Nuclear Information System (INIS)

    Riordan, M.

    1988-12-01

    This report contains papers on the next-generation of linear colliders. The particular areas of discussion are: parameters; beam dynamics and wakefields; damping rings and sources; rf power sources; accelerator structures; instrumentation; final focus; and review of beam-beam interaction

  1. C-Arc: A Novel Architecture for Next Generation Context- Aware ...

    African Journals Online (AJOL)

    In this paper, the common architecture principles of context-aware systems are presented and the crucial contextaware architecture issues to support the next generation context-aware systems which will enable seamless service provisioning in heterogeneous, dynamically varying computing and communication ...

  2. Proceedings of the international workshop on next-generation linear colliders

    Energy Technology Data Exchange (ETDEWEB)

    Riordan, M. (ed.)

    1988-12-01

    This report contains papers on the next-generation of linear colliders. The particular areas of discussion are: parameters; beam dynamics and wakefields; damping rings and sources; rf power sources; accelerator structures; instrumentation; final focus; and review of beam-beam interaction.

  3. Decision Optimization for Power Grid Operating Conditions with High- and Low-Voltage Parallel Loops

    Directory of Open Access Journals (Sweden)

    Dong Yang

    2017-05-01

    Full Text Available With the development of higher-voltage power grids, the high- and low-voltage parallel loops are emerging, which lead to energy losses and even threaten the security and stability of power systems. The multi-infeed high-voltage direct current (HVDC configurations widely appearing in AC/DC interconnected power systems make this situation even worse. Aimed at energy saving and system security, a decision optimization method for power grid operating conditions with high- and low-voltage parallel loops is proposed in this paper. Firstly, considering hub substation distribution and power grid structure, parallel loop opening schemes are generated with GN (Girvan-Newman algorithms. Then, candidate opening schemes are preliminarily selected from all these generated schemes based on a filtering index. Finally, with the influence on power system security, stability and operation economy in consideration, an evaluation model for candidate opening schemes is founded based on analytic hierarchy process (AHP. And a fuzzy evaluation algorithm is used to find the optimal scheme. Simulation results of a New England 39-bus system and an actual power system validate the effectiveness and superiority of this proposed method.

  4. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    Science.gov (United States)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  5. Modeling the video distribution link in the Next Generation Optical Access Networks

    DEFF Research Database (Denmark)

    Amaya, F.; Cárdenas, A.; Tafur Monroy, Idelfonso

    2011-01-01

    In this work we present a model for the design and optimization of the video distribution link in the next generation optical access network. We analyze the video distribution performance in a SCM-WDM link, including the noise, the distortion and the fiber optic nonlinearities. Additionally, we...... consider in the model the effect of distributed Raman amplification, used to extent the capacity and the reach of the optical link. In the model, we use the nonlinear Schrödinger equation with the purpose to obtain capacity limitations and design constrains of the next generation optical access networks....

  6. Promising Practices: Building the Next Generation of School Leaders

    Science.gov (United States)

    Bryant, Jennifer Edic; Escalante, Karen; Selva, Ashley

    2017-01-01

    This study applies transformational leadership theory practices to examine the purposeful ways in which principals work to build the next generation of teacher leaders in response to the shortage of K-12 principals. Given the impact principals have on student development and the shortage of those applying for the principalship, the purpose of this…

  7. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    Science.gov (United States)

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  8. Next generation farms at Fermilab

    International Nuclear Information System (INIS)

    Cudzewicz, R., Giacchetti, L., Leininger, M., Levshina, T., Pasetes, R., Schweitzer, M., Wolbers, S.

    1997-01-01

    The current generation of UNIX farms at Fermilab are rapidly approaching the end of their useful life. The workstations were purchased during the years 1991-1992 and represented the most cost-effective computing available at that time. Acquisition of new workstations is being made to upgrade the UNIX farms for the purpose of providing large amounts of computing for reconstruction of data being collected at the 1996-1997 fixed-target run, as well as to provide simulation computing for CMS, the Auger project, accelerator calculations and other projects that require massive amounts of CPU. 4 refs., 1 fig., 2 tabs

  9. Quantifying population genetic differentiation from next-generation sequencing data

    DEFF Research Database (Denmark)

    Fumagalli, Matteo; Garrett Vieira, Filipe Jorge; Korneliussen, Thorfinn Sand

    2013-01-01

    method for quantifying population genetic differentiation from next-generation sequencing data. In addition, we present a strategy to investigate population structure via Principal Components Analysis. Through extensive simulations, we compare the new method herein proposed to approaches based...... on genotype calling and demonstrate a marked improvement in estimation accuracy for a wide range of conditions. We apply the method to a large-scale genomic data set of domesticated and wild silkworms sequenced at low coverage. We find that we can infer the fine-scale genetic structure of the sampled......Over the last few years, new high-throughput DNA sequencing technologies have dramatically increased speed and reduced sequencing costs. However, the use of these sequencing technologies is often challenged by errors and biases associated with the bioinformatical methods used for analyzing the data...

  10. Safety reviews of next-generation light-water reactors

    International Nuclear Information System (INIS)

    Kudrick, J.A.; Wilson, J.N.

    1997-01-01

    The Nuclear Regulatory Commission (NRC) is reviewing three applications for design certification under its new licensing process. The U.S. Advanced Boiling Water Reactor (ABWR) and System 80+ designs have received final design approvals. The AP600 design review is continuing. The goals of design certification are to achieve early resolution of safety issues and to provide a more stable and predictable licensing process. NRC also reviewed the Utility Requirements Document (URD) of the Electric Power Research Institute (EPRI) and determined that its guidance does not conflict with NRC requirements. This review led to the identification and resolution of many generic safety issues. The NRC determined that next-generation reactor designs should achieve a higher level of safety for selected technical and severe accident issues. Accordingly, NRC developed new review standards for these designs based on (1) operating experience, including the accident at Three Mile Island, Unit 2; (2) the results of probabilistic risk assessments of current and next-generation reactor designs; (3) early efforts on severe accident rulemaking; and (4) research conducted to address previously identified generic safety issues. The additional standards were used during the individual design reviews and the resolutions are documented in the design certification rules. 12 refs

  11. Standardization and quality management in next-generation sequencing.

    Science.gov (United States)

    Endrullat, Christoph; Glökler, Jörn; Franke, Philipp; Frohme, Marcus

    2016-09-01

    DNA sequencing continues to evolve quickly even after > 30 years. Many new platforms suddenly appeared and former established systems have vanished in almost the same manner. Since establishment of next-generation sequencing devices, this progress gains momentum due to the continually growing demand for higher throughput, lower costs and better quality of data. In consequence of this rapid development, standardized procedures and data formats as well as comprehensive quality management considerations are still scarce. Here, we listed and summarized current standardization efforts and quality management initiatives from companies, organizations and societies in form of published studies and ongoing projects. These comprise on the one hand quality documentation issues like technical notes, accreditation checklists and guidelines for validation of sequencing workflows. On the other hand, general standard proposals and quality metrics are developed and applied to the sequencing workflow steps with the main focus on upstream processes. Finally, certain standard developments for downstream pipeline data handling, processing and storage are discussed in brief. These standardization approaches represent a first basis for continuing work in order to prospectively implement next-generation sequencing in important areas such as clinical diagnostics, where reliable results and fast processing is crucial. Additionally, these efforts will exert a decisive influence on traceability and reproducibility of sequence data.

  12. Towards Next Generation Internet Management:CNGI-CERNET2EXPERIENCES

    Institute of Scientific and Technical Information of China (English)

    Jia-Hai Yang; Hui Zhang; Jin-Xiang Zhang; Chang-Qing An

    2009-01-01

    Manageability is an important feature of next generation Internet; management and monitoring of IPv6-based networks are proving a big challenge. While leveraging current IPv4-based SNMP management scheme to IPv6 networks'management need is necessary, it is more urgent to coin a new network management architecture to accommodate the scalability and extensibility requirements of next generation Internet management. The paper proposes a novel network management architecture, IMN (Internet Management Network), which creates an overlay network of management nodes.While each management node can perform management tasks autonomously and independently, it can finish more sophis-ticated management tasks by collaboratively invoking management operations or sharing information provided by other management nodes. P2P-based communication services are introduced in IMN to enable such collaboration. The paper presents a prototyping implementation based on the Web service related technology, as well as some of the key technologies,especially solutions to those issues arising from the management practice of CERNET2. Experiences of deployment of CERNET2 operation and lessons learned from the management practice are discussed.

  13. Raytheon's next generation compact inline cryocooler architecture

    Science.gov (United States)

    Schaefer, B. R.; Bellis, L.; Ellis, M. J.; Conrad, T.

    2014-01-01

    Since the 1970s, Raytheon has developed, built, tested and integrated high performance cryocoolers. Our versatile designs for single and multi-stage cryocoolers provide reliable operation for temperatures from 10 to 200 Kelvin with power levels ranging from 50 W to nearly 600 W. These advanced cryocoolers incorporate clearance seals, flexure suspensions, hermetic housings and dynamic balancing to provide long service life and reliable operation in all relevant environments. Today, sensors face a multitude of cryocooler integration challenges such as exported disturbance, efficiency, scalability, maturity, and cost. As a result, cryocooler selection is application dependent, oftentimes requiring extensive trade studies to determine the most suitable architecture. To optimally meet the needs of next generation passive IR sensors, the Compact Inline Raytheon Stirling 1-Stage (CI-RS1), Compact Inline Raytheon Single Stage Pulse Tube (CI-RP1) and Compact Inline Raytheon Hybrid Stirling/Pulse Tube 2-Stage (CI-RSP2) cryocoolers are being developed to satisfy this suite of requirements. This lightweight, compact, efficient, low vibration cryocooler combines proven 1-stage (RS1 or RP1) and 2-stage (RSP2) cold-head architectures with an inventive set of warm-end mechanisms into a single cooler module, allowing the moving mechanisms for the compressor and the Stirling displacer to be consolidated onto a common axis and in a common working volume. The CI cryocooler is a significant departure from the current Stirling cryocoolers in which the compressor mechanisms are remote from the Stirling displacer mechanism. Placing all of the mechanisms in a single volume and on a single axis provides benefits in terms of package size (30% reduction), mass (30% reduction), thermodynamic efficiency (>20% improvement) and exported vibration performance (≤25 mN peak in all three orthogonal axes at frequencies from 1 to 500 Hz). The main benefit of axial symmetry is that proven balancing

  14. Raytheon's next generation compact inline cryocooler architecture

    International Nuclear Information System (INIS)

    Schaefer, B. R.; Bellis, L.; Ellis, M. J.; Conrad, T.

    2014-01-01

    Since the 1970s, Raytheon has developed, built, tested and integrated high performance cryocoolers. Our versatile designs for single and multi-stage cryocoolers provide reliable operation for temperatures from 10 to 200 Kelvin with power levels ranging from 50 W to nearly 600 W. These advanced cryocoolers incorporate clearance seals, flexure suspensions, hermetic housings and dynamic balancing to provide long service life and reliable operation in all relevant environments. Today, sensors face a multitude of cryocooler integration challenges such as exported disturbance, efficiency, scalability, maturity, and cost. As a result, cryocooler selection is application dependent, oftentimes requiring extensive trade studies to determine the most suitable architecture. To optimally meet the needs of next generation passive IR sensors, the Compact Inline Raytheon Stirling 1-Stage (CI-RS1), Compact Inline Raytheon Single Stage Pulse Tube (CI-RP1) and Compact Inline Raytheon Hybrid Stirling/Pulse Tube 2-Stage (CI-RSP2) cryocoolers are being developed to satisfy this suite of requirements. This lightweight, compact, efficient, low vibration cryocooler combines proven 1-stage (RS1 or RP1) and 2-stage (RSP2) cold-head architectures with an inventive set of warm-end mechanisms into a single cooler module, allowing the moving mechanisms for the compressor and the Stirling displacer to be consolidated onto a common axis and in a common working volume. The CI cryocooler is a significant departure from the current Stirling cryocoolers in which the compressor mechanisms are remote from the Stirling displacer mechanism. Placing all of the mechanisms in a single volume and on a single axis provides benefits in terms of package size (30% reduction), mass (30% reduction), thermodynamic efficiency (>20% improvement) and exported vibration performance (≤25 mN peak in all three orthogonal axes at frequencies from 1 to 500 Hz). The main benefit of axial symmetry is that proven balancing

  15. Performance analysis of next-generation lunar laser retroreflectors

    Science.gov (United States)

    Ciocci, Emanuele; Martini, Manuele; Contessa, Stefania; Porcelli, Luca; Mastrofini, Marco; Currie, Douglas; Delle Monache, Giovanni; Dell'Agnello, Simone

    2017-09-01

    Starting from 1969, Lunar Laser Ranging (LLR) to the Apollo and Lunokhod Cube Corner Retroreflectors (CCRs) provided several tests of General Relativity (GR). When deployed, the Apollo/Lunokhod CCRs design contributed only a negligible fraction of the ranging error budget. Today the improvement over the years in the laser ground stations makes the lunar libration contribution relevant. So the libration now dominates the error budget limiting the precision of the experimental tests of gravitational theories. The MoonLIGHT-2 project (Moon Laser Instrumentation for General relativity High-accuracy Tests - Phase 2) is a next-generation LLR payload developed by the Satellite/lunar/GNSS laser ranging/altimetry and Cube/microsat Characterization Facilities Laboratory (SCF _ Lab) at the INFN-LNF in collaboration with the University of Maryland. With its unique design consisting of a single large CCR unaffected by librations, MoonLIGHT-2 can significantly reduce error contribution of the reflectors to the measurement of the lunar geodetic precession and other GR tests compared to Apollo/Lunokhod CCRs. This paper treats only this specific next-generation lunar laser retroreflector (MoonLIGHT-2) and it is by no means intended to address other contributions to the global LLR error budget. MoonLIGHT-2 is approved to be launched with the Moon Express 1(MEX-1) mission and will be deployed on the Moon surface in 2018. To validate/optimize MoonLIGHT-2, the SCF _ Lab is carrying out a unique experimental test called SCF-Test: the concurrent measurement of the optical Far Field Diffraction Pattern (FFDP) and the temperature distribution of the CCR under thermal conditions produced with a close-match solar simulator and simulated space environment. The focus of this paper is to describe the SCF _ Lab specialized characterization of the performance of our next-generation LLR payload. While this payload will improve the contribution of the error budget of the space segment (MoonLIGHT-2

  16. Next-Generation Beneficial Microbes: The Case of Akkermansia muciniphila

    Directory of Open Access Journals (Sweden)

    Patrice D. Cani

    2017-09-01

    Full Text Available Metabolic disorders associated with obesity and cardiometabolic disorders are worldwide epidemic. Among the different environmental factors, the gut microbiota is now considered as a key player interfering with energy metabolism and host susceptibility to several non-communicable diseases. Among the next-generation beneficial microbes that have been identified, Akkermansia muciniphila is a promising candidate. Indeed, A. muciniphila is inversely associated with obesity, diabetes, cardiometabolic diseases and low-grade inflammation. Besides the numerous correlations observed, a large body of evidence has demonstrated the causal beneficial impact of this bacterium in a variety of preclinical models. Translating these exciting observations to human would be the next logic step and it now appears that several obstacles that would prevent the use of A. muciniphila administration in humans have been overcome. Moreover, several lines of evidence indicate that pasteurization of A. muciniphila not only increases its stability but more importantly increases its efficacy. This strongly positions A. muciniphila in the forefront of next-generation candidates for developing novel food or pharma supplements with beneficial effects. Finally, a specific protein present on the outer membrane of A. muciniphila, termed Amuc_1100, could be strong candidate for future drug development. In conclusion, as plants and its related knowledge, known as pharmacognosy, have been the source for designing drugs over the last century, we propose that microbes and microbiomegnosy, or knowledge of our gut microbiome, can become a novel source of future therapies.

  17. Simple and robust generation of ultrafast laser pulse trains using polarization-independent parallel-aligned thin films

    Science.gov (United States)

    Wang, Andong; Jiang, Lan; Li, Xiaowei; Wang, Zhi; Du, Kun; Lu, Yongfeng

    2018-05-01

    Ultrafast laser pulse temporal shaping has been widely applied in various important applications such as laser materials processing, coherent control of chemical reactions, and ultrafast imaging. However, temporal pulse shaping has been limited to only-in-lab technique due to the high cost, low damage threshold, and polarization dependence. Herein we propose a novel design of ultrafast laser pulse train generation device, which consists of multiple polarization-independent parallel-aligned thin films. Various pulse trains with controllable temporal profile can be generated flexibly by multi-reflections within the splitting films. Compared with other pulse train generation techniques, this method has advantages of compact structure, low cost, high damage threshold and polarization independence. These advantages endow it with high potential for broad utilization in ultrafast applications.

  18. Development of Industrial High-Speed Transfer Parallel Robot

    International Nuclear Information System (INIS)

    Kim, Byung In; Kyung, Jin Ho; Do, Hyun Min; Jo, Sang Hyun

    2013-01-01

    Parallel robots used in industry require high stiffness or high speed because of their structural characteristics. Nowadays, the importance of rapid transportation has increased in the distribution industry. In this light, an industrial parallel robot has been developed for high-speed transfer. The developed parallel robot can handle a maximum payload of 3 kg. For a payload of 0.1 kg, the trajectory cycle time is 0.3 s (come and go), and the maximum velocity is 4.5 m/s (pick amp, place work, adept cycle). In this motion, its maximum acceleration is very high and reaches approximately 13g. In this paper, the design, analysis, and performance test results of the developed parallel robot system are introduced

  19. Next Generation Surfactants for Improved Chemical Flooding Technology

    Energy Technology Data Exchange (ETDEWEB)

    Laura Wesson; Prapas Lohateeraparp; Jeffrey Harwell; Bor-Jier Shiau

    2012-05-31

    The principle objective of this project was to characterize and test current and next generation high performance surfactants for improved chemical flooding technology, focused on reservoirs in the Pennsylvanian-aged (Penn) sands. In order to meet this objective the characteristic curvatures (Cc) of twenty-eight anionic surfactants selected for evaluation for use in chemical flooding formulations were determined. The Cc values ranged from -6.90 to 2.55 with the majority having negative values. Crude oil samples from nine Penn sand reservoirs were analyzed for several properties pertinent to surfactant formulation for EOR application. These properties included equivalent alkane carbon numbers, total acid numbers, and viscosity. The brine samples from these same reservoirs were analyzed for several cations and for total dissolved solids. Surfactant formulations were successfully developed for eight reservoirs by the end of the project period. These formulations were comprised of a tertiary mixture of anionic surfactants. The identities of these surfactants are considered proprietary, but suffice to say the surfactants in each mixture were comprised of varying chemical structures. In addition to the successful development of surfactant formulations for EOR, there were also two successful single-well field tests conducted. There are many aspects that must be considered in the development and implementation of effective surfactant formulations. Taking into account these other aspects, there were four additional studies conducted during this project. These studies focused on the effect of the stability of surfactant formulations in the presence of polymers with an associated examination of polymer rheology, the effect of the presence of iron complexes in the brine on surfactant stability, the potential use of sacrificial agents in order to minimize the loss of surfactant to adsorption, and the effect of electrolytes on surfactant adsorption. In these last four studies

  20. The Dynamic Pricing of Next Generation Consumer Durables

    OpenAIRE

    Barry L. Bayus

    1992-01-01

    Learning curve effects, aspects of consumer demand models (e.g., reservation price distributions, intertemporal utility maximizing behavior), and competitive activity are reasons which have been offered to explain why prices of new durables decline over time. This paper presents an alternative rationale based on the buying behavior for products with overlapping replacement cycles (i.e., next generation products). A model for consumer sales of a new durable is developed by incorporating the re...