WorldWideScience

Sample records for source testing analytical

  1. Electrospray ion source with reduced analyte electrochemistry

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-08-23

    An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.

  2. Emergency analytical testing: things to consider

    CSIR Research Space (South Africa)

    Pretorius, Cecilia J

    2017-07-01

    Full Text Available Circumstances may dictate that samples from mining operations are analysed for unknown compounds that are potentially harmful to humans. These circumstances may be out of the ordinary, unique or isolated incidents. Emergency analytical testing may...

  3. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  4. Infrared source test

    Energy Technology Data Exchange (ETDEWEB)

    Ott, L.

    1994-11-15

    The purpose of the Infrared Source Test (IRST) is to demonstrate the ability to track a ground target with an infrared sensor from an airplane. The system is being developed within the Advance Technology Program`s Theater Missile Defense/Unmanned Aerial Vehicle (UAV) section. The IRST payload consists of an Amber Radiance 1 infrared camera system, a computer, a gimbaled mirror, and a hard disk. The processor is a custom R3000 CPU board made by Risq Modular Systems, Inc. for LLNL. The board has ethernet, SCSI, parallel I/O, and serial ports, a DMA channel, a video (frame buffer) interface, and eight MBytes of main memory. The real-time operating system VxWorks has been ported to the processor. The application code is written in C on a host SUN 4 UNIX workstation. The IRST is the result of a combined effort by physicists, electrical and mechanical engineers, and computer scientists.

  5. Light Source Estimation with Analytical Path-tracing

    OpenAIRE

    Kasper, Mike; Keivan, Nima; Sibley, Gabe; Heckman, Christoffer

    2017-01-01

    We present a novel algorithm for light source estimation in scenes reconstructed with a RGB-D camera based on an analytically-derived formulation of path-tracing. Our algorithm traces the reconstructed scene with a custom path-tracer and computes the analytical derivatives of the light transport equation from principles in optics. These derivatives are then used to perform gradient descent, minimizing the photometric error between one or more captured reference images and renders of our curre...

  6. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  7. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  8. Sampling analytical tests and destructive tests for quality assurance

    International Nuclear Information System (INIS)

    Saas, A.; Pasquini, S.; Jouan, A.; Angelis, de; Hreen Taywood, H.; Odoj, R.

    1990-01-01

    In the context of the third programme of the European Communities on the monitoring of radioactive waste, various methods have been developed for the performance of sampling and measuring tests on encapsulated waste of low and medium level activity, on the one hand, and of high level activity, on the other hand. The purpose was to provide better quality assurance for products to be stored on an interim or long-term basis. Various testing sampling means are proposed such as: - sampling of raw waste before conditioning and determination of the representative aliquot, - sampling of encapsulated waste on process output, - sampling of core specimens subjected to measurement before and after cutting. Equipment suitable for these sampling procedures have been developed and, in the case of core samples, a comparison of techniques has been made. The results are described for the various analytical tests carried out on the samples such as: - mechanical tests, - radiation resistance, - fire resistance, - lixiviation, - determination of free water, - biodegradation, - water resistance, - chemical and radiochemical analysis. Every time it was possible, these tests were compared with non-destructive tests on full-scale packages and some correlations are given. This word has made if possible to improve and clarify sample optimization, with fine sampling techniques and methodologies and draw up characterization procedures. It also provided an occasion for a first collaboration between the laboratories responsible for these studies and which will be furthered in the scope of the 1990-1994 programme

  9. SPANDOM - source projection analytic nodal discrete ordinates method

    International Nuclear Information System (INIS)

    Kim, Tae Hyeong; Cho, Nam Zin

    1994-01-01

    We describe a new discrete ordinates nodal method for the two-dimensional transport equation. We solve the discrete ordinates equation analytically after the source term is projected and represented in polynomials. The method is applied to two fast reactor benchmark problems and compared with the TWOHEX code. The results indicate that the present method accurately predicts not only multiplication factor but also flux distribution

  10. Pentaho Business Analytics: a Business Intelligence Open Source Alternative

    Directory of Open Access Journals (Sweden)

    Diana TÂRNĂVEANU

    2012-10-01

    Full Text Available Most organizations strive to obtain fast, interactive and insightful analytics in order to fundament the most effective and profitable decisions. They need to incorporate huge amounts of data in order to run analysis based on queries and reports with collaborative capabilities. The large variety of Business Intelligence solutions on the market makes it very difficult for organizations to select one and evaluate the impact of the selected solution to the organization. The need of a strategy to help organization chose the best solution for investment emerges. In the past, Business Intelligence (BI market was dominated by closed source and commercial tools, but in the last years open source solutions developed everywhere. An Open Source Business Intelligence solution can be an option due to time-sensitive, sprawling requirements and tightening budgets. This paper presents a practical solution implemented in a suite of Open Source Business Intelligence products called Pentaho Business Analytics, which provides data integration, OLAP services, reporting, dashboarding, data mining and ETL capabilities. The study conducted in this paper suggests that the open source phenomenon could become a valid alternative to commercial platforms within the BI context.

  11. Analytic Approximation to Radiation Fields from Line Source Geometry

    International Nuclear Information System (INIS)

    Michieli, I.

    2000-01-01

    Line sources with slab shields represent typical source-shield configuration in gamma-ray attenuation problems. Such shielding problems often lead to the generalized Secant integrals of the specific form. Besides numerical integration approach, various expansions and rational approximations with limited applicability are in use for computing the value of such integral functions. Lately, the author developed rapidly convergent infinite series representation of generalized Secant Integrals involving incomplete Gamma functions. Validity of such representation was established for zero and positive values of integral parameter a (a=0). In this paper recurrence relations for generalized Secant Integrals are derived allowing us simple approximate analytic calculation of the integral for arbitrary a values. It is demonstrated how truncated series representation can be used, as the basis for such calculations, when possibly negative a values are encountered. (author)

  12. An Analytical Method of Auxiliary Sources Solution for Plane Wave Scattering by Impedance Cylinders

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2004-01-01

    Analytical Method of Auxiliary Sources solutions for plane wave scattering by circular impedance cylinders are derived by transformation of the exact eigenfunction series solutions employing the Hankel function wave transformation. The analytical Method of Auxiliary Sources solution thus obtained...

  13. Test set of gaseous analytes at Hanford tank farms

    International Nuclear Information System (INIS)

    1997-01-01

    DOE has stored toxic and radioactive waste materials in large underground tanks. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed to monitor the open air space above these tanks. In developing this infrared spectra monitor as a safety alert instrument, it is important to know what hazardous gases, called the Analytes of Concern, are most likely to be found in dangerous concentrations. The monitor must consider other gases which could interfere with measurements of the Analytes of Concern. The total list of gases called the Test Set Analytes form the basis for testing the pollution monitor. Prior measurements in 54 tank headspaces have detected 102 toxic air pollutants (TAPs) and over 1000 other analytes. The hazardous Analytes are ranked herein by a Hazardous Atmosphere Rating which combines their measured concentration, their density relative to air, and the concentration at which they become dangerous. The top 20 toxic air pollutants, as ranked by the Hazardous Atmosphere Rating, and the top 20 other analytes, in terms of measured concentrations, are analyzed for possible inclusion in the Test Set Analytes. Of these 40 gases, 20 are selected. To these 20 gases are added the 6 omnipresent atmospheric gases with the highest concentrations, since their spectra could interfere with measurements of the other spectra. The 26 Test Set Analytes are divided into a Primary Set and a Secondary Set. The Primary Set, gases which must be detectable by the monitor, includes the 6 atmospheric gases and the 6 hazardous gases which have been measured at dangerous concentrations. The Secondary Set gases need not be monitored at this time. The infrared spectra indicates that the pollution monitor will detect all 26 Test Set Analytes by thermal emission and will detect 15 Test Set Analytes by laser absorption

  14. Application of californium-252 neutron sources for analytical chemistry

    International Nuclear Information System (INIS)

    Ishii, Daido

    1976-01-01

    The researches made for the application of Cf-252 neutron sources to analytical chemistry during the period from 1970 to 1974 including partly 1975 are reviewed. The first part is the introduction to the above. The second part deals with general review of symposia, publications and the like. Attention is directed to ERDA publishing the periodical ''Californium-252 Progress'' and to a study group of Cf-252 utilization held by Japanese Radioisotope Association in 1974. The third part deals with its application for radio activation analysis. The automated absolute activation analysis (AAAA) of Savannha River is briefly explained. The joint experiment of Savannha River operation office with New Brunswick laboratory is mentioned. Cf-252 radiation source was used for the non-destructive analysis of elements in river water. East neutrons of Cf-252 were used for the quantitative analysis of lead in paints. Many applications for industrial control processes have been reported. Attention is drawn to the application of Cf-252 neutron sources for the field search of neutral resources. For example, a logging sonde for searching uranium resources was developed. the fourth part deals with the application of the analysis with gamma ray by capturing neutrons. For example, a bore hole sonde and the process control analysis of sulfur in fuel utilized capture gamma ray. The prompt gamma ray by capturing neutrons may be used for the nondestructive analysis of enrivonment. (Iwakiri, K.)

  15. Testing of the analytical anisotropic algorithm for photon dose calculation

    International Nuclear Information System (INIS)

    Esch, Ann van; Tillikainen, Laura; Pyykkonen, Jukka; Tenhunen, Mikko; Helminen, Hannu; Siljamaeki, Sami; Alakuijala, Jyrki; Paiusco, Marta; Iori, Mauro; Huyskens, Dominique P.

    2006-01-01

    The analytical anisotropic algorithm (AAA) was implemented in the Eclipse (Varian Medical Systems) treatment planning system to replace the single pencil beam (SPB) algorithm for the calculation of dose distributions for photon beams. AAA was developed to improve the dose calculation accuracy, especially in heterogeneous media. The total dose deposition is calculated as the superposition of the dose deposited by two photon sources (primary and secondary) and by an electron contamination source. The photon dose is calculated as a three-dimensional convolution of Monte-Carlo precalculated scatter kernels, scaled according to the electron density matrix. For the configuration of AAA, an optimization algorithm determines the parameters characterizing the multiple source model by optimizing the agreement between the calculated and measured depth dose curves and profiles for the basic beam data. We have combined the acceptance tests obtained in three different departments for 6, 15, and 18 MV photon beams. The accuracy of AAA was tested for different field sizes (symmetric and asymmetric) for open fields, wedged fields, and static and dynamic multileaf collimation fields. Depth dose behavior at different source-to-phantom distances was investigated. Measurements were performed on homogeneous, water equivalent phantoms, on simple phantoms containing cork inhomogeneities, and on the thorax of an anthropomorphic phantom. Comparisons were made among measurements, AAA, and SPB calculations. The optimization procedure for the configuration of the algorithm was successful in reproducing the basic beam data with an overall accuracy of 3%, 1 mm in the build-up region, and 1%, 1 mm elsewhere. Testing of the algorithm in more clinical setups showed comparable results for depth dose curves, profiles, and monitor units of symmetric open and wedged beams below d max . The electron contamination model was found to be suboptimal to model the dose around d max , especially for physical

  16. Upgraded Analytical Model of the Cylinder Test

    Energy Technology Data Exchange (ETDEWEB)

    Souers, P. Clark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Lauderbach, Lisa [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Garza, Raul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Ferranti, Louis [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Vitello, Peter [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center

    2013-03-15

    A Gurney-type equation was previously corrected for wall thinning and angle of tilt, and now we have added shock wave attenuation in the copper wall and air gap energy loss. Extensive calculations were undertaken to calibrate the two new energy loss mechanisms across all explosives. The corrected Gurney equation is recommended for cylinder use over the original 1943 form. The effect of these corrections is to add more energy to the adiabat values from a relative volume of 2 to 7, with low energy explosives having the largest correction. The data was pushed up to a relative volume of about 15 and the JWL parameter ω was obtained directly. Finally, the total detonation energy density was locked to the v = 7 adiabat energy density, so that the Cylinder test gives all necessary values needed to make a JWL.

  17. Upgraded Analytical Model of the Cylinder Test

    Energy Technology Data Exchange (ETDEWEB)

    Souers, P. Clark; Lauderbach, Lisa; Garza, Raul; Ferranti, Louis; Vitello, Peter

    2013-03-15

    A Gurney-type equation was previously corrected for wall thinning and angle of tilt, and now we have added shock wave attenuation in the copper wall and air gap energy loss. Extensive calculations were undertaken to calibrate the two new energy loss mechanisms across all explosives. The corrected Gurney equation is recommended for cylinder use over the original 1943 form. The effect of these corrections is to add more energy to the adiabat values from a relative volume of 2 to 7, with low energy explosives having the largest correction. The data was pushed up to a relative volume of about 15 and the JWL parameter ω was obtained directly. The total detonation energy density was locked to the v=7 adiabat energy density, so that the Cylinder test gives all necessary values needed to make a JWL.

  18. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  19. Analytic tests and their relation to jet fuel thermal stability

    Energy Technology Data Exchange (ETDEWEB)

    Heneghan, S.P.; Kauffman, R.E. [Univ. of Dayton Research Institute, OH (United States)

    1995-05-01

    The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions show that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.

  20. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  1. Analytical study on model tests of soil-structure interaction

    International Nuclear Information System (INIS)

    Odajima, M.; Suzuki, S.; Akino, K.

    1987-01-01

    Since nuclear power plant (NPP) structures are stiff, heavy and partly-embedded, the behavior of those structures during an earthquake depends on the vibrational characteristics of not only the structure but also the soil. Accordingly, seismic response analyses considering the effects of soil-structure interaction (SSI) are extremely important for seismic design of NPP structures. Many studies have been conducted on analytical techniques concerning SSI and various analytical models and approaches have been proposed. Based on the studies, SSI analytical codes (computer programs) for NPP structures have been improved at JINS (Japan Institute of Nuclear Safety), one of the departments of NUPEC (Nuclear Power Engineering Test Center) in Japan. These codes are soil-spring lumped-mass code (SANLUM), finite element code (SANSSI), thin layered element code (SANSOL). In proceeding with the improvement of the analytical codes, in-situ large-scale forced vibration SSI tests were performed using models simulating light water reactor buildings, and simulation analyses were performed to verify the codes. This paper presents an analytical study to demonstrate the usefulness of the codes

  2. Irregular analytical errors in diagnostic testing - a novel concept.

    Science.gov (United States)

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC

  3. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  4. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    Science.gov (United States)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  5. Large source test stand for H-(D-) ion source

    International Nuclear Information System (INIS)

    Larson, R.; McKenzie-Wilson, R.

    1981-01-01

    The Brookhaven National Laboratory Neutral Beam Group has constructed a large source test stand for testing of the various source modules under development. The first objective of the BNL program is to develop a source module capable of delivering 10A of H - (D - ) at 25 kV operating in the steady state mode with satisfactory gas and power efficiency. The large source test stand contains gas supply and vacuum pumping systems, source cooling systems, magnet power supplies and magnet cooling systems, two arc power supplies rated at 25 kW and 50 kW, a large battery driven power supply and an extractor electrode power supply. Figure 1 is a front view of the vacuum vessel showing the control racks with the 36'' vacuum valves and refrigerated baffles mounted behind. Figure 2 shows the rear view of the vessel with a BNL Mk V magnetron source mounted in the source aperture and also shows the cooled magnet coils. Currently two types of sources are under test: a large magnetron source and a hollow cathode discharge source

  6. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  7. An analytic data analysis method for oscillatory slug tests.

    Science.gov (United States)

    Chen, Chia-Shyun

    2006-01-01

    An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.

  8. Civil Society In Tanzania: An Analytical Review Of Sources Of ...

    African Journals Online (AJOL)

    Sixty percent of civil societies deal with social development programmes. Additionally, results show that most civil societies had disproportionate staffing problems; and sixty six percent depended on international sources of funding while 46% reported that they secured funds from both local and foreign sources of financing.

  9. Neutron Sources for Standard-Based Testing

    Energy Technology Data Exchange (ETDEWEB)

    Radev, Radoslav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McLean, Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-10

    The DHS TC Standards and the consensus ANSI Standards use 252Cf as the neutron source for performance testing because its energy spectrum is similar to the 235U and 239Pu fission sources used in nuclear weapons. An emission rate of 20,000 ± 20% neutrons per second is used for testing of the radiological requirements both in the ANSI standards and the TCS. Determination of the accurate neutron emission rate of the test source is important for maintaining consistency and agreement between testing results obtained at different testing facilities. Several characteristics in the manufacture and the decay of the source need to be understood and accounted for in order to make an accurate measurement of the performance of the neutron detection instrument. Additionally, neutron response characteristics of the particular instrument need to be known and taken into account as well as neutron scattering in the testing environment.

  10. Analytical and semi-analytical formalism for the voltage and the current sources of a superconducting cavity under dynamic detuning

    CERN Document Server

    Doleans, M

    2003-01-01

    Elliptical superconducting radio frequency (SRF) cavities are sensitive to frequency detuning because they have a high Q value in comparison with normal conducting cavities and weak mechanical properties. Radiation pressure on the cavity walls, microphonics, and tuning system are possible sources of dynamic detuning during SRF cavity-pulsed operation. A general analytic relation between the cavity voltage, the dynamic detuning function, and the RF control function is developed. This expression for the voltage envelope in a cavity under dynamic detuning and dynamic RF controls is analytically expressed through an integral formulation. A semi-analytical scheme is derived to calculate the voltage behavior in any practical case. Examples of voltage envelope behavior for different cases of dynamic detuning and RF control functions are shown. The RF control function for a cavity under dynamic detuning is also investigated and as an application various filling schemes are presented.

  11. Application of radioactive sources in analytical instruments for planetary exploration

    International Nuclear Information System (INIS)

    Economou, T.E.

    2008-01-01

    Full text: In the past 50 years or so, many types of radioactive sources have been used in space exploration. 238 Pu is often used in space missions in Radioactive Heater Units (RHU) and Radioisotope Thermoelectric Generators (RTG) for heat and power generation, respectively. In 1960's, 2 ' 42 Cm alpha radioactive sources have been used for the first time in space applications on 3 Surveyor spacecrafts to obtain the chemical composition of the lunar surface with an instrument based on the Rutherford backscatterring of the alpha particles from nuclei in the analyzed sample. 242 Cm is an alpha emitter of 6.1 MeV alpha particles. Its half-life time, 163 days, is short enough to allow sources to be prepared with the necessary high intensity per unit area ( up to 470 mCi and FWHM of about 1.5% in the lunar instruments) that results in narrow energy distribution, yet long enough that the sources have adequate lifetimes for short duration missions. 242 Cm is readily prepared in curie quantities by irradiation of 241 Am by neutrons in nuclear reactors, followed by chemical separation of the curium from the americium and fission products. For long duration missions, like for example missions to Mars, comets, and asteroids, the isotope 244 Cm (T 1/2 =18.1 y, E α =5.8 MeV) is a better source because of its much longer half-life time. Both of these isotopes are also excellent x-ray excitation sources and have been used for that purpose on several planetary missions. For the light elements the excitation is caused mainly by the alpha particles, while for the heavier elements (> Ca) the excitation is mainly due to the x-rays from the Pu L-lines (E x =14-18 keV). 244 Cm has been used in several variations of the Alpha Proton Xray Spectrometer (APXS): PHOBOS 1 and 2 Pathfinder, Russian Mars-96 mission, Mars Exploration Rover (MER) and Rosetta. Other sources used in X-ray fluorescence instruments in space are 55 Fe and 109 Cd (Viking1,2, Beagle 2) and 57 Co is used in Moessbauer

  12. Basic Testing of the DUCHAMP Source Finder

    Science.gov (United States)

    Westmeier, T.; Popping, A.; Serra, P.

    2012-01-01

    This paper presents and discusses the results of basic source finding tests in three dimensions (using spectroscopic data cubes) with DUCHAMP, the standard source finder for the Australian Square Kilometre Array Pathfinder. For this purpose, we generated different sets of unresolved and extended Hi model sources. These models were then fed into DUCHAMP, using a range of different parameters and methods provided by the software. The main aim of the tests was to study the performance of DUCHAMP on sources with different parameters and morphologies and assess the accuracy of DUCHAMP's source parametrisation. Overall, we find DUCHAMP to be a powerful source finder capable of reliably detecting sources down to low signal-to-noise ratios and accurately measuring their position and velocity. In the presence of noise in the data, DUCHAMP's measurements of basic source parameters, such as spectral line width and integrated flux, are affected by systematic errors. These errors are a consequence of the effect of noise on the specific algorithms used by DUCHAMP for measuring source parameters in combination with the fact that the software only takes into account pixels above a given flux threshold and hence misses part of the flux. In scientific applications of DUCHAMP these systematic errors would have to be corrected for. Alternatively, DUCHAMP could be used as a source finder only, and source parametrisation could be done in a second step using more sophisticated parametrisation algorithms.

  13. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  14. A negative ion source test facility

    Energy Technology Data Exchange (ETDEWEB)

    Melanson, S.; Dehnel, M., E-mail: morgan@d-pace.com; Potkins, D.; Theroux, J.; Hollinger, C.; Martin, J.; Stewart, T.; Jackle, P.; Withington, S. [D-Pace, Inc., P.O. Box 201, Nelson, British Columbia V1L 5P9 (Canada); Philpott, C.; Williams, P.; Brown, S.; Jones, T.; Coad, B. [Buckley Systems Ltd., 6 Bowden Road, Mount Wellington, Auckland 1060 (New Zealand)

    2016-02-15

    Progress is being made in the development of an Ion Source Test Facility (ISTF) by D-Pace Inc. in collaboration with Buckley Systems Ltd. in Auckland, NZ. The first phase of the ISTF is to be commissioned in October 2015 with the second phase being commissioned in March 2016. The facility will primarily be used for the development and the commercialization of ion sources. It will also be used to characterize and further develop various D-Pace Inc. beam diagnostic devices.

  15. Review and evaluation of spark source mass spectrometry as an analytical method

    International Nuclear Information System (INIS)

    Beske, H.E.

    1981-01-01

    The analytical features and most important fields of application of spark source mass spectrometry are described with respect to the trace analysis of high-purity materials and the multielement analysis of technical alloys, geochemical and cosmochemical, biological and radioactive materials, as well as in environmental analysis. Comparisons are made to other analytical methods. The distribution of the method as well as opportunities for contract analysis are indicated and developmental tendencies discussed. (orig.) [de

  16. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Scott E., E-mail: sedavids@utmb.edu [Radiation Oncology, The University of Texas Medical Branch, Galveston, Texas 77555 (United States); Cui, Jing [Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Kry, Stephen; Ibbott, Geoffrey S.; Followill, David S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States); Vicic, Milos [Department of Applied Physics, University of Belgrade, Belgrade 11000 (Serbia); White, R. Allen [Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2016-08-15

    points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. Conclusions: A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.

  17. Guidelines for testing sealed radiation sources

    International Nuclear Information System (INIS)

    1989-01-01

    These guidelines are based on article 16(1) of the Ordinance on the Implementation of Atomic Safety and Radiation Protection dated 11 October 1984 (VOAS), in connection with article 36 of the Executory Provision to the VOAS, of 11 October 1984. They apply to the testing of sealed sources to verify their intactness, tightness and non-contamination as well as observance of their fixed service time. The type, scope and intervals of testing as well as the evaluation of test results are determined. These guidelines also apply to the testing of radiation sources forming part of radiation equipment, unless otherwise provided for in the type license or permit. These guidelines enter into force on 1 January 1990

  18. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  19. In situ impulse test: an experimental and analytical evaluation of data interpretation procedures

    International Nuclear Information System (INIS)

    1975-08-01

    Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10 -1 and 10 -3 percent are achieved. The experimental field work consisted of performing special tests in a large test sand fill to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results

  20. Analytic solution of field distribution and demagnetization function of ideal hollow cylindrical field source

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-09-01

    The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.

  1. An analytical approach for the Propagation Saw Test

    Science.gov (United States)

    Benedetti, Lorenzo; Fischer, Jan-Thomas; Gaume, Johan

    2016-04-01

    The Propagation Saw Test (PST) [1, 2] is an experimental in-situ technique that has been introduced to assess crack propagation propensity in weak snowpack layers buried below cohesive snow slabs. This test attracted the interest of a large number of practitioners, being relatively easy to perform and providing useful insights for the evaluation of snow instability. The PST procedure requires isolating a snow column of 30 centimeters of width and -at least-1 meter in the downslope direction. Then, once the stratigraphy is known (e.g. from a manual snow profile), a saw is used to cut a weak layer which could fail, potentially leading to the release of a slab avalanche. If the length of the saw cut reaches the so-called critical crack length, the onset of crack propagation occurs. Furthermore, depending on snow properties, the crack in the weak layer can initiate the fracture and detachment of the overlying slab. Statistical studies over a large set of field data confirmed the relevance of the PST, highlighting the positive correlation between test results and the likelihood of avalanche release [3]. Recent works provided key information on the conditions for the onset of crack propagation [4] and on the evolution of slab displacement during the test [5]. In addition, experimental studies [6] and simplified models [7] focused on the qualitative description of snowpack properties leading to different failure types, namely full propagation or fracture arrest (with or without slab fracture). However, beside current numerical studies utilizing discrete elements methods [8], only little attention has been devoted to a detailed analytical description of the PST able to give a comprehensive mechanical framework of the sequence of processes involved in the test. Consequently, this work aims to give a quantitative tool for an exhaustive interpretation of the PST, stressing the attention on important parameters that influence the test outcomes. First, starting from a pure

  2. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  3. Analytical modeling of Schottky tunneling source impact ionization MOSFET with reduced breakdown voltage

    Directory of Open Access Journals (Sweden)

    Sangeeta Singh

    2016-03-01

    Full Text Available In this paper, we have investigated a novel Schottky tunneling source impact ionization MOSFET (STS-IMOS to lower the breakdown voltage of conventional impact ionization MOS (IMOS and developed an analytical model for the same. In STS-IMOS there is an accumulative effect of both impact ionization and source induced barrier tunneling. The silicide source offers very low parasitic resistance, the outcome of which is an increment in voltage drop across the intrinsic region for the same applied bias. This reduces operating voltage and hence, it exhibits a significant reduction in both breakdown and threshold voltage. STS-IMOS shows high immunity against hot electron damage. As a result of this the device reliability increases magnificently. The analytical model for impact ionization current (Iii is developed based on the integration of ionization integral (M. Similarly, to get Schottky tunneling current (ITun expression, Wentzel–Kramers–Brillouin (WKB approximation is employed. Analytical models for threshold voltage and subthreshold slope is optimized against Schottky barrier height (ϕB variation. The expression for the drain current is computed as a function of gate-to-drain bias via integral expression. It is validated by comparing it with the technology computer-aided design (TCAD simulation results as well. In essence, this analytical framework provides the physical background for better understanding of STS-IMOS and its performance estimation.

  4. Improvement of spatial discretization error on the semi-analytic nodal method using the scattered source subtraction method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro

    2006-01-01

    In this paper, the scattered source subtraction (SSS) method is newly proposed to improve the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. In the SSS method, the scattered source is subtracted from both side of the diffusion or the transport equation to make spatial variation of the source term to be small. The same neutron balance equation is still used in the SSS method. Since the SSS method just modifies coefficients of node coupling equations (those used in evaluation for the response of partial currents), its implementation is easy. Validity of the present method is verified through test calculations that are carried out in PWR multi-assemblies configurations. The calculation results show that the SSS method can significantly improve the spatial discretization error. Since the SSS method does not have any negative impact on execution time, convergence behavior and memory requirement, it will be useful to reduce the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. (author)

  5. Reconstruction of sound source signal by analytical passive TR in the environment with airflow

    Science.gov (United States)

    Wei, Long; Li, Min; Yang, Debin; Niu, Feng; Zeng, Wu

    2017-03-01

    In the acoustic design of air vehicles, the time-domain signals of noise sources on the surface of air vehicles can serve as data support to reveal the noise source generation mechanism, analyze acoustic fatigue, and take measures for noise insulation and reduction. To rapidly reconstruct the time-domain sound source signals in an environment with flow, a method combining the analytical passive time reversal mirror (AP-TR) with a shear flow correction is proposed. In this method, the negative influence of flow on sound wave propagation is suppressed by the shear flow correction, obtaining the corrected acoustic propagation time delay and path. Those corrected time delay and path together with the microphone array signals are then submitted to the AP-TR, reconstructing more accurate sound source signals in the environment with airflow. As an analytical method, AP-TR offers a supplementary way in 3D space to reconstruct the signal of sound source in the environment with airflow instead of the numerical TR. Experiments on the reconstruction of the sound source signals of a pair of loud speakers are conducted in an anechoic wind tunnel with subsonic airflow to validate the effectiveness and priorities of the proposed method. Moreover the comparison by theorem and experiment result between the AP-TR and the time-domain beamforming in reconstructing the sound source signal is also discussed.

  6. Shielding Characteristics Using an Ultrasonic Configurable Fan Artificial Noise Source to Generate Modes - Experimental Measurements and Analytical Predictions

    Science.gov (United States)

    Sutliff, Daniel L.; Walker, Bruce E.

    2014-01-01

    An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.

  7. Analytic solution of magnetic induction distribution of ideal hollow spherical field sources

    Science.gov (United States)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-12-01

    The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.

  8. A Test Beamline on Diamond Light Source

    International Nuclear Information System (INIS)

    Sawhney, K. J. S.; Dolbnya, I. P.; Tiwari, M. K.; Alianelli, L.; Scott, S. M.; Preece, G. M.; Pedersen, U. K.; Walton, R. D.

    2010-01-01

    A Test beamline B16 has been built on the 3 GeV Diamond synchrotron radiation source. The beamline covers a wide photon energy range from 2 to 25 keV. The beamline is highly flexible and versatile in terms of the available beam size (a micron to 100 mm) and the range of energy resolution and photon flux; by virtue of its several operational modes, and the different inter-changeable instruments available in the experiments hutch. Diverse experimental configurations can be flexibly configured using a five-circle diffractometer, a versatile optics test bench, and a suite of detectors. Several experimental techniques including reflectivity, diffraction and imaging are routinely available. Details of the beamline and its measured performance are presented.

  9. Sources of Variation in Creep Testing

    Science.gov (United States)

    Loewenthal, William S.; Ellis, David L.

    2011-01-01

    Creep rupture is an important material characteristic for the design of rocket engines. It was observed during the characterization of GRCop-84 that the complete data set had nearly 4 orders of magnitude of scatter. This scatter likely confounded attempts to determine how creep performance was influenced by manufacturing. It was unclear if this variation was from the testing, the material, or both. Sources of variation were examined by conducting tests on identically processed specimens at the same specified stresses and temperatures. Significant differences existed between the five constant-load creep frames. The specimen temperature was higher than the desired temperature by as much as 43 C. It was also observed that the temperature gradient was up to 44 C. Improved specimen temperature control minimized temperature variations. The data from additional tests demonstrated that the results from all five frames were comparable. The variation decreased to 1/2 order of magnitude from 2 orders of magnitude for the baseline data set. Independent determination of creep rates in a reference load frame closely matched the creep rates determined after the modifications. Testing in helium tended to decrease the sample temperature gradient, but helium was not a significant improvement over vacuum.

  10. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    Science.gov (United States)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  11. Generalized Analytical Treatment Of The Source Strength In The Solution Of The Diffusion Equation

    International Nuclear Information System (INIS)

    Essa, Kh.S.M.; EI-Otaify, M.S.

    2007-01-01

    The source release strength (which is an integral part of the mathematical formulation of the diffusion equation) together with the boundary conditions leads to three different forms of the diffusion equation. The obtained forms have been solved analytically under different boundary conditions, by using transformation of axis, cosine, and Fourier transformation. Three equivalent alternative mathematical formulations of the problem have been obtained. The estimated solution of the concentrations at the ground source has been used for comparison with observed concentrations data for SF 6 tracer experiments in low wind and unstable conditions at lIT Delhi sports ground. A good agreement between estimated and observed concentrations is found

  12. Modeling Run Test Validity: A Meta-Analytic Approach

    National Research Council Canada - National Science Library

    Vickers, Ross

    2002-01-01

    .... This study utilized data from 166 samples (N = 5,757) to test the general hypothesis that differences in testing methods could account for the cross-situational variation in validity. Only runs >2 km...

  13. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  14. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    Science.gov (United States)

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  15. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    Science.gov (United States)

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  17. The analytical benchmark solution of spatial diffusion kinetics in source driven systems for homogeneous media

    International Nuclear Information System (INIS)

    Oliveira, F.L. de; Maiorino, J.R.; Santos, R.S.

    2007-01-01

    This paper describes a closed form solution obtained by the expansion method for the general time dependent diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. Thus, first an analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent without precursors was also solved and the results inter compared with results obtained by the previous one group models for a given fast homogeneous media, and different types of source transients. The results are being compared with the obtained by numerical methods. (author)

  18. A new role of proficiency testing in nuclear analytical work

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    2008-01-01

    The most recent definition of measurement result requires a statement of uncertainty whenever results obtained by nuclear or other quantitative methods of analysis are reported. Proficiency testing (PT) therefore must include the ability of laboratories to present not only unbiased quantity values...

  19. Post-Decontamination Vapor Sampling and Analytical Test Methods

    Science.gov (United States)

    2015-08-12

    is decontaminated that could pose an exposure hazard to unprotected personnel. The chemical contaminants may include chemical warfare agents (CWAs... decontamination process. Chemical contaminants can include chemical warfare agents (CWAs) or their simulants, nontraditional agents (NTAs), toxic industrial...a range of test articles from coupons, panels, and small fielded equipment items. 15. SUBJECT TERMS Vapor hazard; vapor sampling; chemical warfare

  20. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. Pulsed voltage electrospray ion source and method for preventing analyte electrolysis

    Science.gov (United States)

    Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN

    2011-12-27

    An electrospray ion source and method of operation includes the application of pulsed voltage to prevent electrolysis of analytes with a low electrochemical potential. The electrospray ion source can include an emitter, a counter electrode, and a power supply. The emitter can include a liquid conduit, a primary working electrode having a liquid contacting surface, and a spray tip, where the liquid conduit and the working electrode are in liquid communication. The counter electrode can be proximate to, but separated from, the spray tip. The power system can supply voltage to the working electrode in the form of a pulse wave, where the pulse wave oscillates between at least an energized voltage and a relaxation voltage. The relaxation duration of the relaxation voltage can range from 1 millisecond to 35 milliseconds. The pulse duration of the energized voltage can be less than 1 millisecond and the frequency of the pulse wave can range from 30 to 800 Hz.

  2. The revelation effect: A meta-analytic test of hypotheses.

    Science.gov (United States)

    Aßfalg, André; Bernstein, Daniel M; Hockley, William

    2017-12-01

    Judgments can depend on the activity directly preceding them. An example is the revelation effect whereby participants are more likely to claim that a stimulus is familiar after a preceding task, such as solving an anagram, than without a preceding task. We test conflicting predictions of four revelation-effect hypotheses in a meta-analysis of 26 years of revelation-effect research. The hypotheses' predictions refer to three subject areas: (1) the basis of judgments that are subject to the revelation effect (recollection vs. familiarity vs. fluency), (2) the degree of similarity between the task and test item, and (3) the difficulty of the preceding task. We use a hierarchical multivariate meta-analysis to account for dependent effect sizes and variance in experimental procedures. We test the revelation-effect hypotheses with a model selection procedure, where each model corresponds to a prediction of a revelation-effect hypothesis. We further quantify the amount of evidence for one model compared to another with Bayes factors. The results of this analysis suggest that none of the extant revelation-effect hypotheses can fully account for the data. The general vagueness of revelation-effect hypotheses and the scarcity of data were the major limiting factors in our analyses, emphasizing the need for formalized theories and further research into the puzzling revelation effect.

  3. Source-to-incident flux relation for a tokamak fusion test reactor blanket module

    International Nuclear Information System (INIS)

    Imel, G.R.

    1982-01-01

    The source-to-incident 14-MeV flux relation for a blanket module on the Tokamak Fusion Test Reactor is derived. It is shown that assumptions can be made that allow an analytical expression to be derived, using point kernel methods. In addition, the effect of a nonuniform source distribution is derived, again by relatively simple point kernel methods. It is thought that the methodology developed is valid for a variety of blanket modules on tokamak reactors

  4. 10 CFR 39.35 - Leak testing of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing of sealed sources. 39.35 Section 39.35 Energy....35 Leak testing of sealed sources. (a) Testing and recordkeeping requirements. Each licensee who uses... record of leak test results in units of microcuries and retain the record for inspection by the...

  5. Potential sources of analytical bias and error in selected trace element data-quality analyses

    Science.gov (United States)

    Paul, Angela P.; Garbarino, John R.; Olsen, Lisa D.; Rosen, Michael R.; Mebane, Christopher A.; Struzeski, Tedmund M.

    2016-09-28

    Potential sources of analytical bias and error associated with laboratory analyses for selected trace elements where concentrations were greater in filtered samples than in paired unfiltered samples were evaluated by U.S. Geological Survey (USGS) Water Quality Specialists in collaboration with the USGS National Water Quality Laboratory (NWQL) and the Branch of Quality Systems (BQS).Causes for trace-element concentrations in filtered samples to exceed those in associated unfiltered samples have been attributed to variability in analytical measurements, analytical bias, sample contamination either in the field or laboratory, and (or) sample-matrix chemistry. These issues have not only been attributed to data generated by the USGS NWQL but have been observed in data generated by other laboratories. This study continues the evaluation of potential analytical bias and error resulting from matrix chemistry and instrument variability by evaluating the performance of seven selected trace elements in paired filtered and unfiltered surface-water and groundwater samples collected from 23 sampling sites of varying chemistries from six States, matrix spike recoveries, and standard reference materials.Filtered and unfiltered samples have been routinely analyzed on separate inductively coupled plasma-mass spectrometry instruments. Unfiltered samples are treated with hydrochloric acid (HCl) during an in-bottle digestion procedure; filtered samples are not routinely treated with HCl as part of the laboratory analytical procedure. To evaluate the influence of HCl on different sample matrices, an aliquot of the filtered samples was treated with HCl. The addition of HCl did little to differentiate the analytical results between filtered samples treated with HCl from those samples left untreated; however, there was a small, but noticeable, decrease in the number of instances where a particular trace-element concentration was greater in a filtered sample than in the associated

  6. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  7. Waste minimization methods for treating analytical instrumentation effluents at the source

    International Nuclear Information System (INIS)

    Ritter, J.A.; Barnhart, C.

    1995-01-01

    The primary goal of this project was to reduce the amount of hazardous waste being generated by the Savannah River Siste Defense Waste Processing Technology-analytical Laboratory (DWPT-AL). A detailed characterization study was performed on 12 of the liquid effluent streams generated within the DWPT-AL. Two of the streams were not hazardous, and are now being collected separately from the 10 hazardous streams. A secondary goal of the project was to develop in-line methods using primarily adsorption/ion exchange columns to treat liquid effluent as it emerges from the analytical instrument as a slow, dripping flow. Samples from the 10 hazardous streams were treated by adsorption in an experimental apparatus that resembled an in-line or at source column apparatus. The layered adsorbent bed contained activated carbon and ion exchange resin. The column technique did not work on the first three samples of the spectroscopy waste stream, but worked well on the next three samples which were treated in a different column. It was determined that an unusual form of mercury was present in the first three samples. Similarly, two samples of a combined waste stream were rendered nonhazardous, but the last two samples contained acetylnitrile that prevented analysis. The characteristics of these streams changed from the initial characterization study; therefore, continual, in-deptch stream characterization is the key to making this project successful

  8. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    Science.gov (United States)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  10. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  11. Kinetic calculations for miniature neutron source reactor using analytical and numerical techniques

    International Nuclear Information System (INIS)

    Ampomah-Amoako, E.

    2008-06-01

    The analytical methods, step change in reactivity and ramp change in reactivity as well as numerical methods, fixed point iteration and Runge Kutta-gill were used to simulate the initial build up of neutrons in a miniature neutron source reactor with and without temperature feedback effect. The methods were modified to include photo neutron concentration. PARET 7.3 was used to simulate the transients behaviour of Ghana Research Reactor-1. The PARET code was capable of simulating the transients for 2.1 mk and 4 mk insertions of reactivity with peak powers of 49.87 kW and 92.34 kW, respectively. PARET code however failed to simulate 6.71 mk of reactivity which was predicted by Akaho et al through TEMPFED. (au)

  12. Aspects related to the testing of sealed radioactive sources

    International Nuclear Information System (INIS)

    Olteanu, C. M.; Nistor, V.; Valeca, S. C.

    2016-01-01

    Sealed radioactive sources are commonly used in a wide range of applications, such as: medical, industrial, agricultural and scientific research. The radioactive material is contained within the sealed source and the device allows the radiation to be used in a controlled way. Accidents can result if the control over a small fraction of those sources is lost. Sealed nuclear sources fall under the category of special form radioactive material, therefore they must meet safety requirements during transport according to regulations. Testing sealed radioactive sources is an important step in the conformity assessment process in order to obtain the design approval. In ICN Pitesti, the Reliability and Testing Laboratory is notified by CNCAN to perform tests on sealed radioactive sources. This paper wants to present aspects of the verifying tests on sealed capsules for Iridium-192 sources in order to demonstrate the compliance with the regulatory requirements and the program of quality assurance of the tests performed. (authors)

  13. MAGNETO-FRICTIONAL MODELING OF CORONAL NONLINEAR FORCE-FREE FIELDS. I. TESTING WITH ANALYTIC SOLUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keppens, R. [School of Astronomy and Space Science, Nanjing University, Nanjing 210023 (China); Xia, C. [Centre for mathematical Plasma-Astrophysics, Department of Mathematics, KU Leuven, B-3001 Leuven (Belgium); Valori, G., E-mail: guoyang@nju.edu.cn [University College London, Mullard Space Science Laboratory, Holmbury St. Mary, Dorking, Surrey RH5 6NT (United Kingdom)

    2016-09-10

    We report our implementation of the magneto-frictional method in the Message Passing Interface Adaptive Mesh Refinement Versatile Advection Code (MPI-AMRVAC). The method aims at applications where local adaptive mesh refinement (AMR) is essential to make follow-up dynamical modeling affordable. We quantify its performance in both domain-decomposed uniform grids and block-adaptive AMR computations, using all frequently employed force-free, divergence-free, and other vector comparison metrics. As test cases, we revisit the semi-analytic solution of Low and Lou in both Cartesian and spherical geometries, along with the topologically challenging Titov–Démoulin model. We compare different combinations of spatial and temporal discretizations, and find that the fourth-order central difference with a local Lax–Friedrichs dissipation term in a single-step marching scheme is an optimal combination. The initial condition is provided by the potential field, which is the potential field source surface model in spherical geometry. Various boundary conditions are adopted, ranging from fully prescribed cases where all boundaries are assigned with the semi-analytic models, to solar-like cases where only the magnetic field at the bottom is known. Our results demonstrate that all the metrics compare favorably to previous works in both Cartesian and spherical coordinates. Cases with several AMR levels perform in accordance with their effective resolutions. The magneto-frictional method in MPI-AMRVAC allows us to model a region of interest with high spatial resolution and large field of view simultaneously, as required by observation-constrained extrapolations using vector data provided with modern instruments. The applications of the magneto-frictional method to observations are shown in an accompanying paper.

  14. Analytical investigation of low temperature lift energy conversion systems with renewable energy source

    International Nuclear Information System (INIS)

    Lee, Hoseong; Hwang, Yunho; Radermacher, Reinhard

    2014-01-01

    The efficiency of the renewable energy powered energy conversion system is typically low due to its moderate heat source temperature. Therefore, improving its energy efficiency is essential. In this study, the performance of the energy conversion system with renewable energy source was theoretically investigated in order to explore its design aspect. For this purpose, a computer model of n-stage low temperature lift energy conversion (LTLEC) system was developed. The results showed that under given operating conditions such as temperatures and mass flow rates of heat source and heat sink fluids the unit power generation of the system increased with the number of stage, and it became saturated when the number of staging reached four. Investigation of several possible working fluids for the optimum stage LTLEC system revealed that ethanol could be an alternative to ammonia. The heat exchanger effectiveness is a critical factor on the system performance. The power generation was increased by 7.83% for the evaporator and 9.94% for the condenser with 10% increase of heat exchanger effectiveness. When these low temperature source fluids are applied to the LTLEC system, the heat exchanger performance would be very critical and it has to be designed accordingly. - Highlights: •Energy conversion system with renewable energy is analytically investigated. •A model of multi-stage low temperature lift energy conversion systems was developed. •The system performance increases as the stage number is increased. •The unit power generation is increased with increase of HX effectiveness. •Ethanol is found to be a good alternative to ammonia

  15. A 2D semi-analytical model for Faraday shield in ICP source

    International Nuclear Information System (INIS)

    Zhang, L.G.; Chen, D.Z.; Li, D.; Liu, K.F.; Li, X.F.; Pan, R.M.; Fan, M.W.

    2016-01-01

    Highlights: • In this paper, a 2D model of ICP with faraday shield is proposed considering the complex structure of the Faraday shield. • Analytical solution is found to evaluate the electromagnetic field in the ICP source with Faraday shield. • The collision-free motion of electrons in the source is investigated and the results show that the electrons will oscillate along the radial direction, which brings insight into how the RF power couple to the plasma. - Abstract: Faraday shield is a thin copper structure with a large number of slits which is usually used in inductive coupled plasma (ICP) sources. RF power is coupled into the plasma through these slits, therefore Faraday shield plays an important role in ICP discharge. However, due to the complex structure of the Faraday shield, the resulted electromagnetic field is quite hard to evaluate. In this paper, a 2D model is proposed on the assumption that the Faraday shield is sufficiently long and the RF coil is uniformly distributed, and the copper is considered as ideal conductor. Under these conditions, the magnetic field inside the source is uniform with only the axial component, while the electric field can be decomposed into a vortex field generated by changing magnetic field together with a gradient field generated by electric charge accumulated on the Faraday shield surface, which can be easily found by solving Laplace's equation. The motion of the electrons in the electromagnetic field is investigated and the results show that the electrons will oscillate along the radial direction when taking no account of collision. This interesting result brings insight into how the RF power couples into the plasma.

  16. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  17. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    Science.gov (United States)

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  18. Analytical formulae to calculate the solid angle subtended at an arbitrarily positioned point source by an elliptical radiation detector

    International Nuclear Information System (INIS)

    Abbas, Mahmoud I.; Hammoud, Sami; Ibrahim, Tarek; Sakr, Mohamed

    2015-01-01

    In this article, we introduce a direct analytical mathematical method for calculating the solid angle, Ω, subtended at a point by closed elliptical contours. The solid angle is required in many areas of optical and nuclear physics to estimate the flux of particle beam of radiation and to determine the activity of a radioactive source. The validity of the derived analytical expressions was successfully confirmed by the comparison with some published data (Numerical Method)

  19. Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plum Constituents Under Test Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II STTR project is to develop a prototype multi-analyte sensor system to detect gaseous analytes present in the test stands during...

  20. From Web Analytics to Product Analytics: The Internet of Things as a New Data Source for Enterprise Information Systems

    OpenAIRE

    Klat , Wilhelm; Stummer , Christian; Decker , Reinhold

    2016-01-01

    Part 4: Advanced Manufacturing and Management Aspects; International audience; The internet of things (IoT) paves the way for a new generation of consumer products that collect and exchange data, constituting a new data source for enterprise information systems (EIS). These IoT-ready products use built-in sensors and wireless communication technologies to capture and share data about product usage and the environment in which the products are used. The dissemination of the internet into the p...

  1. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm

    2017-01-01

    There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Analytical description of photon beam phase spaces in inverse Compton scattering sources

    Directory of Open Access Journals (Sweden)

    C. Curatolo

    2017-08-01

    Full Text Available We revisit the description of inverse Compton scattering sources and the photon beams generated therein, emphasizing the behavior of their phase space density distributions and how they depend upon those of the two colliding beams of electrons and photons. The main objective is to provide practical formulas for bandwidth, spectral density, brilliance, which are valid in general for any value of the recoil factor, i.e. both in the Thomson regime of negligible electron recoil, and in the deep Compton recoil dominated region, which is of interest for gamma-gamma colliders and Compton sources for the production of multi-GeV photon beams. We adopt a description based on the center of mass reference system of the electron-photon collision, in order to underline the role of the electron recoil and how it controls the relativistic Doppler/boost effect in various regimes. Using the center of mass reference frame greatly simplifies the treatment, allowing us to derive simple formulas expressed in terms of rms momenta of the two colliding beams (emittance, energy spread, etc. and the collimation angle in the laboratory system. Comparisons with Monte Carlo simulations of inverse Compton scattering in various scenarios are presented, showing very good agreement with the analytical formulas: in particular we find that the bandwidth dependence on the electron beam emittance, of paramount importance in Thomson regime, as it limits the amount of focusing imparted to the electron beam, becomes much less sensitive in deep Compton regime, allowing a stronger focusing of the electron beam to enhance luminosity without loss of mono-chromaticity. A similar effect occurs concerning the bandwidth dependence on the frequency spread of the incident photons: in deep recoil regime the bandwidth comes out to be much less dependent on the frequency spread. The set of formulas here derived are very helpful in designing inverse Compton sources in diverse regimes, giving a

  3. Analytical quality control of neutron activation analysis by interlaboratory comparison and proficiency test

    International Nuclear Information System (INIS)

    Kim, S. H.; Moon, J. H.; Jeong, Y. S.

    2002-01-01

    Two air filters (V-50, P-50) artificially loaded with urban dust were provided from IAEA and trace elements to study inter-laboratory comparison and proficiency test were determined using instrumental neutron activation analysis non-destructively. Standard reference material(Urban Particulate Matter, NIST SRM 1648) of National Institute of Standard and Technology was used for internal analytical quality control. About 20 elements in each loaded filter sample were determined, respectively. Our analytical data were compared with statistical results using neutron activation analysis, particle induced X-ray emission spectrometry, inductively coupled plasma mass spectroscopy, etc., which were collected from 49 laboratories of 40 countries. From the results that were statistically re-treated with reported values, Z-scores of our analytical values are within ±2. In addition, the results of proficiency test are passed and accuracy and precision of the analytical values are reliable. Consequently, it was proved that analytical quality control for the analysis of air dust samples is reasonable

  4. Effects of Analytical and Holistic Scoring Patterns on Scorer Reliability in Biology Essay Tests

    Science.gov (United States)

    Ebuoh, Casmir N.

    2018-01-01

    Literature revealed that the patterns/methods of scoring essay tests had been criticized for not being reliable and this unreliability is more likely to be more in internal examinations than in the external examinations. The purpose of this study is to find out the effects of analytical and holistic scoring patterns on scorer reliability in…

  5. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    International Nuclear Information System (INIS)

    Lowry, N.J.

    1998-01-01

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints

  6. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, N.J.

    1998-10-21

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints.

  7. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF.

  8. Cost effectiveness of ovarian reserve testing in in vitro fertilization : a Markov decision-analytic model

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Broekmans, Frank J. M.; van Disseldorp, Jeroen; Fauser, Bart C. J. M.; Eijkemans, Marinus J. C.; Hompes, Peter G. A.; van der Veen, Fulco; Mol, Ben Willem J.

    2011-01-01

    Objective: To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). Design: A Markov decision model based on data from the literature and original patient data. Setting: Decision analytic framework. Patient(s): Computer-simulated cohort of subfertile women aged

  9. Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD

    DEFF Research Database (Denmark)

    Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus

    2013-01-01

    D positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based...

  10. Verification of the analytical fracture assessments methods by a large scale pressure vessel test

    Energy Technology Data Exchange (ETDEWEB)

    Keinanen, H; Oberg, T; Rintamaa, R; Wallin, K

    1988-12-31

    This document deals with the use of fracture mechanics for the assessment of reactor pressure vessel. Tests have been carried out to verify the analytical fracture assessment methods. The analysis is focused on flaw dimensions and the scatter band of material characteristics. Results are provided and are compared to experimental ones. (TEC).

  11. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    OpenAIRE

    S. Padilla-Arellanes; F. Constantino-Casas; L. Núnez-Ochoa; J. Doubek; C. Vega-Murguia; J. Bouda

    2007-01-01

    The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT) and partial thromboplastin time (PTT) were determined in non-diluted and dil...

  12. Chemical/Biological Agent Resistance Test (CBART) Test Fixture System Verification and Analytical Monitioring System Development

    Science.gov (United States)

    2011-03-15

    progress was made towards the proportional intergral derivative (PID) tuning. The CBART NRT analytical system was developed, moved, replumbed, and...efficacy, or applicability of the contents hereof. The use of trade names in this report does not constitute endorsement of any commercial product ...Office MFC mass flow controller MS mass spectrometer MSD mass selective detector NRT near real-time PID proportional intergral derivative

  13. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  14. Analytical validation of an ultra low-cost mobile phone microplate reader for infectious disease testing.

    Science.gov (United States)

    Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei

    2018-07-01

    Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Evaluation of methods to leak test sealed radiation sources

    International Nuclear Information System (INIS)

    Arbeau, N.D.; Scott, C.K.

    1987-04-01

    The methods for the leak testing of sealed radiation sources were reviewed. One hundred and thirty-one equipment vendors were surveyed to identify commercially available leak test instruments. The equipment is summarized in tabular form by radiation type and detector type for easy reference. The radiation characteristics of the licensed sources were reviewed and summarized in a format that can be used to select the most suitable detection method. A test kit is proposed for use by inspectors when verifying a licensee's test procedures. The general elements of leak test procedures are discussed

  16. Analytical solution of spatial kinetics of the diffusion model for subcritical homogeneous systems driven by external source

    International Nuclear Information System (INIS)

    Oliveira, Fernando Luiz de

    2008-01-01

    This work describes an analytical solution obtained by the expansion method for the spatial kinetics using the diffusion model with delayed emission for source transients in homogeneous media. In particular, starting from simple models, and increasing the complexity, numerical results were obtained for different types of source transients. An analytical solution of the one group without precursors was solved, followed by considering one precursors family. The general case of G-groups with R families of precursor although having a closed form solution, cannot be solved analytically, since there are no explicit formulae for the eigenvalues, and numerical methods must be used to solve such problem. To illustrate the general solution, the multi-group (three groups) time-dependent problem without precursors was solved and the numerical results of a finite difference code were compared with the exact results for different transients. (author)

  17. GA-4/GA-9 honeycomb impact limiter tests and analytical model

    International Nuclear Information System (INIS)

    Koploy, M.A.; Taylor, C.S.

    1991-01-01

    General Atomics (GA) has a test program underway to obtain data on the behavior of a honeycomb impact limiter. The program includes testing of small samples to obtain basic information, as well as testing of complete 1/4-scale impact limiters to obtain load-versus-deflection curves for different crush orientations. GA has used the test results to aid in the development of an analytical model to predict the impact limiter loads. The results also helped optimize the design of the impact limiters for the GA-4 and GA-9 Casks

  18. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  19. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  20. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  1. Analytical support for the preparation of bundle test QUENCH-10 on air ingress

    International Nuclear Information System (INIS)

    Birchley, J.; Haste, T.; Homann, C.; Hering, W.

    2005-07-01

    Bundle test QUENCH-10 is dedicated to study air ingress with subsequent water quench during a supposed accident in a spent fuel storage tank. It was proposed by AEKI, Budapest, Hungary and was performed on 21 July 2004 in the QUENCH facility at Forschungszentrum Karlsruhe. Preparation of the test is based on common analytical work at Forschungszentrum Karlsruhe and Paul Scherrer Institut, Villigen, Switzerland, mainly with the severe accident codes SCDAP/RELAP5 and MELCOR, to derive the protocol for the essential test phases, namely pre-oxidation, air ingress and quench phase. For issues that could not be tackled by this computational work, suggestions for the test conduct were made and applied during the test. Improvements of the experimental set-up and the test conduct were suggested and largely applied. In SCDAP/RELAP5, an error was found: for thick oxide scales, the output value of the oxide scale is sensibly underestimated. For the aims of the test preparation, its consequences could be taken into account. Together with the related computational and other analytical support by the engaged institutions the test is co-financed as test QUENCH-L1 by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (LACOMERA Project, contract No. FIR1-CT2002-40158). (orig.)

  2. Analytical magmatic source modelling from a joint inversion of ground deformation and focal mechanisms data

    Science.gov (United States)

    Cannavo', Flavio; Scandura, Danila; Palano, Mimmo; Musumeci, Carla

    2014-05-01

    Seismicity and ground deformation represent the principal geophysical methods for volcano monitoring and provide important constraints on subsurface magma movements. The occurrence of migrating seismic swarms, as observed at several volcanoes worldwide, are commonly associated with dike intrusions. In addition, on active volcanoes, (de)pressurization and/or intrusion of magmatic bodies stress and deform the surrounding crustal rocks, often causing earthquakes randomly distributed in time within a volume extending about 5-10 km from the wall of the magmatic bodies. Despite advances in space-based, geodetic and seismic networks have significantly improved volcano monitoring in the last decades on an increasing worldwide number of volcanoes, quantitative models relating deformation and seismicity are not common. The observation of several episodes of volcanic unrest throughout the world, where the movement of magma through the shallow crust was able to produce local rotation of the ambient stress field, introduces an opportunity to improve the estimate of the parameters of a deformation source. In particular, during these episodes of volcanic unrest a radial pattern of P-axes of the focal mechanism solutions, similar to that of ground deformation, has been observed. Therefore, taking into account additional information from focal mechanisms data, we propose a novel approach to volcanic source modeling based on the joint inversion of deformation and focal plane solutions assuming that both observations are due to the same source. The methodology is first verified against a synthetic dataset of surface deformation and strain within the medium, and then applied to real data from an unrest episode occurred before the May 13th 2008 eruption at Mt. Etna (Italy). The main results clearly indicate as the joint inversion improves the accuracy of the estimated source parameters of about 70%. The statistical tests indicate that the source depth is the parameter with the highest

  3. The effects of free recall testing on subsequent source memory.

    Science.gov (United States)

    Brewer, Gene A; Marsh, Richard L; Meeks, Joseph T; Clark-Foos, Arlo; Hicks, Jason L

    2010-05-01

    The testing effect is the finding that prior retrieval of information from memory will result in better subsequent memory for that material. One explanation for these effects is that initial free recall testing increases the recollective details for tested information, which then becomes more available during a subsequent test phase. In three experiments we explored this hypothesis using a source-monitoring test phase after the initial free recall tests. We discovered that memory is differentially enhanced for certain recollective details depending on the nature of the free recall task. Thus further research needs to be conducted to specify how different kinds of memorial details are enhanced by free recall testing.

  4. Generalizing Source Geometry of Site Contamination by Simulating and Analyzing Analytical Solution of Three-Dimensional Solute Transport Model

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2014-01-01

    Full Text Available Due to the uneven distribution of pollutions and blur edge of pollutant area, there will exist uncertainty of source term shape in advective-diffusion equation model of contaminant transport. How to generalize those irregular source terms and deal with those uncertainties is very critical but rarely studied in previous research. In this study, the fate and transport of contaminant from rectangular and elliptic source geometry were simulated based on a three-dimensional analytical solute transport model, and the source geometry generalization guideline was developed by comparing the migration of contaminant. The result indicated that the variation of source area size had no effect on pollution plume migration when the plume migrated as far as five times of source side length. The migration of pollution plume became slower with the increase of aquifer thickness. The contaminant concentration was decreasing with scale factor rising, and the differences among various scale factors became smaller with the distance to field increasing.

  5. A Generic analytical solution for modelling pumping tests in wells intersecting fractures

    Science.gov (United States)

    Dewandel, Benoît; Lanini, Sandra; Lachassagne, Patrick; Maréchal, Jean-Christophe

    2018-04-01

    The behaviour of transient flow due to pumping in fractured rocks has been studied for at least the past 80 years. Analytical solutions were proposed for solving the issue of a well intersecting and pumping from one vertical, horizontal or inclined fracture in homogeneous aquifers, but their domain of application-even if covering various fracture geometries-was restricted to isotropic or anisotropic aquifers, whose potential boundaries had to be parallel or orthogonal to the fracture direction. The issue thus remains unsolved for many field cases. For example, a well intersecting and pumping a fracture in a multilayer or a dual-porosity aquifer, where intersected fractures are not necessarily parallel or orthogonal to aquifer boundaries, where several fractures with various orientations intersect the well, or the effect of pumping not only in fractures, but also in the aquifer through the screened interval of the well. Using a mathematical demonstration, we show that integrating the well-known Theis analytical solution (Theis, 1935) along the fracture axis is identical to the equally well-known analytical solution of Gringarten et al. (1974) for a uniform-flux fracture fully penetrating a homogeneous aquifer. This result implies that any existing line- or point-source solution can be used for implementing one or more discrete fractures that are intersected by the well. Several theoretical examples are presented and discussed: a single vertical fracture in a dual-porosity aquifer or in a multi-layer system (with a partially intersecting fracture); one and two inclined fractures in a leaky-aquifer system with pumping either only from the fracture(s), or also from the aquifer between fracture(s) in the screened interval of the well. For the cases with several pumping sources, analytical solutions of flowrate contribution from each individual source (fractures and well) are presented, and the drawdown behaviour according to the length of the pumped screened interval of

  6. CheapStat: an open-source, "do-it-yourself" potentiostat for analytical and educational applications.

    Directory of Open Access Journals (Sweden)

    Aaron A Rowe

    Full Text Available Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80, open-source (software and hardware, hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.

  7. A Test Stand for Ion Sources of Ultimate Reliability

    International Nuclear Information System (INIS)

    Enparantza, R.; Uriarte, L.; Romano, P.; Alonso, J.; Ariz, I.; Egiraun, M.; Bermejo, F. J.; Etxebarria, V.; Lucas, J.; Del Rio, J. M.; Letchford, A.; Faircloth, D.; Stockli, M.

    2009-01-01

    The rationale behind the ITUR project is to perform a comparison between different kinds of H - ion sources using the same beam diagnostics setup. In particular, a direct comparison will be made in terms of the emittance characteristics of Penning Type sources such as those currently in use in the injector for the ISIS (UK) Pulsed Neutron Source and those of volumetric type such as that driving the injector for the ORNL Spallation Neutron Source (TN, U.S.A.). The endeavour here pursued is thus to build an Ion Source Test Stand where virtually any type of source can be tested and its features measured and, thus compared to the results of other sources under the same gauge. It would be possible then to establish a common ground for effectively comparing different ion sources. The long term objectives are thus to contribute towards building compact sources of minimum emittance, maximum performance, high reliability-availability, high percentage of desired particle production, stability and high brightness. The project consortium is lead by Tekniker-IK4 research centre and partners are companies Elytt Energy and Jema Group. The technical viability is guaranteed by the collaboration between the project consortium and several scientific institutions, such the CSIC (Spain), the University of the Basque Country (Spain), ISIS (STFC-UK), SNS (ORNL-USA) and CEA in Saclay (France).

  8. Using analytic element models to delineate drinking water source protection areas.

    Science.gov (United States)

    Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve

    2006-01-01

    Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.

  9. A two-dimensional transient analytical solution for a ponded ditch drainage system under the influence of source/sink

    Science.gov (United States)

    Sarmah, Ratan; Tiwari, Shubham

    2018-03-01

    An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.

  10. Annual banned-substance review: analytical approaches in human sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm

    2016-01-01

    The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.

  11. 1+-n+ ECR ION SOURCE DEVELOPMENT TEST STAND

    International Nuclear Information System (INIS)

    Donald P. May

    2006-01-01

    A test stand for the investigation of 1+-n+ charge boosting using an ECR ion sources is currently being assembled at the Texas A and M Cyclotron Institute. The ultimate goal is to relate the charge-boosting of ions of stable species to possible charge-boosting of ions of radioactive species extracted from the diverse, low-charge-state ion sources developed for radioactive ion beams

  12. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  13. Enhanced H- ion source testing capabilities at LANSCE

    International Nuclear Information System (INIS)

    Ingalls, W.B.; Hardy, M.W.; Prichard, B.A.; Sander, O.R.; Stelzer, J.E.; Stevens, R.R.; Leung, K.N.; Williams, M.D.

    1998-01-01

    As part of the on-going beam-current upgrade in the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE), the current available from the H - injector will be increased from the present 16 to 18 mA to as much as 40 mA. A collaboration between the Ion Beam Technology Group at Lawrence Berkeley National Laboratory (LBNL) and the Ion Sources and Injectors section of LANSCE-2 at Los Alamos National Laboratory (LANL) has been formed to develop and evaluate a new ion source. A new Ion Source Test Stand (ISTS) has been constructed at LANSCE to evaluate candidate ion sources. The ISTS has been constructed to duplicate as closely as possible the beam transport and ancillary systems presently in use in the LANSCE H - injector, while incorporating additional beam diagnostics for source testing. The construction and commissioning of the ISTS will be described, preliminary results for the proof-of-principle ion source developed by the Berkeley group will be presented, and future plans for the extension of the test stand will be presented

  14. Special aerosol sources for certification and test of aerosol radiometers

    International Nuclear Information System (INIS)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E.

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author)

  15. Special aerosol sources for certification and test of aerosol radiometers

    Energy Technology Data Exchange (ETDEWEB)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E. (Union Research Institute of Instrumentation, Moscow (USSR))

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author).

  16. Transformational Leadership and Organizational Citizenship Behavior: A Meta-Analytic Test of Underlying Mechanisms.

    Science.gov (United States)

    Nohe, Christoph; Hertel, Guido

    2017-01-01

    Based on social exchange theory, we examined and contrasted attitudinal mediators (affective organizational commitment, job satisfaction) and relational mediators (trust in leader, leader-member exchange; LMX) of the positive relationship between transformational leadership and organizational citizenship behavior (OCB). Hypotheses were tested using meta-analytic path models with correlations from published meta-analyses (761 samples with 227,419 individuals overall). When testing single-mediator models, results supported our expectations that each of the mediators explained the relationship between transformational leadership and OCB. When testing a multi-mediator model, LMX was the strongest mediator. When testing a model with a latent attitudinal mechanism and a latent relational mechanism, the relational mechanism was the stronger mediator of the relationship between transformational leadership and OCB. Our findings help to better understand the underlying mechanisms of the relationship between transformational leadership and OCB.

  17. Radioactive particles in the environment: sources, particle characterization and analytical techniques

    International Nuclear Information System (INIS)

    2011-08-01

    Over the years, radioactive particles have been released to the environment from nuclear weapons testing and nuclear fuel cycle operations. However, measurements of environmental radioactivity and any associated assessments are often based on the average bulk mass or surface concentration, assuming that radionuclides are homogeneously distributed as simple ionic species. It has generally not been recognised that radioactive particles present in the environment often contain a significant fraction of the bulk sample activity, leading to sample heterogeneity problems and false and/or erratic measurement data. Moreover, the inherent differences in the transport and bioavailability of particle bound radionuclides compared with those existing as molecules or ions have largely been ignored in dose assessments. To date, most studies regarding radionuclide behaviour in the soil-plant system have dealt with soluble forms of radionuclides. When radionuclides are deposited in a less mobile form, or in case of a superposition of different physico-chemical forms, the behaviour of radionuclides becomes much more complicated and extra efforts are required to provide information about environmental status and behaviour of radioactive particles. There are currently no documents or international guides covering this aspect of environmental impact assessments. To fill this gap, between 2001 and 2008 the IAEA performed a Coordinated Research Programme (CRP- G4.10.03) on the 'Radiochemical, Chemical and Physical Characterization of Radioactive Particles in the Environment' with the objective of development, adoption and application of standardized analytical techniques for the comprehensive study of radioactive particles. The CRP was in line with the IAEA project intended to assist the Member States in building capacity for improving environmental assessments and for management of sites contaminated with radioactive particles. This IAEA-TECDOC presents the findings and achievements of

  18. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  19. An analytical calculation of the peak efficiency for cylindrical sources perpendicular to the detector axis in gamma-ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Julio C. [Autoridad Regulatoria Nuclear, Laboratorio de Espectrometria Gamma-CTBTO, Av. Del Libertador 8250, C1429BNP Buenos Aires (Argentina)], E-mail: jaguiar@sede.arn.gov.ar

    2008-08-15

    An analytical expression for the so-called full-energy peak efficiency {epsilon}(E) for cylindrical source with perpendicular axis to an HPGe detector is derived, using point-source measurements. The formula covers different measuring distances, matrix compositions, densities and gamma-ray energies; the only assumption is that the radioactivity is homogeneously distributed within the source. The term for the photon self-attenuation is included in the calculation. Measurements were made using three different sized cylindrical sources of {sup 241}Am, {sup 57}Co, {sup 137}Cs, {sup 54}Mn, and {sup 60}Co with corresponding peaks of 59.5, 122, 662, 835, 1173, and 1332 keV, respectively, and one measurement of radioactive waste drum for 662, 1173, and 1332 keV.

  20. Analytical Model of Coil Spring Damper Based on the Loading Test

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook; Park, Woong Ki [INNOSE TECH Co. LTD, Incheon (Korea, Republic of); Furuya, Osamu [Tokyo City University, Tokyo (Japan); Kurabayashi, Hiroshi [Vibro-System, Tokyo (Japan)

    2016-05-15

    The one way of solving such problems is to enhance and to develop an improved damping element used in base-isolation and response control system. A cost reduction of damper for a large scale structure is another important task to upgrade the total response control abilities in the near future. This study has examined a response control device using elastoplastic hysteresis damping of metal material. The proposed damper is designed to be coil spring element shape for a uniform stress of metal and for a reduction of low cyclic fatigue in large deformation to upgrade a repetitive strength during the earthquake motions. By using the metal material of SS400 general structural rolled steel, the corresponding cost issues of the damping element will be effectively reduced. The analytical of elasto-plastic coil spring damper (CSD) is introduced, and basic mechanical properties evaluated experimentally and analytically. This study has been examined the response control damper using elasto-plastic hysteresis characteristics of metal material. The paper described the design method of elasto-plastic coil spring damper, basic mechanical properties evaluated from loading test, and analytical model of damper are summarized. It was confirmed that the damping force and mechanical characteristics of elasto-plastic coil spring damper are almost satisfied the design specifications.

  1. Testing methods of ECR ion source experimental platform

    International Nuclear Information System (INIS)

    Zhou Changgeng; Hu Yonghong; Li Yan

    2006-12-01

    The principle and structure of ECR ion source experimental platform were introduce. The testing methods of the parameters of single main component and the comprehensive parameters under the condition of certain beam current and beam spot diameter were summarized in process of manufacturing. Some appropriate testing dates were given. The existent questions (the parameters of plasma density in discharge chamber and accurate hydrogen flow, etc. can not be measured in operation) and resolutions were also put forward. (authors)

  2. Project of a test stand for cyclotron ion sources

    International Nuclear Information System (INIS)

    Buettig, H.; Dietrich, J.; Merker, H.; Odrich, H.; Preusche, S.; Weissig, J.

    1978-10-01

    In the work the construction of a test stand for testing and optimization of ion sources of the Rossendorf cyclotron U-120 is represented. The design procedure and the construction of the electromagnet, the vacuum chamber with monant, the vacuum system, the power supply and the detecting system are demonstrated. The results of calculations of the motion of ions in the magnetic field are presented. (author)

  3. Source effects on surface waves from Nevada Test Site explosions

    International Nuclear Information System (INIS)

    Patton, H.J.; Vergino, E.S.

    1981-11-01

    Surface waves recorded on the Lawrence Livermore National Laboratory (LLNL) digital network have been used to study five underground nuclear explosions detonated in Yucca Valley at the Nevada Test Site. The purpose of this study is to characterize the reduced displacement potential (RDP) at low frequencies and to test secondary source models of underground explosions. The observations consist of Rayleigh- and Love-wave amplitude and phase spectra in the frequency range 0.03 to 0.16 Hz. We have found that Rayleigh-wave spectral amplitudes are modeled well by a RDP with little or no overshoot for explosions detonated in alluvium and tuff. On the basis of comparisons between observed and predicted source phase, the spall closure source proposed by Viecelli does not appear to be a significant source of Rayleigh waves that reach the far field. We tested two other secondary source models, the strike-slip, tectonic strain release model proposed by Toksoez and Kehrer and the dip-slip thrust model of Masse. The surface-wave observations do not provide sufficient information to discriminate between these models at the low F-values (0.2 to 0.8) obtained for these explosions. In the case of the strike-slip model, the principal stress axes inferred from the fault slip angle and strike angle are in good agreement with the regional tectonic stress field for all but one explosion, Nessel. The results of the Nessel explosion suggest a mechanism other than tectonic strain release

  4. Comparison of analytic source models for head scatter factor calculation and planar dose calculation for IMRT

    International Nuclear Information System (INIS)

    Yan Guanghua; Liu, Chihray; Lu Bo; Palta, Jatinder R; Li, Jonathan G

    2008-01-01

    The purpose of this study was to choose an appropriate head scatter source model for the fast and accurate independent planar dose calculation for intensity-modulated radiation therapy (IMRT) with MLC. The performance of three different head scatter source models regarding their ability to model head scatter and facilitate planar dose calculation was evaluated. A three-source model, a two-source model and a single-source model were compared in this study. In the planar dose calculation algorithm, in-air fluence distribution was derived from each of the head scatter source models while considering the combination of Jaw and MLC opening. Fluence perturbations due to tongue-and-groove effect, rounded leaf end and leaf transmission were taken into account explicitly. The dose distribution was calculated by convolving the in-air fluence distribution with an experimentally determined pencil-beam kernel. The results were compared with measurements using a diode array and passing rates with 2%/2 mm and 3%/3 mm criteria were reported. It was found that the two-source model achieved the best agreement on head scatter factor calculation. The three-source model and single-source model underestimated head scatter factors for certain symmetric rectangular fields and asymmetric fields, but similar good agreement could be achieved when monitor back scatter effect was incorporated explicitly. All the three source models resulted in comparable average passing rates (>97%) when the 3%/3 mm criterion was selected. The calculation with the single-source model and two-source model was slightly faster than the three-source model due to their simplicity

  5. Comparison of analytic source models for head scatter factor calculation and planar dose calculation for IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Yan Guanghua [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Liu, Chihray; Lu Bo; Palta, Jatinder R; Li, Jonathan G [Department of Radiation Oncology, University of Florida, Gainesville, FL 32610-0385 (United States)

    2008-04-21

    The purpose of this study was to choose an appropriate head scatter source model for the fast and accurate independent planar dose calculation for intensity-modulated radiation therapy (IMRT) with MLC. The performance of three different head scatter source models regarding their ability to model head scatter and facilitate planar dose calculation was evaluated. A three-source model, a two-source model and a single-source model were compared in this study. In the planar dose calculation algorithm, in-air fluence distribution was derived from each of the head scatter source models while considering the combination of Jaw and MLC opening. Fluence perturbations due to tongue-and-groove effect, rounded leaf end and leaf transmission were taken into account explicitly. The dose distribution was calculated by convolving the in-air fluence distribution with an experimentally determined pencil-beam kernel. The results were compared with measurements using a diode array and passing rates with 2%/2 mm and 3%/3 mm criteria were reported. It was found that the two-source model achieved the best agreement on head scatter factor calculation. The three-source model and single-source model underestimated head scatter factors for certain symmetric rectangular fields and asymmetric fields, but similar good agreement could be achieved when monitor back scatter effect was incorporated explicitly. All the three source models resulted in comparable average passing rates (>97%) when the 3%/3 mm criterion was selected. The calculation with the single-source model and two-source model was slightly faster than the three-source model due to their simplicity.

  6. A semi-analytical solution for slug tests in an unconfined aquifer considering unsaturated flow

    Science.gov (United States)

    Sun, Hongbing

    2016-01-01

    A semi-analytical solution considering the vertical unsaturated flow is developed for groundwater flow in response to a slug test in an unconfined aquifer in Laplace space. The new solution incorporates the effects of partial penetrating, anisotropy, vertical unsaturated flow, and a moving water table boundary. Compared to the Kansas Geological Survey (KGS) model, the new solution can significantly improve the fittings of the modeled to the measured hydraulic heads at the late stage of slug tests in an unconfined aquifer, particularly when the slug well has a partially submerged screen and moisture drainage above the water table is significant. The radial hydraulic conductivities estimated with the new solution are comparable to those from the KGS, Bouwer and Rice, and Hvorslev methods. In addition, the new solution also can be used to examine the vertical conductivity, specific storage, specific yield, and the moisture retention parameters in an unconfined aquifer based on slug test data.

  7. Performance specifications for the extra-analytical phases of laboratory testing: Why and how.

    Science.gov (United States)

    Plebani, Mario

    2017-07-01

    An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  9. Source Country Differences in Test Score Gaps: Evidence from Denmark

    Science.gov (United States)

    Rangvid, Beatrice Schindler

    2010-01-01

    We combine data from three studies for Denmark in the PISA 2000 framework to investigate differences in the native-immigrant test score gap by country of origin. In addition to the controls available from PISA data sources, we use student-level data on home background and individual migration histories linked from administrative registers. We find…

  10. Upgrade of the BATMAN test facility for H- source development

    Science.gov (United States)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-04-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called "Large Area Grid" (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame.

  11. Upgrade of the BATMAN test facility for H− source development

    International Nuclear Information System (INIS)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-01-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called “Large Area Grid” (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame

  12. Analytic sensing for multi-layer spherical models with application to EEG source imaging

    OpenAIRE

    Kandaswamy, Djano; Blu, Thierry; Van De Ville, Dimitri

    2013-01-01

    Source imaging maps back boundary measurements to underlying generators within the domain; e. g., retrieving the parameters of the generating dipoles from electrical potential measurements on the scalp such as in electroencephalography (EEG). Fitting such a parametric source model is non-linear in the positions of the sources and renewed interest in mathematical imaging has led to several promising approaches. One important step in these methods is the application of a sensing principle that ...

  13. A very high yield electron impact ion source for analytical mass spectrometry

    International Nuclear Information System (INIS)

    Koontz, S.L.; Bonner Denton, M.

    1981-01-01

    A novel ion source designed for use in mass spectrometric determination of organic compounds is described. The source is designed around a low pressure, large volume, hot cathode Penning discharge. The source operates in the 10 -4 - 10 -7 torr pressure domain and is capable of producing focusable current densities several orders of magnitude greater than those produced by conventional Nier -type sources. Mass spectra of n-butane and octafluoro-2-butene are presented. An improved signal-to-noise ratio is demonstrated with a General Electric Monopole 300 mass spectrometer. (orig.)

  14. Analytical support for the B4C control rod test QUENCH-07

    International Nuclear Information System (INIS)

    Homann, C.; Hering, W.; Fernandez Benitez, J.A.; Ortega Bernardo, M.

    2003-04-01

    Degradation of B 4 C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  15. Analytical support for the B{sub 4}C control rod test QUENCH-07

    Energy Technology Data Exchange (ETDEWEB)

    Homann, C.; Hering, W. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Inst. fuer Reaktorsicherheit]|[Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Programm Nukleare Sicherheitsforschung; Birchley, J. [Paul Scherrer Inst. (Switzerland); Fernandez Benitez, J.A.; Ortega Bernardo, M. [Univ. Politecnica de Madrid (Spain)

    2003-04-01

    Degradation of B{sub 4}C absorber rods during a beyond design accident in a nuclear power reactor may be a safety concern. Among others, the integral test QUENCH-07 is performed in the FZK QUENCH facility and supported by analytical work within the Euratom Fifth Framework Programme on Nuclear Fission Safety to get a more profound database. Since the test differed substantially from previous QUENCH tests, much more work had to be done for pretest calculations than usual to guarantee the safety of the facility and to derive the test protocol. Several institutions shared in this work with different computer code systems, as used for nuclear reactor safety analyses. Due to this effort, problems could be identified and solved, leading to several modifications of the originally planned test conduct, until a feasible test protocol could be derived and recommended. All calculations showed the same trends. Especially the high temperatures and hence the small safety margin for the facility were a concern. In this report, contributions of various authors, engaged in this work, are presented. The test QUENCH-07 and the related computational support by the engaged institutions were co-financed by the European Community under the Euratom Fifth Framework Programme on Nuclear Fission Safety 1998 - 2002 (COLOSS Project, contract No. FIKS-CT-1999-00002). (orig.)

  16. The impact of repeat-testing of common chemistry analytes at critical concentrations.

    Science.gov (United States)

    Onyenekwu, Chinelo P; Hudson, Careen L; Zemlin, Annalise E; Erasmus, Rajiv T

    2014-12-01

    Early notification of critical values by the clinical laboratory to the treating physician is a requirement for accreditation and is essential for effective patient management. Many laboratories automatically repeat a critical value before reporting it to prevent possible misdiagnosis. Given today's advanced instrumentation and quality assurance practices, we questioned the validity of this approach. We performed an audit of repeat-testing in our laboratory to assess for significant differences between initial and repeated test results, estimate the delay caused by repeat-testing and to quantify the cost of repeating these assays. A retrospective audit of repeat-tests for sodium, potassium, calcium and magnesium in the first quarter of 2013 at Tygerberg Academic Laboratory was conducted. Data on the initial and repeat-test values and the time that they were performed was extracted from our laboratory information system. The Clinical Laboratory Improvement Amendment criteria for allowable error were employed to assess for significant difference between results. A total of 2308 repeated tests were studied. There was no significant difference in 2291 (99.3%) of the samples. The average delay ranged from 35 min for magnesium to 42 min for sodium and calcium. At least 2.9% of laboratory running costs for the analytes was spent on repeating them. The practice of repeating a critical test result appears unnecessary as it yields similar results, delays notification to the treating clinician and increases laboratory running costs.

  17. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Science.gov (United States)

    2010-07-01

    ... approves the use of E. coli as a fecal indicator for source water monitoring under this paragraph (a). If the repeat sample collected from the ground water source is E.coli positive, the system must comply... listed in the in paragraph (c)(2) of this section for the presence of E. coli, enterococci, or coliphage...

  18. A systematic quantification of the sources of variation of process analytical measurements in the steel industry

    NARCIS (Netherlands)

    Jellema, R.H.; Louwerse, D.J.; Smilde, A.K.; Gerritsen, M.J.P.; Guldemond, D.; Voet, van der H.; Vereijken, P.F.G.

    2003-01-01

    A strategy is proposed for the Identification and quantification of sources of variation in a manufacturing process. The strategy involves six steps: identification and selection of factors, model selection, design of the experiments, performing the experiments, estimation of sources of variation,

  19. Temperature field due to time-dependent heat sources in a large rectangular grid - Derivation of analytical solution

    International Nuclear Information System (INIS)

    Claesson, J.; Probert, T.

    1996-01-01

    The temperature field in rock due to a large rectangular grid of heat releasing canisters containing nuclear waste is studied. The solution is by superposition divided into different parts. There is a global temperature field due to the large rectangular canister area, while a local field accounts for the remaining heat source problem. The global field is reduced to a single integral. The local field is also solved analytically using solutions for a finite line heat source and for an infinite grid of point sources. The local solution is reduced to three parts, each of which depends on two spatial coordinates only. The temperatures at the envelope of a canister are given by a single thermal resistance, which is given by an explicit formula. The results are illustrated by a few numerical examples dealing with the KBS-3 concept for storage of nuclear waste. 8 refs

  20. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    Science.gov (United States)

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  1. Continuous Analytical Performances Monitoring at the On-Site Laboratory through Proficiency, Inter-Laboratory Testing and Inter-Comparison Analytical Methods

    International Nuclear Information System (INIS)

    Duhamel, G.; Decaillon, J.-G.; Dashdondog, S.; Kim, C.-K.; Toervenyi, A.; Hara, S.; Kato, S.; Kawaguchi, T.; Matsuzawa, K.

    2015-01-01

    Since 2008, as one measure to strengthen its quality management system, the On-Site Laboratory for nuclear safeguards at the Rokkasho Reprocessing Plant, has increased its participation in domestic and international proficiency and inter-laboratory testing for the purpose of determining analytical method accuracy, precision and robustness but also to support method development and improvement. This paper provides a description of the testing and its scheduling. It presents the way the testing was optimized to cover most of the analytical methods at the OSL. The paper presents the methodology used for the evaluation of the obtained results based on Analysis of variance (ANOVA). Results are discussed with respect to random, systematic and long term systematic error. (author)

  2. A Comparison of Two Approaches for the Ruggedness Testing of an Analytical Method

    International Nuclear Information System (INIS)

    Maestroni, Britt

    2016-01-01

    As part of an initiative under the “Red Analitica de Latino America y el Caribe” (RALACA) network the FAO/IAEA Food and Environmental Protection Laboratory validated a multi-residue method for pesticides in potato. One of the parameters to be assessed was the intra laboratory robustness or ruggedness. The objective of this work was to implement a worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method. As a conclusion to this study, it is evident that there is a need for harmonization of the definition of the terms robustness/ruggedness, the limits, the methodology and the statistical treatment of the generated data. A worked example for RALACA laboratories to test for the robustness (ruggedness) of an analytical method will soon be posted on the RALACA website (www.red-ralaca.net). This study was carried out with collaborators from LVA (Austria), University of Antwerp (Belgium), University of Leuwen (The Netherlands), Universidad de la Republica (Uruguay) and Agilent technologies.

  3. Data from thermal testing of the Open Source Cryostage

    DEFF Research Database (Denmark)

    Buch, Johannes Lørup; Ramløv, Hans

    2016-01-01

    The data presented here is related to the research article "An open source cryostage and software analysis method for detection of antifreeze activity" (Buch and Ramløv, 2016) [1]. The design of the Open Source Cryostage (OSC) is tested in terms of thermal limits, thermal efficiency and electrical...... efficiency. This article furthermore includes an overview of the electrical circuitry and a flowchart of the software program controlling the temperature of the OSC. The thermal efficiency data is presented here as degrees per volt and maximum cooling capacity....

  4. Analytical calculation of the solid angle subtended by an arbitrarily positioned ellipsoid to a point source

    International Nuclear Information System (INIS)

    Heitz, Eric

    2017-01-01

    We present a geometric method for computing an ellipse that subtends the same solid-angle domain as an arbitrarily positioned ellipsoid. With this method we can extend existing analytical solid-angle calculations of ellipses to ellipsoids. Our idea consists of applying a linear transformation on the ellipsoid such that it is transformed into a sphere from which a disk that covers the same solid-angle domain can be computed. We demonstrate that by applying the inverse linear transformation on this disk we obtain an ellipse that subtends the same solid-angle domain as the ellipsoid. We provide a MATLAB implementation of our algorithm and we validate it numerically.

  5. Analytical calculation of the solid angle subtended by an arbitrarily positioned ellipsoid to a point source

    Energy Technology Data Exchange (ETDEWEB)

    Heitz, Eric, E-mail: eheitz.research@gmail.com

    2017-04-21

    We present a geometric method for computing an ellipse that subtends the same solid-angle domain as an arbitrarily positioned ellipsoid. With this method we can extend existing analytical solid-angle calculations of ellipses to ellipsoids. Our idea consists of applying a linear transformation on the ellipsoid such that it is transformed into a sphere from which a disk that covers the same solid-angle domain can be computed. We demonstrate that by applying the inverse linear transformation on this disk we obtain an ellipse that subtends the same solid-angle domain as the ellipsoid. We provide a MATLAB implementation of our algorithm and we validate it numerically.

  6. Analytical calculations of the efficiency of gamma scintillators total efficiency for coaxial disk sources

    Energy Technology Data Exchange (ETDEWEB)

    Selim, Y S; Abbas, M I; Fawzy, M A [Physics Department, Faculty of Science, Alexandria University, Aleaxndria (Egypt)

    1997-12-31

    Total efficiency of clad right circular cylindrical Nal(TI) scintillation detector from a coaxial isotropic radiating circular disk source has been calculated by the of rigid mathematical expressions. Results were tabulated for various gamma energies. 2 figs., 5 tabs.

  7. INAA in combination with other analytical techniques in the study of urban aerosol sources

    International Nuclear Information System (INIS)

    Binh, N.T.; Truong, Y.; Ngo, N.T.; Sieu, L.N.; Hien, P.D.

    2000-01-01

    Concentrations of elements in fine and coarse PM10 samples collected in Ho Chi Minh City were determined by INAA for the purpose of characterising air pollution sources using multivariate receptor modeling techniques. Seven sources common to coarse and fine samples were identified. Resuspended soil dust is dominant in the coarse samples accounting for 41% of the particulate mass. In the fine samples, vehicle emissions and coal burning are most important accounting for about 20% each. Although a great number of elements were included in the input data for receptor modeling, the interpretation of emission sources was not always straightforward. Information on other source markers were needed. Therefore, a polarography method was used for quantifying lead, and recently, ion chromatography method became available for quantifying secondary sulphates, nitrates and other water soluble ions. (author)

  8. Analytic and Unambiguous Phase-Based Algorithm for 3-D Localization of a Single Source with Uniform Circular Array

    Directory of Open Access Journals (Sweden)

    Le Zuo

    2018-02-01

    Full Text Available This paper presents an analytic algorithm for estimating three-dimensional (3-D localization of a single source with uniform circular array (UCA interferometers. Fourier transforms are exploited to expand the phase distribution of a single source and the localization problem is reformulated as an equivalent spectrum manipulation problem. The 3-D parameters are decoupled to different spectrums in the Fourier domain. Algebraic relations are established between the 3-D localization parameters and the Fourier spectrums. Fourier sampling theorem ensures that the minimum element number for 3-D localization of a single source with a UCA is five. Accuracy analysis provides mathematical insights into the 3-D localization algorithm that larger number of elements gives higher estimation accuracy. In addition, the phase-based high-order difference invariance (HODI property of a UCA is found and exploited to realize phase range compression. Following phase range compression, ambiguity resolution is addressed by the HODI of a UCA. A major advantage of the algorithm is that the ambiguity resolution and 3-D localization estimation are both analytic and are processed simultaneously, hence computationally efficient. Numerical simulations and experimental results are provided to verify the effectiveness of the proposed 3-D localization algorithm.

  9. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    International Nuclear Information System (INIS)

    Chen, Ming; Yu, Hengyong

    2015-01-01

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units

  10. Analytic treatment of leading-order parton evolution equations: Theory and tests

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; McKay, Douglas W.

    2009-01-01

    We recently derived an explicit expression for the gluon distribution function G(x,Q 2 )=xg(x,Q 2 ) in terms of the proton structure function F 2 γp (x,Q 2 ) in leading-order (LO) QCD by solving the LO Dokshitzer-Gribov-Lipatov-Altarelli-Parisi equation for the Q 2 evolution of F 2 γp (x,Q 2 ) analytically, using a differential-equation method. We showed that accurate experimental knowledge of F 2 γp (x,Q 2 ) in a region of Bjorken x and virtuality Q 2 is all that is needed to determine the gluon distribution in that region. We rederive and extend the results here using a Laplace-transform technique, and show that the singlet quark structure function F S (x,Q 2 ) can be determined directly in terms of G from the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi gluon evolution equation. To illustrate the method and check the consistency of existing LO quark and gluon distributions, we used the published values of the LO quark distributions from the CTEQ5L and MRST2001 LO analyses to form F 2 γp (x,Q 2 ), and then solved analytically for G(x,Q 2 ). We find that the analytic and fitted gluon distributions from MRST2001LO agree well with each other for all x and Q 2 , while those from CTEQ5L differ significantly from each other for large x values, x > or approx. 0.03-0.05, at all Q 2 . We conclude that the published CTEQ5L distributions are incompatible in this region. Using a nonsinglet evolution equation, we obtain a sensitive test of quark distributions which holds in both LO and next-to-leading order perturbative QCD. We find in either case that the CTEQ5 quark distributions satisfy the tests numerically for small x, but fail the tests for x > or approx. 0.03-0.05--their use could potentially lead to significant shifts in predictions of quantities sensitive to large x. We encountered no problems with the MRST2001LO distributions or later CTEQ distributions. We suggest caution in the use of the CTEQ5 distributions.

  11. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J. William

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP

  12. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  13. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J.W. [California Univ., Riverside, CA (United States). Dept. of Physics

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.) 6 refs.

  14. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J.W.

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.)

  15. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Science.gov (United States)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e +e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon hets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  16. Analytical performance of centrifuge-based device for clinical chemistry testing.

    Science.gov (United States)

    Suk-Anake, Jamikorn; Promptmas, Chamras

    2012-01-01

    A centrifuge-based device has been introduced to the Samsung Blood Analyzer (SBA). The verification of this analyzer is essential to meet the ISO15189 standard. Analytical performance was evaluated according to the NCCLS EP05-A method. The results of plasma samples were compared between the SBA and a Hitachi 917 analyzer according to the NCCLS EP09-A2-IR method. Percent recovery was determined via analysis of original control serum and spiked serum. Within-run precision was found to be 0.00 - 6.61% and 0.96 - 5.99% in normal- and abnormal-level assays, respectively, while between-run precision was 1.31 - 9.09% and 0.89 - 6.92%, respectively. The correlation coefficients (r) were > 0.990. The SBA presented analytical accuracy at 96.64 +/- 3.39% to 102.82 +/- 2.75% and 98.31 +/- 4.04% to 103.61 +/- 8.28% recovery, respectively. The results obtained verify that all of the 13 tests performed using the SBA demonstrates good and reliable precision suitable for use in qualified clinical chemistry laboratory service.

  17. Testing a 1-D Analytical Salt Intrusion Model and the Predictive Equation in Malaysian Estuaries

    Science.gov (United States)

    Gisen, Jacqueline Isabella; Savenije, Hubert H. G.

    2013-04-01

    Little is known about the salt intrusion behaviour in Malaysian estuaries. Study on this topic sometimes requires large amounts of data especially if a 2-D or 3-D numerical models are used for analysis. In poor data environments, 1-D analytical models are more appropriate. For this reason, a fully analytical 1-D salt intrusion model, based on the theory of Savenije in 2005, was tested in three Malaysian estuaries (Bernam, Selangor and Muar) because it is simple and requires minimal data. In order to achieve that, site surveys were conducted in these estuaries during the dry season (June-August) at spring tide by moving boat technique. Data of cross-sections, water levels and salinity were collected, and then analysed with the salt intrusion model. This paper demonstrates a good fit between the simulated and observed salinity distribution for all three estuaries. Additionally, the calibrated Van der Burgh's coefficient K, Dispersion coefficient D0, and salt intrusion length L, for the estuaries also displayed a reasonable correlations with those calculated from the predictive equations. This indicates that not only is the salt intrusion model valid for the case studies in Malaysia but also the predictive model. Furthermore, the results from this study describe the current state of the estuaries with which the Malaysian water authority in Malaysia can make decisions on limiting water abstraction or dredging. Keywords: salt intrusion, Malaysian estuaries, discharge, predictive model, dispersion

  18. Exact analytical solution of time-independent neutron transport equation, and its applications to systems with a point source

    International Nuclear Information System (INIS)

    Mikata, Y.

    2014-01-01

    Highlights: • An exact solution for the one-speed neutron transport equation is obtained. • This solution as well as its derivation are believed to be new. • Neutron flux for a purely absorbing material with a point neutron source off the origin is obtained. • Spherically as well as cylindrically piecewise constant cross sections are studied. • Neutron flux expressions for a point neutron source off the origin are believed to be new. - Abstract: An exact analytical solution of the time-independent monoenergetic neutron transport equation is obtained in this paper. The solution is applied to systems with a point source. Systematic analysis of the solution of the time-independent neutron transport equation, and its applications represent the primary goal of this paper. To the best of the author’s knowledge, certain key results on the scalar neutron flux as well as their derivations are new. As an application of these results, a scalar neutron flux for a purely absorbing medium with a spherically piecewise constant cross section and an isotropic point neutron source off the origin as well as that for a cylindrically piecewise constant cross section with a point neutron source off the origin are obtained. Both of these results are believed to be new

  19. Coagulation Tests and Selected Biochemical Analytes in Dairy Cows with Hepatic Lipidosis

    Directory of Open Access Journals (Sweden)

    S. Padilla-Arellanes

    2007-01-01

    Full Text Available The aim of this study was to determine the values and changes in conventional and optimised clotting tests, as well as in selected biochemical analytes during hepatic lipidosis in postpartum dairy cows. Ten healthy and ten Holstein cows with hepatic lipidosis were selected based upon clinical history, clinical examination, liver biopsy, flotation test and histological analysis of hepatic tissue. Prothrombin time (PT and partial thromboplastin time (PTT were determined in non-diluted and diluted blood plasma samples. Clotting times determined in diluted plasma samples were prolonged in cows with hepatic lipidosis and there was a difference in the PT value at both 50% and 25% plasma dilutions between both groups of animals (P = 0.004 and P = 0.001. Significant differences between healthy animals and cows with hepatic lipidosis were observed in blood serum values for free fatty acids (FFA, aspartate aminotransferase (AST and triacyglycerols (P = 0.001, P = 0.007 and P = 0.044, respectively. FFA and liver biopsy are better diagnostic indicators for hepatic lipidosis than coagulation tests. The optimised PT is prolonged in cows with hepatic lipidosis and can detect this alteration that cannot be appreciated using conventional PT test.

  20. Transfer of test-enhanced learning: Meta-analytic review and synthesis.

    Science.gov (United States)

    Pan, Steven C; Rickard, Timothy C

    2018-05-07

    Attempting recall of information from memory, as occurs when taking a practice test, is one of the most potent training techniques known to learning science. However, does testing yield learning that transfers to different contexts? In the present article, we report the findings of the first comprehensive meta-analytic review into that question. Our review encompassed 192 transfer effect sizes extracted from 122 experiments and 67 published and unpublished articles (N = 10,382) that together comprise more than 40 years of research. A random-effects model revealed that testing can yield transferrable learning as measured relative to a nontesting reexposure control condition (d = 0.40, 95% CI [0.31, 0.50]). That transfer of learning is greatest across test formats, to application and inference questions, to problems involving medical diagnoses, and to mediator and related word cues; it is weakest to rearranged stimulus-response items, to untested materials seen during initial study, and to problems involving worked examples. Moderator analyses further indicated that response congruency and elaborated retrieval practice, as well as initial test performance, strongly influence the likelihood of positive transfer. In two assessments for publication bias using PET-PEESE and various selection methods, the moderator effect sizes were minimally affected. However, the intercept predictions were substantially reduced, often indicating no positive transfer when none of the aforementioned moderators are present. Overall, our results motivate a three-factor framework for transfer of test-enhanced learning and have practical implications for the effective use of practice testing in educational and other training contexts. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Comparison of EPRI safety valve test data with analytically determined hydraulic results

    International Nuclear Information System (INIS)

    Smith, L.C.; Howe, K.S.

    1983-01-01

    NUREG-0737 (November 1980) and all subsequent U.S. NRC generic follow-up letters require that all operating plant licensees and applicants verify the acceptability of plant specific pressurizer safety valve piping systems for valve operation transients by testing. To aid in this verification process, the Electric Power Research Institute (EPRI) conducted an extensive testing program at the Combustion Engineering Test Facility. Pertinent tests simulating dynamic opening of the safety valves for representative upstream environments were carried out. Different models and sizes of safety valves were tested at the simulated operating conditions. Transducers placed at key points in the system monitored a variety of thermal, hydraulic and structural parameters. From this data, a more complete description of the transient can be made. The EPRI test configuration was analytically modeled using a one-dimensional thermal hydraulic computer program that uses the method of characteristics approach to generate key fluid parameters as a function of space and time. The conservative equations are solved by applying both the implicit and explicit characteristic methods. Unbalanced or wave forces were determined for each straight run of pipe bounded on each side by a turn or elbow. Blowdown forces were included, where appropriate. Several parameters were varied to determine the effects on the pressure, hydraulic forces and timings of events. By comparing these quantities with the experimentally obtained data, an approximate picture of the flow dynamics is arrived at. Two cases in particular are presented. These are the hot and cold loop seal discharge tests made with the Crosby 6M6 spring-loaded safety valve. Included in the paper is a description of the hydraulic code, modeling techniques and assumptions, a comparison of the numerical results with experimental data and a qualitative description of the factors which govern pipe support loading. (orig.)

  2. The analytical investigation of the super-Gaussian pump source on ...

    Indian Academy of Sciences (India)

    In this paper, we assumed that the fiber core and first clad are exposed to a pump source with a super-Gaussian profile of order four. The effects of this non-uniform heat deposition on thermal, stress and thermo-optics properties such as temperature-dependent change of refractive index and thermally induced stress have ...

  3. A test on analytic continuation of thermal imaginary-time data

    International Nuclear Information System (INIS)

    Burnier, Y.; Laine, M.; Mether, L.

    2011-01-01

    Some time ago, Cuniberti et al. have proposed a novel method for analytically continuing thermal imaginary-time correlators to real time, which requires no model input and should be applicable with finite-precision data as well. Given that these assertions go against common wisdom, we report on a naive test of the method with an idealized example. We do encounter two problems, which we spell out in detail; this implies that systematic errors are difficult to quantify. On a more positive note, the method is simple to implement and allows for an empirical recipe by which a reasonable qualitative estimate for some transport coefficient may be obtained, if statistical errors of an ultraviolet-subtracted imaginary-time measurement can be reduced to roughly below the per mille level. (orig.)

  4. Latent structure of the Wisconsin Card Sorting Test: a confirmatory factor analytic study.

    Science.gov (United States)

    Greve, Kevin W; Stickle, Timothy R; Love, Jeffrey M; Bianchini, Kevin J; Stanford, Matthew S

    2005-05-01

    The present study represents the first large scale confirmatory factor analysis of the Wisconsin Card Sorting Test (WCST). The results generally support the three factor solutions reported in the exploratory factor analysis literature. However, only the first factor, which reflects general executive functioning, is statistically sound. The secondary factors, while likely reflecting meaningful cognitive abilities, are less stable except when all subjects complete all 128 cards. It is likely that having two discontinuation rules for the WCST has contributed to the varied factor analytic solutions reported in the literature and early discontinuation may result in some loss of useful information. Continued multivariate research will be necessary to better clarify the processes underlying WCST performance and their relationships to one another.

  5. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  6. Effects of fecal sampling on preanalytical and analytical phases in quantitative fecal immunochemical tests for hemoglobin.

    Science.gov (United States)

    Rapi, Stefano; Berardi, Margherita; Cellai, Filippo; Ciattini, Samuele; Chelazzi, Laura; Ognibene, Agostino; Rubeca, Tiziana

    2017-07-24

    Information on preanalytical variability is mandatory to bring laboratories up to ISO 15189 requirements. Fecal sampling is greatly affected by lack of harmonization in laboratory medicine. The aims of this study were to obtain information on the devices used for fecal sampling and to explore the effect of different amounts of feces on the results from the fecal immunochemical test for hemoglobin (FIT-Hb). Four commercial sample collection devices for quantitative FIT-Hb measurements were investigated. The volume of interest (VOI) of the probes was measured from diameter and geometry. Quantitative measurements of the mass of feces were carried out by gravimetry. The effects of an increased amount of feces on the analytical environment were investigated measuring the Hb values with a single analytical method. VOI was 8.22, 7.1 and 9.44 mm3 for probes that collected a target of 10 mg of feces, and 3.08 mm3 for one probe that targeted 2 mg of feces. The ratio between recovered and target amounts of devices ranged from 56% to 121%. Different changes in the measured Hb values were observed, in adding increasing amounts of feces in commercial buffers. The amounts of collected materials are related to the design of probes. Three out 4 manufacturers declare the same target amount using different sampling volumes and obtaining different amounts of collected materials. The introduction of a standard probes to reduce preanalytical variability could be an useful step for fecal test harmonization and to fulfill the ISO 15189 requirements.

  7. SOURCE 1ST 2.0: development and beta testing

    International Nuclear Information System (INIS)

    Barber, D.H.; Iglesias, F.C.; Hoang, Y.; Dickson, L.W.; Dickson, R.S.; Richards, M.J.; Gibb, R.A.

    1999-01-01

    SOURCE 1ST 2.0 is the Industry Standard fission product release code that is being developed by Ontario Power Generation, New Brunswick Power, Hydro-Quebec, and Atomic Energy of Canada Ltd. This paper is a report on recent progress on requirement specification, code development, and module verification and validation activities. The theoretical basis for each model in the code is described in a module Software Theory Manual. The development of SOURCE IST 2.0 has required code design decisions about how to implement the software requirements. Development and module testing of the β1 release of SOURCE IST 2.0 (released in July 1999) have led to some interesting insights into fission product release modelling. The beta testing process has allowed code developers and analysts to refine the software requirements for the code. The need to verify physical reference data has guided some decisions on the code and data structure design. Examples of these design decisions are provided. Module testing, and verification and validation activities are discussed. These activities include code-targeted testing, stress testing, code inspection, comparison of code with requirements, and comparison of code results with independent algebraic, numerical, or semi-algebraic calculations. The list of isotopes to be modelled by SOURCE IST 2.0 provides an example of a subset of a reference data set. Isotopes are present on the list for a variety of reasons: personnel or public dose, equipment dose (for environmental qualification), fission rate and actinide modelling, or stable (or long-lived) targets for activation processes. To accommodate controlled changes to the isotope list, the isotope list and associated nuclear data are contained in a reference data file. The questions of multiple computing platforms, and of Year 2000 compliance have been addressed by programming rules for the code. By developing and testing modules on most of the different platforms on which the code is intended

  8. Immunochemical faecal occult blood tests have superior stability and analytical performance characteristics over guaiac-based tests in a controlled in vitro study.

    LENUS (Irish Health Repository)

    Lee, Chun Seng

    2011-06-01

    The aims of this study were (1) to determine the measurement accuracy of a widely used guaiac faecal occult blood test (gFOBT) compared with an immunochemical faecal occult blood test (iFOBT) during in vitro studies, including their analytical stability over time at ambient temperature and at 4°C; and (2) to compare analytical imprecision and other characteristics between two commercially available iFOBT methods.

  9. Type testing of devices with inserted radioactive sources

    International Nuclear Information System (INIS)

    Rolle, A.; Droste, B.; Dombrowski, H.

    2006-01-01

    In Germany devices with inserted radioactive sources can get a type approval if they comply with specific requirements. Whoever operates a device whose type has been approved in accordance with the German Radiation Protection Ordinance does not need an individual authorization. Such type approvals for free use are granted by the Federal Office for Radiation Protection (B.f.S.) on the basis of type testing performed by the Physikalisch-Technische Bundesanstalt (P.T.B.), the national metrology institute, and the Bundesanstalt fur Materialforschung und -prufung (B.A.M.), the Federal Institute for Materials Research and Testing. Main aspects of the assessment are the activity of the radioactive sources, the dose equivalent rate near the devices, the tamper-proofness and leak-tightness of the sources and the safety of the construction of the devices. With the new Radiation Protection Ordinance in 2001, more stringent requirements for a type approval were established. Experiences with the new regulations and the relevant assessment criteria applied by P.T.B. and B.A.M. will be presented. (authors)

  10. Deuterium results at the negative ion source test facility ELISE

    Science.gov (United States)

    Kraus, W.; Wünderlich, D.; Fantz, U.; Heinemann, B.; Bonomo, F.; Riedl, R.

    2018-05-01

    The ITER neutral beam system will be equipped with large radio frequency (RF) driven negative ion sources, with a cross section of 0.9 m × 1.9 m, which have to deliver extracted D- ion beams of 57 A at 1 MeV for 1 h. On the extraction from a large ion source experiment test facility, a source of half of this size is being operational since 2013. The goal of this experiment is to demonstrate a high operational reliability and to achieve the extracted current densities and beam properties required for ITER. Technical improvements of the source design and the RF system were necessary to provide reliable operation in steady state with an RF power of up to 300 kW. While in short pulses the required D- current density has almost been reached, the performance in long pulses is determined in particular in Deuterium by inhomogeneous and unstable currents of co-extracted electrons. By application of refined caesium evaporation and distribution procedures, and reduction and symmetrization of the electron currents, considerable progress has been made and up to 190 A/m2 D-, corresponding to 66% of the value required for ITER, have been extracted for 45 min.

  11. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    International Nuclear Information System (INIS)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K.; Pozdniakov, S.P.; Shestakov, V.M.; Roshal, A.A.

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs

  12. Development of analytical and numerical models for the assessment and interpretation of hydrogeological field tests

    Energy Technology Data Exchange (ETDEWEB)

    Mironenko, V.A.; Rumynin, V.G.; Konosavsky, P.K. [St. Petersburg Mining Inst. (Russian Federation); Pozdniakov, S.P.; Shestakov, V.M. [Moscow State Univ. (Russian Federation); Roshal, A.A. [Geosoft-Eastlink, Moscow (Russian Federation)

    1994-07-01

    Mathematical models of the flow and tracer tests in fractured aquifers are being developed for the further study of radioactive wastes migration in round water at the Lake Area, which is associated with one of the waste disposal site in Russia. The choice of testing methods, tracer types (chemical or thermal) and the appropriate models are determined by the nature of the ongoing ground-water pollution processes and the hydrogeological features of the site under consideration. Special importance is attached to the increased density of wastes as well as to the possible redistribution of solutes both in the liquid phase and in the absorbed state (largely, on fracture surfaces). This allows for studying physical-and-chemical (hydrogeochemical) interaction parameters which are hard to obtain (considering a fractured structure of the rock mass) in laboratory. Moreover, a theoretical substantiation is being given to the field methods of studying the properties of a fractured stratum aimed at the further construction of the drainage system or the subsurface flow barrier (cutoff wall), as well as the monitoring system that will evaluate the reliability of these ground-water protection measures. The proposed mathematical models are based on a tight combination of analytical and numerical methods, the former being preferred in solving the principal (2D axisymmetrical) class of the problems. The choice of appropriate problems is based on the close feedback with subsequent field tests in the Lake Area. 63 refs.

  13. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  14. Analytical characteristics of a continuum-source tungsten coil atomic absorption spectrometer.

    Science.gov (United States)

    Rust, Jennifer A; Nóbrega, Joaquim A; Calloway, Clifton P; Jones, Bradley T

    2005-08-01

    A continuum-source tungsten coil electrothermal atomic absorption spectrometer has been assembled, evaluated, and employed in four different applications. The instrument consists of a xenon arc lamp light source, a tungsten coil atomizer, a Czerny-Turner high resolution monochromator, and a linear photodiode array detector. This instrument provides simultaneous multi-element analyses across a 4 nm spectral window with a resolution of 0.024 nm. Such a device might be useful in many different types of analyses. To demonstrate this broad appeal, four very different applications have been evaluated. First of all, the temperature of the gas phase was measured during the atomization cycle of the tungsten coil, using tin as a thermometric element. Secondly, a summation approach for two absorption lines for aluminum falling within the same spectral window (305.5-309.5 nm) was evaluated. This approach improves the sensitivity without requiring any additional preconcentration steps. The third application describes a background subtraction technique, as it is applied to the analysis of an oil emulsion sample. Finally, interference effects caused by Na on the atomization of Pb were studied. The simultaneous measurements of Pb and Na suggests that negative interference arises at least partially from competition between Pb and Na atoms for H2 in the gas phase.

  15. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Heusen, M.; Shalchi, A., E-mail: husseinm@myumanitoba.ca, E-mail: andreasm4@yahoo.com [Department of Physics and Astronomy, University of Manitoba, Winnipeg, MB R3T 2N2 (Canada)

    2017-04-20

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  16. Numerical Test of Analytical Theories for Perpendicular Diffusion in Small Kubo Number Turbulence

    International Nuclear Information System (INIS)

    Heusen, M.; Shalchi, A.

    2017-01-01

    In the literature, one can find various analytical theories for perpendicular diffusion of energetic particles interacting with magnetic turbulence. Besides quasi-linear theory, there are different versions of the nonlinear guiding center (NLGC) theory and the unified nonlinear transport (UNLT) theory. For turbulence with high Kubo numbers, such as two-dimensional turbulence or noisy reduced magnetohydrodynamic turbulence, the aforementioned nonlinear theories provide similar results. For slab and small Kubo number turbulence, however, this is not the case. In the current paper, we compare different linear and nonlinear theories with each other and test-particle simulations for a noisy slab model corresponding to small Kubo number turbulence. We show that UNLT theory agrees very well with all performed test-particle simulations. In the limit of long parallel mean free paths, the perpendicular mean free path approaches asymptotically the quasi-linear limit as predicted by the UNLT theory. For short parallel mean free paths we find a Rechester and Rosenbluth type of scaling as predicted by UNLT theory as well. The original NLGC theory disagrees with all performed simulations regardless what the parallel mean free path is. The random ballistic interpretation of the NLGC theory agrees much better with the simulations, but compared to UNLT theory the agreement is inferior. We conclude that for this type of small Kubo number turbulence, only the latter theory allows for an accurate description of perpendicular diffusion.

  17. High temperature and dynamic testing of AHSS for an analytical description of the adiabatic cutting process

    Science.gov (United States)

    Winter, S.; Schmitz, F.; Clausmeyer, T.; Tekkaya, A. E.; F-X Wagner, M.

    2017-03-01

    In the automotive industry, advanced high strength steels (AHSS) are widely used as sheet part components to reduce weight, even though this leads to several challenges. The demand for high-quality shear cutting surfaces that do not require reworking can be fulfilled by adiabatic shear cutting: High strain rates and local temperatures lead to the formation of adiabatic shear bands (ASB). While this process is well suited to produce AHSS parts with excellent cutting surface quality, a fundamental understanding of the process is still missing today. In this study, compression tests in a Split-Hopkinson Pressure Bar with an initial strain rate of 1000 s-1 were performed in a temperature range between 200 °C and 1000 °C. The experimental results show that high strength steels with nearly the same mechanical properties at RT may possess a considerably different behavior at higher temperatures. The resulting microstructures after testing at different temperatures were analyzed by optical microscopy. The thermo-mechanical material behavior was then considered in an analytical model. To predict the local temperature increase that occurs during the adiabatic blanking process, experimentally determined flow curves were used. Furthermore, the influence of temperature evolution with respect to phase transformation is discussed. This study contributes to a more complete understanding of the relevant microstructural and thermo-mechanical mechanisms leading to the evolution of ASB during cutting of AHSS.

  18. Thulium-170 oxide heat source experimental and analytical radiation and shielding study

    International Nuclear Information System (INIS)

    Tse, A.; Nelson, C.A.

    1970-05-01

    Radiation dose rates from three thulium-170 oxide sources (20.7, 10.0 and 5.0 thermal watts) were measured through three thicknesses (1/4, 1/2 and 1 inch) of absorber by thermoluminescent dosimetry techniques. Absorber materials used were aluminium, stainless steel, lead, tungsten and depleted uranium. Resultant radiation doses were measured at 19 and 100 cm. Comparison of theoretical dose rates calculated by computer with measured dose rates validated the calculation technique for lead, tungsten and uranium absorbers but not for aluminum and stainless steel. Use of infinite medium build-up factors (B/sub ∞/) was thus validated in computation of dose rates for lead, tungsten and uranium absorbers; use of B/sub ∞/ in computation of dose rates for aluminum and stainless steel absorbers overestimated dose rates vis-a-vis experimentally determined dose rates by an approximate factor of 2

  19. Tests of MVD prototype pad detector with a β- source

    International Nuclear Information System (INIS)

    Yeol Kim, Sang; Gook Kim, Young; Su Ryu, Sang; Hwan Kang, Ju; Simon-Gillo, Jehanne; Sullivan, John P.; Heck, Hubert W. van; Xu Guanghua

    1999-01-01

    The MVD group has been testing two versions of silicon pad detectors. One design uses a single metal layer for readout trace routing. The second type uses two layers of metal, allowing for greatly simplified signal routing. However, because the readout traces for the pads pass over the other pads in the same column (separated by an oxide layer), the double-metal design introduces crosstalk into the system. A simple test stand using a 90 Sr β - source with scintillator triggers was made to estimate the crosstalk. The crosstalk between pads in the same column of the pad detector was 1.6-3.1%. The values measured between pads in different columns were very close to zero. The measured crosstalk was below our maximum allowed value of 7.8%

  20. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Science.gov (United States)

    Cunningham, Hamish; Tablan, Valentin; Roberts, Angus; Bontcheva, Kalina

    2013-01-01

    This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group) who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  1. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Directory of Open Access Journals (Sweden)

    Hamish Cunningham

    Full Text Available This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  2. Preliminary Tests Of The Decris-sc Ion Source

    CERN Document Server

    Efremov, A; Bechterev, V; Bogomolov, S L; Bondarenko, P G; Datskov, V I; Dmitriev, S; Drobin, V; Lebedev, A; Leporis, M; Malinowski, H; Nikiforov, A; Paschenko, S V; Seleznev, V; Shishov, Yu A; Smirnov, Yu; Tsvineva, G; Yakovlev, B; Yazvitsky, N Yu

    2004-01-01

    A new "liquid He-free" superconducting Electron Cyclotron Resonance Ion Source DECRIS-SC, to be used as injector for the IC-100 small cyclotron, has been designed by FLNR and LHE JINR. The main feature is that a compact refrigerator of Gifford-McMahon type is used to cool the solenoid coils. For the reason of very small cooling power at 4.2 K (about 1 W) our efforts were to optimize the magnetic structure and minimize an external heating of the coils. The maximum magnetic field strength is 3 T and 2 T in injection and extraction region respectively. For the radial plasma confinement a hexapole made of NdFeB permanent magnet is used. The source will be capable of ECR plasma heating using different frequencies (14 GHz or 18 GHz). To be able to deliver usable intensities of solids, the design is also allow axial access for evaporation oven and metal samples using the plasma sputtering technique. Very preliminary results of the source test are presented.

  3. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  4. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  5. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    Directory of Open Access Journals (Sweden)

    Kronenwett Ralf

    2012-10-01

    Full Text Available Abstract Background EndoPredict (EP is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE tissue by reverse transcription-quantitative real-time PCR (RT-qPCR. Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Methods Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. Results PCR assays were linear up to Cq values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots resulted in a total noise (standard deviation of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14 was caused by the replicate-to-replicate noise of the PCR assays (repeatability and was not associated with different operating conditions (reproducibility. Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. Conclusions The EP test showed reproducible performance

  6. Decentral gene expression analysis: analytical validation of the Endopredict genomic multianalyte breast cancer prognosis test

    International Nuclear Information System (INIS)

    Kronenwett, Ralf; Brase, Jan C; Weber, Karsten E; Fisch, Karin; Müller, Berit M; Schmidt, Marcus; Filipits, Martin; Dubsky, Peter; Petry, Christoph; Dietel, Manfred; Denkert, Carsten; Bohmann, Kerstin; Prinzler, Judith; Sinn, Bruno V; Haufe, Franziska; Roth, Claudia; Averdick, Manuela; Ropers, Tanja; Windbergs, Claudia

    2012-01-01

    EndoPredict (EP) is a clinically validated multianalyte gene expression test to predict distant metastasis in ER-positive, HER2-negative breast cancer treated with endocrine therapy alone. The test is based on the combined analysis of 12 genes in formalin-fixed, paraffin-embedded (FFPE) tissue by reverse transcription-quantitative real-time PCR (RT-qPCR). Recently, it was shown that EP is feasible for reliable decentralized assessment of gene expression. The aim of this study was the analytical validation of the performance characteristics of the assay and its verification in a molecular-pathological routine laboratory. Gene expression values to calculate the EP score were assayed by one-step RT-qPCR using RNA from FFPE tumor tissue. Limit of blank, limit of detection, linear range, and PCR efficiency were assessed for each of the 12 PCR assays using serial samples dilutions. Different breast cancer samples were used to evaluate RNA input range, precision and inter-laboratory variability. PCR assays were linear up to C q values between 35.1 and 37.2. Amplification efficiencies ranged from 75% to 101%. The RNA input range without considerable change of the EP score was between 0.16 and 18.5 ng/μl. Analysis of precision (variation of day, day time, instrument, operator, reagent lots) resulted in a total noise (standard deviation) of 0.16 EP score units on a scale from 0 to 15. The major part of the total noise (SD 0.14) was caused by the replicate-to-replicate noise of the PCR assays (repeatability) and was not associated with different operating conditions (reproducibility). Performance characteristics established in the manufacturer’s laboratory were verified in a routine molecular pathology laboratory. Comparison of 10 tumor samples analyzed in two different laboratories showed a Pearson coefficient of 0.995 and a mean deviation of 0.15 score units. The EP test showed reproducible performance characteristics with good precision and negligible laboratory

  7. Cost effectiveness of ovarian reserve testing in in vitro fertilization: a Markov decision-analytic model.

    Science.gov (United States)

    Moolenaar, Lobke M; Broekmans, Frank J M; van Disseldorp, Jeroen; Fauser, Bart C J M; Eijkemans, Marinus J C; Hompes, Peter G A; van der Veen, Fulco; Mol, Ben Willem J

    2011-10-01

    To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF. [1] No treatment, [2] up to three cycles of IVF limited to women under 41 years and no ovarian reserve testing, [3] up to three cycles of IVF with dose individualization of gonadotropins according to ovarian reserve, and [4] up to three cycles of IVF with ovarian reserve testing and exclusion of expected poor responders after the first cycle, with no treatment scenario as the reference scenario. Cumulative live birth over 1 year, total costs, and incremental cost-effectiveness ratios. The cumulative live birth was 9.0% in the no treatment scenario, 54.8% for scenario 2, 70.6% for scenario 3 and 51.9% for scenario 4. Absolute costs per woman for these scenarios were €0, €6,917, €6,678, and €5,892 for scenarios 1, 2, 3, and 4, respectively. Incremental cost-effectiveness ratios (ICER) for scenarios 2, 3, and 4 were €15,166, €10,837, and €13,743 per additional live birth. Sensitivity analysis showed the model to be robust over a wide range of values. Individualization of the follicle-stimulating hormone dose according to ovarian reserve is likely to be cost effective in women who are eligible for IVF, but this effectiveness needs to be confirmed in randomized clinical trials. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  8. Electron capture detector based on a non-radioactive electron source: operating parameters vs. analytical performance

    Directory of Open Access Journals (Sweden)

    E. Bunert

    2017-12-01

    Full Text Available Gas chromatographs with electron capture detectors are widely used for the analysis of electron affine substances such as pesticides or chlorofluorocarbons. With detection limits in the low pptv range, electron capture detectors are the most sensitive detectors available for such compounds. Based on their operating principle, they require free electrons at atmospheric pressure, which are usually generated by a β− decay. However, the use of radioactive materials leads to regulatory restrictions regarding purchase, operation, and disposal. Here, we present a novel electron capture detector based on a non-radioactive electron source that shows similar detection limits compared to radioactive detectors but that is not subject to these limitations and offers further advantages such as adjustable electron densities and energies. In this work we show first experimental results using 1,1,2-trichloroethane and sevoflurane, and investigate the effect of several operating parameters on the analytical performance of this new non-radioactive electron capture detector (ECD.

  9. Feed Preparation for Source of Alkali Melt Rate Tests

    International Nuclear Information System (INIS)

    Stone, M. E.; Lambert, D. P.

    2005-01-01

    The purpose of the Source of Alkali testing was to prepare feed for melt rate testing in order to determine the maximum melt-rate for a series of batches where the alkali was increased from 0% Na 2 O in the frit (low washed sludge) to 16% Na 2 O in the frit (highly washed sludge). This document summarizes the feed preparation for the Source of Alkali melt rate testing. The Source of Alkali melt rate results will be issued in a separate report. Five batches of Sludge Receipt and Adjustment Tank (SRAT) product and four batches of Slurry Mix Evaporator (SME) product were produced to support Source of Alkali (SOA) melt rate testing. Sludge Batch 3 (SB3) simulant and frit 418 were used as targets for the 8% Na 2 O baseline run. For the other four cases (0% Na 2 O, 4% Na 2 O, 12% Na 2 O, and 16% Na 2 O in frit), special sludge and frit preparations were necessary. The sludge preparations mimicked washing of the SB3 baseline composition, while frit adjustments consisted of increasing or decreasing Na and then re-normalizing the remaining frit components. For all batches, the target glass compositions were identical. The five SRAT products were prepared for testing in the dry fed melt-rate furnace and the four SME products were prepared for the Slurry-fed Melt-Rate Furnace (SMRF). At the same time, the impacts of washing on a baseline composition from a Chemical Process Cell (CPC) perspective could also be investigated. Five process simulations (0% Na 2 O in frit, 4% Na 2 O in frit, 8% Na 2 O in frit or baseline, 12% Na 2 O in frit, and 16% Na 2 O in frit) were completed in three identical 4-L apparatus to produce the five SRAT products. The SRAT products were later dried and combined with the complementary frits to produce identical glass compositions. All five batches were produced with identical processing steps, including off-gas measurement using online gas chromatographs. Two slurry-fed melter feed batches, a 4% Na 2 O in frit run (less washed sludge combined with

  10. Crowd-sourcing as an analytical method: Metrology of smartphone measurements in heritage science.

    Science.gov (United States)

    Brigham, Rosie; Grau-Bove, Josep; Rudnicka, Anna; Cassar, May; Strlic, Matija

    2018-04-12

    This research assesses the precision, repeatability and accuracy of crowd-sourced scientific measurements, and whether their quality is sufficient to provide usable results. Measurements of colour and area were chosen because of the possibility of producing them with smartphone cameras. The quality of measurements was estimated experimentally by comparing data contributed by anonymous participants in heritage sites with reference measurements of known accuracy and precision. Participants performed the measurements by taking photographs with their smartphones, from which colour and dimensional data could be extracted. The results indicate that smartphone measurements provided by citizen-scientists can be used to measure changes of colour, but that the performance is strongly dependent on the measured colour coordinate and ranges from a minimum detectable colour change or difference between colours of ΔE 3.1 to ΔE 17.2. The same method is able to measure areas when the difference in colour with the neighbouring areas is higher than ΔE 10. These results render the method useful in some heritage science contexts, but higher precision would be desirable: the human eye can detect differences as small as ΔE 2, and a light-fast pigment fades approximately ΔE 8 in its lifetime. There is scope for further research in the automatization of the post-processing of user contributions and the effect of contextual factors (such as detail in the instructions) in the quality of the raw data. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Does the Cognitive Reflection Test actually capture heuristic versus analytic reasoning styles in older adults?

    Science.gov (United States)

    Hertzog, Christopher; Smith, R Marit; Ariel, Robert

    2018-01-01

    Background/Study Context: This study evaluated adult age differences in the original three-item Cognitive Reflection Test (CRT; Frederick, 2005, The Journal of Economic Perspectives, 19, 25-42) and an expanded seven-item version of that test (Toplak et al., 2013, Thinking and Reasoning, 20, 147-168). The CRT is a numerical problem-solving test thought to capture a disposition towards either rapid, intuition-based problem solving (Type I reasoning) or a more thoughtful, analytical problem-solving approach (Type II reasoning). Test items are designed to induce heuristically guided errors that can be avoided if using an appropriate numerical representation of the test problems. We evaluated differences between young adults and old adults in CRT performance and correlates of CRT performance. Older adults (ages 60 to 80) were paid volunteers who participated in experiments assessing age differences in self-regulated learning. Young adults (ages 17 to 35) were students participating for pay as part of a project assessing measures of critical thinking skills or as a young comparison group in the self-regulated learning study. There were age differences in the number of CRT correct responses in two independent samples. Results with the original three-item CRT found older adults to have a greater relative proportion of errors based on providing the intuitive lure. However, younger adults actually had a greater proportion of intuitive errors on the long version of the CRT, relative to older adults. Item analysis indicated a much lower internal consistency of CRT items for older adults. These outcomes do not offer full support for the argument that older adults are higher in the use of a "Type I" cognitive style. The evidence was also consistent with an alternative hypothesis that age differences were due to lower levels of numeracy in the older samples. Alternative process-oriented evaluations of how older adults solve CRT items will probably be needed to determine

  12. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test.

    Science.gov (United States)

    Yang, Hannah P; Walmer, David K; Merisier, Delson; Gage, Julia C; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S; Castle, Philip E

    2011-09-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings ("triage test"). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kappa = 0.87). DNA sequencing on a subset of 16 HPV16/18/45 positive and 16 HPV16/18/45 negative verified the analytic specificity of the research test. It is concluded that the HPV16/18/45 assay is a promising triage test with a minimum detection of approximately 5000 viral copies, the clinically relevant threshold. Published by Elsevier B.V.

  13. Development and testing of analytical models for the pebble bed type HTRs

    International Nuclear Information System (INIS)

    Huda, M.Q.; Obara, T.

    2008-01-01

    The pebble bed type gas cooled high temperature reactor (HTR) appears to be a good candidate for the next generation nuclear reactor technology. These reactors have unique characteristics in terms of the randomness in geometry, and require special techniques to analyze their systems. This study includes activities concerning the testing of computational tools and the qualification of models. Indeed, it is essential that the validated analytical tools be available to the research community. From this viewpoint codes like MCNP, ORIGEN and RELAP5, which have been used in nuclear industry for many years, are selected to identify and develop new capabilities needed to support HTR analysis. The geometrical model of the full reactor is obtained by using lattice and universe facilities provided by MCNP. The coupled MCNP-ORIGEN code is used to estimate the burnup and the refuelling scheme. Results obtained from Monte Carlo analysis are interfaced with RELAP5 to analyze the thermal hydraulics and safety characteristics of the reactor. New models and methodologies are developed for several past and present experimental and prototypical facilities that were based on HTR pebble bed concepts. The calculated results are compared with available experimental data and theoretical evaluations showing very good agreement. The ultimate goal of the validation of the computer codes for pebble bed HTR applications is to acquire and reinforce the capability of these general purpose computer codes for performing HTR core design and optimization studies

  14. Advanced photon source low-energy undulator test line

    International Nuclear Information System (INIS)

    Milton, S.V.

    1997-01-01

    The injector system of the Advanced Photon Source (APS) consists of a linac capable of producing 450-MeV positrons or > 650-MeV electrons, a positron accumulator ring (PAR), and a booster synchrotron designed to accelerate particles to 7 GeV. There are long periods of time when these machines are not required for filling the main storage ring and instead can be used for synchrotron radiation research. We describe here an extension of the linac beam transport called the Low-Energy Undulator Test Line (LEUTL). The LEUTL will have a twofold purpose. The first is to fully characterize innovative, future generation undulators, some of which may prove difficult or impossible to measure by traditional techniques. These might include small-gap and superconducting undulators, very long undulators, undulators with designed-in internal focusing, and helical undulators. This technique also holds the promise of extending the magnetic measurement sensitivity beyond that presently attainable. This line will provide the capability to directly test undulators before their possible insertion into operating storage rings. A second use for the test line will be to investigate the generation of coherent radiation at wavelengths down to a few tens of nanometers

  15. Getting Open Source Right for Big Data Analytics: Software Sharing, Governance, Collaboration and Most of All, Fun!

    Science.gov (United States)

    Mattmann, C. A.

    2013-12-01

    A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.

  16. Note: Simulation and test of a strip source electron gun.

    Science.gov (United States)

    Iqbal, Munawar; Islam, G U; Misbah, I; Iqbal, O; Zhou, Z

    2014-06-01

    We present simulation and test of an indirectly heated strip source electron beam gun assembly using Stanford Linear Accelerator Center (SLAC) electron beam trajectory program. The beam is now sharply focused with 3.04 mm diameter in the post anode region at 15.9 mm. The measured emission current and emission density were 1.12 A and 1.15 A/cm(2), respectively, that corresponds to power density of 11.5 kW/cm(2), at 10 kV acceleration potential. The simulated results were compared with then and now experiments and found in agreement. The gun is without any biasing, electrostatic and magnetic fields; hence simple and inexpensive. Moreover, it is now more powerful and is useful for accelerators technology due to high emission and low emittance parameters.

  17. Broadband Liner Optimization for the Source Diagnostic Test Fan

    Science.gov (United States)

    Nark, Douglas M.; Jones, Michael G.

    2012-01-01

    The broadband component of fan noise has grown in relevance with the utilization of increased bypass ratio and advanced fan designs. Thus, while the attenuation of fan tones remains paramount, the ability to simultaneously reduce broadband fan noise levels has become more appealing. This paper describes a broadband acoustic liner optimization study for the scale model Source Diagnostic Test fan. Specifically, in-duct attenuation predictions with a statistical fan source model are used to obtain optimum impedance spectra over a number of flow conditions for three liner locations in the bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners aimed at producing impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increased weighting to specific frequencies and/or operating conditions. Typical tonal liner designs targeting single frequencies at one operating condition are first produced to provide baseline performance information. These are followed by multiple broadband design approaches culminating in a broadband liner targeting the full range of frequencies and operating conditions. The broadband liner is found to satisfy the optimum impedance objectives much better than the tonal liner designs. In addition, the broadband liner is found to provide better attenuation than the tonal designs over the full range of frequencies and operating conditions considered. Thus, the current study successfully establishes a process for the initial design and evaluation of novel broadband liner concepts for complex engine configurations.

  18. Ion beam pellet fusion as a CTR neutron test source

    International Nuclear Information System (INIS)

    Arnold, R.; Martin, R.

    1975-07-01

    Pellet fusion, driven by nanosecond pulses containing α particles with 200 MeV energy, is being developed as a neutron source. A prototype system is in the conceptual design stage. During the coming year, engineering design of required accelerator components, storage rings, and pellet configurations, as well as experiments on energy deposition mechanisms, should be accomplished. Successful construction and tests of prototype rings, followed by two years of full scale system construction, would give a source producing a useful flux of fusion neutrons for materials testing. The system as currently envisioned would employ 100 small superconducting high field storage rings (15 cm radius, 140 kG field) which would be synchronously filled with circulating 1 nsec pulses from a 200 MeV linear accelerator over a period of 3 x 10 -4 sec. These ion pulses would all be simultaneously extracted, forming a total current of 10 kA, and focussed from all directions on a deuterium and tritium (DT) pellet with 0.17 mm radium, surrounded by a heavier (metal) coating to increase confinement time and aid compression efficiency. The overall repetition rate, limited principally by physical transport of the pellets, could reach 100/sec. Spacing between pellet and focussing elements would be about 1 m. The predominant engineering problems are the fast extraction mechanism and beam transport devices for the storage rings. Additional theoretical and experimental studies are required on the crucial energy deposition and transport mechanisms in pellets with ion beam heating before firm estimates can be given. Preliminary estimates suggest fusion neutron yields of at least 10 14 /sec and possibly 10 16 /sec are possible, with optimal pellet dynamics, but without the necessity for any large advances in the state-of-the-art in accelerator and storage ring design. (auth)

  19. Mixed analytical-stochastic simulation method for the recovery of a Brownian gradient source from probability fluxes to small windows.

    Science.gov (United States)

    Dobramysl, U; Holcman, D

    2018-02-15

    Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.

  20. A Simple Analytical Model for Predicting the Detectable Ion Current in Ion Mobility Spectrometry Using Corona Discharge Ionization Sources

    Science.gov (United States)

    Kirk, Ansgar Thomas; Kobelt, Tim; Spehlbrink, Hauke; Zimmermann, Stefan

    2018-05-01

    Corona discharge ionization sources are often used in ion mobility spectrometers (IMS) when a non-radioactive ion source with high ion currents is required. Typically, the corona discharge is followed by a reaction region where analyte ions are formed from the reactant ions. In this work, we present a simple yet sufficiently accurate model for predicting the ion current available at the end of this reaction region when operating at reduced pressure as in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) or most IMS-MS instruments. It yields excellent qualitative agreement with measurement results and is even able to calculate the ion current within an error of 15%. Additional interesting findings of this model are the ion current at the end of the reaction region being independent from the ion current generated by the corona discharge and the ion current in High Kinetic Energy Ion Mobility Spectrometers (HiKE-IMS) growing quadratically when scaling down the length of the reaction region. [Figure not available: see fulltext.

  1. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    International Nuclear Information System (INIS)

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO 2 matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO 2 matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO 2 matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO 2 matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs

  2. Pre-analytical conditions in non-invasive prenatal testing of cell-free fetal RHD.

    Directory of Open Access Journals (Sweden)

    Frederik Banch Clausen

    Full Text Available Non-invasive prenatal testing of cell-free fetal DNA (cffDNA in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening.Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110 and ambient outdoor temperatures (n = 1539 on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104.The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10 °C to 28 °C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10-39, n = 1317.The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification.

  3. AGN outflows as neutrino sources: an observational test

    Science.gov (United States)

    Padovani, P.; Turcati, A.; Resconi, E.

    2018-04-01

    We test the recently proposed idea that outflows associated with Active Galactic Nuclei (AGN) could be neutrino emitters in two complementary ways. First, we cross-correlate a list of 94 "bona fide" AGN outflows with the most complete and updated repository of IceCube neutrinos currently publicly available, assembled by us for this purpose. It turns out that AGN with outflows matched to an IceCube neutrino have outflow and kinetic energy rates, and bolometric powers larger than those of AGN with outflows not matched to neutrinos. Second, we carry out a statistical analysis on a catalogue of [O III] λ5007 line profiles using a sample of 23,264 AGN at z values (˜6 and 18 per cent respectively, pre-trial) for relatively high velocities and luminosities. Our results are consistent with a scenario where AGN outflows are neutrino emitters but at present do not provide a significant signal. This can be tested with better statistics and source stacking. A predominant role of AGN outflows in explaining the IceCube data appears in any case to be ruled out.

  4. General-Purpose Heat Source Development: Safety Test Program. Postimpact evaluation, Design Iteration Test 3

    International Nuclear Information System (INIS)

    Schonfeld, F.W.; George, T.G.

    1984-07-01

    The General-Purpose Heat Source(GPHS) provides power for space missions by transmitting the heat of 238 PuO 2 decay to thermoelectric elements. Because of the inevitable return of certain aborted missions, the heat source must be designed and constructed to survive both re-entry and Earth impact. The Design Iteration Test (DIT) series is part of an ongoing test program. In the third test (DIT-3), a full GPHS module was impacted at 58 m/s and 930 0 C. The module impacted the target at an angle of 30 0 to the pole of the large faces. The four capsules used in DIT-3 survived impact with minimal deformation; no internal cracks other than in the regions indicated by Savannah River Plant (SRP) preimpact nondestructive testing were observed in any of the capsules. The 30 0 impact orientation used in DIT-3 was considerably less severe than the flat-on impact utilized in DIT-1 and DIT-2. The four capsules used in DIT-1 survived, while two of the capsules used in DIT-2 breached; a small quantity (approx. = 50 μg) of 238 PuO 2 was released from the capsules breached in the DIT-2 impact. All of the capsules used in DIT-1 and DIT-2 were severely deformed and contained large internal cracks. Postimpact analyses of the DIT-3 test components are described, with emphasis on weld structure and the behavior of defects identified by SRP nondestructive testing

  5. The full spectrum of climate change adaptation: testing an analytical framework in Tyrolean mountain agriculture (Austria).

    Science.gov (United States)

    Grüneis, Heidelinde; Penker, Marianne; Höferl, Karl-Michael

    2016-01-01

    Our scientific view on climate change adaptation (CCA) is unsatisfying in many ways: It is often dominated by a modernistic perspective of planned pro-active adaptation, with a selective focus on measures directly responding to climate change impacts and thus it is far from real-life conditions of those who are actually affected by climate change. Farmers have to simultaneously adapt to multiple changes. Therefore, also empirical climate change adaptation research needs a more integrative perspective on real-life climate change adaptations. This also has to consider "hidden" adaptations, which are not explicitly and directly motivated by CCA but actually contribute to the sector's adaptability to climate change. The aim of the present study is to develop and test an analytic framework that contributes to a broader understanding of CCA and to bridge the gap between scientific expertise and practical action. The framework distinguishes three types of CCA according to their climate related motivations: explicit adaptations, multi-purpose adaptations, and hidden adaptations. Although agriculture is among the sectors that are most affected by climate change, results from the case study of Tyrolean mountain agriculture show that climate change is ranked behind other more pressing "real-life-challenges" such as changing agricultural policies or market conditions. We identified numerous hidden adaptations which make a valuable contribution when dealing with climate change impacts. We conclude that these hidden adaptations have not only to be considered to get an integrative und more realistic view on CCA; they also provide a great opportunity for linking adaptation strategies to farmers' realities.

  6. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  7. Final report on the proficiency test of the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network

    International Nuclear Information System (INIS)

    Shakhashiro, A.; Radecki, Z.; Trinkl, A.; Sansone, U.; Benesch, T.

    2005-08-01

    This report presents the statistical evaluation of results from the analysis of 12 radionuclides in 8 samples within the frame of the First Proficiency Test of Analytical Laboratories for the Measurement Environmental RAdioactivity (ALMERA) organized in 2001-2002 by the Chemistry Unit, Agency's Laboratory in Seibersdorf. The results were evaluated by using appropriate statistical means to assess laboratory analytical performance and to estimate the overall performance for the determination of each radionuclide. Evaluation of the analytical data for gamma emitting radionuclides showed that 68% of data obtained a 'Passed' final score for both the trueness and precision criteria applied to this exercise. However, transuranic radionuclides obtained only 58% for the same criteria. (author)

  8. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  9. Influence of test configuration on the combustion characteristics of polymers as ignition sources

    Science.gov (United States)

    Julien, Howard L.

    1993-01-01

    The experimental evaluation of polymers as ignition sources for metals was accomplished at the NASA White Sands Test Facility (WSTF) using a standard promoted combustion test. These tests involve the transient burning of materials in high-pressure oxygen environments. They have provided data from which design decisions can be made; data include video recordings of ignition and non-ignition for specific combinations of metals and polymers. Other tests provide the measured compositions of combustion products for polymers at select burn times and an empirical basis for estimating burn rates. With the current test configuration, the detailed analysis of test results requires modeling a three-dimensional, transient convection process involving fluid motion, thermal conduction and convection, the diffusion of chemical species, and the erosion of sample surface. At the high pressure extremes, it even requires the analysis of turbulent, transient convection where the physics of the problem are not well known and the computation requirements are not practical at this time. An alternative test configuration that can be analyzed with a relatively-simple convection model was developed during the summer period. The principal change constitutes replacing a large-diameter polymer disk at the end of the metal test rod with coaxial polymer cylinders that have a diameter nearer to that of the metal rod. The experimental objective is to assess the importance of test geometries on the promotion of metal ignition by testing with different lengths of the polymer and, with an extended effort, to analyze the surface combustion in the redesigned promoted combustion tests through analytical modeling of the process. The analysis shall use the results of cone-calorimeter tests of the polymer material to model primary chemical reactions and, with proper design of the promoted combustion test, modeling of the convection process could be conveniently limited to a quasi-steady boundary layer

  10. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    Science.gov (United States)

    Jernström, J.; Eriksson, M.; Simon, R.; Tamborini, G.; Bildstein, O.; Marquez, R. Carlos; Kehl, S. R.; Hamilton, T. F.; Ranebo, Y.; Betti, M.

    2006-08-01

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low 240Pu/ 239Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of 137Cs ( 239 + 240 Pu/ 137Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average 241Am/ 239Pu atomic ratio in the six particles was 3.7 × 10 - 3 ± 0.2 × 10 - 3 (February 2006), which indicated that plutonium in the different particles had similar age.

  11. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jernstroem, J. [Laboratory of Radiochemistry, Department of Chemistry, P.O. Box 55, FI-00014 University of Helsinki (Finland)]. E-mail: jussi.jernstrom@helsinki.fi; Eriksson, M. [IAEA-MEL, International Atomic Energy Agency - Marine Environment Laboratory, 4 Quai Antoine 1er, MC 98000 Monaco (Monaco); Simon, R. [Institute for Synchrotron Radiation, Forschungszentrum Karlsruhe GmbH, D-76021 Karlsruhe (Germany); Tamborini, G. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Bildstein, O. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Marquez, R. Carlos [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Kehl, S.R. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94551-0808 (United States); Hamilton, T.F. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94551-0808 (United States); Ranebo, Y. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany); Betti, M. [European Commission, Joint Research Centre, Institute for Transuranium Elements, P.O. Box 2340, D-76125 Karlsruhe (Germany)]. E-mail: maria.betti@ec.europa.eu

    2006-08-15

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low {sup 240}Pu/{sup 239}Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of {sup 137}Cs ({sup 239+240}Pu/{sup 137}Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average {sup 241}Am/{sup 239}Pu atomic ratio in the six particles was 3.7 x 10{sup -3} {+-} 0.2 x 10{sup -3} (February 2006), which indicated that plutonium in the different particles had similar age.

  12. Use of Strain Measurements from Acoustic Bench Tests of the Battleship Flowliner Test Articles To Link Analytical Model Results to In-Service Resonant Response

    Science.gov (United States)

    Frady, Greg; Smaolloey, Kurt; LaVerde, Bruce; Bishop, Jim

    2004-01-01

    The paper will discuss practical and analytical findings of a test program conducted to assist engineers in determining which analytical strain fields are most appropriate to describe the crack initiating and crack propagating stresses in thin walled cylindrical hardware that serves as part of the Space Shuttle Main Engine's fuel system. In service the hardware is excited by fluctuating dynamic pressures in a cryogenic fuel that arise from turbulent flow/pump cavitation. A bench test using a simplified system was conducted using acoustic energy in air to excite the test articles. Strain measurements were used to reveal response characteristics of two Flowliner test articles that are assembled as a pair when installed in the engine feed system.

  13. Analytical model of asymmetrical Mixed-Mode Bending test of adhesively bonded GFRP joint

    Czech Academy of Sciences Publication Activity Database

    Ševčík, Martin; Hutař, Pavel; Vassilopoulos, Anastasios P.; Shahverdi, M.

    2015-01-01

    Roč. 9, č. 34 (2015), s. 237-246 ISSN 1971-8993 R&D Projects: GA MŠk(CZ) EE2.3.30.0063; GA ČR GA15-09347S Institutional support: RVO:68081723 Keywords : GFRP materials * Mixed-Mode bending * Fiber bridging * Analytical model Subject RIV: JL - Materials Fatigue, Friction Mechanics

  14. Proficiency Testing by Interlaboratory Comparison Performed in 2010-2015 for Neutron Activation Analysis and Other Analytical Techniques

    International Nuclear Information System (INIS)

    2017-12-01

    The IAEA supports its Member States to increase the utilization of their research reactors. Small and medium sized reactors are mostly used for neutron activation analysis (NAA). Although the markets for NAA laboratories have been identified, demonstration of valid analytical results and organizational quality of the work process are preconditions for expanding the stakeholder community, particularly in commercial routine application of this powerful technique. The IAEA has implemented a new mechanism for supporting NAA laboratories in demonstrating their analytical performance by participation in proficiency testing schemes by interlaboratory comparison. This activity makes possible the identification of deviations and non-conformities, their causes and the process to implement effective approaches to eliminate them. Over 30 laboratories participated between 2010 and 2015 in consecutive proficiency tests organized by the IAEA in conjunction with the Wageningen Evaluating Programmes for Analytical Laboratories (WEPAL) to assess their analytical performances. This publication reports the findings and includes lessons learned of this activity. An attached CD-ROM contains many individual participating laboratory papers sharing their individual results and experience gained through this participation.

  15. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  16. Development and application of test apparatus for classification of sealed source

    International Nuclear Information System (INIS)

    Kim, Dong Hak; Seo, Ki Seog; Bang, Kyoung Sik; Lee, Ju Chan; Son, Kwang Je

    2007-01-01

    Sealed sources have to conducted the tests be done according to the classification requirements for their typical usages in accordance with the relevant domestic notice standard and ISO 2919. After each test, the source shall be examined visually for loss of integrity and pass an appropriate leakage test. Tests to class a sealed source are temperature, external pressure, impact, vibration and puncture test. The environmental test conditions for tests with class numbers are arranged in increasing order of severity. In this study, the apparatus of tests, except the vibration test, were developed and applied to three kinds of sealed source. The conditions of the tests to class a sealed source were stated and the difference between the domestic notice standard and ISO 2919 were considered. And apparatus of the tests were made. Using developed apparatus we conducted the test for 192 Ir brachytherapy sealed source and two kinds of sealed source for industrial radiography. 192 Ir brachytherapy sealed source is classified by temperature class 5, external pressure class 3, impact class 2 and vibration and puncture class 1. Two kinds of sealed source for industrial radiography are classified by temperature class 4, external pressure class 2, impact and puncture class 5 and vibration class 1. After the tests, Liquid nitrogen bubble test and vacuum bubble test were done to evaluate the safety of the sealed sources

  17. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test

    OpenAIRE

    Yang, Hannah P.; Walmer, David K.; Merisier, Delson; Gage, Julia C.; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S.; Castle, Philip E.

    2011-01-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings (“triage test”). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kap...

  18. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    Energy Technology Data Exchange (ETDEWEB)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C. [CEA Marcoule, DEN/DRCP/SE2A/LAMM, BP17171, 30207 Bagnols-sur-Ceze (France)

    2008-07-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr.

  19. High-Activity ICP-AES Measurements in the ATALANTE Facility Applied to Analytical Monitoring of an Extraction Test

    International Nuclear Information System (INIS)

    Esbelin, E.; Boyer-Deslys, V.; Beres, A.; Viallesoubranne, C.

    2008-01-01

    The Material Analysis and Metrology Laboratory (LAMM) of the Cea's Atalante complex ensures analytical monitoring of enhanced separation tests. Certain fission products, actinides and lanthanides were assayed by ICP-AES (Inductively Coupled Plasma-Atomic Emission Spectroscopy) in the CBA shielded analysis line. These analyses were particularly effective for controlling the Diamex test, and contributed to its success. The Diamex process consists in extracting the actinides and lanthanides from a Purex raffinate using a diamide, DMDOHEMA, followed by stripping at low acidity. The major elements analyzed during the test were Am, Nd, Mo, Fe, and Zr

  20. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  1. Laboratory test of source encapsulation for decreasing PCB concentrations

    DEFF Research Database (Denmark)

    Kolarik, Barbara; Andersen, Helle Vibeke; Markowicz, Pawel

    2016-01-01

    This study investigates the effect of encapsulation of tertiary PCB sources with PERMASORB™ Adsorber Wallpaper and the surface emissions trap (cTrap) on indoor air concentration of PCBs and on the PCB content in the source. The 40 weeks long laboratory investigation shows reduction of the air...... concentration by approx. 90% for both wallpapers, a level comparable to source removal. The potential for extraction of PCBs from the contaminated materials stays unclear for both wallpapers. The cTrap has shown potential to accumulate PCBs, however the total content of PCB in investigated sources has...... apparently increased. The opposite was observed for the PERMASORB™, where the total PCB content in the sources has decreased, with however only small concentration of PCBs in the wallpaper measured at the end of the experiment....

  2. Analytical study for frequency effects on the EPRI/USNRC piping component tests. Part 1: Theoretical basis and model development

    International Nuclear Information System (INIS)

    Adams, T.M.; Branch, E.B.; Tagart, S.W. Jr.

    1994-01-01

    As part of the engineering effort for the Advanced Light Water Reactor the Advanced Reactor Corporation formed a Piping Technical Core Group to develop a set of improved ASME Boiler and Pressure Vessel Code, Section III design rules and approaches for ALWR plant piping and support design. The technical basis for the proposed changes to the ASME Boiler and Pressure Vessel Code developed by Technical Core Group for the design of piping relies heavily on the failure margins determined from the EPRI/USNRC piping component test program. The majority of the component tests forming the basis for the reported margins against failure were run with input frequency to natural frequency ratios (Ω/ω) in the range of 0.74 to 0.87. One concern investigated by the Technical Core Group was the effect which could exist on measured margins if the tests had been run at higher or lower frequency ratios than those in the limited frequency ratio range tested. Specifically, the concern investigated was that the proposed Technical Core Group Piping Stress Criteria will allow piping to be designed in the low frequency range (Ω/ω ≥ 2.0) for which there is little test data from the EPRI/USNRC test program. The purpose of this analytical study was to: (1) evaluate the potential for margin variation as a function of the frequency ratio (R ω = Ω/ω, where Ω is the forcing frequency and ω is the natural component frequency), (2) recommend a margin reduction factor (MRF) that could be applied to margins determined from the EPRI/USNRC test program to adjust those margins for potential margin variation with frequency ratio. Presented in this paper is the analytical approach and methodology, which are inelastic analysis, which was the basis of the study. Also, discussed is the development of the analytical model, the procedure used to benchmark the model to actual test results, and the various parameter studies conducted

  3. A ring test of in vitro neutral detergent fiber digestibility: analytical variability and sample ranking.

    Science.gov (United States)

    Hall, M B; Mertens, D R

    2012-04-01

    In vitro neutral detergent fiber (NDF) digestibility (NDFD) is an empirical measurement of fiber fermentability by rumen microbes. Variation is inherent in all assays and may be increased as multiple steps or differing procedures are used to assess an empirical measure. The main objective of this study was to evaluate variability within and among laboratories of 30-h NDFD values analyzed in repeated runs. Subsamples of alfalfa (n=4), corn forage (n=5), and grass (n=5) ground to pass a 6-mm screen passed a test for homogeneity. The 14 samples were sent to 10 laboratories on 3 occasions over 12 mo. Laboratories ground the samples and ran 1 to 3 replicates of each sample within fermentation run and analyzed 2 or 3 sets of samples. Laboratories used 1 of 2 NDFD procedures: 8 labs used procedures related to the 1970 Goering and Van Soest (GVS) procedure using fermentation vessels or filter bags, and 2 used a procedure with preincubated inoculum (PInc). Means and standard deviations (SD) of sample replicates within run within laboratory (lab) were evaluated with a statistical model that included lab, run within lab, sample, and lab × sample interaction as factors. All factors affected mean values for 30-h NDFD. The lab × sample effect suggests against a simple lab bias in mean values. The SD ranged from 0.49 to 3.37% NDFD and were influenced by lab and run within lab. The GVS procedure gave greater NDFD values than PInc, with an average difference across all samples of 17% NDFD. Because of the differences between GVS and PInc, we recommend using results in contexts appropriate to each procedure. The 95% probability limits for within-lab repeatability and among-lab reproducibility for GVS mean values were 10.2 and 13.4%, respectively. These percentages describe the span of the range around the mean into which 95% of analytical results for a sample fall for values generated within a lab and among labs. This degree of precision was supported in that the average maximum

  4. Analytical Analysis and Field Test Investigation of Consolidation for CCSG Pile Composite Foundation in Soft Clay

    Directory of Open Access Journals (Sweden)

    Jin Yu

    2013-01-01

    Full Text Available Low-grade concrete-cored sand-gravel (CCSG pile composite foundation is a new kind of composite foundation for thick and soft clay ground treatment. An analytical solution was derived for calculating the consolidation process of this composite foundation by considering coefficients of horizontal permeability in smear zone, the radial flow within the sand-gravel shell, and the impervious property of concrete-cored pile. The results show that Terzaghi’s one-dimensional consolidation solution and the consolidation analytical solution of ordinary composite foundation were special cases of this solution. Curves of the average consolidation degree of the composite foundation under various nondimensional parameters were observed using the program based on the theoretical formula. Meanwhile, a series of in situ measurements including the settlement of pile and soil, the pore water pressure, and the total stress under embankment load were obtained on the CCSG pile composite foundation on a section of Zhenjiang-Liyang highway. The analyzed results show that the new style composite foundation patent technology has many advantages such as small differential postconstruction settlement (differential is not good, small is, reliable quality, high bearing capacity, and stability. And the consolidation of composite foundation is largely affected by the nondimensional parameters. The analytical solution is finally verified with the actual measurement data.

  5. The wire optical test: a thorough analytical study in and out of caustic surface, and advantages of a dynamical adaptation

    Science.gov (United States)

    Alejandro Juárez-Reyes, Salvador; Sosa-Sánchez, Citlalli Teresa; Silva-Ortigoza, Gilberto; de Jesús Cabrera-Rosas, Omar; Espíndola-Ramos, Ernesto; Ortega-Vidals, Paula

    2018-03-01

    Among the best known non-interferometric optical tests are the wire test, the Foucault test and Ronchi test with a low frequency grating. Since the wire test is the seed to understand the other ones, the aim of the present work is to do a thorough study of this test for a lens with symmetry of revolution and to do this study for any configuration of the object and detection planes where both planes could intersect: two, one or no branches of the caustic region (including the marginal and paraxial foci). To this end, we calculated the vectorial representation for the caustic region, and we found the analytical expression for the pattern; we report that the analytical pattern explicitly depends on the magnitude of a branch of the caustic. With the analytical pattern we computed a set of simulations of a dynamical adaptation of the optical wire test. From those simulations, we have done a thorough analysis of the topological structure of the pattern; so we explain how the multiple image formation process and the image collapse process take place for each configuration, in particular, when both the wire and the detection planes are placed inside the caustic region, which has not been studied before. For the first time, we remark that not only the intersections of the object and detection planes with the caustic are important in the change of pattern topology; but also the projection of the intersection between the caustic and the object plane mapped onto the detection plane; and the virtual projection of the intersection between the caustic and the detection plane mapped onto the object plane. We present that for the new configurations of the optical system, the wire image is curves of the Tschirnhausen’s cubic, the piriform and the deformed eight-curve types.

  6. Comparison in the analytical performance between krypton and argon glow discharge plasmas as the excitation source for atomic emission spectrometry.

    Science.gov (United States)

    Wagatsuma, Kazuaki

    2009-04-01

    The emission characteristics of ionic lines of nickel, cobalt, and vanadium were investigated when argon or krypton was employed as the plasma gas in glow discharge optical emission spectrometry. A dc Grimm-style lamp was employed as the excitation source. Detection limits of the ionic lines in each iron-matrix alloy sample were compared between the krypton and the argon plasmas. Particular intense ionic lines were observed in the emission spectra as a function of the discharge gas (krypton or argon), such as the Co II 258.033 nm for krypton and the Co II 231.707 nm for argon. The explanation for this is that collisions with the plasma gases dominantly populate particular excited levels of cobalt ion, which can receive the internal energy from each gas ion selectively, for example, the 3d(7)4p (3)G(5) (6.0201 eV) for krypton and the 3d(7)4p (3)G(4) (8.0779 eV) for argon. In the determination of nickel as well as cobalt in iron-matrix samples, more sensitive ionic lines could be found in the krypton plasma rather than the argon plasma. Detection limits in the krypton plasma were 0.0039 mass% Ni for the Ni II 230.299-nm line and 0.002 mass% Co for the Co II 258.033-nm line. However, in the determination of vanadium, the argon plasma had better analytical performance, giving a detection limit of 0.0023 mass% V for the V II 309.310-nm line.

  7. Analytical solution for the transient wave propagation of a buried cylindrical P-wave line source in a semi-infinite elastic medium with a fluid surface layer

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng

    2018-02-01

    This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.

  8. Explosion Source Phenomena Using Soviet, Test-Era, Waveform Data

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Paul G.; Rautian, Tatyana G.; Khalturin, Vitaly I.; Phillips, W. Scott

    2006-04-12

    During the nuclear testing era, the former Soviet Union carried out extensive observations of underground nuclear explosions, recording both their own shots and those of foreign nuclear states. Between 1961 and 1989, the Soviet Complex Seismological Expedition deployed seismometers at time-varying subsets of over 150 sites to record explosions at regional distances from the Semipalatinsk and Lop Nor test sites and from the shot points of peaceful nuclear explosions. This data set included recordings from broadband, multi-channel ChISS seismometers that produced a series of narrow band outputs, which could then be measured to perform spectral studies. [ChISS is the Russian abbreviation for multichannel spectral seismometer. In this instrument the signal from the seismometer is passed through a system of narrow bandpass filters and recorded on photo paper. ChISS instruments have from 8 to 16 channels in the frequency range from 100 sec to 40 Hz. We used data mostly from 7 channels, ranging from 0.08 to 5 Hz.] Quantitative, pre-digital era investigations of high-frequency source scaling relied on this type of data. To augment data sets of central Central Asia explosions, we have measured and compiled 537 ChISS coda envelopes for 124 events recorded at Talgar, Kazakhstan, at a distance of about 750 km from Semipalatinsk. Envelopes and calibration levels were measured manually from photo paper records for seven bands between 0.08 and 5 Hz. We obtained from 2 to 10 coda envelope measurements per event, depending on the event size and instrument magnification. Coda lengths varied from 250 to 1400 s. For small events, only bands between 0.6 and 2.5 Hz could be measured. Envelope levels were interpolated or extrapolated to 500 s and we have obtained the dependence of this quantity on magnitude. Coda Q was estimated and found to increase from 232 at 0.08 Hz to 1270 at 5 Hz. These relationships were used to construct an average scaling law of coda spectra for Semipalatinsk

  9. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification.

  10. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    International Nuclear Information System (INIS)

    1995-01-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification

  11. Predictive Big Data Analytics: A Study of Parkinson’s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    Science.gov (United States)

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.

    2016-01-01

    Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several

  12. Geodesics of electrically and magnetically charged test particles in the Reissner-Nordstroem space-time: Analytical solutions

    International Nuclear Information System (INIS)

    Grunau, Saskia; Kagramanova, Valeria

    2011-01-01

    We present the full set of analytical solutions of the geodesic equations of charged test particles in the Reissner-Nordstroem space-time in terms of the Weierstrass weierp, σ, and ζ elliptic functions. Based on the study of the polynomials in the θ and r equations, we characterize the motion of test particles and discuss their properties. The motion of charged test particles in the Reissner-Nordstroem space-time is compared with the motion of neutral test particles in the field of a gravitomagnetic monopole. Electrically or magnetically charged particles in the Reissner-Nordstroem space-time with magnetic or electric charges, respectively, move on cones similar to neutral test particles in the Taub-NUT space-times.

  13. Comparison of in-plant performance test data with analytic prediction of reactor safety system injection transient (U)

    International Nuclear Information System (INIS)

    Roy, B.N.; Neill, C.H. Jr.

    1993-01-01

    This paper compares the performance test data from injection transients for both of the subsystems of the Supplementary Safety System of the Savannah River Site production reactor with analytical predictions from an in-house thermal hydraulic computer code. The code was initially developed for design validation of the new Supplementary Safety System subsystem, but is shown to be equally capable of predicting the performance of the Supplementary Safety System existing subsystem even though the two subsystem transient injections have marked differences. The code itself was discussed and its validation using prototypic tests with simulated fluids was reported in an earlier paper (Roy and Nomm 1991)

  14. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    Directory of Open Access Journals (Sweden)

    Piérard GE

    2015-03-01

    Full Text Available Gérald E Piérard,1,2 Justine Courtois,1 Caroline Ritacco,1 Philippe Humbert,2,3 Ferial Fanian,3 Claudine Piérard-Franchimont1,4,5 1Laboratory of Skin Bioengineering and Imaging (LABIC, Department of Clinical Sciences, Liège University, Liège, Belgium; 2University of Franche-Comté, Besançon, France; 3Department of Dermatology, University Hospital Saint-Jacques, Besançon, France; 4Department of Dermatopathology, Unilab Lg, University Hospital of Liège, Liège, Belgium; 5Department of Dermatology, Regional Hospital of Huy, Huy, Belgium Background: In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD and cyanoacrylate skin surface strippings (CSSSs. Methods: Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results: With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion: A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. Keywords: irritation, morphometry, quantitative morphology, stripping

  15. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  16. The effect of testing versus restudy on retention: a meta-analytic review of the testing effect.

    Science.gov (United States)

    Rowland, Christopher A

    2014-11-01

    Engaging in a test over previously studied information can serve as a potent learning event, a phenomenon referred to as the testing effect. Despite a surge of research in the past decade, existing theories have not yet provided a cohesive account of testing phenomena. The present study uses meta-analysis to examine the effects of testing versus restudy on retention. Key results indicate support for the role of effortful processing as a contributor to the testing effect, with initial recall tests yielding larger testing benefits than recognition tests. Limited support was found for existing theoretical accounts attributing the testing effect to enhanced semantic elaboration, indicating that consideration of alternative mechanisms is warranted in explaining testing effects. Future theoretical accounts of the testing effect may benefit from consideration of episodic and contextually derived contributions to retention resulting from memory retrieval. Additionally, the bifurcation model of the testing effect is considered as a viable framework from which to characterize the patterns of results present across the literature. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Multiplicity distributions of gluon and quark jets and tests of QCD analytic predictions

    Science.gov (United States)

    OPAL Collaboration; Ackerstaff, K.; et al.

    Gluon jets are identified in e+e^- hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The charged particle multiplicity distribution of the gluon jets is presented, and is analyzed for its mean, dispersion, skew, and curtosis values, and for its factorial and cumulant moments. The results are compared to the analogous results found for a sample of light quark (uds) jets, also defined inclusively. We observe differences between the mean, skew and curtosis values of gluon and quark jets, but not between their dispersions. The cumulant moment results are compared to the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the data compared to a next-to-leading order calculation without energy conservation. There is agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets.

  18. Multiplicity distributions of gluon and quark jets and tests of QCD analytic predictions

    CERN Document Server

    Ackerstaff, K; Allison, J; Altekamp, N; Anderson, K J; Anderson, S; Arcelli, S; Asai, S; Axen, D A; Azuelos, Georges; Ball, A H; Barberio, E; Barlow, R J; Bartoldus, R; Batley, J Richard; Baumann, S; Bechtluft, J; Beeston, C; Behnke, T; Bell, A N; Bell, K W; Bella, G; Bentvelsen, Stanislaus Cornelius Maria; Bethke, Siegfried; Biebel, O; Biguzzi, A; Bird, S D; Blobel, Volker; Bloodworth, Ian J; Bloomer, J E; Bobinski, M; Bock, P; Bonacorsi, D; Boutemeur, M; Bouwens, B T; Braibant, S; Brigliadori, L; Brown, R M; Burckhart, Helfried J; Burgard, C; Bürgin, R; Capiluppi, P; Carnegie, R K; Carter, A A; Carter, J R; Chang, C Y; Charlton, D G; Chrisman, D; Clarke, P E L; Cohen, I; Conboy, J E; Cooke, O C; Cuffiani, M; Dado, S; Dallapiccola, C; Dallavalle, G M; Davis, R; De Jong, S; del Pozo, L A; Desch, Klaus; Dienes, B; Dixit, M S; do Couto e Silva, E; Doucet, M; Duchovni, E; Duckeck, G; Duerdoth, I P; Eatough, D; Edwards, J E G; Estabrooks, P G; Evans, H G; Evans, M; Fabbri, Franco Luigi; Fanti, M; Faust, A A; Fiedler, F; Fierro, M; Fischer, H M; Fleck, I; Folman, R; Fong, D G; Foucher, M; Fürtjes, A; Futyan, D I; Gagnon, P; Gary, J W; Gascon, J; Gascon-Shotkin, S M; Geddes, N I; Geich-Gimbel, C; Geralis, T; Giacomelli, G; Giacomelli, P; Giacomelli, R; Gibson, V; Gibson, W R; Gingrich, D M; Glenzinski, D A; Goldberg, J; Goodrick, M J; Gorn, W; Grandi, C; Gross, E; Grunhaus, Jacob; Gruwé, M; Hajdu, C; Hanson, G G; Hansroul, M; Hapke, M; Hargrove, C K; Hart, P A; Hartmann, C; Hauschild, M; Hawkes, C M; Hawkings, R; Hemingway, Richard J; Herndon, M; Herten, G; Heuer, R D; Hildreth, M D; Hill, J C; Hillier, S J; Hobson, P R; Homer, R James; Honma, A K; Horváth, D; Hossain, K R; Howard, R; Hüntemeyer, P; Hutchcroft, D E; Igo-Kemenes, P; Imrie, D C; Ingram, M R; Ishii, K; Jawahery, A; Jeffreys, P W; Jeremie, H; Jimack, Martin Paul; Joly, A; Jones, C R; Jones, G; Jones, M; Jost, U; Jovanovic, P; Junk, T R; Karlen, D A; Kartvelishvili, V G; Kawagoe, K; Kawamoto, T; Kayal, P I; Keeler, Richard K; Kellogg, R G; Kennedy, B W; Kirk, J; Klier, A; Kluth, S; Kobayashi, T; Kobel, M; Koetke, D S; Kokott, T P; Kolrep, M; Komamiya, S; Kress, T; Krieger, P; Von Krogh, J; Kyberd, P; Lafferty, G D; Lahmann, R; Lai, W P; Lanske, D; Lauber, J; Lautenschlager, S R; Layter, J G; Lazic, D; Lee, A M; Lefebvre, E; Lellouch, Daniel; Letts, J; Levinson, L; Lloyd, S L; Loebinger, F K; Long, G D; Losty, Michael J; Ludwig, J; Macchiolo, A; MacPherson, A L; Mannelli, M; Marcellini, S; Markus, C; Martin, A J; Martin, J P; Martínez, G; Mashimo, T; Mättig, P; McDonald, W J; McKenna, J A; McKigney, E A; McMahon, T J; McPherson, R A; Meijers, F; Menke, S; Merritt, F S; Mes, H; Meyer, J; Michelini, Aldo; Mikenberg, G; Miller, D J; Mincer, A; Mir, R; Mohr, W; Montanari, A; Mori, T; Morii, M; Müller, U; Mihara, S; Nagai, K; Nakamura, I; Neal, H A; Nellen, B; Nisius, R; O'Neale, S W; Oakham, F G; Odorici, F; Ögren, H O; Oh, A; Oldershaw, N J; Oreglia, M J; Orito, S; Pálinkás, J; Pásztor, G; Pater, J R; Patrick, G N; Patt, J; Pearce, M J; Pérez-Ochoa, R; Petzold, S; Pfeifenschneider, P; Pilcher, J E; Pinfold, J L; Plane, D E; Poffenberger, P R; Poli, B; Posthaus, A; Rees, D L; Rigby, D; Robertson, S; Robins, S A; Rodning, N L; Roney, J M; Rooke, A M; Ros, E; Rossi, A M; Routenburg, P; Rozen, Y; Runge, K; Runólfsson, O; Ruppel, U; Rust, D R; Rylko, R; Sachs, K; Saeki, T; Sarkisyan-Grinbaum, E; Sbarra, C; Schaile, A D; Schaile, O; Scharf, F; Scharff-Hansen, P; Schenk, P; Schieck, J; Schleper, P; Schmitt, B; Schmitt, S; Schöning, A; Schröder, M; Schultz-Coulon, H C; Schumacher, M; Schwick, C; Scott, W G; Shears, T G; Shen, B C; Shepherd-Themistocleous, C H; Sherwood, P; Siroli, G P; Sittler, A; Skillman, A; Skuja, A; Smith, A M; Snow, G A; Sobie, Randall J; Söldner-Rembold, S; Springer, R W; Sproston, M; Stephens, K; Steuerer, J; Stockhausen, B; Stoll, K; Strom, D; Szymanski, P; Tafirout, R; Talbot, S D; Tanaka, S; Taras, P; Tarem, S; Teuscher, R; Thiergen, M; Thomson, M A; Von Törne, E; Towers, S; Trigger, I; Trócsányi, Z L; Tsur, E; Turcot, A S; Turner-Watson, M F; Utzat, P; Van Kooten, R; Verzocchi, M; Vikas, P; Vokurka, E H; Voss, H; Wäckerle, F; Wagner, A; Ward, C P; Ward, D R; Watkins, P M; Watson, A T; Watson, N K; Wells, P S; Wermes, N; White, J S; Wilkens, B; Wilson, G W; Wilson, J A; Wolf, G; Wyatt, T R; Yamashita, S; Yekutieli, G; Zacek, V; Zer-Zion, D

    1999-01-01

    Gluon jets are identified in e+e- hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The charged particle multiplicity distribution of the gluon jets is presented, and is analyzed for its mean, dispersion, skew, and curtosis values, and for its factorial and cumulant moments. The results are compared to the analogous results found for a sample of light quark (uds) jets, also defined inclusively. We observe differences between the mean, skew and curtosis values of gluon and quark jets, but not between their dispersions. The cumulant moment results are compared to the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observe...

  19. Analytical and numerical investigation of trolleybus vertical dynamics on an artificial test track

    Directory of Open Access Journals (Sweden)

    Polach P.

    2009-12-01

    Full Text Available Two virtual models of the ŠKODA 21 Tr low-floor trolleybus intended for the investigation of vertical dynamic properties during the simulation of driving on an uneven road surface are presented in the article. In order to solve analytically vertical vibrations, the trolleybus model formed by the system of four rigid bodies with seven degrees of freedom coupled by spring-damper elements is used. The influence of the asymmetry of a sprung mass, a linear viscous damping and a general kinematic excitation of wheels are incorporated in the model. The analytical approach to solving the ŠKODA 21 Tr low-floor trolleybus model vibrations is a suitable complement of the model based on a numerical solution. Vertical vibrations are numerically solved on the trolleybus multibody model created in the alaska simulation tool. Both virtual trolleybus models are used for the simulations of driving on the track composed of vertical obstacles. Conclusion concerning the effects of the usage of the linear and the nonlinear spring-damper elements characteristics are also given.

  20. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  1. General-Purpose Heat Source Safety Verification Test program: Edge-on flyer plate tests

    International Nuclear Information System (INIS)

    George, T.G.

    1987-03-01

    The radioisotope thermoelectric generator (RTG) that will supply power for the Galileo and Ulysses space missions contains 18 General-Purpose Heat Source (GPHS) modules. The GPHS modules provide power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Each module contains four 238 PuO 2 -fueled clads and generates 250 W(t). Because the possibility of a launch vehicle explosion always exists, and because such an explosion could generate a field of high-energy fragments, the fueled clads within each GPHS module must survive fragment impact. The edge-on flyer plate tests were included in the Safety Verification Test series to provide information on the module/clad response to the impact of high-energy plate fragments. The test results indicate that the edge-on impact of a 3.2-mm-thick, aluminum-alloy (2219-T87) plate traveling at 915 m/s causes the complete release of fuel from capsules contained within a bare GPHS module, and that the threshold velocity sufficient to cause the breach of a bare, simulant-fueled clad impacted by a 3.5-mm-thick, aluminum-alloy (5052-T0) plate is approximately 140 m/s

  2. Testing Special Relativity at High Energies with Astrophysical Sources

    Science.gov (United States)

    Stecker, F. W.

    2007-01-01

    Since the group of Lorentz boosts is unbounded, there is a question as to whether Lorentz invariance (LI) holds to infinitely short distances. However, special and general relativity may break down at the Planck scale. Various quantum gravity scenarios such as loop quantum gravity, as well as some forms of string theory and extra dimension models may imply Lorentz violation (LV) at ultrahigh energies. The Gamma-Ray Large Area Space Telescope (GLAST), to be launched in mid-December, will measure the spectra of distant extragalactic sources of high energy gamma-rays, particularly active galactic nuclei and gamma-ray bursts. GLAST can look for energy-dependent gamma-ray propagation effects from such sources as a signal of Lorentz invariance violation. These sources may also exhibit the high energy cutoffs predicted to be the result of intergalactic annihilation interactions with low energy photons having a flux level as determined by various astronomical observations. With LV the threshold for such interactions can be significantly raised, changing the predicted absorption turnover in the observed spectrum of the sources. Stecker and Glashow have shown that the existence such absorption features in the spectra of extragalactic sources puts constraints on LV. Such constraints have important implications for some quantum gravity and large extra dimension models. Future spaceborne detectors dedicated to measuring gamma-ray polarization can look for birefringence effects as a possible signal of loop quantum gravity. A very small LV may also result in the modification or elimination of the GZK effect, thus modifying the spectrum of ultrahigh energy cosmic rays. This possibility can be explored with ground-based arrays such as Auger or with a space based detector system such as the proposed OWL satellite mission.

  3. Sex Differences in Objective and Projective Dependency Tests: A Meta-Analytic Review.

    Science.gov (United States)

    Bornstein, Robert F.

    1995-01-01

    A meta-analysis of 97 studies published since 1950 that assessed sex differences in scores on objective and projective dependency tests indicated that women consistently obtained higher dependency scores on objective tests, and men obtained higher scores on projective tests. Findings are discussed in terms of sex role socialization. (SLD)

  4. Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement.

    Science.gov (United States)

    Paisley, Suzy

    2016-06-01

    This paper proposes recommendations for a minimum level of searching for data for key parameters in decision-analytic models of cost effectiveness and describes sources of evidence relevant to each parameter type. Key parameters are defined as treatment effects, adverse effects, costs, resource use, health state utility values (HSUVs) and baseline risk of events. The recommended minimum requirement for treatment effects is comprehensive searching according to available methodological guidance. For other parameter types, the minimum is the searching of one bibliographic database plus, where appropriate, specialist sources and non-research-based and non-standard format sources. The recommendations draw on the search methods literature and on existing analyses of how evidence is used to support decision-analytic models. They take account of the range of research and non-research-based sources of evidence used in cost-effectiveness models and of the need for efficient searching. Consideration is given to what constitutes best evidence for the different parameter types in terms of design and scientific quality and to making transparent the judgments that underpin the selection of evidence from the options available. Methodological issues are discussed, including the differences between decision-analytic models of cost effectiveness and systematic reviews when searching and selecting evidence and comprehensive versus sufficient searching. Areas are highlighted where further methodological research is required.

  5. Analytical evaluation on loss of off-side electric power simulation of the High Temperature Engineering Test Reactor

    International Nuclear Information System (INIS)

    Takeda, Takeshi; Nakagawa, Shigeaki; Tachibana, Yukio; Takada, Eiji; Kunitomi, Kazuhiko

    2000-03-01

    A rise-to-power test of the high temperature engineering test reactor (HTTR) started on September 28 in 1999 for establishing and upgrading the technological basis for the high temperature gas-cooled reactor (HTGR). A loss of off-site electric power test of the HTTR from the normal operation under 15 and 30 MW thermal power will be carried out in the rise-to-power test. Analytical evaluations on transient behaviors of the reactor and plant during the loss of off-site electric power were conducted. These estimations are proposed as benchmark problems for the IAEA coordinated research program on 'Evaluation of HTGR Performance'. This report describes an event scenario of transient during the loss of off-site electric power, the outline of major components and system, detailed thermal and nuclear data set for these problems and pre-estimation results of the benchmark problems by an analytical code 'ACCORD' for incore and plant dynamics of the HTGR. (author)

  6. Testing the Effectiveness of Cognitive Analytic Therapy for Hypersexuality Disorder: An Intensive Time-Series Evaluation.

    Science.gov (United States)

    Kellett, Stephen; Simmonds-Buckley, Mel; Totterdell, Peter

    2017-08-18

    The evidence base for treatment of hypersexuality disorder (HD) has few studies with appropriate methodological rigor. This study therefore conducted a single case experiment of cognitive analytic therapy (CAT) for HD using an A/B design with extended follow-up. Cruising, pornography usage, masturbation frequency and associated cognitions and emotions were measured daily in a 231-day time series. Following a three-week assessment baseline (A: 21 days), treatment was delivered via outpatient sessions (B: 147 days), with the follow-up period lasting 63 days. Results show that cruising and pornography usage extinguished. The total sexual outlet score no longer met caseness, and the primary nomothetic hypersexuality outcome measure met recovery criteria. Reduced pornography consumption was mediated by reduced obsessionality and greater interpersonal connectivity. The utility of the CAT model for intimacy problems shows promise. Directions for future HD outcome research are also provided.

  7. Testing Convergence of Different Free-Energy Methods in a Simple Analytical System with Hidden Barriers

    Directory of Open Access Journals (Sweden)

    S. Alexis Paz

    2018-03-01

    Full Text Available In this work, we study the influence of hidden barriers on the convergence behavior of three free-energy calculation methods: well-tempered metadynamics (WTMD, adaptive-biasing forces (ABF, and on-the-fly parameterization (OTFP. We construct a simple two-dimensional potential-energy surfaces (PES that allows for an exact analytical result for the free-energy in any one-dimensional order parameter. Then we chose different CV definitions and PES parameters to create three different systems with increasing sampling challenges. We find that all three methods are not greatly affected by the hidden-barriers in the simplest case considered. The adaptive sampling methods show faster sampling while the auxiliary high-friction requirement of OTFP makes it slower for this case. However, a slight change in the CV definition has a strong impact in the ABF and WTMD performance, illustrating the importance of choosing suitable collective variables.

  8. Tests of the TRAC code against known analytical solutions for stratified flow

    International Nuclear Information System (INIS)

    Black, P.S.; Leslie, D.C.; Hewitt, G.F.

    1987-01-01

    The area averaged equations for gas-liquid flow are briefly summarized and related, for the specific case of stratified flow, to the shallow water equations commonly used in hydraulics. These equations are then compared to the equations used in TRAC-PF/MOD1 and are shown to differ in their treatment of the gravity head terms. A modification of the TRAC code is therefore necessary to bring it into line with established shallow water theory. The corrected form of the code was compared with a number of specific cases, each of which throws further light on the code behavior. The following areas are discussed in the paper: (1) the dam break problem; (2) Kelvin-Helmholtz instability; (3) counter-current flow; and (4) slug flow. It is concluded that detailed comparisons of the code with known analytic solutions and with a number of the more complex phenomenological experiments can give useful insights into its behavior

  9. 10 CFR 34.27 - Leak testing and replacement of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing and replacement of sealed sources. 34.27... SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Equipment § 34.27 Leak testing and replacement... radiographic exposure device and leak testing of any sealed source must be performed by persons authorized to...

  10. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    Science.gov (United States)

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  11. Doubling immunochemistry laboratory testing efficiency with the cobas e 801 module while maintaining consistency in analytical performance.

    Science.gov (United States)

    Findeisen, P; Zahn, I; Fiedler, G M; Leichtle, A B; Wang, S; Soria, G; Johnson, P; Henzell, J; Hegel, J K; Bendavid, C; Collet, N; McGovern, M; Klopprogge, K

    2018-06-04

    The new immunochemistry cobas e 801 module (Roche Diagnostics) was developed to meet increasing demands on routine laboratories to further improve testing efficiency, while maintaining high quality and reliable data. During a non-interventional multicenter evaluation study, the overall performance, functionality and reliability of the new module was investigated under routine-like conditions. It was tested as a dedicated immunochemistry system at four sites and as a consolidator combined with clinical chemistry at three sites. We report on testing efficiency and analytical performance of the new module. Evaluation of sample workloads with site-specific routine request patterns demonstrated increased speed and almost doubled throughput (maximal 300 tests per h), thus revealing that one cobas e 801 module can replace two cobas e 602 modules while saving up to 44% floor space. Result stability was demonstrated by QC analysis per assay throughout the study. Precision testing over 21 days yielded excellent results within and between labs, and, method comparison performed versus the cobas e 602 module routine results showed high consistency of results for all assays under study. In a practicability assessment related to performance and handling, 99% of graded features met (44%) or even exceeded (55%) laboratory expectations, with enhanced reagent management and loading during operation being highlighted. By nearly doubling immunochemistry testing efficiency on the same footprint as a cobas e 602 module, the new module has a great potential to further consolidate and enhance laboratory testing while maintaining high quality analytical performance with Roche platforms. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  12. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    Science.gov (United States)

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  13. The Testing Effect in the Psychology Classroom: A Meta-Analytic Perspective

    Science.gov (United States)

    Schwieren, Juliane; Barenberg, Jonathan; Dutke, Stephan

    2017-01-01

    The testing effect is a robust empirical finding in the research on learning and instruction, demonstrating that taking tests during the learning phase facilitates later retrieval from long-term memory. Early evidence came mainly from laboratory studies, though in recent years applied educational researchers have become increasingly interested in…

  14. Designing and Testing Energy Harvesters Suitable for Renewable Power Sources

    Science.gov (United States)

    Synkiewicz, B.; Guzdek, P.; Piekarski, J.; Zaraska, K.

    2016-01-01

    Energy harvesters convert waste power (heat, light and vibration) directly to electric power . Fast progress in their technology, design and areas of application (e.g. “Internet of Things”) has been observed recently. Their effectiveness is steadily growing which makes their application to powering sensor networks with wireless data transfer reasonable. The main advantage is the independence from wired power sources, which is especially important for monitoring state of environmental parameters. In this paper we describe the design and realization of a gas sensor monitoring CO level (powered by TEG) and two, designed an constructed in ITE, autonomous power supply modules powered by modern photovoltaic cells.

  15. Designing and Testing Energy Harvesters Suitable for Renewable Power Sources

    International Nuclear Information System (INIS)

    Synkiewicz, B.; Guzdek, P.; Piekarski, J.; Zaraska, K.

    2016-01-01

    Energy harvesters convert waste power (heat, light and vibration) directly to electric power . Fast progress in their technology, design and areas of application (e.g. “Internet of Things”) has been observed recently. Their effectiveness is steadily growing which makes their application to powering sensor networks with wireless data transfer reasonable. The main advantage is the independence from wired power sources, which is especially important for monitoring state of environmental parameters. In this paper we describe the design and realization of a gas sensor monitoring CO level (powered by TEG) and two, designed an constructed in ITE, autonomous power supply modules powered by modern photovoltaic cells

  16. SPECTRAL INDEX AS A FUNCTION OF MASS ACCRETION RATE IN BLACK HOLE SOURCES: MONTE CARLO SIMULATIONS AND AN ANALYTICAL DESCRIPTION

    International Nuclear Information System (INIS)

    Laurent, Philippe; Titarchuk, Lev

    2011-01-01

    We present herein a theoretical study of correlations between spectral indexes of X-ray emergent spectra and mass accretion rate ( m-dot ) in black hole (BH) sources, which provide a definitive signature for BHs. It has been firmly established, using the Rossi X-ray Timing Explorer (RXTE) in numerous BH observations during hard-soft state spectral evolution, that the photon index of X-ray spectra increases when m-dot increases and, moreover, the index saturates at high values of m-dot . In this paper, we present theoretical arguments that the observationally established index saturation effect versus mass accretion rate is a signature of the bulk (converging) flow onto the BH. Also, we demonstrate that the index saturation value depends on the plasma temperature of converging flow. We self-consistently calculate the Compton cloud (CC) plasma temperature as a function of mass accretion rate using the energy balance between energy dissipation and Compton cooling. We explain the observable phenomenon, index- m-dot correlations using a Monte Carlo simulation of radiative processes in the innermost part (CC) of a BH source and we account for the Comptonization processes in the presence of thermal and bulk motions, as basic types of plasma motion. We show that, when m-dot increases, BH sources evolve to high and very soft states (HSS and VSS, respectively), in which the strong blackbody(BB)-like and steep power-law components are formed in the resulting X-ray spectrum. The simultaneous detections of these two components strongly depends on sensitivity of high-energy instruments, given that the relative contribution of the hard power-law tail in the resulting VSS spectrum can be very low, which is why, to date RXTE observations of the VSS X-ray spectrum have been characterized by the presence of the strong BB-like component only. We also predict specific patterns for high-energy e-fold (cutoff) energy (E fold ) evolution with m-dot for thermal and dynamical (bulk

  17. Field-testing competing runoff source and hydrochemical conceptualisations

    Science.gov (United States)

    Western, A. W.; Saffarpour, S.; Adams, R.; Costelloe, J. F.; McDonnell, J.

    2014-12-01

    There are competing conceptualisations of heterogeneity in catchment systems. It is often convenient to divide catchments into zones, for example the soil profile, groundwater aquifers (saturated zone), riparian zones, etc. We also often divide flow sources into distinct categories such as surface runoff, interflow and baseflow, implying a few distinct stores of water. In tracer hydrology we typically assume water from such zones has distinct and invariant chemistry that is used to infer the runoff source mixture through conservative mixing model techniques such as End-Member Mixing Analysis (EMMA). An alternative conceptualisation is that catchments consist of a large number of stores with varying residence times. In this case individual stores contribute a variable proportion of flow and may have a temporally varying composition due to processes such as evapo-concentration. Hence they have a variable influence on the hydrochemistry of runoff. In this presentation, examples from two field studies in southern Australia will be presented that examine the relationships between hydrologic and hydrochemical conceptualisations and the relative variation within and between different hydrologic zones. The implications for water quality behaviour will be examined and the additional behavioural complexities associated with interactions between runoff pathways for non-conservative chemical species will be discussed.

  18. Emotional distress following genetic testing for hereditary breast and ovarian cancer: a meta-analytic review.

    Science.gov (United States)

    Hamilton, Jada G; Lobel, Marci; Moyer, Anne

    2009-07-01

    Meta-analysis was used to synthesize results of studies on emotional consequences of predictive genetic testing for BRCA1/2 mutations conferring increased risk of breast and ovarian cancer. Studies assessing anxiety or cancer-specific distress before and after provision of test results (k = 20) were analyzed using a random-effects model. Moderator variables included country of data collection and personal cancer history of study participants. Standardized mean gain effect sizes were calculated for mutation carriers, noncarriers, and those with inconclusive results over short (0-4 weeks), moderate (5-24 weeks), or long (25-52 weeks) periods of time after testing. Distress among carriers increased shortly after receiving results and returned to pretesting levels over time. Distress among noncarriers and those with inconclusive results decreased over time. Some distress patterns differed in studies conducted outside the United States and for individuals with varying cancer histories. Results underscore the importance of time; changes in distress observed shortly after test-result disclosure frequently differed from the pattern of distress seen subsequently. Although emotional consequences of this testing appear minimal, it remains possible that testing may affect cognitive and behavioral outcomes, which have rarely been examined through meta-analysis. Testing may also affect understudied subgroups differently.

  19. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection

    Science.gov (United States)

    Cross, Robert W.; Boisen, Matthew L.; Millett, Molly M.; Nelson, Diana S.; Oottamasathien, Darin; Hartnett, Jessica N.; Jones, Abigal B.; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A.; Fusco, Marnie L.; Abelson, Dafna M.; Oda, Shunichiro; Brown, Bethany L.; Pham, Ha; Rowland, Megan M.; Agans, Krystle N.; Geisbert, Joan B.; Heinrich, Megan L.; Kulakosky, Peter C.; Shaffer, Jeffrey G.; Schieffelin, John S.; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M.; Wilson, Russell B.; Saphire, Erica Ollmann; Pitts, Kelly R.; Khan, Sheik Humarr; Grant, Donald S.; Geisbert, Thomas W.; Branco, Luis M.; Garry, Robert F.

    2016-01-01

    Background. Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013–2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases. Methods. Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance. Results. The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription–polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 105–9.0 × 108 genomes/mL. Conclusions. The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. PMID:27587634

  20. Analytical Validation of the ReEBOV Antigen Rapid Test for Point-of-Care Diagnosis of Ebola Virus Infection.

    Science.gov (United States)

    Cross, Robert W; Boisen, Matthew L; Millett, Molly M; Nelson, Diana S; Oottamasathien, Darin; Hartnett, Jessica N; Jones, Abigal B; Goba, Augustine; Momoh, Mambu; Fullah, Mohamed; Bornholdt, Zachary A; Fusco, Marnie L; Abelson, Dafna M; Oda, Shunichiro; Brown, Bethany L; Pham, Ha; Rowland, Megan M; Agans, Krystle N; Geisbert, Joan B; Heinrich, Megan L; Kulakosky, Peter C; Shaffer, Jeffrey G; Schieffelin, John S; Kargbo, Brima; Gbetuwa, Momoh; Gevao, Sahr M; Wilson, Russell B; Saphire, Erica Ollmann; Pitts, Kelly R; Khan, Sheik Humarr; Grant, Donald S; Geisbert, Thomas W; Branco, Luis M; Garry, Robert F

    2016-10-15

     Ebola virus disease (EVD) is a severe viral illness caused by Ebola virus (EBOV). The 2013-2016 EVD outbreak in West Africa is the largest recorded, with >11 000 deaths. Development of the ReEBOV Antigen Rapid Test (ReEBOV RDT) was expedited to provide a point-of-care test for suspected EVD cases.  Recombinant EBOV viral protein 40 antigen was used to derive polyclonal antibodies for RDT and enzyme-linked immunosorbent assay development. ReEBOV RDT limits of detection (LOD), specificity, and interference were analytically validated on the basis of Food and Drug Administration (FDA) guidance.  The ReEBOV RDT specificity estimate was 95% for donor serum panels and 97% for donor whole-blood specimens. The RDT demonstrated sensitivity to 3 species of Ebolavirus (Zaire ebolavirus, Sudan ebolavirus, and Bundibugyo ebolavirus) associated with human disease, with no cross-reactivity by pathogens associated with non-EBOV febrile illness, including malaria parasites. Interference testing exhibited no reactivity by medications in common use. The LOD for antigen was 4.7 ng/test in serum and 9.4 ng/test in whole blood. Quantitative reverse transcription-polymerase chain reaction testing of nonhuman primate samples determined the range to be equivalent to 3.0 × 10 5 -9.0 × 10 8 genomes/mL.  The analytical validation presented here contributed to the ReEBOV RDT being the first antigen-based assay to receive FDA and World Health Organization emergency use authorization for this EVD outbreak, in February 2015. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  1. An analytical threshold voltage model for a short-channel dual-metal-gate (DMG) recessed-source/drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Saramekala, G. K.; Santra, Abirmoya; Dubey, Sarvesh; Jit, Satyabrata; Tiwari, Pramod Kumar

    2013-08-01

    In this paper, an analytical short-channel threshold voltage model is presented for a dual-metal-gate (DMG) fully depleted recessed source/drain (Re-S/D) SOI MOSFET. For the first time, the advantages of recessed source/drain (Re-S/D) and of dual-metal-gate structure are incorporated simultaneously in a fully depleted SOI MOSFET. The analytical surface potential model at Si-channel/SiO2 interface and Si-channel/buried-oxide (BOX) interface have been developed by solving the 2-D Poisson’s equation in the channel region with appropriate boundary conditions assuming parabolic potential profile in the transverse direction of the channel. Thereupon, a threshold voltage model is derived from the minimum surface potential in the channel. The developed model is analyzed extensively for a variety of device parameters like the oxide and silicon channel thicknesses, thickness of source/drain extension in the BOX, control and screen gate length ratio. The validity of the present 2D analytical model is verified with ATLAS™, a 2D device simulator from SILVACO Inc.

  2. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  3. Analytical Solution of the Hyperbolic Heat Conduction Equation for Moving Semi-Infinite Medium under the Effect of Time-Dependent Laser Heat Source

    Directory of Open Access Journals (Sweden)

    R. T. Al-Khairy

    2009-01-01

    source, whose capacity is given by (,=((1−− while the semi-infinite body has insulated boundary. The solution is obtained by Laplace transforms method, and the discussion of solutions for different time characteristics of heat sources capacity (constant, instantaneous, and exponential is presented. The effect of absorption coefficients on the temperature profiles is examined in detail. It is found that the closed form solution derived from the present study reduces to the previously obtained analytical solution when the medium velocity is set to zero in the closed form solution.

  4. Pressurized thermal shocks: the JRC Ispra experimental test rig and analytical results

    International Nuclear Information System (INIS)

    Jovanovic, A.; Lucia, A.C.

    1990-01-01

    The paper tackles some issues of particular interest for the remanent (remaining) life prediction for the pressurized components exposed to pressurized thermal shock (PTS) loads, that have been tackled in analytical work performed in the framework of the MPA - JRC collaboration for the PTS experimental research at the JRC Ispra. These issues regard in general application of damage mechanics, fracture mechanics and artificial intelligence (including the treatment of uncertainties in the PTS analysis and experiments). The considered issues are essential for further understanding and modelling of the crack behaviour and of the component response in PTS conditions. In particular, the development of the FRAP preprocessor and development and implementation of a methodology for analysis of local non-stationary heat transfer coefficients during a PTS, have been explained more in detail. FRAP is used as a frontend, for the finite element code ABAQUS, for the heat transfer, stress and fracture mechanics analyses. The ABAQUS results are used further on, for the probabilistic fatigue crack growth analysis performed by the COVASTOL code. (author)

  5. Quality specifications for the extra-analytical phase of laboratory testing: Reference intervals and decision limits.

    Science.gov (United States)

    Ceriotti, Ferruccio

    2017-07-01

    Reference intervals and decision limits are a critical part of the clinical laboratory report. The evaluation of their correct use represents a tool to verify the post analytical quality. Four elements are identified as indicators. 1. The use of decision limits for lipids and glycated hemoglobin. 2. The use, whenever possible, of common reference values. 3. The presence of gender-related reference intervals for at least the following common serum measurands (besides obviously the fertility relate hormones): alkaline phosphatase (ALP), alanine aminotransferase (ALT), creatine kinase (CK), creatinine, gamma-glutamyl transferase (GGT), IgM, ferritin, iron, transferrin, urate, red blood cells (RBC), hemoglobin (Hb) and hematocrit (Hct). 4. The presence of age-related reference intervals. The problem of specific reference intervals for elderly people is discussed, but their use is not recommended; on the contrary it is necessary the presence of pediatric age-related reference intervals at least for the following common serum measurands: ALP, amylase, creatinine, inorganic phosphate, lactate dehydrogenase, aspartate aminotransferase, urate, insulin like growth factor 1, white blood cells, RBC, Hb, Hct, alfa-fetoprotein and fertility related hormones. The lack of such reference intervals may imply significant risks for the patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  7. Response of HDR-VKL piping system to seismic test excitations: Comparison of analytical predictions and test measurements

    International Nuclear Information System (INIS)

    Srinivasan, M.G.; Kot, C.A.; Hsieh, B.J.

    1989-01-01

    As part of the earthquake investigations at the HDR (Heissdampfreaktor) Test Facility in Kahl/Main, FRG, simulated seismic tests (SHAM) were performed during April--May 1988 on the VKL (Versuchskreislauf) piping system. The purpose of these experiments was to study the behavior of piping subjected to a range of seismic excitation levels including those that exceed design levels manifold and that might induce failure of pipe supports or plasticity in the pipe runs, and to establish seismic margins for piping and pipe supports. Data obtained in the tests are also used to validate analysis methods. Detailed reports on the SHAM experiments are given elsewhere. The objective of this document is to evaluate a subsystem analysis module of the SMACS code. This module is a linear finite-element based program capable of calculating the response of nuclear power plant subsystems subjected to independent multiple-acceleration input excitation. The evaluation is based on a comparison of computational results of simulation of SHAM tests with corresponding test measurements

  8. A two dimensional analytical modeling of surface potential in triple metal gate (TMG) fully-depleted Recessed-Source/Drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Priya, Anjali; Mishra, Ram Awadh

    2016-04-01

    In this paper, analytical modeling of surface potential is proposed for new Triple Metal Gate (TMG) fully depleted Recessed-Source/Dain Silicon On Insulator (SOI) Metal Oxide Semiconductor Field Effect Transistor (MOSFET). The metal with the highest work function is arranged near the source region and the lowest one near the drain. Since Recessed-Source/Drain SOI MOSFET has higher drain current as compared to conventional SOI MOSFET due to large source and drain region. The surface potential model developed by 2D Poisson's equation is verified by comparison to the simulation result of 2-dimensional ATLAS simulator. The model is compared with DMG and SMG devices and analysed for different device parameters. The ratio of metal gate length is varied to optimize the result.

  9. Verification of hybrid analysis concept of soil-foundation interaction by field vibration tests - Analytical phase

    International Nuclear Information System (INIS)

    Katayama, I.; Niwa, A.; Kubo, Y.; Penzien, J.

    1987-01-01

    In connection with the previous paper under the same subject, which describes the results obtained by the field vibration tests of five different models, this paper describes the outline of the hybrid analysis code of soil-structure interaction (HASSI) and the results of numerical simulation of the responses obtained at the model 2C in both cases of the forced vibration test and the natural earthquake excitation

  10. Air Monitoring Network at Tonopah Test Range: Network Description, Capabilities, and Analytical Results

    International Nuclear Information System (INIS)

    Hartwell, William T.; Daniels, Jeffrey; Nikolich, George; Shadel, Craig; Giles, Ken; Karr, Lynn; Kluesner, Tammy

    2012-01-01

    During the period April to June 2008, at the behest of the Department of Energy (DOE), National Nuclear Security Administration, Nevada Site Office (NNSA/NSO); the Desert Research Institute (DRI) constructed and deployed two portable environmental monitoring stations at the Tonopah Test Range (TTR) as part of the Environmental Restoration Project Soils Activity. DRI has operated these stations since that time. A third station was deployed in the period May to September 2011. The TTR is located within the northwest corner of the Nevada Test and Training Range (NTTR), and covers an area of approximately 725.20 km2 (280 mi2). The primary objective of the monitoring stations is to evaluate whether and under what conditions there is wind transport of radiological contaminants from Soils Corrective Action Units (CAUs) associated with Operation Roller Coaster on TTR. Operation Roller Coaster was a series of tests, conducted in 1963, designed to examine the stability and dispersal of plutonium in storage and transportation accidents. These tests did not result in any nuclear explosive yield. However, the tests did result in the dispersal of plutonium and contamination of surface soils in the surrounding area.

  11. Analytical Work in Support of the Design and Operation of Two Dimensional Self Streamlining Test Sections

    Science.gov (United States)

    Judd, M.; Wolf, S. W. D.; Goodyer, M. J.

    1976-01-01

    A method has been developed for accurately computing the imaginary flow fields outside a flexible walled test section, applicable to lifting and non-lifting models. The tolerances in the setting of the flexible walls introduce only small levels of aerodynamic interference at the model. While it is not possible to apply corrections for the interference effects, they may be reduced by improving the setting accuracy of the portions of wall immediately above and below the model. Interference effects of the truncation of the length of the streamlined portion of a test section are brought to an acceptably small level by the use of a suitably long test section with the model placed centrally.

  12. Test Method for High β Particle Emission Rate of 63Ni Source Plate

    OpenAIRE

    ZHANG Li-feng

    2015-01-01

    For the problem of measurement difficulties of β particle emission rate of Ni-63 source plate used for Ni-63 betavoltaic battery, a relative test method of scintillation current method was erected according to the measurement principle of scintillation detector.β particle emission rate of homemade Ni-63 source plate was tested by the method, and the test results were analysed and evaluated, it was initially thought that scintillation current method was a feasible way of testing β particle emi...

  13. Two-dimensional analytical solutions for chemical transport in aquifers. Part 1. Simplified solutions for sources with constant concentration. Part 2. Exact solutions for sources with constant flux rate

    International Nuclear Information System (INIS)

    Shan, C.; Javandel, I.

    1996-05-01

    Analytical solutions are developed for modeling solute transport in a vertical section of a homogeneous aquifer. Part 1 of the series presents a simplified analytical solution for cases in which a constant-concentration source is located at the top (or the bottom) of the aquifer. The following transport mechanisms have been considered: advection (in the horizontal direction), transverse dispersion (in the vertical direction), adsorption, and biodegradation. In the simplified solution, however, longitudinal dispersion is assumed to be relatively insignificant with respect to advection, and has been neglected. Example calculations are given to show the movement of the contamination front, the development of concentration profiles, the mass transfer rate, and an application to determine the vertical dispersivity. The analytical solution developed in this study can be a useful tool in designing an appropriate monitoring system and an effective groundwater remediation method

  14. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  15. Gender Differences among Children with ADHD on Continuous Performance Tests: A Meta-Analytic Review

    Science.gov (United States)

    Hasson, Ramzi; Fine, Jodene Goldenring

    2012-01-01

    Objective: Gender differences among children with ADHD are not well understood. The continuous performance test (CPT) is the most frequently used direct measure of inattention and impulsivity. This meta-analysis compared CPT performance between boys and girls with and without ADHD. Method: All peer-reviewed ADHD studies published between 1980 and…

  16. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  17. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  18. Simplified analytical modeling of the normal hole erosion test; Modelado analitico simplificado del ensayo normal de ersoion de tubo

    Energy Technology Data Exchange (ETDEWEB)

    Khamlichi, A.; Bezzazi, M.; El Bakkali, L.; Jabbouri, A.; Kissi, B.; Yakhlef, F.; Parron Vera, M. A.; Rubio Cintas, M. D.; Castillo Lopez, O.

    2009-07-01

    The role erosion test was developed in order to study erosion phenomenon which occurs in cracks appearing in hydraulic infrastructures such as dams. This test enables describing experimentally the erosive characteristics of soils by means of an index which is called erosion rate and a critical tension which indicates the threshold of surface erosion initiation. The objective of this work is to five modelling of this experiment by means of a simplified analytical approach. The erosion law is derived by taking into account the flow regime. This law shows that the erosion occurring in the tube is controlled by a first order dynamics where only two parameters are involved: the characteristic's time linked to the erosion rate and the stress shear threshold for which erosion begins to develop. (Author) 5 refs.

  19. A new DG nanoscale TFET based on MOSFETs by using source gate electrode: 2D simulation and an analytical potential model

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2017-08-01

    This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.

  20. Analytic model of the stress waves propagation in thin wall tubes, seeking the location of a harmonic point source in its surface

    International Nuclear Information System (INIS)

    Boaratti, Mario Francisco Guerra

    2006-01-01

    Leaks in pressurized tubes generate acoustic waves that propagate through the walls of these tubes, which can be captured by accelerometers or by acoustic emission sensors. The knowledge of how these walls can vibrate, or in another way, how these acoustic waves propagate in this material is fundamental in the detection and localization process of the leak source. In this work an analytic model was implemented, through the motion equations of a cylindrical shell, with the objective to understand the behavior of the tube surface excited by a point source. Since the cylindrical surface has a closed pattern in the circumferential direction, waves that are beginning their trajectory will meet with another that has already completed the turn over the cylindrical shell, in the clockwise direction as well as in the counter clockwise direction, generating constructive and destructive interferences. After enough time of propagation, peaks and valleys in the shell surface are formed, which can be visualized through a graphic representation of the analytic solution created. The theoretical results were proven through measures accomplished in an experimental setup composed of a steel tube finished in sand box, simulating the condition of infinite tube. To determine the location of the point source on the surface, the process of inverse solution was adopted, that is to say, known the signals of the sensor disposed in the tube surface , it is determined through the theoretical model where the source that generated these signals can be. (author)

  1. Analytical solution for the transient response of a fluid/saturated porous medium halfspace system subjected to an impulsive line source

    Science.gov (United States)

    Shan, Zhendong; Ling, Daosheng; Jing, Liping; Li, Yongqiang

    2018-05-01

    In this paper, transient wave propagation is investigated within a fluid/saturated porous medium halfspace system with a planar interface that is subjected to a cylindrical P-wave line source. Assuming the permeability coefficient is sufficiently large, analytical solutions for the transient response of the fluid/saturated porous medium halfspace system are developed. Moreover, the analytical solutions are presented in simple closed forms wherein each term represents a transient physical wave, especially the expressions for head waves. The methodology utilised to determine where the head wave can emerge within the system is also given. The wave fields within the fluid and porous medium are first defined considering the behaviour of two compressional waves and one tangential wave in the saturated porous medium and one compressional wave in the fluid. Substituting these wave fields into the interface continuity conditions, the analytical solutions in the Laplace domain are then derived. To transform the solutions into the time domain, a suitable distortion of the contour is provided to change the integration path of the solution, after which the analytical solutions in the Laplace domain are transformed into the time domain by employing Cagniard's method. Numerical examples are provided to illustrate some interesting features of the fluid/saturated porous medium halfspace system. In particular, the interface wave and head waves that propagate along the interface between the fluid and saturated porous medium can be observed.

  2. Thermal Vacuum Test Correlation of a Zero Propellant Load Case Thermal Capacitance Propellant Gauging Analytical Model

    Science.gov (United States)

    Mckim, Stephen A.

    2016-01-01

    This thesis describes the development and correlation of a thermal model that forms the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA's Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented are presented. The thermal model was correlated to within plus or minus 3 degrees Celsius of the thermal vacuum test data, and was determined sufficient to make future propellant predictions on MMS. The model was also found to be relatively sensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed to improve temperature predictions in the upper hemisphere of the propellant tank where predictions were found to be 2 to 2.5 C lower than the test data. A road map for applying the model to predict propellant loads on the actual MMS spacecraft toward its end of life in 2017-2018 is also presented.

  3. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    Science.gov (United States)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  4. Characterization of zinc alloy by sheet bulging test with analytical models and digital image correlation

    Science.gov (United States)

    Vitu, L.; Laforge, N.; Malécot, P.; Boudeau, N.; Manov, S.; Milesi, M.

    2018-05-01

    Zinc alloys are used in a wide range of application such as electronics, automotive and building construction. Their various shapes are generally obtained by metal forming operation such as stamping. Therefore, it is important to characterize the material with adequate characterization tests. Sheet Bulging Test (SBT) is well recognized in the metal forming community. Different theoretical models of the literature for the evaluation of thickness and radius of the deformed sheet in SBT have been studied in order to get the hardening curve of different materials. These theoretical models present the advantage that the experimental procedure is very simple. But Koç et al. showed their limitation, since the combination of thickness and radius evaluations depend on the material. As Zinc alloys are strongly anisotropic with a special crystalline structure, a procedure is adopted for characterizing the hardening curve of a Zinc alloy. The anisotropy is first studied with tensile test, and SBT with elliptical dies is also investigated. Parallel to this, Digital Image Correlation (DIC) measures are carried out. The results obtained from theoretical models and DIC measures are compared. Measures done on post-mortem specimens complete the comparisons. Finally, DIC measures give better results and the resulting hardening curve of the studied zinc alloy is provided.

  5. Thermal Vacuum Test Correlation of A Zero Propellant Load Case Thermal Capacitance Propellant Gauging Analytics Model

    Science.gov (United States)

    McKim, Stephen A.

    2016-01-01

    This thesis describes the development and test data validation of the thermal model that is the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA's Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented to validate the model are presented. The thermal model was correlated to within plus or minus 3 degrees Centigrade of the thermal vacuum test data, and was found to be relatively insensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed, however, to refine the thermal model to further improve temperature predictions in the upper hemisphere of the propellant tank. Temperatures predictions in this portion were found to be 2-2.5 degrees Centigrade lower than the test data. A road map to apply the model to predict propellant loads on the actual MMS spacecraft toward its end of life in 2017-2018 is also presented.

  6. International Test Comparisons: Reviewing Translation Error in Different Source Language-Target Language Combinations

    Science.gov (United States)

    Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming

    2018-01-01

    This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…

  7. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  8. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-10-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  9. Benchmark Tests to Develop Analytical Time-Temperature Limit for HANA-6 Cladding for Compliance with New LOCA Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sung Yong; Jang, Hun; Lim, Jea Young; Kim, Dae Il; Kim, Yoon Ho; Mok, Yong Kyoon [KEPCO Nuclear Fuel Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    According to 10CFR50.46c, two analytical time and temperature limits for breakaway oxidation and postquench ductility (PQD) should be determined by approved experimental procedure as described in NRC Regulatory Guide (RG) 1.222 and 1.223. According to RG 1.222 and 1.223, rigorous qualification requirements for test system are required, such as thermal and weight gain benchmarks. In order to meet these requirements, KEPCO NF has developed the new special facility to evaluate LOCA performance of zirconium alloy cladding. In this paper, qualification results for test facility and HT oxidation model for HANA-6 are summarized. The results of thermal benchmark tests of LOCA HT oxidation tester is summarized as follows. 1. The best estimate HT oxidation model of HANA- 6 was developed for the vender proprietary HT oxidation model. 2. In accordance with the RG 1.222 and 1.223, Benchmark tests were performed by using LOCA HT oxidation tester 3. The maximum axial and circumferential temperature difference are ± 9 .deg. C and ± 2 .deg. C at 1200 .deg. C, respectively. At the other temperature conditions, temperature difference is less than 1200 .deg. C result. Thermal benchmark test results meet the requirements of NRC RG 1.222 and 1.223.

  10. Design and tests of a package for the transport of radioactive sources

    International Nuclear Information System (INIS)

    Santos, Paulo de Oliveira

    2011-01-01

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  11. Test for arsenic speciation in waters based on a paper-based analytical device with scanometric detection.

    Science.gov (United States)

    Pena-Pereira, Francisco; Villar-Blanco, Lorena; Lavilla, Isela; Bendicho, Carlos

    2018-06-29

    A rapid, simple and affordable method for arsenic speciation analysis is described in this work. The proposed methodology involves in situ arsine generation, transfer of the volatile to the headspace and its reaction with silver nitrate at the detection zone of a paper-based analytical device (PAD). Thus, silver nitrate acts as a recognition element for arsine in the paper-based sensor. The chemical reaction between the recognition element and the analyte derivative results in the formation of a colored product which can be detected by scanning the detection zone and data treatment with an image processing and analysis program. Detection and injection zones were defined in the paper substrate by formation of hydrophobic barriers, thus enabling the formation of the volatile derivative without affecting the chemical stability of the recognition element present in the PAD. Experimental parameters influencing the analytical performance of the methodology, namely color mode detection, composition of the paper-based sensor and hydride generation and mass transfer conditions, were evaluated. Under optimal conditions, the proposed method showed limits of detection and quantification of 1.1 and 3.6 ng mL -1 , respectively. Remarkably, the limit of detection of the method reported herein was much lower than the maximum contaminant levels set by both the World Health Organization and the US Environmental Protection Agency for arsenic in drinking water, unlike several commercially available arsenic test kits. The repeatability, expressed as relative standard deviation, was found to be 7.1% (n = 8). The method was validated against the European Reference Material ERM ® -CA615 groundwater and successfully applied to the determination of As(III), As(V) and total inorganic As in different water samples. Furthermore, the method can be used for the screening analysis of total arsenic in waters when a cut-off level of 7 ng mL -1 is used. Copyright © 2018 Elsevier B

  12. Analytic-graphic testing of deformities at the waterworks Pod Bukovcom

    Directory of Open Access Journals (Sweden)

    Jeèný Miloš

    2001-09-01

    Full Text Available The paper presents some geodetic measurement results in a frame of deformity survey of the bulk dam at the waterworks Pod Bukovcom nearby Košice. Periodic geodetic position and levelling measurement are realized on the dam since 1999. Testing statistics are applied into the deformity survey. Geodetic data obtained from individual measurements in the geodetic network on the bulk dam at the waterworks Pod Bukovcom are adjusted using Gauss-Markov model. Accuracy analysis by means of using relative and confidence ellipses is complemented to geodetic measurements.

  13. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1993-01-01

    The use of multidisciplinary teams to develop Type B shipping containers improves the quality and reliability of these reusable packagings. Including the people involved in all aspects of the design, certification and use of the package leads to more innovative, user-friendly containers. Concurrent use of testing and analysis allows engineers to more fully characterize a shipping container's responses to the environments given in the regulations, and provides a strong basis for certification. The combination of the input and output of these efforts should provide a general methodology that designers of Type B radioactive material shipping containers can utilize to optimize and certify their designs. (J.P.N.)

  14. Genetic Testing and Tissue Banking for Personalized Oncology: Analytical and Institutional Factors.

    Science.gov (United States)

    Miles, George; Rae, James; Ramalingam, Suresh S; Pfeifer, John

    2015-10-01

    Personalized oncology, or more aptly precision oncogenomics, refers to the identification and implementation of clinically actionable targets tailored to an individual patient's cancer genomic information. Banking of human tissue and other biospecimens establishes a framework to extract and collect the data essential to our understanding of disease pathogenesis and treatment. Cancer cooperative groups in the United States have led the way in establishing robust biospecimen collection mechanisms to facilitate translational research, and combined with technological advances in molecular testing, tissue banking has expanded from its traditional base in academic research and is assuming an increasingly pivotal role in directing the clinical care of cancer patients. Comprehensive screening of tumors by DNA sequencing and the ability to mine and interpret these large data sets from well-organized tissue banks have defined molecular subtypes of cancer. Such stratification by genomic criteria has revolutionized our perspectives on cancer diagnosis and treatment, offering insight into prognosis, progression, and susceptibility or resistance to known therapeutic agents. In turn, this has enabled clinicians to offer treatments tailored to patients that can greatly improve their chances of survival. Unique challenges and opportunities accompany the rapidly evolving interplay between tissue banking and genomic sequencing, and are the driving forces underlying the revolution in precision medicine. Molecular testing and precision medicine clinical trials are now becoming the major thrust behind the cooperative groups' clinical research efforts. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. An analytical approach for a nodal formulation of a two-dimensional fixed-source neutron transport problem in heterogeneous medium

    Energy Technology Data Exchange (ETDEWEB)

    Basso Barichello, Liliane; Dias da Cunha, Rudnei [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Inst. de Matematica; Becker Picoloto, Camila [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Tres, Anderson [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Matematica Aplicada

    2015-05-15

    A nodal formulation of a fixed-source two-dimensional neutron transport problem, in Cartesian geometry, defined in a heterogeneous medium, is solved by an analytical approach. Explicit expressions, in terms of the spatial variables, are derived for averaged fluxes in each region in which the domain is subdivided. The procedure is an extension of an analytical discrete ordinates method, the ADO method, for the solution of the two-dimensional homogeneous medium case. The scheme is developed from the discrete ordinates version of the two-dimensional transport equation along with the level symmetric quadrature scheme. As usual for nodal schemes, relations between the averaged fluxes and the unknown angular fluxes at the contours are introduced as auxiliary equations. Numerical results are in agreement with results available in the literature.

  16. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    Energy Technology Data Exchange (ETDEWEB)

    Felix, Juliana S., E-mail: jfelix@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Alfaro, Pilar, E-mail: palfarot@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain); Nerin, Cristina, E-mail: cnerin@unizar.es [Department of Analytical Chemistry, Aragon Institute of Engineering Research (I3A), CPS, University of Zaragoza, Torres Quevedo Bldg., Maria de Luna St. 3, E-50018 Zaragoza (Spain)

    2011-02-14

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  17. Pros and cons of analytical methods to quantify surrogate contaminants from the challenge test in recycled polyethylene terephthalate

    International Nuclear Information System (INIS)

    Felix, Juliana S.; Alfaro, Pilar; Nerin, Cristina

    2011-01-01

    Different analytical methods were optimized and applied to quantify certain surrogate contaminants (toluene, chlorobenzene, phenol, limonene and benzophenone) in samples of contaminated and recycled flakes and virgin pellets of polyethylene terephthalate (PET) coming from the industrial challenge test. A screening analysis of the PET samples was carried out by direct solid-phase microextraction (SPME) in headspace mode (HS). The methods developed and used for quantitative analysis were a) total dissolution of PET samples in dichloroacetic acid and analysis by HS-SPME coupled to gas chromatography-mass spectrometry (GC-MS) and, b) dichloromethane extraction and analysis by GC-MS. The concentration of all surrogates in the contaminated PET flakes analyzed by HS-SPME method was lower than expected according to information provided by the supplier. Dichloroacetic acid interacted with the surrogates, resulting in a tremendous decrease of limonene concentration. The degradation compounds from limonene were identified. Dichloromethane extraction and GC-MS analysis evidenced the highest values of analytes in these PET samples. Based on the foregoing data, the efficiency of the recycling process was evaluated, whereby the removal of 99.9% of the surrogates proceeding from the contaminated flakes was confirmed.

  18. Comparison of General Purpose Heat Source testing with the ANSI N43.6-1977 (R 1989) sealed source standard

    International Nuclear Information System (INIS)

    Grigsby, C.O.

    1998-01-01

    This analysis provides a comparison of the testing of Radioisotope Thermoelectric Generators (RTGs) and RTG components with the testing requirements of ANSI N43.6-1977 (R1989) ''Sealed Radioactive Sources, Categorization''. The purpose of this comparison is to demonstrate that the RTGs meet or exceed the requirements of the ANSI standard, and thus can be excluded from the radioactive inventory of the Chemistry and Metallurgy Research (CMR) building in Los Alamos per Attachment 1 of DOE STD 1027-92. The approach used in this analysis is as follows: (1) describe the ANSI sealed source classification methodology; (2) develop sealed source performance requirements for the RTG and/or RTG components based on criteria from the accident analysis for CMR; (3) compare the existing RTG or RTG component test data to the CMR requirements; and (4) determine the appropriate ANSI classification for the RTG and/or RTG components based on CMR performance requirements. The CMR requirements for treating RTGs as sealed sources are derived from the radiotoxicity of the isotope ( 238 P7) and amount (13 kg) of radioactive material contained in the RTG. The accident analysis for the CMR BIO identifies the bounding accidents as wing-wide fire, explosion and earthquake. These accident scenarios set the requirements for RTGs or RTG components stored within the CMR

  19. Relationship between first trimester aneuploidy screening test serum analytes and placenta accreta.

    Science.gov (United States)

    Büke, Barış; Akkaya, Hatice; Demir, Sibel; Sağol, Sermet; Şimşek, Deniz; Başol, Güneş; Barutçuoğlu, Burcu

    2018-01-01

    The aim of this study is to determine whether there is a relationship between first trimester serum pregnancy-associated plasma protein A (PAPP-A) and free beta human chorionic gonadotropin (fβhCG) MoM values and placenta accreta in women who had placenta previa. A total of 88 patients with placenta previa who had first trimester aneuploidy screening test results were enrolled in the study. Nineteen of these patients were also diagnosed with placenta accreta. As probable markers of excessive placental invasion, serum PAPP-A and fβhCG MoM values were compared in two groups with and without placenta accreta. Patients with placenta accreta had higher statistically significant serum PAPP-A (1.20 versus 0.865, respectively, p = 0.045) and fβhCG MoM (1.42 versus 0.93, respectively, p = 0.042) values than patients without accreta. Higher first trimester serum PAPP-A and fβhCG MoM values seem to be associated with placenta accreta in women with placenta previa. Further studies are needed to use these promising additional tools for early detection of placenta accreta.

  20. Linking job demands and resources to employee engagement and burnout: a theoretical extension and meta-analytic test.

    Science.gov (United States)

    Crawford, Eean R; Lepine, Jeffery A; Rich, Bruce Louis

    2010-09-01

    We refine and extend the job demands-resources model with theory regarding appraisal of stressors to account for inconsistencies in relationships between demands and engagement, and we test the revised theory using meta-analytic structural modeling. Results indicate support for the refined and updated theory. First, demands and burnout were positively associated, whereas resources and burnout were negatively associated. Second, whereas relationships among resources and engagement were consistently positive, relationships among demands and engagement were highly dependent on the nature of the demand. Demands that employees tend to appraise as hindrances were negatively associated with engagement, and demands that employees tend to appraise as challenges were positively associated with engagement. Implications for future research are discussed. Copyright 2010 APA, all rights reserved

  1. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  2. APL/JHU free flight tests of the General Purpose Heat Source module. Testing: 5-7 March 1984

    International Nuclear Information System (INIS)

    Baker, W.M. II.

    1984-01-01

    Purpose of the test was to obtain statistical information on the dynamics of the General Purpose Heat Source (GPHS) module at terminal speeds. Models were designed to aerodynamically and dynamically represent the GPHS module. Normal and high speed photographic coverage documented the motion of the models. This report documents test parameters and techniques for the free-spin tests. It does not include data analysis

  3. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  4. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  5. Plastics from household waste as a source of heavy metal pollution: An inventory study using INAA as the analytical technique

    International Nuclear Information System (INIS)

    Bode, P.; De Bruin, M.; Aalbers, Th.G.; Meyer, P.J.

    1990-01-01

    An inventory study to the levels of cadmium in the plastic component of household waste was carried out utilizing INAA as the analytical technique. In a 2-h irradiation, 2-d decay, and 1-h measurement, protocol adequate sensitivities could be obtained for Cd, but also for a group of other metals: Cr, Co, Ni, Cu, Sr, Zn, As, Se, Mo, Sn, Sb, Ba, and Hg. Red-, orange-, and yellow-colored plastics either contain Cd at high levels (over 1000 mg/kg) or have relatively low Cd concentrations (<50 mg/kg). High concentrations were also occasionally found for Sr,Se,Ba,Sb, and Hg. INAA appeared very well to be routinely usable for such analysis because of the absence of a destruction step, adequate sensitivity, high accuracy, and multielement results

  6. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    Science.gov (United States)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  7. The World Spatiotemporal Analytics and Mapping Project (WSTAMP: Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World’s Largest Open Source Data Sets

    Directory of Open Access Journals (Sweden)

    J. Piburn

    2017-10-01

    Full Text Available Spatiotemporal (ST analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  8. Analytical pyrolysis and thermally assisted hydrolysis and methylation of EUROSOIL humic acid samples: a key to their source

    NARCIS (Netherlands)

    Buurman, P.; Nierop, K.G.J.; Kaal, J.; Senesi, S.I.

    2009-01-01

    Humic acids have been widely investigated by spectroscopic methods, especially NMR and FTIR, and they are known to show significant differences according to their origin. Low resolution methods such as NMR and FTIR, however cannot easily distinguish different input sources or establish relations

  9. DEVELOPMENT OF SAMPLING AND ANALYTICAL METHODS FOR THE MEASUREMENT OF NITROUS OXIDE FROM FOSSIL FUEL COMBUSTION SOURCES

    Science.gov (United States)

    The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...

  10. Analytical Subthreshold Current and Subthreshold Swing Models for a Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFET with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2017-08-01

    Two-dimensional (2D) analytical models for the subthreshold current and subthreshold swing of the back-gated fully depleted recessed-source/drain (Re-S/D) silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistor (MOSFET) are presented. The surface potential is determined by solving the 2D Poisson equation in both channel and buried-oxide (BOX) regions, considering suitable boundary conditions. To derive closed-form expressions for the subthreshold characteristics, the virtual cathode potential expression has been derived in terms of the minimum of the front and back surface potentials. The effect of various device parameters such as gate oxide and Si film thicknesses, thickness of source/drain penetration into BOX, applied back-gate bias voltage, etc. on the subthreshold current and subthreshold swing has been analyzed. The validity of the proposed models is established using the Silvaco ATLAS™ 2D device simulator.

  11. Dataset for Testing Contamination Source Identification Methods for Water Distribution Networks

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes the results of a simulation study using the source inversion techniques available in the Water Security Toolkit. The data was created to test...

  12. Evaluation and Testing of Several Free/Open Source Web Vulnerability Scanners

    OpenAIRE

    Suteva, Natasa; Zlatkovski, Dragi; Mileva, Aleksandra

    2013-01-01

    Web Vulnerability Scanners (WVSs) are software tools for identifying vulnerabilities in web applications. There are commercial WVSs, free/open source WVSs, and some companies offer them as a Software-as-a-Service. In this paper, we test and evaluate six free/open source WVSs using the web application WackoPicko with many known vulnerabilities, primary for false negative rates.

  13. Estimating and Testing the Sources of Evoked Potentials in the Brain.

    Science.gov (United States)

    Huizenga, Hilde M.; Molenaar, Peter C. M.

    1994-01-01

    The source of an event-related brain potential (ERP) is estimated from multivariate measures of ERP on the head under several mathematical and physical constraints on the parameters of the source model. Statistical aspects of estimation are discussed, and new tests are proposed. (SLD)

  14. Manufacturing cost study on the ion sources for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    A study of the cost of manufacturing 48 ion sources for the Mirror Fusion Test Facility is described. The estimate is built up from individual part costs and assembly operation times for the 80 kV prototype source constructed by LLL and described by LLL drawings furnished during December 1978. Recommendations for cost reduction are made

  15. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  16. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  17. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; Piburn, Jesse O [ORNL; Sorokine, Alexandre [ORNL; Myers, Aaron T [ORNL; White, Devin A [ORNL

    2015-01-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or

  18. LC-MS/MS analytical procedure to quantify tris(nonylphenyl)phosphite, as a source of the endocrine disruptors 4-nonylphenols, in food packaging materials.

    Science.gov (United States)

    Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry

    2014-01-01

    Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.

  19. Analytical Evaluation of Preliminary Drop Tests Performed to Develop a Robust Design for the Standardized DOE Spent Nuclear Fuel Canister

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Smith, N.L.; Snow, S.D.; Rahl, T.E.

    1999-01-01

    The Department of Energy (DOE) has developed a design concept for a set of standard canisters for the handling, interim storage, transportation, and disposal in the national repository, of DOE spent nuclear fuel (SNF). The standardized DOE SNF canister has to be capable of handling virtually all of the DOE SNF in a variety of potential storage and transportation systems. It must also be acceptable to the repository, based on current and anticipated future requirements. This expected usage mandates a robust design. The canister design has four unique geometries, with lengths of approximately 10 feet or 15 feet, and an outside nominal diameter of 18 inches or 24 inches. The canister has been developed to withstand a drop from 30 feet onto a rigid (flat) surface, sustaining only minor damage - but no rupture - to the pressure (containment) boundary. The majority of the end drop-induced damage is confined to the skirt and lifting/stiffening ring components, which can be removed if de sired after an accidental drop. A canister, with its skirt and stiffening ring removed after an accidental drop, can continue to be used in service with appropriate operational steps being taken. Features of the design concept have been proven through drop testing and finite element analyses of smaller test specimens. Finite element analyses also validated the canister design for drops onto a rigid (flat) surface for a variety of canister orientations at impact, from vertical to 45 degrees off vertical. Actual 30-foot drop testing has also been performed to verify the final design, though limited to just two full-scale test canister drops. In each case, the analytical models accurately predicted the canister response

  20. Analytic laboratory performance of a point of care urine culture kit for diagnosis and antibiotic susceptibility testing.

    Science.gov (United States)

    Bongard, E; Frimodt-Møller, N; Gal, M; Wootton, M; Howe, R; Francis, N; Goossens, H; Butler, C C

    2015-10-01

    Currently available point-of-care (POC) diagnostic tests for managing urinary tract infections (UTIs) in general practice are limited by poor performance characteristics, and laboratory culture generally provides results only after a few days. This laboratory evaluation compared the analytic performance of the POC UK Flexicult(™) (Statens Serum Institut) (SSI) urinary kit for quantification, identification and antibiotic susceptibility testing and routine UK National Health Service (NHS) urine processing to an advanced urine culture method. Two hundred urine samples routinely submitted to the Public Health Wales Microbiology Laboratory were divided and: (1) analysed by routine NHS microbiological tests as per local laboratory standard operating procedures, (2) inoculated onto the UK Flexicult(™) SSI urinary kit and (3) spiral plated onto Colorex Orientation UTI medium (E&O Laboratories Ltd). The results were evaluated between the NHS and Flexicult(™ )methods, and discordant results were compared to the spiral plating method. The UK Flexicult(™) SSI urinary kit was compared to routine NHS culture for identification of a pure or predominant uropathogen at ≥ 10(5) cfu/mL, with a positive discordancy rate of 13.5% and a negative discordancy rate of 3%. The sensitivity and specificity were 86.7% [95% confidence interval (CI) 73.8-93.7] and 82.6% (95% CI 75.8-87.7), respectively. The UK Flexicult(™) SSI urinary kit was comparable to routine NHS urine processing in identifying microbiologically positive UTIs in this laboratory evaluation. However, the number of false-positive samples could lead to over-prescribing of antibiotics in clinical practice. The Flexicult(™) SSI kit could be useful as a POC test for UTIs in primary care but further pragmatic evaluations are necessary.

  1. Sources of traffic and visitors' preferences regarding online public reports of quality: web analytics and online survey results.

    Science.gov (United States)

    Bardach, Naomi S; Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-05-01

    In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225/427); the majority of those choosing a

  2. Sources of Traffic and Visitors’ Preferences Regarding Online Public Reports of Quality: Web Analytics and Online Survey Results

    Science.gov (United States)

    Hibbard, Judith H; Greaves, Felix; Dudley, R Adams

    2015-01-01

    Background In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. Objective To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Methods Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. Results There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225

  3. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  4. A simple analytical scaling method for a scaled-down test facility simulating SB-LOCAs in a passive PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il

    1992-02-01

    A Simple analytical scaling method is developed for a scaled-down test facility simulating SB-LOCAs in a passive PWR. The whole scenario of a SB-LOCA is divided into two phases on the basis of the pressure trend ; depressurization phase and pot-boiling phase. The pressure and the core mixture level are selected as the most critical parameters to be preserved between the prototype and the scaled-down model. In each phase the high important phenomena having the influence on the critical parameters are identified and the scaling parameters governing the high important phenomena are generated by the present method. To validate the model used, Marviken CFT and 336 rod bundle experiment are simulated. The models overpredict both the pressure and two phase mixture level, but it shows agreement at least qualitatively with experimental results. In order to validate whether the scaled-down model well represents the important phenomena, we simulate the nondimensional pressure response of a cold-leg 4-inch break transient for AP-600 and the scaled-down model. The results of the present method are in excellent agreement with those of AP-600. It can be concluded that the present method is suitable for scaling the test facility simulating SB-LOCAs in a passive PWR

  5. Quality management and accreditation in a mixed research and clinical hair testing analytical laboratory setting-a review.

    Science.gov (United States)

    Fulga, Netta

    2013-06-01

    Quality management and accreditation in the analytical laboratory setting are developing rapidly and becoming the standard worldwide. Quality management refers to all the activities used by organizations to ensure product or service consistency. Accreditation is a formal recognition by an authoritative regulatory body that a laboratory is competent to perform examinations and report results. The Motherisk Drug Testing Laboratory is licensed to operate at the Hospital for Sick Children in Toronto, Ontario. The laboratory performs toxicology tests of hair and meconium samples for research and clinical purposes. Most of the samples are involved in a chain of custody cases. Establishing a quality management system and achieving accreditation became mandatory by legislation for all Ontario clinical laboratories since 2003. The Ontario Laboratory Accreditation program is based on International Organization for Standardization 15189-Medical laboratories-Particular requirements for quality and competence, an international standard that has been adopted as a national standard in Canada. The implementation of a quality management system involves management commitment, planning and staff education, documentation of the system, validation of processes, and assessment against the requirements. The maintenance of a quality management system requires control and monitoring of the entire laboratory path of workflow. The process of transformation of a research/clinical laboratory into an accredited laboratory, and the benefits of maintaining an effective quality management system, are presented in this article.

  6. Analytical electron microscopy examination of solid reaction products in long-term test of SRL 200 waste glasses

    International Nuclear Information System (INIS)

    Buck, E.C.; Fortner, J.A.; Bates, J.K.; Feng, X.; Dietz, N.L.; Bradley, C.R.; Tani, B.S.

    1993-01-01

    Alteration phases, found on the leached surfaces and present as colloids in the leachates of 200-based frit (fully active and simulated) nuclear waste glass, reacted under static test conditions, at a surface area to leachate volume ratio of 20,000 m -1 for 15 days to 728 days, have been examined by analytical electron microscopy. The compositions of the secondary phases were determined using x-ray energy dispersive spectroscopy and electron energy loss spectroscopy, and structural analysis was accomplished by electron diffraction. Long-term samples of simulated glass, which had undergone an acceleration of reaction after 182 days, possessed a number of silicate secondary phases, including; smectite (iron silicate and potassium iron alumina-silicate, weeksite (uranium silicate), zeolite (calcium potassium alumino-silicate), tobermorite (calcium silicate), and a pure silica phase. However, uranium silicates and smectite have also been observed in tests, which have not undergone the acceleration of reaction, in both the leachate and leached layer, suggesting that these phases are not responsible for the acceleration of reaction

  7. Discussion and analytical test for inclusion of advanced field and boundary condition in theory of free electron lasers

    Science.gov (United States)

    Niknejadi, Pardis; Madey, John M. J.

    2017-09-01

    By the covariant statement of the distance in space-time separating transmitter and receivers, the emission and absorption of the retarded and advanced waves are all simultaneous. In other words, for signals carried on electromagnetic waves (advanced or retarded) the invariant interval (cdt) 2 -dr2 between the emission of a wave and it's absorption at the non-reflecting boundary is always identically zero. Utilizing this principle, we have previously explained the advantages of including the coherent radiation reaction force as a part of the solution to the boundary value problem for FELs that radiate into "free space" (Self Amplified Spontaneous Emission (SASE) FELs) and discussed how the advanced field of the absorber can interact with the radiating particles at the time of emission. Here we present an analytical test which verifies that a multilayer mirror can act as a band pass filter and can contribute to microbunching in the electron beam. Here we will discuss motivation, conditions and requirements, and method for testing this effect.

  8. Novel cellulose-based halochromic test strips for naked-eye detection of alkaline vapors and analytes.

    Science.gov (United States)

    Abou-Yousef, Hussein; Khattab, Tawfik A; Youssef, Yehia A; Al-Balakocy, Naser; Kamel, Samir

    2017-08-01

    A simple, portable and highly sensitive naked-eye test strip is successfully prepared for optical detection of gaseous and aqueous alkaline analytes. Novel pH-sensory tricyanofuran-hydrazone (TCFH) disperse colorant containing a hydrazone recognition functional moiety is successfully synthesized via azo-coupling reaction between active methyl-containing tricyanofuran (TCF) heterocycle and diazonium salt of 4-aminobenzaldehyde followed by Knoevenagel condensation with malononitrile. UV-vis absorption spectra display solvatochromism and reversible color changes of the TCFH solution in dimethyl sulfoxide in response to pH variations. We investigate the preparation of hydrophobic cellulose/polyethylene terephthalate composites characterized by their high affinity for disperse dyes. Composite films made from CA, Cell/CA, PET/CA, and Cell/PET-CA are produced via solvent-casting procedure using 10-30% modified cellulose or modified polyethylene terephthalate. The mechanical properties and morphologies of these composite films are investigated. The prepared pH-sensory hydrazone-based disperse dye is then applied to dye the produced cellulose-based composite films employing the high temperature pressure dyeing procedure. The produced halochromic PET-CA-TCFH test strip provide an instant visible signal from orange to purple upon exposure to alkaline conditions as proved by the coloration measurements. The sensor strip exhibits high sensitivity and quick detection toward ammonia in both of aqueous and vapor phases by naked-eye observations at room temperature and atmospheric pressure. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Rationale for a spallation neutron source target system test facility at the 1-MW Long-Pulse Spallation Source

    International Nuclear Information System (INIS)

    Sommer, W.F.

    1995-12-01

    The conceptual design study for a 1-MW Long-Pulse Spallation Source at the Los Alamos Neutron Science Center has shown the feasibility of including a spallation neutron test facility at a relatively low cost. This document presents a rationale for developing such a test bed. Currently, neutron scattering facilities operate at a maximum power of 0.2 MW. Proposed new designs call for power levels as high as 10 MW, and future transmutation activities may require as much as 200 MW. A test bed will allow assessment of target neutronics; thermal hydraulics; remote handling; mechanical structure; corrosion in aqueous, non-aqueous, liquid metal, and molten salt systems; thermal shock on systems and system components; and materials for target systems. Reliable data in these areas are crucial to the safe and reliable operation of new high-power facilities. These tests will provide data useful not only to spallation neutron sources proposed or under development, but also to other projects in accelerator-driven transmutation technologies such as the production of tritium

  10. Multiple Sources of Test Bias on the WISC-R and Bender-Gestalt Test.

    Science.gov (United States)

    Oakland, Thomas; Feigenbaum, David

    1979-01-01

    Assessed test bias on the Wechsler Intelligence Test for Children-Revised (WISC-R) and Bender-Gestalt. On the Bender, evidence of bias was infrequent and irregular. On the WISC-R, group differences were most discernible for age, sex, family structure, and race. Consistent patterns of bias were not apparent among comparison groups. (Author)

  11. Field and analytical data relating to the 1972 and 1978 surveys of residual contamination of the Monte Bello Islands and Emu atomic weapons test sites

    International Nuclear Information System (INIS)

    Cooper, M.B.; Duggleby, J.C.

    1980-12-01

    Radiation surveys of the Monte Bello Islands test site in Western Australia, and the Emu test site in South Australia, were carried out in 1972 and 1978. The results have been published in ARL reports ARL/TR--010 and ARL/TR--012. The detailed field and analytical data which formed the basis of those publications are given

  12. The detection of problem analytes in a single proficiency test challenge in the absence of the Health Care Financing Administration rule violations.

    Science.gov (United States)

    Cembrowski, G S; Hackney, J R; Carey, N

    1993-04-01

    The Clinical Laboratory Improvement Act of 1988 (CLIA 88) has dramatically changed proficiency testing (PT) practices having mandated (1) satisfactory PT for certain analytes as a condition of laboratory operation, (2) fixed PT limits for many of these "regulated" analytes, and (3) an increased number of PT specimens (n = 5) for each testing cycle. For many of these analytes, the fixed limits are much broader than the previously employed Standard Deviation Index (SDI) criteria. Paradoxically, there may be less incentive to identify and evaluate analytically significant outliers to improve the analytical process. Previously described "control rules" to evaluate these PT results are unworkable as they consider only two or three results. We used Monte Carlo simulations of Kodak Ektachem analyzers participating in PT to determine optimal control rules for the identification of PT results that are inconsistent with those from other laboratories using the same methods. The analysis of three representative analytes, potassium, creatine kinase, and iron was simulated with varying intrainstrument and interinstrument standard deviations (si and sg, respectively) obtained from the College of American Pathologists (Northfield, Ill) Quality Assurance Services data and Proficiency Test data, respectively. Analytical errors were simulated in each of the analytes and evaluated in terms of multiples of the interlaboratory SDI. Simple control rules for detecting systematic and random error were evaluated with power function graphs, graphs of probability of error detected vs magnitude of error. Based on the simulation results, we recommend screening all analytes for the occurrence of two or more observations exceeding the same +/- 1 SDI limit. For any analyte satisfying this condition, the mean of the observations should be calculated. For analytes with sg/si ratios between 1.0 and 1.5, a significant systematic error is signaled by the mean exceeding 1.0 SDI. Significant random error

  13. Design of the 'half-size' ITER neutral beam source for the test facility ELISE

    International Nuclear Information System (INIS)

    Heinemann, B.; Falter, H.; Fantz, U.; Franzen, P.; Froeschle, M.; Gutser, R.; Kraus, W.; Nocentini, R.; Riedl, R.; Speth, E.; Staebler, A.; Wuenderlich, D.; Agostinetti, P.; Jiang, T.

    2009-01-01

    In 2007 the radio frequency driven negative hydrogen ion source developed at IPP in Garching was chosen by the ITER board as the new reference source for the ITER neutral beam system. In order to support the design and the commissioning and operating phases of the ITER test facilities ISTF and NBTF in Padua, IPP is presently constructing a new test facility ELISE (Extraction from a Large Ion Source Experiment). ELISE will be operated with the so-called 'half-size ITER source' which is an intermediate step between the present small IPP RF sources (1/8 ITER size) and the full size ITER source. The source will have approximately the width but only half the height of the ITER source. The modular concept with 4 drivers will allow an easy extrapolation to the full ITER size with 8 drivers. Pulsed beam extraction and acceleration up to 60 kV (corresponding to pre-acceleration voltage of SINGAP) is foreseen. The aim of the design of the ELISE source and extraction system was to be as close as possible to the ITER design; it has however some modifications allowing a better diagnostic access as well as more flexibility for exploring open questions. Therefore one major difference compared to the source of ITER, NBTF or ISTF is the possible operation in air. Specific requirements for RF sources as found on IPP test facilities BATMAN and MANITU are implemented [A. Staebler, et al., Development of a RF-driven ion source for the ITER NBI system, SOFT Conference 2008, Fusion Engineering and Design, 84 (2009) 265-268].

  14. Iterative and range test methods for an inverse source problem for acoustic waves

    International Nuclear Information System (INIS)

    Alves, Carlos; Kress, Rainer; Serranho, Pedro

    2009-01-01

    We propose two methods for solving an inverse source problem for time-harmonic acoustic waves. Based on the reciprocity gap principle a nonlinear equation is presented for the locations and intensities of the point sources that can be solved via Newton iterations. To provide an initial guess for this iteration we suggest a range test algorithm for approximating the source locations. We give a mathematical foundation for the range test and exhibit its feasibility in connection with the iteration method by some numerical examples

  15. Simulations of Liners and Test Objects for a New Atlas Advanced Radiography Source

    International Nuclear Information System (INIS)

    Morgan, D. V.; Iversen, S.; Hilko, R. A.

    2002-01-01

    The Advanced Radiographic Source (ARS) will improve the data significantly due to its smaller source width. Because of the enhanced ARS output, larger source-to-object distances are a reality. The harder ARS source will allow radiography of thick high-Z targets. The five different spectral simulations resulted in similar imaging detector weighted transmission. This work used a limited set of test objects and imaging detectors. Other test objects and imaging detectors could possibly change the MVp-sensitivity result. The effect of material motion blur must be considered for the ARS due to the expected smaller X-ray source size. This study supports the original 1.5-MVp value

  16. Heat and mass release for some transient fuel source fires: A test report

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1986-10-01

    Nine fire tests using five different trash fuel source packages were conducted by Sandia National Laboratories. This report presents the findings of these tests. Data reported includes heat and mass release rates, total heat and mass release, plume temperatures, and average fuel heat of combustion. These tests were conducted as a part of the US Nuclear Regulatory Commission sponsored fire safety research program. Data from these tests were intended for use in nuclear power plant probabilistic risk assessment fire analyses. The results were also used as input to a fire test program at Sandia investigating the vulnerability of electrical control cabinets to fire. The fuel packages tested were chosen to be representative of small to moderately sized transient trash fuel sources of the type that would be found in a nuclear power plant. The highest fire intensity encountered during these tests was 145 kW. Plume temperatures did not exceed 820 0 C

  17. Installation and Characterization of Charged Particle Sources for Space Environmental Effects Testing

    Science.gov (United States)

    Skevington, Jennifer L.

    2010-01-01

    Charged particle sources are integral devices used by Marshall Space Flight Center s Environmental Effects Branch (EM50) in order to simulate space environments for accurate testing of materials and systems. By using these sources inside custom vacuum systems, materials can be tested to determine charging and discharging properties as well as resistance to sputter damage. This knowledge can enable scientists and engineers to choose proper materials that will not fail in harsh space environments. This paper combines the steps utilized to build a low energy electron gun (The "Skevington 3000") as well as the methods used to characterize the output of both the Skevington 3000 and a manufactured Xenon ion source. Such characterizations include beam flux, beam uniformity, and beam energy. Both sources were deemed suitable for simulating environments in future testing.

  18. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    International Nuclear Information System (INIS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.; Roberts, Matthew

    2014-01-01

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  19. Analytical estimation of emission zone mean position and width in organic light-emitting diodes from emission pattern image-source interference fringes

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, Ariel, E-mail: ariel.epstein@utoronto.ca; Tessler, Nir, E-mail: nir@ee.technion.ac.il; Einziger, Pinchas D. [Department of Electrical Engineering, Technion-Israel Institute of Technology, Haifa 32000 (Israel); Roberts, Matthew, E-mail: mroberts@cdtltd.co.uk [Cambridge Display Technology Ltd, Building 2020, Cambourne Business Park, Cambourne, Cambridgeshire CB23 6DW (United Kingdom)

    2014-06-14

    We present an analytical method for evaluating the first and second moments of the effective exciton spatial distribution in organic light-emitting diodes (OLED) from measured emission patterns. Specifically, the suggested algorithm estimates the emission zone mean position and width, respectively, from two distinct features of the pattern produced by interference between the emission sources and their images (induced by the reflective cathode): the angles in which interference extrema are observed, and the prominence of interference fringes. The relations between these parameters are derived rigorously for a general OLED structure, indicating that extrema angles are related to the mean position of the radiating excitons via Bragg's condition, and the spatial broadening is related to the attenuation of the image-source interference prominence due to an averaging effect. The method is applied successfully both on simulated emission patterns and on experimental data, exhibiting a very good agreement with the results obtained by numerical techniques. We investigate the method performance in detail, showing that it is capable of producing accurate estimations for a wide range of source-cathode separation distances, provided that the measured spectral interval is large enough; guidelines for achieving reliable evaluations are deduced from these results as well. As opposed to numerical fitting tools employed to perform similar tasks to date, our approximate method explicitly utilizes physical intuition and requires far less computational effort (no fitting is involved). Hence, applications that do not require highly resolved estimations, e.g., preliminary design and production-line verification, can benefit substantially from the analytical algorithm, when applicable. This introduces a novel set of efficient tools for OLED engineering, highly important in the view of the crucial role the exciton distribution plays in determining the device performance.

  20. From the Kirsch-Kress potential method via the range test to the singular sources method

    International Nuclear Information System (INIS)

    Potthast, R; Schulz, J

    2005-01-01

    We review three reconstruction methods for inverse obstacle scattering problems. We will analyse the relation between the Kirsch-Kress potential method 1986, the range test of Kusiak, Potthast and Sylvester (2003) and the singular sources method of Potthast (2000). In particular, we show that the range test is a logical extension of the Kirsch-Kress method into the category of sampling methods employing the tool of domain sampling. Then we will show how a multi-wave version of the range test can be set up and we will work out its relation to the singular sources method. Numerical examples and demonstrations will be provided

  1. Analytic energy gradient of excited electronic state within TDDFT/MMpol framework: Benchmark tests and parallel implementation.

    Science.gov (United States)

    Zeng, Qiao; Liang, WanZhen

    2015-10-07

    The time-dependent density functional theory (TDDFT) has become the most popular method to calculate the electronic excitation energies, describe the excited-state properties, and perform the excited-state geometric optimization of medium and large-size molecules due to the implementation of analytic excited-state energy gradient and Hessian in many electronic structure software packages. To describe the molecules in condensed phase, one usually adopts the computationally efficient hybrid Quantum Mechanics/Molecular Mechanics (QM/MM) models. Here, we extend our previous work on the energy gradient of TDDFT/MM excited state to account for the mutual polarization effects between QM and MM regions, which is believed to hold a crucial position in the potential energy surface of molecular systems when the photoexcitation-induced charge rearrangement in the QM region is drastic. The implementation of a simple polarizable TDDFT/MM (TDDFT/MMpol) model in Q-Chem/CHARMM interface with both the linear response and the state-specific features has been realized. Several benchmark tests and preliminary applications are exhibited to confirm our implementation and assess the effects of different treatment of environmental polarization on the excited-state properties, and the efficiency of parallel implementation is demonstrated as well.

  2. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Sood, Avnet; Forster, R. Arthur; Parsons, D. Kent

    2001-01-01

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10 -4 to 10 -5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined k eff answer was given with the standard deviation and three confidence intervals that contained the analytic k eff . To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined k eff confidence intervals for these deliberately ill-posed problems did not include the analytic k eff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that

  3. Design and qualification testing of a strontium-90 fluoride heat source

    International Nuclear Information System (INIS)

    Fullam, H.T.

    1981-12-01

    The Strontium Heat Source Development Program began at the Pacific Northwest Laboratory (PNL) in 1972 and is scheduled to be completed by the end of FY-1981. The program is currently funded by the US Department of Energy (DOE) By-Product Utilization Program. The primary objective of the program has been to develop the data and technology required to permit the licensing of power systems for terrestrial applications that utilize 90 SrF 2 -fueled radioisotope heat sources. A secondary objective of the program has been to design and qualification-test a general purpose 90 SrF 2 -fueled heat source. The effort expended in the design and testing of the heat source is described. Detailed information is included on: heat source design, licensing requirements, and qualification test requirements; the qualification test procedures; and the fabrication and testing of capsules of various materials. The results obtained in the qualification tests show that the outer capsule design proposed for the 90 SrF 2 heat source is capable of meeting current licensing requirements when Hastelloy S is used as the outer capsule material. The data also indicate that an outer capsule of Hastelloy C-4 would probably also meet licensing requirements, although Hastelloy S is the preferred material. Therefore, based on the results of this study, the general purpose 90 SrF 2 heat source will consist of a standard WESF Hastelloy C-276 inner capsule filled with 90 SrF 2 and a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for this study, the general purpose 90 SrF 2 heat a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for the outer capsule will utilize an interlocking joint design requiring a 0.1-in. penetration closure weld

  4. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    International Nuclear Information System (INIS)

    Caballero-Casero, N.; Lunar, L.; Rubio, S.

    2016-01-01

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  5. Analytical methods for the determination of mixtures of bisphenols and derivatives in human and environmental exposure sources and biological fluids. A review

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Casero, N.; Lunar, L.; Rubio, S., E-mail: qa1rubrs@uco.es

    2016-02-18

    Bisphenol A (BPA) is ubiquitous in humans and the environment. Its potential adverse effects through genomic and non-genomic pathways have fostered BPA replacement by bisphenol analogs that, unfortunately, exert similar adverse effects. Many of these analogs, as well as their derivatives, have already found in humans and the environment and major concerns have arisen over their low dose- and mixture-related effects. This review aims to discuss the characteristics of the main analytical methods reported so far for the determination of mixtures of bisphenol analogs and/or derivatives in human and environmental exposure sources and biological fluids. Approaches followed for removal of background contamination, sample preparation and separation and detection of mixtures of bisphenols and derivatives are critically discussed. Sample treatment is matrix-dependent and common steps include analyte isolation, removal of interferences, evaporation of the extracts and solvent reconstitution. Separation and quantification has been almost exclusively carried out by liquid chromatography tandem mass spectrometry (LC-MS/MS) or gas chromatography mass spectrometry (GC–MS), in the last case prior derivatization, but LC-fluorescence detection has also found some applications. Main characteristics, advantages and drawbacks of these methods will be comparatively discussed. Although at an early stage, some approaches for the assessment of the risk to mixtures of bisphenols, mainly based on the combination of chemical target analysis and toxicity evaluation, have been already applied and they will be here presented. Current knowledge gaps hindering a reliable assessment of human and environmental risk to mixtures of bisphenols and derivatives will be outlined. - Highlights: • Analytical methods for the (bio)monitoring of mixtures of bisphenols are reviewed. • LC and CG coupled to MS are the preferred techniques. • Method-dependent sample treatments are required to remove matrix

  6. Analytical model of nanoscale junctionless transistors towards controlling of short channel effects through source/drain underlap and channel thickness engineering

    Science.gov (United States)

    Roy, Debapriya; Biswas, Abhijit

    2018-01-01

    We develop a 2D analytical subthreshold model for nanoscale double-gate junctionless transistors (DGJLTs) with gate-source/drain underlap. The model is validated using well-calibrated TCAD simulation deck obtained by comparing experimental data in the literature. To analyze and control short-channel effects, we calculate the threshold voltage, drain induced barrier lowering (DIBL) and subthreshold swing of DGJLTs using our model and compare them with corresponding simulation value at channel length of 20 nm with channel thickness tSi ranging 5-10 nm, gate-source/drain underlap (LSD) values 0-7 nm and source/drain doping concentrations (NSD) ranging 5-12 × 1018 cm-3. As tSi reduces from 10 to 5 nm DIBL drops down from 42.5 to 0.42 mV/V at NSD = 1019 cm-3 and LSD = 5 nm in contrast to decrement from 71 to 4.57 mV/V without underlap. For a lower tSiDIBL increases marginally with increasing NSD. The subthreshold swing reduces more rapidly with thinning of channel thickness rather than increasing LSD or decreasing NSD.

  7. A New 500-kV Ion Source Test Stand for HIF

    International Nuclear Information System (INIS)

    Sangster, T.C.; Ahle, L.E.; Halaxa, E.F.; Karpenko, V.P.; Oldaker, M. E.; Mitchell, J.W.; Beck, D.N.; Bieniosek, F.M.; Henestroza, E.; Kwan, J.W.

    2000-01-01

    One of the most challenging aspects of ion beam driven inertial fusion energy is the reliable and efficient generation of low emittance, high current ion beams. The primary ion source requirements include a rise time of order 1-msec, a pulse width of at least 20-msec, a flattop ripple of less than 0.1% and a repetition rate of at least 5-Hz. Naturally, at such a repetition rate, the duty cycle of the source must be greater than 108 pulses. Although these specifications do not appear to exceed the state-of-the-art for pulsed power, considerable effort remains to develop a suitable high current ion source. Therefore, we are constructing a 500-kV test stand specifically for studying various ion source concepts including surface, plasma and metal vapor arc. This paper will describe the test stand design specifications as well as the details of the various subsystems and components

  8. Orthodontic brackets removal under shear and tensile bond strength resistance tests - a comparative test between light sources

    Science.gov (United States)

    Silva, P. C. G.; Porto-Neto, S. T.; Lizarelli, R. F. Z.; Bagnato, V. S.

    2008-03-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased.

  9. Orthodontic brackets removal under shear and tensile bond strength resistance tests – a comparative test between light sources

    International Nuclear Information System (INIS)

    Silva, P C G; Porto-Neto, S T; Lizarelli, R F Z; Bagnato, V S

    2008-01-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased

  10. A simplified model of the source channel of the Leksell GammaKnife tested with PENELOPE

    OpenAIRE

    Al-Dweri, Feras M. O.; Lallena, Antonio M.; Vilches, Manuel

    2004-01-01

    Monte Carlo simulations using the code PENELOPE have been performed to test a simplified model of the source channel geometry of the Leksell GammaKnife$^{\\circledR}$. The characteristics of the radiation passing through the treatment helmets are analysed in detail. We have found that only primary particles emitted from the source with polar angles smaller than 3$^{\\rm o}$ with respect to the beam axis are relevant for the dosimetry of the Gamma Knife. The photons trajectories reaching the out...

  11. Source passing test in Vesivehmaa air field - STUK/HUT team

    International Nuclear Information System (INIS)

    Honkamaa, T.; Tiilikainen, H.; Aarnio, P.; Nikkinen, M.

    1997-01-01

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h -1 and 50 km h -1 . A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5'x5') were used. The sources had a nominal activity of 22 MBq ( 60 Co) and 1.85 GBq ( 137 Cs). The 60 Co source strength was under the detection limit in all measurements. The detection of the 137 Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that 137 Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au)

  12. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T.; Tiilikainen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P.; Nikkinen, M. [Helsinki Univ. of Technology, Espoo (Finland)

    1997-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  13. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T; Tiilikainen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P; Nikkinen, M [Helsinki Univ. of Technology, Espoo (Finland)

    1998-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  14. ALMA observations of lensed Herschel sources: testing the dark matter halo paradigm

    Science.gov (United States)

    Amvrosiadis, A.; Eales, S. A.; Negrello, M.; Marchetti, L.; Smith, M. W. L.; Bourne, N.; Clements, D. L.; De Zotti, G.; Dunne, L.; Dye, S.; Furlanetto, C.; Ivison, R. J.; Maddox, S. J.; Valiante, E.; Baes, M.; Baker, A. J.; Cooray, A.; Crawford, S. M.; Frayer, D.; Harris, A.; Michałowski, M. J.; Nayyeri, H.; Oliver, S.; Riechers, D. A.; Serjeant, S.; Vaccari, M.

    2018-04-01

    With the advent of wide-area submillimetre surveys, a large number of high-redshift gravitationally lensed dusty star-forming galaxies have been revealed. Because of the simplicity of the selection criteria for candidate lensed sources in such surveys, identified as those with S500 μm > 100 mJy, uncertainties associated with the modelling of the selection function are expunged. The combination of these attributes makes submillimetre surveys ideal for the study of strong lens statistics. We carried out a pilot study of the lensing statistics of submillimetre-selected sources by making observations with the Atacama Large Millimeter Array (ALMA) of a sample of strongly lensed sources selected from surveys carried out with the Herschel Space Observatory. We attempted to reproduce the distribution of image separations for the lensed sources using a halo mass function taken from a numerical simulation that contains both dark matter and baryons. We used three different density distributions, one based on analytical fits to the haloes formed in the EAGLE simulation and two density distributions [Singular Isothermal Sphere (SIS) and SISSA] that have been used before in lensing studies. We found that we could reproduce the observed distribution with all three density distributions, as long as we imposed an upper mass transition of ˜1013 M⊙ for the SIS and SISSA models, above which we assumed that the density distribution could be represented by a Navarro-Frenk-White profile. We show that we would need a sample of ˜500 lensed sources to distinguish between the density distributions, which is practical given the predicted number of lensed sources in the Herschel surveys.

  15. Acoustic emission non-destructive testing of structures using source location techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Alan G.

    2013-09-01

    The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one on aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.

  16. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  17. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  18. Use of GSR particle analysis program on an analytical SEM to identify sources of emission of airborne particles

    International Nuclear Information System (INIS)

    Chan, Y.C.; Trumper, J.; Bostrom, T.

    2002-01-01

    Full text: High concentrations of airborne particles, in particular PM 10 (particulate matter 10 , but has been little used in Australia for airborne particulates. Two sets of 15 mm PM 10 samples were collected in March and April 2000 from two sites in Brisbane, one within a suburb and one next to an arterial road. The particles were collected directly onto double-sided carbon tapes with a cascade impactor attached to a high-volume PM 10 sampler. The carbon tapes were analysed in a JEOL 840 SEM equipped with a Be-window energy-dispersive X-ray detector and Moran Scientific microanalysis system. An automated Gun Shot Residue (GSR) program was used together with backscattered electron imaging to characterise and analyse individual particulates. About 6,000 particles in total were analysed for each set of impactor samples. Due to limitations of useful pixel size, only particles larger than about 0.5 μm could be analysed. The size, shape and estimated elemental composition (from Na to Pb) of the particles were subjected to non-hierarchical cluster analysis and the characteristics of the clusters were related to their possible sources of emission. Both samples resulted in similar particle clusters. The particles could be classified into three main categories non-spherical (58% of the total number of analysed particles, shape factor >1 1), spherical (15%) and 'carbonaceous' (27%, ie with unexplained % of elemental mass >75%). Non-spherical particles were mainly sea salt and soil particles, and a small amount of iron, lead and mineral dust. The spherical particles were mainly sea salt particles and flyash, and a small amount of iron, lead and secondary sulphate dust. The carbonaceous particles included carbon material mixed with secondary aerosols, roadside dust, sea salt or industrial dust. The arterial road sample also contained more roadside dust and less secondary aerosols than the suburb sample. Current limitations with this method are the minimum particle size

  19. Well water quality in rural Nicaragua using a low-cost bacterial test and microbial source tracking.

    Science.gov (United States)

    Weiss, Patricia; Aw, Tiong Gim; Urquhart, Gerald R; Galeano, Miguel Ruiz; Rose, Joan B

    2016-04-01

    Water-related diseases, particularly diarrhea, are major contributors to morbidity and mortality in developing countries. Monitoring water quality on a global scale is crucial to making progress in terms of population health. Traditional analytical methods are difficult to use in many regions of the world in low-resource settings that face severe water quality issues due to the inaccessibility of laboratories. This study aimed to evaluate a new low-cost method (the compartment bag test (CBT)) in rural Nicaragua. The CBT was used to quantify the presence of Escherichia coli in drinking water wells and aimed to determine the source(s) of any microbial contamination. Results indicate that the CBT is a viable method for use in remote rural regions. The overall quality of well water in Pueblo Nuevo, Nicaragua was deemed unsafe, and results led to the conclusion that animal fecal wastes may be one of the leading causes of well contamination. Elevation and depth of wells were not found to impact overall water quality. However rope-pump wells had a 64.1% reduction in contamination when compared with simple wells.

  20. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    Energy Technology Data Exchange (ETDEWEB)

    Zaccaria, Pierluigi, E-mail: pierluigi.zaccaria@igi.cnr.it [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Masiello, Antonio [Fusion for Energy F4E, Barcelona (Spain); Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario [Ettore Zanon S.p.A., Schio (VI) (Italy)

    2015-10-15

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  1. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    International Nuclear Information System (INIS)

    Zaccaria, Pierluigi; Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco; Masiello, Antonio; Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario

    2015-01-01

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  2. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice

    NARCIS (Netherlands)

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-01-01

    OBJECTIVE: Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection

  3. Low-Cost Method for Quantifying Sodium in Coconut Water and Seawater for the Undergraduate Analytical Chemistry Laboratory: Flame Test, a Mobile Phone Camera, and Image Processing

    Science.gov (United States)

    Moraes, Edgar P.; da Silva, Nilbert S. A.; de Morais, Camilo de L. M.; das Neves, Luiz S.; de Lima, Kassio M. G.

    2014-01-01

    The flame test is a classical analytical method that is often used to teach students how to identify specific metals. However, some universities in developing countries have difficulties acquiring the sophisticated instrumentation needed to demonstrate how to identify and quantify metals. In this context, a method was developed based on the flame…

  4. Justice at the millennium, a decade later: a meta-analytic test of social exchange and affect-based perspectives.

    Science.gov (United States)

    Colquitt, Jason A; Scott, Brent A; Rodell, Jessica B; Long, David M; Zapata, Cindy P; Conlon, Donald E; Wesson, Michael J

    2013-03-01

    Although a flurry of meta-analyses summarized the justice literature at the turn of the millennium, interest in the topic has surged in the decade since. In particular, the past decade has witnessed the rise of social exchange theory as the dominant lens for examining reactions to justice, and the emergence of affect as a complementary lens for understanding such reactions. The purpose of this meta-analytic review was to test direct, mediating, and moderating hypotheses that were inspired by those 2 perspectives, to gauge their adequacy as theoretical guides for justice research. Drawing on a review of 493 independent samples, our findings revealed a number of insights that were not included in prior meta-analyses. With respect to social exchange theory, our results revealed that the significant relationships between justice and both task performance and citizenship behavior were mediated by indicators of social exchange quality (trust, organizational commitment, perceived organizational support, and leader-member exchange), though such mediation was not apparent for counterproductive behavior. The strength of those relationships did not vary according to whether the focus of the justice matched the target of the performance behavior, contrary to popular assumptions in the literature, or according to whether justice was referenced to a specific event or a more general entity. With respect to affect, our results showed that justice-performance relationships were mediated by positive and negative affect, with the relevant affect dimension varying across justice and performance variables. Our discussion of these findings focuses on the merit in integrating the social exchange and affect lenses in future research.

  5. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  6. Detailed design of the RF source for the 1 MV neutral beam test facility

    International Nuclear Information System (INIS)

    Marcuzzi, D.; Palma, M. Dalla; Pavei, M.; Heinemann, B.; Kraus, W.; Riedl, R.

    2009-01-01

    In the framework of the EU activities for the development of the Neutral Beam Injector for ITER, the detailed design of the Radio Frequency (RF) driven negative ion source to be installed in the 1 MV ITER Neutral Beam Test Facility (NBTF) has been carried out. Results coming from ongoing R and D on IPP test beds [A. Staebler et al., Development of a RF-Driven Ion Source for the ITER NBI System, this conference] and the design of the new ELISE facility [B. Heinemann et al., Design of the Half-Size ITER Neutral Beam Source Test Facility ELISE, this conference] brought several modifications to the solution based on the previous design. An assessment was carried out regarding the Back-Streaming positive Ions (BSI+) that impinge on the back plates of the ion source and cause high and localized heat loads. This led to the redesign of most heated components to increase cooling, and to different choices for the plasma facing materials to reduce the effects of sputtering. The design of the electric circuit, gas supply and the other auxiliary systems has been optimized. Integration with other components of the beam source has been revised, with regards to the interfaces with the supporting structure, the plasma grid and the flexible connections. In the paper the design will be presented in detail, as well as the results of the analyses performed for the thermo-mechanical verification of the components.

  7. Sealed source and device design safety testing. Volume 4: Technical report on the findings of Task 4, Investigation of sealed source for paper mill digester

    International Nuclear Information System (INIS)

    Benac, D.J.; Iddings, F.A.

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate a suspected leaking radioactive source that was installed in a gauge that was on a paper mill digester. The actual source that was leaking was not available, therefore, SwRI examined another source. SwRI concluded that the encapsulated source examined by SwRI was not leaking. However, the presence of Cs-137 on the interior and exterior of the outer encapsulation and hending tube suggests that contamination probably occurred when the source was first manufactured, then installed in the handling tube

  8. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  9. Performance Test of the Microwave Ion Source with the Multi-layer DC Break

    International Nuclear Information System (INIS)

    Kim, Dae Il; Kwon, Hyeok Jung; Kim, Han Sung; Seol, Kyung Tae; Cho, Yong Sub

    2012-01-01

    A microwave proton source has been developed as a proton injector for the 100-MeV proton linac of the PEFP (Proton Engineering Frontier Project). On microwave ion source, the high voltage for the beam extraction is applied to the plasma chamber, also to the microwave components such as a 2.45GHz magnetron, a 3-stub tuner, waveguides. If microwave components can be installed on ground side, the microwave ion source can be operated and maintained easily. For the purpose, the multi-layer DC break has been developed. A multi-layer insulation has the arrangement of conductors and insulators as shown in the Fig. 1. For the purpose of stable operation as the multi-layer DC break, we checked the radiation of the insulator depending on materials and high voltage test of a fabricated multi-layer insulation. In this report, the details of performance test of the multi-layer DC break will be presented

  10. Performance test of electron cyclotron resonance ion sources for the Hyogo Ion Beam Medical Center

    Science.gov (United States)

    Sawada, K.; Sawada, J.; Sakata, T.; Uno, K.; Okanishi, K.; Harada, H.; Itano, A.; Higashi, A.; Akagi, T.; Yamada, S.; Noda, K.; Torikoshi, M.; Kitagawa, A.

    2000-02-01

    Two electron cyclotron resonance (ECR) ion sources were manufactured for the accelerator facility at the Hyogo Ion Beam Medical Center. H2+, He2+, and C4+ were chosen as the accelerating ions because they have the highest charge to mass ratio among ion states which satisfy the required intensity and quality. The sources have the same structure as the 10 GHz ECR source at the Heavy Ion Medical Accelerator in Chiba except for a few improvements in the magnetic structure. Their performance was investigated at the Sumitomo Heavy Industries factory before shipment. The maximum intensity was 1500 μA for H2+, 1320 μA for He2+, and 580 μA for C4+ at the end of the ion source beam transport line. These are several times higher than required. Sufficient performance was also observed in the flatness and long-term stability of the pulsed beams. These test results satisfy the requirements for medical use.

  11. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS

    Energy Technology Data Exchange (ETDEWEB)

    Thomae, R., E-mail: rthomae@tlabs.ac.za; Conradie, J.; Fourie, D.; Mira, J.; Nemulodi, F. [iThemba LABS, P.O. Box 722, Somerset West 7130 (South Africa); Kuechler, D.; Toivanen, V. [CERN, BE/ABP/HSL, 1211 Geneva 23 (Switzerland)

    2016-02-15

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the results of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.

  12. A multi-analyte biosensor for the simultaneous label-free detection of pathogens and biomarkers in point-of-need animal testing.

    Science.gov (United States)

    Ewald, Melanie; Fechner, Peter; Gauglitz, Günter

    2015-05-01

    For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.

  13. Development and performance test of a continuous source of nitrous acid (HONO)

    Energy Technology Data Exchange (ETDEWEB)

    Ammann, M.; Roessler, E.; Kalberer, M.; Bruetsch, S.; Schwikowski, M.; Baltensperger, U.; Zellweger, C.; Gaeggeler, H.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Laboratory investigations involving nitrous acid (HONO) require a stable, continuous source of HONO at ppb levels. A flow type generation system based on the reaction of sodium nitrite with sulfuric acid has been developed. Performance and speciation of gaseous products were tested with denuder and chemiluminescence techniques. (author) 2 figs., 2 refs.

  14. What Does a Verbal Test Measure? A New Approach to Understanding Sources of Item Difficulty.

    Science.gov (United States)

    Berk, Eric J. Vanden; Lohman, David F.; Cassata, Jennifer Coyne

    Assessing the construct relevance of mental test results continues to present many challenges, and it has proven to be particularly difficult to assess the construct relevance of verbal items. This study was conducted to gain a better understanding of the conceptual sources of verbal item difficulty using a unique approach that integrates…

  15. Portable test bench for the studies concerning ion sources and ion beam extraction and focusing systems

    International Nuclear Information System (INIS)

    Cordero Lopez, F.

    1961-01-01

    A portable test bench is described, which was designed to check ion sources, ion beam extraction and focusing systems before its use in a 600 KeV Cockcroft-Walton accelerator. The vacuum possibilities of the system are specially analyzed in connection with its particular use. The whole can be considered as a portable accelerator of low energy (50 keV). (Author)

  16. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  17. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  18. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  19. Sources of pre-analytical variations in yield of DNA extracted from blood samples: analysis of 50,000 DNA samples in EPIC.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The European Prospective Investigation into Cancer and nutrition (EPIC is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88% performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.

  20. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  1. Testing Procedures and Results of the Prototype Fundamental Power Coupler for the Spallation Neutron Source

    International Nuclear Information System (INIS)

    M. Stirbet; I.E. Campisi; E.F. Daly; G.K. Davis; M. Drury; P. Kneisel; G. Myneni; T. Powers; W.J. Schneider; K.M. Wilson; Y. Kang; K.A. Cummings; T. Hardek

    2001-01-01

    High-power RF testing with peak power in excess of 500 kW has been performed on prototype Fundamental Power Couplers (FPC) for the Spallation Neutron Source superconducting (SNS) cavities. The testing followed the development of procedures for cleaning, assembling and preparing the FPC for installation in the test stand. The qualification of the couplers has occurred for the time being only in a limited set of conditions (travelling wave, 20 pps) as the available RF system and control instrumentation are under improvement

  2. Induced over voltage test on transformers using enhanced Z-source inverter based circuit

    Science.gov (United States)

    Peter, Geno; Sherine, Anli

    2017-09-01

    The normal life of a transformer is well above 25 years. The economical operation of the distribution system has its roots in the equipments being used. The economy being such, that it is financially advantageous to replace transformers with more than 15 years of service in the second perennial market. Testing of transformer is required, as its an indication of the extent to which a transformer can comply with the customers specified requirements and the respective standards (IEC 60076-3). In this paper, induced over voltage testing on transformers using enhanced Z source inverter is discussed. Power electronic circuits are now essential for a whole array of industrial electronic products. The bulky motor generator set, which is used to generate the required frequency to conduct the induced over voltage testing of transformers is nowadays replaced by static frequency converter. First conventional Z-source inverter, and second an enhanced Z source inverter is being used to generate the required voltage and frequency to test the transformer for induced over voltage test, and its characteristics is analysed.

  3. Not if, but how they differ: A meta-analytic test of the nomological networks of burnout and engagement

    Directory of Open Access Journals (Sweden)

    Daniel D. Goering

    2017-06-01

    Full Text Available The distinctiveness between work engagement and burnout has long been an issue of debate. To address this issue, we use a recently developed technique by Yu et al. (2016 to specify and test a meta-analytic structural equation model (MASEM which accounts for the non-independence between engagement and burnout as well as the simultaneous effects of all relationships in our model, based on job demands-resources (JD-R theory. We also estimate the degree of variability of these relationships across subpopulations. We report the findings as a distribution of effect size estimates—each estimate in the distribution representing the true effect size for a potential subpopulation—around the mean average estimate for each relationship in the model. Based on the findings, we conclude that overall burnout and engagement display empirically distinct relationships within the JD-R model (i.e., they are not antipodal, particularly in terms of antecedents. Perhaps most interestingly, rather than a polar opposite pattern of relationships, challenge demands have a similarly positive relationship to both burnout (ß = 0.35, SD = 0.10 and engagement (ß = 0.35, SD = 0.08, suggesting that challenge demands simultaneously lead—in equal force—to both engagement and burnout. In addition, the distributions of effect sizes are nearly identical for both relationships, indicating that this holds true for nearly all subpopulations. As expected, hindrance demands have a positive relationship with burnout (ß = 0.31, SD = 0.10 and have a relatively weak, negative relationship on average to engagement (ß = −0.07, SD = 0.07; work resources have a negative relationship with burnout (ß = −0.15, SD = 0.06 and are positively related to engagement, but in absolute terms they are a stronger predictor of engagement (ß = 0.33, SD = 0.05. In terms of outcomes, burnout and engagement predict a variety of behavioral and attitudinal outcomes

  4. Preparation of tracing source layer in simulation test of nuclide migration

    International Nuclear Information System (INIS)

    Zhao Yingjie; Ni Shiwei; Li Weijuan; Yamamoto, T.; Tanaka, T.; Komiya, T.

    1993-01-01

    In cooperative research between CIRP and JAERI on safety assessment for shallow land disposal of low level radioactive waste, a laboratory simulation test of nuclide migration was carried out, in which the undisturbed loess soil column sampled from CIRP' s field test site was used as testing material, three nuclides, Sr-85, Cs-137 and Co-60 were used as tracers. Special experiment on tracing method was carried out, which included measuring pH value of quartz sand in HCl solution, determining the eligible water content of quartz sand as tracer carrier, measuring distribution uniformity of nuclides in the tracing quartz sand, determining elution rate of nuclides from the tracing quartz sand and detecting activity uniformity of tracing source layer. The experiment results showed that the tracing source layer, in which fine quartz sand was used as tracer carrier, satisfied expected requirement. (1 fig.)

  5. An analytic uranium sources model

    International Nuclear Information System (INIS)

    Singer, C.E.

    2001-01-01

    This document presents a method for estimating uranium resources as a continuous function of extraction costs and describing the uncertainty in the resulting fit. The estimated functions provide convenient extrapolations of currently available data on uranium extraction cost and can be used to predict the effect of resource depletion on future uranium supply costs. As such, they are a useful input for economic models of the nuclear energy sector. The method described here pays careful attention to minimizing built-in biases in the fitting procedure and defines ways to describe the uncertainty in the resulting fits in order to render the procedure and its results useful to the widest possible variety of potential users. (author)

  6. Ultracold neutron source at the PULSTAR reactor: Engineering design and cryogenic testing

    Energy Technology Data Exchange (ETDEWEB)

    Korobkina, E., E-mail: ekorobk@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Medlin, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Wehring, B.; Hawari, A.I. [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Huffman, P.R.; Young, A.R. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Beaumont, B. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Palmquist, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States)

    2014-12-11

    Construction is completed and commissioning is in progress for an ultracold neutron (UCN) source at the PULSTAR reactor on the campus of North Carolina State University. The source utilizes two stages of neutron moderation, one in heavy water at room temperature and the other in solid methane at ∼40K, followed by a converter stage, solid deuterium at 5 K, that allows a single down scattering of cold neutrons to provide UCN. The UCN source rolls into the thermal column enclosure of the PULSTAR reactor, where neutrons will be delivered from a bare face of the reactor core by streaming through a graphite-lined assembly. The source infrastructure, i.e., graphite-lined assembly, heavy-water system, gas handling system, and helium liquefier cooling system, has been tested and all systems operate as predicted. The research program being considered for the PULSTAR UCN source includes the physics of UCN production, fundamental particle physics, and material surface studies of nanolayers containing hydrogen. In the present paper we report details of the engineering and cryogenic design of the facility as well as results of critical commissioning tests without neutrons.

  7. Rapid detection and E-test antimicrobial susceptibility testing of Vibrio parahaemolyticus isolated from seafood and environmental sources in Malaysia.

    Science.gov (United States)

    Al-Othrubi, Saleh M; Hanafiah, Alfizah; Radu, Son; Neoh, Humin; Jamal, Rahaman

    2011-04-01

    To find out the prevalence and antimicrobial susceptibility of Vibrio parahaemolyticus in seafoods and environmental sources. The study was carried out at the Center of Excellence for Food Safety Research, University Putra Malaysia; Universiti Kebangsaan Malaysia; Medical Molecular Biology Institute; and University Kebansaan Malaysia Hospital, Malaysia between January 2006 and August 2008. One hundred and forty-four isolates from 400 samples of seafood (122 isolates) and seawater sources (22 isolates) were investigated for the presence of thermostable direct hemolysin (tdh+) and TDH-related hemolysin (trh+) genes using the standard methods. The E-test method was used to test the antimicrobial susceptibility. The study indicates low occurrence of tdh+ (0.69%) and trh+ isolates (8.3%). None of the isolates tested posses both virulence genes. High sensitivity was observed against tetracycline (98%). The mean minimum inhibitory concentration (MIC) of the isolates toward ampicillin increased from 4 ug/ml in 2004 to 24 ug/ml in 2007. The current study demonstrates a low occurrence of pathogenic Vibrio parahaemolyticus in the marine environment and seafood. Nonetheless, the potential risk of vibrio infection due to consumption of Vibrio parahaemolyticus contaminated seafood in Malaysia should not be neglected.

  8. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA

  9. Long-term storage life of light source modules by temperature cycling accelerated life test

    International Nuclear Information System (INIS)

    Sun Ningning; Tan Manqing; Li Ping; Jiao Jian; Guo Xiaofeng; Guo Wentao

    2014-01-01

    Light source modules are the most crucial and fragile devices that affect the life and reliability of the interferometric fiber optic gyroscope (IFOG). While the light emitting chips were stable in most cases, the module packaging proved to be less satisfactory. In long-term storage or the working environment, the ambient temperature changes constantly and thus the packaging and coupling performance of light source modules are more likely to degrade slowly due to different materials with different coefficients of thermal expansion in the bonding interface. A constant temperature accelerated life test cannot evaluate the impact of temperature variation on the performance of a module package, so the temperature cycling accelerated life test was studied. The main failure mechanism affecting light source modules is package failure due to solder fatigue failure including a fiber coupling shift, loss of cooling efficiency and thermal resistor degradation, so the Norris-Landzberg model was used to model solder fatigue life and determine the activation energy related to solder fatigue failure mechanism. By analyzing the test data, activation energy was determined and then the mean life of light source modules in different storage environments with a continuously changing temperature was simulated, which has provided direct reference data for the storage life prediction of IFOG. (semiconductor devices)

  10. Dependence of the source performance on plasma parameters at the BATMAN test facility

    Science.gov (United States)

    Wimmer, C.; Fantz, U.

    2015-04-01

    The investigation of the dependence of the source performance (high jH-, low je) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H-, its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H- density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa).

  11. Dependence of the source performance on plasma parameters at the BATMAN test facility

    International Nuclear Information System (INIS)

    Wimmer, C.; Fantz, U.

    2015-01-01

    The investigation of the dependence of the source performance (high j H − , low j e ) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H − , its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H − density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa)

  12. Open source non-invasive prenatal testing platform and its performance in a public health laboratory

    DEFF Research Database (Denmark)

    Johansen, Peter; Richter, Stine R; Balslev-Harder, Marie

    2016-01-01

    OBJECTIVE: The objective of this study was to introduce non-invasive prenatal testing (NIPT) for fetal autosomal trisomies and gender in a Danish public health setting, using semi-conductor sequencing and published open source scripts for analysis. METHODS: Plasma-derived DNA from a total of 375...... correlation (R(2)  = 0.72) to Y-chromosomal content of the male fetus samples. DISCUSSION: We have implemented NIPT into Danish health care using published open source scripts for autosomal aneuploidy detection and fetal DNA fraction estimation showing excellent false negative and false positive rates. Seq...

  13. The front end test stand high performance H- ion source at Rutherford Appleton Laboratory.

    Science.gov (United States)

    Faircloth, D C; Lawrie, S; Letchford, A P; Gabor, C; Wise, P; Whitehead, M; Wood, T; Westall, M; Findlay, D; Perkins, M; Savage, P J; Lee, D A; Pozimski, J K

    2010-02-01

    The aim of the front end test stand (FETS) project is to demonstrate that chopped low energy beams of high quality can be produced. FETS consists of a 60 mA Penning Surface Plasma Ion Source, a three solenoid low energy beam transport, a 3 MeV radio frequency quadrupole, a chopper, and a comprehensive suite of diagnostics. This paper details the design and initial performance of the ion source and the laser profile measurement system. Beam current, profile, and emittance measurements are shown for different operating conditions.

  14. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  15. Safety Test Program Summary SNAP 19 Pioneer Heat Source Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1971-07-01

    Sixteen heat source assemblies have been tested in support of the SNAP 19 Pioneer Safety Test Program. Seven were subjected to simulated reentry heating in various plasma arc facilities followed by impact on earth or granite. Six assemblies were tested under abort accident conditions of overpressure, shrapnel impact, and solid and liquid propellant fires. Three capsules were hot impacted under Transit capsule impact conditions to verify comparability of test results between the two similar capsule designs, thus utilizing both Pioneer and Transit Safety Test results to support the Safety Analysis Report for Pioneer. The tests have shown the fuel is contained under all nominal accident environments with the exception of minor capsule cracks under severe impact and solid fire environments. No catastrophic capsule failures occurred in this test which would release large quantities of fuel. In no test was fuel visible to the eye following impact or fire. Breached capsules were defined as those which exhibit thoria contamination on its surface following a test, or one which exhibited visible cracks in the post test metallographic analyses.

  16. Preservatives and neutralizing substances in milk: analytical sensitivity of official specific and nonspecific tests, microbial inhibition effect, and residue persistence in milk

    Directory of Open Access Journals (Sweden)

    Livia Cavaletti Corrêa da Silva

    2015-09-01

    Full Text Available Milk fraud has been a recurring problem in Brazil; thus, it is important to know the effect of most frequently used preservatives and neutralizing substances as well as the detection capability of official tests. The objective of this study was to evaluate the analytical sensitivity of legislation-described tests and nonspecific microbial inhibition tests, and to investigate the effect of such substances on microbial growth inhibition and the persistence of detectable residues after 24/48h of refrigeration. Batches of raw milk, free from any contaminant, were divided into aliquots and mixed with different concentrations of formaldehyde, hydrogen peroxide, sodium hypochlorite, chlorine, chlorinated alkaline detergent, or sodium hydroxide. The analytical sensitivity of the official tests was 0.005%, 0.003%, and 0.013% for formaldehyde, hydrogen peroxide, and hypochlorite, respectively. Chlorine and chlorinated alkaline detergent were not detected by regulatory tests. In the tests for neutralizing substances, sodium hydroxide could not be detected when acidity was accurately neutralized. The yogurt culture test gave results similar to those obtained by official tests for the detection of specific substances. Concentrations of 0.05% of formaldehyde, 0.003% of hydrogen peroxide and 0.013% of sodium hypochlorite significantly reduced (P

  17. JRR-3 cold neutron source facility H2-O2 explosion safety proof testing

    International Nuclear Information System (INIS)

    Hibi, T.; Fuse, H.; Takahashi, H.; Akutsu, C.; Kumai, T.; Kawabata, Y.

    1990-01-01

    A cold Neutron Source (CNS) will be installed in Japan Research Reactor-3 (JRR-3) in Japan Atomic Energy Research Institute (JAERI) during its remodeling project. This CNS holds liquid hydrogen at a temperature of about 20 K as a cold neutron source moderator in the heavy water area of the reactor to moderate thermal neutrons from the reactor to cold neutrons of about 5 meV energy. In the hydrogen circuit of the CNS safety measures are taken to prevent oxygen/hydrogen reaction (H 2 -O 2 explosion). It is also designed in such manner that, should an H 2 -O 2 explosion take place, the soundness of all the components can be maintained so as not to harm the reactor safety. A test hydrogen circuit identical to that of the CNS (real components designed by TECHNICATOME of France) was manufactured to conduct the H 2 -O 2 explosion test. In this test, the detonation that is the severest phenomenon of the oxygen/hydrogen reaction took place in the test hydrogen circuit to measure the exerted pressure on the components and their strain, deformation, leakage, cracking, etc. Based on the results of this measurement, the structural strength of the test hydrogen circuit was analyzed. The results of this test show that the hydrogen circuit components have sufficient structural strength to withstand an oxygen/hydrogen reaction

  18. The preliminary tests of the superconducting electron cyclotron resonance ion source DECRIS-SC2.

    Science.gov (United States)

    Efremov, A; Bekhterev, V; Bogomolov, S; Drobin, V; Loginov, V; Lebedev, A; Yazvitsky, N; Yakovlev, B

    2012-02-01

    A new compact version of the "liquid He-free" superconducting ECR ion source, to be used as an injector of highly charged heavy ions for the MC-400 cyclotron, is designed and built at the Flerov Laboratory of Nuclear Reactions in collaboration with the Laboratory of High Energy Physics of JINR. The axial magnetic field of the source is created by the superconducting magnet and the NdFeB hexapole is used for the radial plasma confinement. The microwave frequency of 14 GHz is used for ECR plasma heating. During the first tests, the source shows a good enough performance for the production of medium charge state ions. In this paper, we will present the design parameters and the preliminary results with gaseous ions.

  19. The Application of the Analytic Hierarchy Process and a New Correlation Algorithm to Urban Construction and Supervision Using Multi-Source Government Data in Tianjin

    Directory of Open Access Journals (Sweden)

    Shaoyi Wang

    2018-02-01

    Full Text Available As the era of big data approaches, big data has attracted increasing amounts of attention from researchers. Various types of studies have been conducted and these studies have focused particularly on the management, organization, and correlation of data and calculations using data. Most studies involving big data address applications in scientific, commercial, and ecological fields. However, the application of big data to government management is also needed. This paper examines the application of multi-source government data to urban construction and supervision in Tianjin, China. The analytic hierarchy process and a new approach called the correlation degree algorithm are introduced to calculate the degree of correlation between different approval items in one construction project and between different construction projects. The results show that more than 75% of the construction projects and their approval items are highly correlated. The results of this study suggest that most of the examined construction projects are well supervised, have relatively high probabilities of satisfying the relevant legal requirements, and observe their initial planning schemes.

  20. Advancing Explosion Source Theory through Experimentation: Results from Seismic Experiments Since the Moratorium on Nuclear Testing

    Science.gov (United States)

    Bonner, J. L.; Stump, B. W.

    2011-12-01

    On 23 September 1992, the United States conducted the nuclear explosion DIVIDER at the Nevada Test Site (NTS). It would become the last US nuclear test when a moratorium ended testing the following month. Many of the theoretical explosion seismic models used today were developed from observations of hundreds of nuclear tests at NTS and around the world. Since the moratorium, researchers have turned to chemical explosions as a possible surrogate for continued nuclear explosion research. This talk reviews experiments since the moratorium that have used chemical explosions to advance explosion source models. The 1993 Non-Proliferation Experiment examined single-point, fully contained chemical-nuclear equivalence by detonating over a kiloton of chemical explosive at NTS in close proximity to previous nuclear explosion tests. When compared with data from these nearby nuclear explosions, the regional and near-source seismic data were found to be essentially identical after accounting for different yield scaling factors for chemical and nuclear explosions. The relationship between contained chemical explosions and large production mining shots was studied at the Black Thunder coal mine in Wyoming in 1995. The research led to an improved source model for delay-fired mining explosions and a better understanding of mining explosion detection by the International Monitoring System (IMS). The effect of depth was examined in a 1997 Kazakhstan Depth of Burial experiment. Researchers used local and regional seismic observations to conclude that the dominant mechanism for enhanced regional shear waves was local Rg scattering. Travel-time calibration for the IMS was the focus of the 1999 Dead Sea Experiment where a 10-ton shot was recorded as far away as 5000 km. The Arizona Source Phenomenology Experiments provided a comparison of fully- and partially-contained chemical shots with mining explosions, thus quantifying the reduction in seismic amplitudes associated with partial

  1. Clinical Neuropathology practice news 1-2014: Pyrosequencing meets clinical and analytical performance criteria for routine testing of MGMT promoter methylation status in glioblastoma

    Science.gov (United States)

    Preusser, Matthias; Berghoff, Anna S.; Manzl, Claudia; Filipits, Martin; Weinhäusel, Andreas; Pulverer, Walter; Dieckmann, Karin; Widhalm, Georg; Wöhrer, Adelheid; Knosp, Engelbert; Marosi, Christine; Hainfellner, Johannes A.

    2014-01-01

    Testing of the MGMT promoter methylation status in glioblastoma is relevant for clinical decision making and research applications. Two recent and independent phase III therapy trials confirmed a prognostic and predictive value of the MGMT promoter methylation status in elderly glioblastoma patients. Several methods for MGMT promoter methylation testing have been proposed, but seem to be of limited test reliability. Therefore, and also due to feasibility reasons, translation of MGMT methylation testing into routine use has been protracted so far. Pyrosequencing after prior DNA bisulfite modification has emerged as a reliable, accurate, fast and easy-to-use method for MGMT promoter methylation testing in tumor tissues (including formalin-fixed and paraffin-embedded samples). We performed an intra- and inter-laboratory ring trial which demonstrates a high analytical performance of this technique. Thus, pyrosequencing-based assessment of MGMT promoter methylation status in glioblastoma meets the criteria of high analytical test performance and can be recommended for clinical application, provided that strict quality control is performed. Our article summarizes clinical indications, practical instructions and open issues for MGMT promoter methylation testing in glioblastoma using pyrosequencing. PMID:24359605

  2. Prototype tests on the ion source power supplies of the TEXTOR NI-system

    International Nuclear Information System (INIS)

    Goll, O.; Braunsberger, U.; Schwarz, U.

    1987-01-01

    The PINI ion source for the TEXTOR neutral injector is fed by a new modular transistorized power supply. All modules are located in a high voltage cage on 55 kV dc against ground. The normal operation of the injectors includes frequent grid breakdowns causing transient high voltage stresses on the ion source power supplies. These stresses must not disturb the safe operation of the power supplies. The paper describes the set up for extensive testing of a supply prototype module under the expected operating conditions. The main features of this test program are reviewed and the measures taken for a safe operation are discussed. As a result of the investigations, recommendations for the installation of the power supplies at the TEXTOR NI system are given

  3. Heavy ion beams from an Alphatross source for use in calibration and testing of diagnostics

    Science.gov (United States)

    Ward, R. J.; Brown, G. M.; Ho, D.; Stockler, B. F. O. F.; Freeman, C. G.; Padalino, S. J.; Regan, S. P.

    2016-10-01

    Ion beams from the 1.7 MV Pelletron Accelerator at SUNY Geneseo have been used to test and calibrate many inertial confinement fusion (ICF) diagnostics and high energy density physics (HEDP) diagnostics used at the Laboratory for Laser Energetics (LLE). The ion source on this accelerator, a radio-frequency (RF) alkali-metal charge exchange source called an Alphatross, is designed to produce beams of hydrogen and helium isotopes. There is interest in accelerating beams of carbon, oxygen, argon, and other heavy ions for use in testing several diagnostics, including the Time Resolved Tandem Faraday Cup (TRTF). The feasibility of generating these heavy ion beams using the Alphatross source will be reported. Small amounts of various gases are mixed into the helium plasma in the ion source bottle. A velocity selector is used to allow the desired ions to pass into the accelerator. As the heavy ions pass through the stripper canal of the accelerator, they emerge in a variety of charge states. The energy of the ion beam at the high-energy end of the accelerator will vary as a function of the charge state, however the maximum energy deliverable to target is limited by the maximum achievable magnetic field produced by the accelerator's steering magnet. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  4. IFMIF [International Fusion Materials Irradiation Facility], an accelerator-based neutron source for fusion components irradiation testing: Materials testing capabilities

    International Nuclear Information System (INIS)

    Mann, F.M.

    1988-08-01

    The International Fusion Materials Irradiation Facility (IFMIF) is proposed as an advanced accelerator-based neutron source for high-flux irradiation testing of large-sized fusion reactor components. The facility would require only small extensions to existing accelerator and target technology originally developed for the Fusion Materials Irradiation Test (FMIT) facility. At the extended facility, neutrons would be produced by a 0.1-A beam of 35-MeV deuterons incident upon a liquid lithium target. The volume available for high-flux (>10/sup 15/ n/cm/sup 2/-s) testing in IFMITF would be over a liter, a factor of about three larger than in the FMIT facility. This is because the effective beam current of 35-MeV deuterons on target can be increased by a factor of ten to 1A or more. Such an increase can be accomplished by funneling beams of deuterium ions from the radio-frequency quadruple into a linear accelerator and by taking advantage of recent developments in accelerator technology. Multiple beams and large total current allow great variety in available testing. For example, multiple simultaneous experiments, and great flexibility in tailoring spatial distributions of flux and spectra can be achieved. 5 refs., 2 figs., 1 tab

  5. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  6. Low power microwave tests on RF gun prototype of the Iranian Light Source Facility

    Directory of Open Access Journals (Sweden)

    A Sadeghipanah

    2017-08-01

    Full Text Available In this paper, we introduce RF electron gun of Iranian Light Source Facility (ILSF pre-injection system. Design, fabrication and low-power microwave tests results of the prototype RF electron gun have been described in detail. This paper also explains the tuning procedure of the prototype RF electron gun to the desired resonant frequency. The outcomes of this project brighten the path to the fabrication of the RF electron gun by the local industries  

  7. Test stand for magnetron H negative ion source at IPP-Nagoya

    Energy Technology Data Exchange (ETDEWEB)

    Okamura, H; Kuroda, T; Miyahara, A

    1981-02-01

    Test facilities for the development of magnetron H(-) ion source consists of the vacuum system, power supplies, diagnostic equipment, and their controlling electronics. Schematics are presented and relevant items described including sequence control, optical links, the charged pulse forming network, the extractor power supply, magnet power supply, temperature control of the cesium feeder, and the pulsed valve driver. Noise problems and diagnostics are also considered.

  8. Treatment of liquid effluents from uranium analytical method 'DAVIES & GRAY' by electrodialysis and electrodialysis reactive tests

    International Nuclear Information System (INIS)

    Zuniga Alvear, Karina Andrea

    2014-01-01

    This work describes the process which produces liquid waste coming from the chemical analysis laboratory of the Chilean Nuclear Energy Commission (CCHEN), from the analytical technique called 'Davies and Gray' and their further treatment, using electro dialysis (ED) and reactive electro dialysis (RED), in order to achieve lower uranium contents in solution. The contamination in water is a big problem, since there are many places in the world where is limited. For these reasons new treatments must be done, and the ion-selective membrane has opened a new path for these processes. The radioactive liquid waste have lots of other restrictions in their final disposal, which difficult even more the water recovery, because the law has very strict security margins with respect to these ones. In the case of liquid waste containing uranium, the concern increases, because being the uranium a radioactive element has it has to be lowered at its maximum, or eliminated directly, in order to avoid any kind of contamination. There exist national regulations and international recommendations. They have stipulated the correct management and disposal for radioactive waste. These can come from any uranium production process. In any of these, the liquid waste contains certain uranium content, which after the end of the process; the discarded waste must go through a conditioning and cleaning process for its afterward liberation or recycling. In this study, it was tested the electro dialysis as a radioactive waste treatment, only uranium containing waste coming from the chemical analysis laboratory in CCHEN. The electro dialysis process has a direct competition with other separation process, such as distillation, ionic exchange, and reverse osmosis, among others. The classic electro dialysis has been developed during the 50's, and until today, there has been different version, as inverse, reactive, reversible. The unidirectional and reactive electro dialysis will be the

  9. Development and tests of molybdenum armored copper components for MITICA ion source

    Science.gov (United States)

    Pavei, Mauro; Böswirth, Bernd; Greuner, Henri; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo

    2016-02-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  10. Development and tests of molybdenum armored copper components for MITICA ion source

    International Nuclear Information System (INIS)

    Pavei, Mauro; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo; Böswirth, Bernd; Greuner, Henri

    2016-01-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results

  11. Development and tests of molybdenum armored copper components for MITICA ion source

    Energy Technology Data Exchange (ETDEWEB)

    Pavei, Mauro, E-mail: mauro.pavei@igi.cnr.it; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo [Consorzio RFX, Corso Stati Uniti, 4, I-35127 Padova (Italy); Böswirth, Bernd; Greuner, Henri [Max-Planck-Institut für Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-02-15

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  12. Thermal hydraulic tests of a liquid hydrogen cold neutron source. NISTIR 5026

    International Nuclear Information System (INIS)

    Siegwarth, J.D.; Olson, D.A.; Lewis, M.A.; Rowe, J.M.; Williams, R.E.; Kopetka, P.

    1995-01-01

    Liquid hydrogen cold neutron source designed at NBSR contains neutron moderator chamber. The NIST-B electrically heated glass moderator chamber used to test the NBSR chamber testing showed the following results: Stable operation possible up to at least 2200 watts with two-phase flow; LH 2 mass quickly reaches new, stable value after heat load change; Void fraction well below 20 at anticipated power and pressure; Restart of H 2 flow verified after extending supply line; Visual inspection showed no dryout or unexpected voids

  13. Introduction to an open source internet-based testing program for medical student examinations.

    Science.gov (United States)

    Lee, Yoon-Hwan

    2009-12-20

    The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees' information, and examinees access the system. The examinee's score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education.

  14. Fabrication and test of prototype ring magnets for the ALS [Advanced Light Source

    International Nuclear Information System (INIS)

    Tanabe, J.; Avery, R.; Caylor, R.; Green, M.I.; Hoyer, E.; Halbach, K.; Hernandez, S.; Humphries, D.; Kajiyama, Y.; Keller, R.; Low, W.; Marks, S.; Milburn, J.; Yee, D.

    1989-03-01

    Prototype Models for the Advanced Light Source (ALS) Booster Dipole, Quadrupole and Sextupole and the Storage Ring Gradient Magnet, Quadrupole and Sextupole have been constructed. The Booster Magnet Prototypes have been tested. The Storage Ring Magnets are presently undergoing tests and magnetic measurements. This paper reviews the designs and parameters for these magnets, briefly describes features of the magnet designs which respond to the special constraints imposed by the requirements for both accelerator rings, and reviews some of the results of magnet measurements for the prototype. 13 refs., 7 figs., 1 tab

  15. Facility for fast neutron irradiation tests of electronics at the ISIS spallation neutron source

    International Nuclear Information System (INIS)

    Andreani, C.; Pietropaolo, A.; Salsano, A.; Gorini, G.; Tardocchi, M.; Paccagnella, A.; Gerardin, S.; Frost, C. D.; Ansell, S.; Platt, S. P.

    2008-01-01

    The VESUVIO beam line at the ISIS spallation neutron source was set up for neutron irradiation tests in the neutron energy range above 10 MeV. The neutron flux and energy spectrum were shown, in benchmark activation measurements, to provide a neutron spectrum similar to the ambient one at sea level, but with an enhancement in intensity of a factor of 10 7 . Such conditions are suitable for accelerated testing of electronic components, as was demonstrated here by measurements of soft error rates in recent technology field programable gate arrays

  16. Introduction to an Open Source Internet-Based Testing Program for Medical Student Examinations

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2009-12-01

    Full Text Available The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees’ information, and examinees access the system. The examinee’s score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education.

  17. Performance of the CERN plasma lens in laboratory and beam tests at the Antiproton Source

    International Nuclear Information System (INIS)

    Kowalewicz, R.; Lubrano di Scampamorte, M.; Milner, S.; Pedersen, F.; Riege, H.; Christiansen, J.; Frank, K.; Stetter, M.; Tkotz, R.; Boggasch, E.

    1991-01-01

    The CERN plasma lens is based on a dynamic z-pinch which creates during 500 ns a cylindrical plasma current conductor of 290 mm length and 38 to 45 mm diameter. The lens is designed for pulsed pinched currents of 400 kA and magnetic field gradients of 200 T/m produced with stored energies of 56 kJ. Life tests of different lens components were carried through at a repetition rate of 4.8 s/pulse. The results of the first beam tests of the plasma lens at the CERN antiproton source are very encouraging in view of other potential plasma lens applications

  18. Summary test results of the particle-beam diagnostics for the Advanced Photon Source (APS) subsystems

    International Nuclear Information System (INIS)

    Lumpkin, A.; Wang, X.; Sellyey, W.; Patterson, D.; Kahana, E.

    1994-01-01

    During the first half of 1994, a number of the diagnostic systems for measurement of the charged-particle beam parameters throughout the subsystems of the Advanced Photon Source (APS) have been installed and tested. The particle beams eventually will involve 450-MeV to 7-GeV positrons and with different pulse formats. The first test and commissionin results for beam profiles, beam position monitors, loss rate monitors, current monitors, and synchrotron radiation photon monitors hve been obtained using 200- to 350-MeV electron beams injected into the subsystems. Data presented are principally from the transport lines and the positron accumulator ring

  19. Single-crate stand-alone CAMAC control system for a negative ion source test facility

    International Nuclear Information System (INIS)

    Juras, R.C.; Ziegler, N.F.

    1979-01-01

    A single-crate CAMAC system was configured to control a negative ion source development facility at ORNL and control software was written for the crate microcomputer. The software uses inputs from a touch panel and a shaft encoder to control the various operating parameters of the test facility and uses the touch panel to display the operating status. Communication to and from the equipment at ion source potential is accomplished over optical fibers from an ORNL-built CAMAC module. A receiver at ion source potential stores the transmitted data and some of these stored values are then used to control discrete parameters of the ion source (i.e., power supply on or off). Other stored values are sent to a multiplexed digital-to-analog converter to provide analog control signals. A transmitter at ion source potential transmits discrete status information and several channels of analog data from an analog-to-digital converter back to the ground-potential receiver where it is stored to be read and displayed by the software

  20. Environmental assessment of general-purpose heat source safety verification testing

    International Nuclear Information System (INIS)

    1995-02-01

    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE's mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned

  1. Information Sources Influencing Soil Testing Innovation Adoption by Grape Farmers in the Khorramdarreh Township

    Directory of Open Access Journals (Sweden)

    Seyedeh Shirin Golbaz

    2015-08-01

    Full Text Available Testing soil  is recognized to be an important practice for sustainable use of nutrients, which has been introduced to Iranian grape farmers as an innovation for over a decade. Its adoption and utilization may be influenced by receiving information from different sources. This study is performed to introduce these information sources that may influence the adoption of soil testing innovation by grape farmers. Using a survey, a sample of 260 out of 3942 grape farmers of the Khorramdarreh Township was selected using a stratified sampling technique and data was collected by structured interviews using a questionnaire. The content and face validity of the questionnaire was discussed and reviewed by a panel of experts consisting of university staff and agricultural professionals. Its reliability was also assessed through a pilot study and its main constructs were approved to be reliable using the Cronbach’s alpha test (measures between 0.71 and 0.84. Less than half of the grape farmers conducted soil testing in their vineyards. A regression analysis showed that variables such as contact of the farmers with model grape producers, Poster Received, publications and listening to radio programs and farmers’ education have a significant positive impact on soil testing innovation adoption. Therefore, both interpersonal and mass media can have a positive effect on farmers to adopt this innovation.

  2. Safety quality classification test of the sealed neutron sources used in start-up neutron source rods for Qinshan Nuclear Power Plant

    International Nuclear Information System (INIS)

    Yao Chunbing; Guo Gang; Chao Jinglan; Duan Liming

    1992-01-01

    According to the regulations listed in the GB4075, the safety quality classification tests have been carried out for the neutron sources. The test items include temperature, external pressure, impact, vibration and puncture, Two dummy sealed sources are used for each test item. The testing equipment used have been examined and verified to be qualified by the measuring department which is admitted by the National standard Bureau. The leak rate of each tested sample is measured by UL-100 Helium Leak Detector (its minimum detectable leak rate is 1 x 10 -10 Pa·m 3 ·s -1 ). The samples with leak rate less than 1.33 x 10 -8 Pa·m 3 ·s -1 are considered up to the standard. The test results show the safety quality classification class of the neutron sources have reached the class of GB/E66545 which exceeds the preset class

  3. Make-up of injector test stand (ITS-1) and preliminary results with Model-I ion source

    International Nuclear Information System (INIS)

    Matsuda, S.; Ito, T.; Kondo, U.; Ohara, Y.; Oga, T.; Shibata, T.; Shirakata, H.; Sugawara, T.; Tanaka, S.

    Constitution of the 1-st injector test stand (ITS-1) in the Thermonuclear Division, JAERI, and the performance of the Model-I ion source are described. Heating a plasma by neutral beam injection is one of the promising means in the thermonuclear fusion devices. Purpose of the test stand is to develop the ion sources used in such injection systems. The test stand was completed in February 1975, which is capable of testing the ion sources up to 12 amps at 30 kV. A hydrogen ion beam of 5.5 amps at 25 kV was obtained in the Model-I ion source

  4. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  5. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    International Nuclear Information System (INIS)

    Park, Sae-Hoon; Kim, Yu-Seok; Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2015-01-01

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber

  6. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sae-Hoon; Kim, Yu-Seok [Dongguk University, Gyeonju (Korea, Republic of); Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Multipurpose Accelerator Complex, Gyeongju (Korea, Republic of)

    2015-10-15

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber.

  7. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  8. Report on the engineering test of the LBL 30 second neutral beam source for the MFTF-B project

    International Nuclear Information System (INIS)

    Vella, M.C.; Pincosy, P.A.; Hauck, C.A.; Pyle, R.V.

    1984-08-01

    Positive ion based neutral beam development in the US has centered on the long pulse, Advanced Positive Ion Source (APIS). APIS eventually focused on development of 30 second sources for MFTF-B. The Engineering Test was part of competitive testing of the LBL and ORNL long pulse sources carried out for the MFTF-B Project. The test consisted of 500 beam shots with 80 kV, 30 second deuterium, and was carried out on the Neutral Beam Engineering Test Facility (NBETF). This report summarizes the results of LBL testing, in which the LBL APIS demonstrated that it would meet the requirements for MFTF-B 30 second sources. In part as a result of this test, the LBL design was found to be suitable as the baseline for a Common Long Pulse Source design for MFTF-B, TFTR, and Doublet Upgrade

  9. Redshift anisotropy among test-particle sources inside a black hole

    International Nuclear Information System (INIS)

    Debney, G.

    1976-01-01

    An elementary (mass-normalized) model of observers and emitters of light in free-fall within a black hole's radius is investigated in terms of the redshift spectrum induced. All observers and emitters follow the same kinds of trajectories, radially inward and starting from rest at spatial infinity. The major results are concerned with demonstrating the types of redshifts possible in all directions on a typical observer's celestial sphere. These are simulated by considering all equatorial light paths inside and generalizing to three dimensions by symmetry. Under certain assumptions a direction for maximum redshift and one for minimum redshift are obtained; these lie on antipodal points on the observer's celestial sphere. No multiple imaging or focusing is possible from isotropic sources inside r = 2m, however. At this stage no luminosity distances or intensity results are developed; these more complicated relationships would be required to simulate the actual picture getting through to an observer. Some of the redshift results are applied to a black hole whose scale is cosmological. This extreme example is included mainly as a curiosity to illustrate the impact of a simple change of scale and to reemphasize the importance of the microwave isotropy to theoretical models. A careful analytical formulation of general relativistic redshifts as seen in local Lorentz frames provides the tools for this investigation. (author)

  10. Redshift anisotropy among test-particle sources inside a black hole

    Energy Technology Data Exchange (ETDEWEB)

    Debney, G [Virginia Polytechnic Inst. and State Univ., Blacksburg (USA)

    1976-09-01

    An elementary (mass-normalized) model of observers and emitters of light in free-fall within a black hole's radius is investigated in terms of the redshift spectrum induced. All observers and emitters follow the same kinds of trajectories, radially inward and starting from rest at spatial infinity. The major results are concerned with demonstrating the types of redshifts possible in all directions on a typical observer's celestial sphere. These are simulated by considering all equatorial light paths inside and generalizing to three dimensions by symmetry. Under certain assumptions a direction for maximum redshift and one for minimum redshift are obtained; these lie on antipodal points on the observer's celestial sphere. No multiple imaging or focusing is possible from isotropic sources inside r = 2m, however. At this stage no luminosity distances or intensity results are developed; these more complicated relationships would be required to simulate the actual picture getting through to an observer. Some of the redshift results are applied to a black hole whose scale is cosmological. This extreme example is included mainly as a curiosity to illustrate the impact of a simple change of scale and to reemphasize the importance of the microwave isotropy to theoretical models. A careful analytical formulation of general relativistic redshifts as seen in local Lorentz frames provides the tools for this investigation.

  11. Is conscious stimulus identification dependent on knowledge of the perceptual modality? Testing the "source misidentification hypothesis"

    DEFF Research Database (Denmark)

    Overgaard, Morten; Lindeløv, Jonas Kristoffer; Svejstrup, Stinna

    2013-01-01

    This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access...... to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness...... experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence....

  12. Reduction of sources of error and simplification of the Carbon-14 urea breath test

    International Nuclear Information System (INIS)

    Bellon, M.S.

    1997-01-01

    Full text: Carbon-14 urea breath testing is established in the diagnosis of H. pylori infection. The aim of this study was to investigate possible further simplification and identification of error sources in the 14 C urea kit extensively used at the Royal Adelaide Hospital. Thirty six patients with validated H. pylon status were tested with breath samples taken at 10,15, and 20 min. Using the single sample value at 15 min, there was no change in the diagnostic category. Reduction or errors in analysis depends on attention to the following details: Stability of absorption solution, (now > 2 months), compatibility of scintillation cocktail/absorption solution. (with particular regard to photoluminescence and chemiluminescence), reduction in chemical quenching (moisture reduction), understanding counting hardware and relevance, and appropriate response to deviation in quality assurance. With this experience, we are confident of the performance and reliability of the RAPID-14 urea breath test kit now available commercially

  13. Vacuum tests of a beamline front-end mock-up at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Liu, C.; Nielsen, R.W.; Kruy, T.L.; Shu, D.; Kuzay, T.M.

    1994-01-01

    A-mock-up has been constructed to test the functioning and performance of the Advanced Photon Source (APS) front ends. The mock-up consists of all components of the APS insertion-device beamline front end with a differential pumping system. Primary vacuum tests have been performed and compared with finite element vacuum calculations. Pressure distribution measurements using controlled leaks demonstrate a better than four decades of pressure difference between the two ends of the mock-up. The measured pressure profiles are consistent with results of finite element analyses of the system. The safety-control systems are also being tested. A closing time of ∼20 ms for the photon shutter and ∼7 ms for the fast closing valve have been obtained. Experiments on vacuum protection systems indicate that the front end is well protected in case of a vacuum breach

  14. Testing and intercomparison of model predictions of radionuclide migration from a hypothetical area source

    International Nuclear Information System (INIS)

    O'Brien, R.S.; Yu, C.; Zeevaert, T.; Olyslaegers, G.; Amado, V.; Setlow, L.W.; Waggitt, P.W.

    2008-01-01

    This work was carried out as part of the International Atomic Energy Agency's EMRAS program. One aim of the work was to develop scenarios for testing computer models designed for simulating radionuclide migration in the environment, and to use these scenarios for testing the models and comparing predictions from different models. This paper presents the results of the development and testing of a hypothetical area source of NORM waste/residue using two complex computer models and one screening model. There are significant differences in the methods used to model groundwater flow between the complex models. The hypothetical source was used because of its relative simplicity and because of difficulties encountered in finding comprehensive, well-validated data sets for real sites. The source consisted of a simple repository of uniform thickness, with 1 Bq g -1 of uranium-238 ( 238 U) (in secular equilibrium with its decay products) distributed uniformly throughout the waste. These approximate real situations, such as engineered repositories, waste rock piles, tailings piles and landfills. Specification of the site also included the physical layout, vertical stratigraphic details, soil type for each layer of material, precipitation and runoff details, groundwater flow parameters, and meteorological data. Calculations were carried out with and without a cover layer of clean soil above the waste, for people working and living at different locations relative to the waste. The predictions of the two complex models showed several differences which need more detailed examination. The scenario is available for testing by other modelers. It can also be used as a planning tool for remediation work or for repository design, by changing the scenario parameters and running the models for a range of different inputs. Further development will include applying models to real scenarios and integrating environmental impact assessment methods with the safety assessment tools currently

  15. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    International Nuclear Information System (INIS)

    Woracek, R.; Hofmann, T.; Bulat, M.; Sales, M.; Habicht, K.; Andersen, K.; Strobl, M.

    2016-01-01

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  16. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    Energy Technology Data Exchange (ETDEWEB)

    Woracek, R., E-mail: robin.woracek@esss.se [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Hofmann, T.; Bulat, M. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Sales, M. [Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark); Habicht, K. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Andersen, K. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Strobl, M. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark)

    2016-12-11

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  17. Design and tests of a package for the transport of radioactive sources; Projeto e testes de uma embalagem para o transporte de fontes radioativas

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Paulo de Oliveira, E-mail: pos@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-10-26

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  18. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    Science.gov (United States)

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Testing a path-analytic mediation model of how motivational enhancement physiotherapy improves physical functioning in pain patients.

    Science.gov (United States)

    Cheing, Gladys; Vong, Sinfia; Chan, Fong; Ditchman, Nicole; Brooks, Jessica; Chan, Chetwyn

    2014-12-01

    Pain is a complex phenomenon not easily discerned from psychological, social, and environmental characteristics and is an oft cited barrier to return to work for people experiencing low back pain (LBP). The purpose of this study was to evaluate a path-analytic mediation model to examine how motivational enhancement physiotherapy, which incorporates tenets of motivational interviewing, improves physical functioning of patients with chronic LBP. Seventy-six patients with chronic LBP were recruited from the outpatient physiotherapy department of a government hospital in Hong Kong. The re-specified path-analytic model fit the data very well, χ (2)(3, N = 76) = 3.86, p = .57; comparative fit index = 1.00; and the root mean square error of approximation = 0.00. Specifically, results indicated that (a) using motivational interviewing techniques in physiotherapy was associated with increased working alliance with patients, (b) working alliance increased patients' outcome expectancy and (c) greater outcome expectancy resulted in a reduction of subjective pain intensity and improvement in physical functioning. Change in pain intensity also directly influenced improvement in physical functioning. The effect of motivational enhancement therapy on physical functioning can be explained by social-cognitive factors such as motivation, outcome expectancy, and working alliance. The use of motivational interviewing techniques to increase outcome expectancy of patients and improve working alliance could further strengthen the impact of physiotherapy on rehabilitation outcomes of patients with chronic LBP.

  20. Ground Deformation and Sources geometry of the 2016 Central Italy Earthquake Sequence Investigated through Analytical and Numerical Modeling of DInSAR Measurements and Structural-Geological Data

    Science.gov (United States)

    Solaro, G.; Bonano, M.; Boncio, P.; Brozzetti, F.; Castaldo, R.; Casu, F.; Cirillo, D.; Cheloni, D.; De Luca, C.; De Nardis, R.; De Novellis, V.; Ferrarini, F.; Lanari, R.; Lavecchia, G.; Manunta, M.; Manzo, M.; Pepe, A.; Pepe, S.; Tizzani, P.; Zinno, I.

    2017-12-01

    The 2016 Central Italy seismic sequence started on 24th August with a MW 6.1 event, where the intra-Apennine WSW-dipping Vettore-Gorzano extensional fault system released a destructive earthquake, causing 300 casualties and extensive damage to the town of Amatrice and surroundings. We generated several interferograms by using ALOS and Sentinel 1-A and B constellation data acquired on both ascending and descending orbits to show that most displacement is characterized by two main subsiding lobes of about 20 cm on the fault hanging-wall. By inverting the generated interferograms, following the Okada analytical approach, the modelling results account for two sources related to main shock and more energetic aftershock. Through Finite Element numerical modelling that jointly exploits DInSAR deformation measurements and structural-geological data, we reconstruct the 3D source of the Amatrice 2016 normal fault earthquake which well fit the main shock. The inversion shows that the co-seismic displacement area was partitioned on two distinct en echelon fault planes, which at the main event hypocentral depth (8 km) merge in one single WSW-dipping surface. Slip peaks were higher along the southern half of the Vettore fault, lower along the northern half of Gorzano fault and null in the relay zone between the two faults; field evidence of co-seismic surface rupture are coherent with the reconstructed scenario. The following seismic sequence was characterized by numerous aftershocks located southeast and northwest of the epicenter which decreased in frequency and magnitude until the end of October, when a MW 5.9 event occurred on 26th October about 25 km to the NW of the previous mainshock. Then, on 30th October, a third large event of magnitude MW 6.5 nucleated below the town of Norcia, striking the area between the two preceding events and filling the gap between the previous ruptures. Also in this case, we exploit a large dataset of DInSAR and GPS measurements to investigate

  1. Safety and quality of food contact materials. Part 1: evaluation of analytical strategies to introduce migration testing into good manufacturing practice.

    Science.gov (United States)

    Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N

    2002-02-01

    The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.

  2. Modding a free and open source software video game: "Play testing is hard work"

    Directory of Open Access Journals (Sweden)

    Giacomo Poderi

    2014-03-01

    Full Text Available Video game modding is a form of fan productivity in contemporary participatory culture. We see modding as an important way in which modders experience and conceptualize their work. By focusing on modding in a free and open source software video game, we analyze the practice of modding and the way it changes modders' relationship with their object of interest. The modders' involvement is not always associated with fun and creativity. Indeed, activities such as play testing often undermine these dimensions of modding. We present a case study of modding that is based on ethnographic research done for The Battle for Wesnoth, a free and open source software strategy video game entirely developed by a community of volunteers.

  3. General-purpose heat source safety verification test series: SVT-11 through SVT-13

    International Nuclear Information System (INIS)

    George, T.G.; Pavone, D.

    1986-05-01

    The General-Purpose Heat Source (GPHS) is a modular component of the radioisotope thermoelectric generator that will provide power for the Galileo and Ulysses (formerly ISPM) space missions. The GPHS provides power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Because the possibility of an orbital abort always exists, the heat source was designed and constructed to minimize plutonia release in any accident environment. The Safety Verification Test (SVT) series was formulated to evaluate the effectiveness of GPHS plutonia containment after atmospheric reentry and Earth impact. The first two reports (covering SVT-1 through SVT-10) described the results of flat, side-on, and angular module impacts against steel targets at 54 m/s. This report describes flat-on module impacts against concrete and granite targets, at velocities equivalent to or higher than previous SVTs

  4. Campylobacter species in animal, food, and environmental sources, and relevant testing programs in Canada.

    Science.gov (United States)

    Huang, Hongsheng; Brooks, Brian W; Lowman, Ruff; Carrillo, Catherine D

    2015-10-01

    Campylobacter species, particularly thermophilic campylobacters, have emerged as a leading cause of human foodborne gastroenteritis worldwide, with Campylobacter jejuni, Campylobacter coli, and Campylobacter lari responsible for the majority of human infections. Although most cases of campylobacteriosis are self-limiting, campylobacteriosis represents a significant public health burden. Human illness caused by infection with campylobacters has been reported across Canada since the early 1970s. Many studies have shown that dietary sources, including food, particularly raw poultry and other meat products, raw milk, and contaminated water, have contributed to outbreaks of campylobacteriosis in Canada. Campylobacter spp. have also been detected in a wide range of animal and environmental sources, including water, in Canada. The purpose of this article is to review (i) the prevalence of Campylobacter spp. in animals, food, and the environment, and (ii) the relevant testing programs in Canada with a focus on the potential links between campylobacters and human health in Canada.

  5. Assessing the Impact of Testing Aids on Post-Secondary Student Performance: A Meta-Analytic Investigation

    Science.gov (United States)

    Larwin, Karen H.; Gorman, Jennifer; Larwin, David A.

    2013-01-01

    Testing aids, including student-prepared testing aids (a.k.a., cheat sheets or crib notes) and open-textbook exams, are common practice in post-secondary assessment. There is a considerable amount of published research that discusses and investigates the impact of these testing aids. However, the findings of this research are contradictory and…

  6. 10 CFR 34.67 - Records of leak testing of sealed sources and devices containing depleted uranium.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Records of leak testing of sealed sources and devices containing depleted uranium. 34.67 Section 34.67 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL... Requirements § 34.67 Records of leak testing of sealed sources and devices containing depleted uranium. Each...

  7. Various quantum nonlocality tests with a commercial two-photon entanglement source

    International Nuclear Information System (INIS)

    Pomarico, Enrico; Bancal, Jean-Daniel; Sanguinetti, Bruno; Rochdi, Anas; Gisin, Nicolas

    2011-01-01

    Nonlocality is a fascinating and counterintuitive aspect of nature, revealed by the violation of a Bell inequality. The standard and easiest configuration in which Bell inequalities can be measured has been proposed by Clauser-Horne-Shimony-Holt (CHSH). However, alternative nonlocality tests can also be carried out. In particular, Bell inequalities requiring multiple measurement settings can provide deeper fundamental insights about quantum nonlocality, as well as offering advantages in the presence of noise and detection inefficiency. In this paper we show how these nonlocality tests can be performed using a commercially available source of entangled photon pairs. We report the violation of a series of these nonlocality tests (I 3322 , I 4422 , and chained inequalities). With the violation of the chained inequality with 4 settings per side we put an upper limit at 0.49 on the local content of the states prepared by the source (instead of 0.63 attainable with CHSH). We also quantify the amount of true randomness that has been created during our experiment (assuming fair sampling of the detected events).

  8. Conceptual and analytical modeling of fracture zone aquifers in hard rock. Implications of pumping tests in the Pohjukansalo well field, east-central Finland

    International Nuclear Information System (INIS)

    Leveinen, J.

    2001-01-01

    Fracture zones with an interconnected network of open fractures can conduct significant groundwater flow and as in the case of the Pohjukansalo well field in Leppaevirta, can yield sufficiently for small-scale municipal water supply. Glaciofluvial deposits comprising major aquifers commonly overlay fracture zones that can contribute to the water balance directly or indirectly by providing hydraulic interconnections between different formations. Fracture zones and fractures can also transport contaminants in a poorly predictable way. Consequently, hydrogeological research of fracture zones is important for the management and protection of soil aquifers in Finland. Hydraulic properties of aquifers are estimated in situ by well test analyses based on analytical models. Most analytical models rely on the concepts of radial flow and horizontal slab aquifer. In Paper 1, pump test responses of fracture zones in the Pohjukansalo well field were characterised based on alternative analytical models developed for channelled flow cases. In Paper 2, the tests were analysed based on the generalised radial flow (GRF) model and a concept of a fracture network possessing fractional flow dimension due to limited connectivity compared to ideal 2- or 3- dimensional systems. The analysis provides estimates of hydraulic properties in terms of parameters that do not have concrete meaning when the flow dimension of the aquifer has fractional values. Concrete estimates of hydraulic parameters were produced by making simplified assumptions and by using the composite model developed in Paper 3. In addition to estimates of hydraulic parameters, analysis of hydraulic tests provides qualitative information that is useful when the hydraulic connections in the fracture system are not well known. However, attention should be paid to the frequency of drawdown measurements-particularly for the application of derivative curves. In groundwater studies, analytical models have been also used to estimate

  9. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Partial discharge tests and characterisation of the Advanced Photon Source linac modulator cables

    International Nuclear Information System (INIS)

    Cours, A.

    2007-01-01

    The advanced photon source (APS) linac modulators are PFN-type pulsers with switch-mode charging power supplies (PSs). The PS and the PFN are connected to each other by 15 feet of 100-kV x-ray cable, with the PFN end of the cable terminated with a connector that was confirmed partial-discharge (PD)-free up to 38 kV ac (53.5 kV peak). Another end of the cable is terminated with a connector that was designed by the PS manufacturer and cannot easily be replaced with another type of connector, since part of it is located inside the densely packed PS. PD tests of the cables with this type of connector show that the PD inception voltages (PDIVs) in different cables turn out to be located within a wide voltage range: 21 to 27 kV ac that corresponds to 29 to 38 kV peak. In order to evaluate the insulation condition of the modulator cables, detect insulation deterioration, and ensure failure-preventing equipment maintenance, over the last two years the PDIVs of all high-voltage (HV) cables in use in the modulators have been tested about every three and a half months. Before the tests, all cables were removed from the equipment, carefully cleaned, inspected, and regreased. The tests were performed using a 40-kV PD detector. The test results show that: 1 The PDIVs remain almost unchanged in all tested cables. 2 From test to test, the PDIV of any particular cable may slightly oscillate around some average value. This possibly depends on the connector regreasing technique. 3 There is no direct evidence of cable insulation deterioration during more than two years of operation under voltage higher than the PD inception level.

  11. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  12. The x-ray source application test cassette for radiation exposures at the OMEGA laser

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, K. B.; Rekow, V.; Emig, J. [Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Fisher, J. H.; Newlander, C. D. [Fifth Gait Technologies, Inc., Huntsville, Alabama 35803 (United States); Horton, R. [Gray Research, Inc., Huntsville, Alabama 35806 (United States); Davis, J. [Defense Threat Reduction Agency, Fort Belvoir, Virginia 22060 (United States)

    2012-10-15

    We have designed a sample cassette that can be used to position up to six samples in the OMEGA laser chamber. The cassette accommodates round samples up to 38.1 mm (1.5{sup Double-Prime }) in diameter and square samples up to 27 mm on a side, any of which can be up to 12.7 mm thick. Smaller specimens are centered with spacers. The test cassette allows each sample to have a unique filter scheme, with multiple filter regions in front of each sample. This paper will present mechanical design considerations and operational aspects of the x-ray source application cassette.

  13. Progress toward the development and testing of source reconstruction methods for NIF neutron imaging.

    Science.gov (United States)

    Loomis, E N; Grim, G P; Wilde, C; Wilson, D C; Morgan, G; Wilke, M; Tregillis, I; Merrill, F; Clark, D; Finch, J; Fittinghoff, D; Bower, D

    2010-10-01

    Development of analysis techniques for neutron imaging at the National Ignition Facility is an important and difficult task for the detailed understanding of high-neutron yield inertial confinement fusion implosions. Once developed, these methods must provide accurate images of the hot and cold fuels so that information about the implosion, such as symmetry and areal density, can be extracted. One method under development involves the numerical inversion of the pinhole image using knowledge of neutron transport through the pinhole aperture from Monte Carlo simulations. In this article we present results of source reconstructions based on simulated images that test the methods effectiveness with regard to pinhole misalignment.

  14. Compact X-ray source at STF (Super Conducting Accelerator Test Facility)

    International Nuclear Information System (INIS)

    Urakawa, J

    2012-01-01

    KEK-STF is a super conducting linear accelerator test facility for developing accelerator technologies for the ILC (International Linear Collider). We are supported in developing advanced accelerator technologies using STF by Japanese Ministry (MEXT) for Compact high brightness X-ray source development. Since we are required to demonstrate the generation of high brightness X-ray based on inverse Compton scattering using super conducting linear accelerator and laser storage cavity technologies by October of next year (2012), the design has been fixed and the installation of accelerator components is under way. The necessary technology developments and the planned experiment are explained.

  15. Development of quality assurance programme for prescribed ionizing radiation source testing. Recommendations

    International Nuclear Information System (INIS)

    1999-01-01

    The document gives guidance to those applying for licence to perform ionizing radiation source acceptance tests and long-term stability tests and provides information which should be known when introducing quality assurance systems in compliance with legislative requirements. It is envisaged that this document ('Recommendations') will form a basis for final Safety Guides to be issued by the State Office for Nuclear Safety, the Czech nuclear regulatory authority. The setup of the publication is as follows. Part I gives a glossary of basic terms in quality systems. Part 2 explains quality system principles, paying special attention to radiation safety issues, and describes the structure and scope of quality system documentation. Part 3 explains the individual elements of the quality system and gives practical examples. Part 4 deals with the quality assurance programme; using instructions and practical examples, this part shows how the quality system elements should be applied to long-time stability testing and acceptance testing. A model structure of 2nd degree documentation (guidelines) and a model testing protocol are given in annexes. (P.A.)

  16. The HAW-Project: Test disposal of highly radioactive radiation sources in the Asse salt mine

    International Nuclear Information System (INIS)

    1992-04-01

    Two electrical heater tests were already started in November 1988 and are continuously surveyed in respect of the thermomechanical and geochemical response of the rock mass. Also the handling system necessary for the emplacement of 30 radioactive canisters (Sr-90 and Cs-137 sources) was developed and succesfully tested. This system consists of six multiple transport and storage casks of the type Castor-GSF-5, two above ground/below ground shuttle transport casks of the type Asse TB1, an above ground transfer station, an underground transport vehicle, a disposal machine, and a borehole slider. A laboratory investigation program on radiation effects in salt is being performed in advance to the radioactive canister emplacement. This program includes the investigation of thermally and radiolytically induced water and gas release from the rock salt and the radiolytical decomposition of salt minerals. For gamma dose and dose rate measurements in the test field measuring systems consisting of ionisation chambers as well as solid state dosemeters were developed and tested. Thermomechanical computer code validation is performed by calculational predictions and parallel investigation of the stress and displacement fields in the underground test field. (orig./HP)

  17. Marketing the HIV test to MSM: ethnic differences in preferred venues and sources.

    Science.gov (United States)

    Lechuga, Julia; Owczarzak, Jill T; Petroll, Andrew E

    2013-05-01

    Lack of awareness of HIV status is associated with an increased likelihood of HIV transmission. We surveyed 633 men who have sex with men (MSM) from diverse ethnic groups recruited from a variety of community venues in a U.S. Midwestern city with rising HIV infection rates. Our first aim was to describe patterns of sexual risk, annual HIV testing frequency, and venues where information about HIV and HIV testing could be disseminated to inner-city MSM. Our second aim was to identify preferred sources to receive information about HIV testing and determine whether these preferences differed by ethnic background. Results indicated that despite similar proportions of high-sexual risk behaviors, compared with African American and Latino MSM, smaller proportions of non-Hispanic White MSM had received an HIV test in the last 12 months. Despite ethnic differences in health care access, a physician's office was the most common HIV testing site. Overall, a majority conveyed a preference to see advertisements in mainstream media outlets. However, when preferences were stratified by ethnicity, African American MSM were the least likely to prefer receiving information from mainstream media and conveyed a stronger preference to receive information from authority figures than non-Hispanic White and Hispanic MSM.

  18. The feasibility of 10 keV X-ray as radiation source in total dose response radiation test

    International Nuclear Information System (INIS)

    Li Ruoyu; Li Bin; Luo Hongwei; Shi Qian

    2005-01-01

    The standard radiation source utilized in traditional total dose response radiation test is 60 Co, which is environment-threatening. X-rays, as a new radiation source, has the advantages such as safety, precise control of dose rate, strong intensity, possibility of wafer-level test or even on-line test, which greatly reduce cost for package, test and transportation. This paper discussed the feasibility of X-rays replacing 60 Co as the radiation source, based on the radiation mechanism and the effects of radiation on gate oxide. (authors)

  19. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  20. A test of unification towards the radio source PKS1413+135

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, M.C., E-mail: up200802537@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Julião, M.D., E-mail: meinf12013@fe.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Engenharia, Universidade do Porto, Rua Dr Roberto Frias, 4200-465 Porto (Portugal); Martins, C.J.A.P., E-mail: Carlos.Martins@astro.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Monteiro, A.M.R.V.L., E-mail: mmonteiro@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands)

    2013-07-09

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g{sub p} towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements.

  1. The test beamline of the European Spallation Source - Instrumentation development and wavelength frame multiplication

    DEFF Research Database (Denmark)

    Woracek, R.; Hofmann, T.; Bulat, M.

    2016-01-01

    which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor...... wavelength band between 1.6 A and 10 A by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components....... This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects....

  2. Radiation Tolerance Qualification Tests of the Final Source Interface Unit for the ALICE Experiment

    CERN Document Server

    Dénes, E; Futó, E; Kerék, A; Kiss, T; Molnár, J; Novák, D; Soós, C; Tölyhi, T; Van de Vyvre, P

    2007-01-01

    The ALICE Detector Data Link (DDL) is a high-speed optical link designed to interface the readout electronics of ALICE sub-detectors to the DAQ computers. The Source Interface Unit (SIU) of the DDL will operate in radiation environment. Previous tests showed that a configuration loss of SRAM-based FPGA devices may happen and the frequency of undetected data errors in the FPGA user memory area is also not acceptable. Therefore, we redesigned the SIU card using another FPGA based on flash technology. In order to detect bit errors in the user memory we added parity check logic to the design. The new SIU has been extensively tested using neutron and proton irradiation to verify its radiation tolerance. In this paper we summarize the design changes, introduce the final design, and the results of the radiation tolerance measurements on the final card.

  3. A test of unification towards the radio source PKS1413+135

    International Nuclear Information System (INIS)

    Ferreira, M.C.; Julião, M.D.; Martins, C.J.A.P.; Monteiro, A.M.R.V.L.

    2013-01-01

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g p towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements

  4. Characterizing the Performance of the Princeton Advanced Test Stand Ion Source

    Science.gov (United States)

    Stepanov, A.; Gilson, E. P.; Grisham, L.; Kaganovich, I.; Davidson, R. C.

    2012-10-01

    The Princeton Advanced Test Stand (PATS) is a compact experimental facility for studying the physics of intense beam-plasma interactions relevant to the Neutralized Drift Compression Experiment - II (NDCX-II). The PATS facility consists of a multicusp RF ion source mounted on a 2 m-long vacuum chamber with numerous ports for diagnostic access. Ar+ beams are extracted from the source plasma with three-electrode (accel-decel) extraction optics. The RF power and extraction voltage (30 - 100 kV) are pulsed to produce 100 μsec duration beams at 0.5 Hz with excellent shot-to-shot repeatability. Diagnostics include Faraday cups, a double-slit emittance scanner, and scintillator imaging. This work reports measurements of beam parameters for a range of beam energies (30 - 50 keV) and currents to characterize the behavior of the ion source and extraction optics. Emittance scanner data is used to calculate the beam trace-space distribution and corresponding transverse emittance. If the plasma density is changing during a beam pulse, time-resolved emittance scanner data has been taken to study the corresponding evolution of the beam trace-space distribution.

  5. A simplified model of the source channel of the Leksell GammaKnife tested with PENELOPE.

    Science.gov (United States)

    Al-Dweri, Feras M O; Lallena, Antonio M; Vilches, Manuel

    2004-06-21

    Monte Carlo simulations using the code PENELOPE have been performed to test a simplified model of the source channel geometry of the Leksell GammaKnife. The characteristics of the radiation passing through the treatment helmets are analysed in detail. We have found that only primary particles emitted from the source with polar angles smaller than 3 degrees with respect to the beam axis are relevant for the dosimetry of the Gamma Knife. The photon trajectories reaching the output helmet collimators at (x, v, z = 236 mm) show strong correlations between rho = (x2 + y2)(1/2) and their polar angle theta, on one side, and between tan(-1)(y/x) and their azimuthal angle phi, on the other. This enables us to propose a simplified model which treats the full source channel as a mathematical collimator. This simplified model produces doses in good agreement with those found for the full geometry. In the region of maximal dose, the relative differences between both calculations are within 3%, for the 18 and 14 mm helmets, and 10%, for the 8 and 4 mm ones. Besides, the simplified model permits a strong reduction (larger than a factor 15) in the computational time.

  6. Lifetime test on a high-performance dc microwave proton source

    International Nuclear Information System (INIS)

    Sherman, J.D.; Hodgkins, D.J.; Lara, P.D.; Schneider, J.D.; Stevens, R.R. Jr.

    1995-01-01

    Powerful CW proton linear accelerators (100 mA at 0.5--1 GeV) are being proposed for spallation neutron source applications.These production accelerators require high availability and reliability. A microwave proton source, which has already demonstrated several key beam requirements, was operated for one week (170 hours) in a dc mode to test the reliability and lifetime of its plasma generator. The source was operated with 570 W of microwave (2.45 GHz) discharge power and with a 47-kV extraction voltage. This choice of operating parameters gave a proton current density of 250-mA/cm 2 at 83% proton fraction, which is sufficient for a conservative dc injector design. The beam current was 60--65 mA over most of the week, and was sufficiently focused for RFQ injection. Total beam availability, defined as 47-keV beam-on time divided by elapsed time, was 96.2%. Spark downs in the high voltage column and a gas flow control problem caused all the downtime; no plasma generator failures were observed

  7. A 14-MeV beam-plasma neutron source for materials testing

    International Nuclear Information System (INIS)

    Futch, A.H.; Coensgen, F.H.; Damm, C.C.; Molvik, A.W.

    1989-01-01

    The design and performance of 14-MeV beam-plasma neutron sources for accelerated testing of fusion reactor materials are described. Continuous production of 14-MeV neutron fluxes in the range of 5 to 10 MW/m 2 at the plasma surface are produced by D-T reactions in a two-component plasma. In the present designs, 14-MeV neutrons result from collisions of energetic deuterium ions created by transverse injection of 150-keV deuterium atoms on a fully ionized tritium target plasma. The beam energy, which deposited at the center of the tritium column, is transferred to the warm plasma by electron drag, which flows axially to the end regions. Neutral gas at high pressure absorbs the energy in the tritium plasma and transfers the heat to the walls of the vacuum vessel. The plasma parameters of the neutron source, in dimensionless units, have been achieved in the 2XIIB high-β plasma. The larger magnetic field of the present design permits scaling to the higher energy and density of the neutron source design. In the extrapolation, care has been taken to preserve the scaling and plasma attributes that contributed to equilibrium, magnetohydrodynamic (MHD) stability, and microstability in 2XIIB. The performance and scaling characteristics are described for several designs chosen to enhance the thermal isolation of the two-component plasmas. 11 refs., 3 figs., 3 tabs

  8. An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.

    Science.gov (United States)

    Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D

    2016-05-01

    Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be

  9. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    Science.gov (United States)

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Effect of dietary iron source and iron status on iron bioavailability tests in the rat

    International Nuclear Information System (INIS)

    Zhang, D.; Hendricks, D.G.; Mahoney, A.W.

    1986-01-01

    Weanling male rats were made anemic in 7 days by feeding a low iron diet and bleeding. Healthy rats were fed the low iron diet supplemented with ferrous sulfate (29 ppm Fe). Each group was subdivided and fed for 10 days on test diets containing about 29 ppm iron that were formulated with meat:spinach mixtures or meat:soy mixtures to provided 100:0, 75:25, 50:50, 25:75, or 0:100% of the dietary iron from these sources or from a ferrous sulfate diet. After 3 days on the diets all rats were dosed orally with 2 or 5 micro curries of 59 Fe after a 18 hour fast and refeeding for 1.5 hours. Iron status influenced liver iron, carcass iron, liver radio activity and percent of radioactive dose retained. Diet influenced fecal iron and apparent absorption of iron. In iron bioavailability studies assessment methodology and iron status of the test subject greatly influences the estimates of the value of dietary sources of iron

  11. First in situ operation performance test of ground source heat pump in Tunisia

    International Nuclear Information System (INIS)

    Naili, Nabiha; Attar, Issam; Hazami, Majdi; Farhat, Abdelhamid

    2013-01-01

    Highlights: • Evaluate the geothermal energy in Tunisia. • Study of the performance of GSHP system for cooling space. • GSHP is a promising alternative for building cooling in Tunisia. - Abstract: The main purpose of this paper is to study the energetic potential of the deployment in Tunisia of the Ground Source Heat Pump (GSHP) system for cooling mode application. Therefore, a pilot GSHP system using horizontal Ground Heat Exchanger (GHE) was installed and experimented in the Research and Technology Center of Energy (CRTEn), Borj Cédria. The experiment is conducted in a test room with a floor area of about 12 m 2 . In the floor of the tested room is integrated a polyethylene exchanger (PEX) used as a radiant floor cooling (RFC) system. The experimental setup mainly includes the ground temperature, the temperature and flow rate of water circulating in the heat pump and the GHE, as well as the power consumption of the heat pump and circulating pumps. These experimental data are essentially used to evaluate the coefficient of performance of the heat pump (COP hp ) and the overall system (COP sys ) for continuous operation mode. The COP hp and the COP sys were found to be 4.25 and 2.88, respectively. These results reveal that the use of the ground source heat pump is very appropriate for Tunisian building cooling

  12. OpenSR: An Open-Source Stimulus-Response Testing Framework

    Directory of Open Access Journals (Sweden)

    Carolyn C. Matheus

    2015-01-01

    Full Text Available Stimulus–response (S–R tests provide a unique way to acquire information about human perception by capturing automatic responses to stimuli and attentional processes. This paper presents OpenSR, a user-centered S–R testing framework providing a graphical user interface that can be used by researchers to customize, administer, and manage one type of S–R test, the implicit association test. OpenSR provides an extensible open-source Web-based framework that is platform independent and can be implemented on most computers using any operating system. In addition, it provides capabilities for automatically generating and assigning participant identifications, assigning participants to different condition groups, tracking responses, and facilitating collecting and exporting of data. The Web technologies and languages used in creating the OpenSR framework are discussed, namely, HTML5, CSS3, JavaScript, jQuery, Twitter Bootstrap, Python, and Django. OpenSR is available for free download.

  13. SUPER-FMIT, an accelerator-based neutron source for fusion components irradiation testing

    International Nuclear Information System (INIS)

    Burke, R.J.; Holmes, J.J.; Johnson, D.L.; Mann, F.M.; Miles, R.R.

    1984-01-01

    The SUPER-FMIT facility is proposed as an advanced accelerator based neutron source for high flux irradiation testing of large-sized fusion reactor components. The facility would require only small extensions to existing accelerator and target technology originally developed for the Fusion Materials Irradiation Test (FMIT) facility. There, neutrons would be produced by a 0.1 ampere beam of 35 MeV deuterons incident upon a liquid lithium target. The volume available for high flux (> 10 14 n/cm 2 -s) testing in SUPER-FMIT would be 14 liters, about a factor of 30 larger than in the FMIT facility. This is because the effective beam current of 35 MeV deuterons on target can be increased by a factor of ten to 1.0 amperes or more. Such a large increase can be accomplished by acceleration of multiple beams of molecular deuterium ions (D 2 +) to 70 MeV in a common accelerator sructure. The availability of multiple beams and large total current allows great variety in the testing that can be done. For example, fluxes greater than 10 16 n/cm 2 -s, multiple simultaneous experiments, and great flexibility in tailoring of spatial distributions of flux and spectra can be achieved

  14. A General Semi-Analytical Solution for Three Types of Well Tests in Confined Aquifers with a Partially Penetrating Well

    Directory of Open Access Journals (Sweden)

    Shaw-Yang Yang Hund-Der Yeh

    2012-01-01

    Full Text Available This note develops a general mathematical model for describing the transient hydraulic head response for constant-head test, constant-flux test, and slug test in a radial confined aquifer system with a partially penetrating well. The Laplace-domain solution for the model is derived by applying the Laplace transform with respect to time and finite Fourier cosine transform with respect to the z-direction. This new solution has been shown to reduce to the constant-head test when discounting the wellbore storage and maintaining a constant well water level. This solution can also be reduced to the constant-flux test solution when discounting the wellbore storage and keeping a constant pumping rate in the well. Moreover, the solution becomes the slug test solution when there is no pumping in the well. This general solution can be used to develop a single computer code to estimate aquifer parameters if coupled with an optimization algorithm or to assess the effect of well partial penetration on hydraulic head distribution for three types of aquifer tests.

  15. Quantifying Acoustic Impacts on Marine Mammals and Sea Turtles: Methods and Analytical Approach for Phase III Training and Testing

    Science.gov (United States)

    2017-06-16

    bottom in these areas. The primary acoustic propagation paths in deep water do not usually involve any interaction with the bottom, whereas in shallow ...during the simulation process according to the typical depth pattern observed for each species. Dive profile information was collected via literature...Friedlander source signature was examined at two locations ( shallow and deep ) for a near surface 1,000 lb net explosive weight charge. Generally there is

  16. Analytical and numerical study of validation test-cases for multi-physic problems: application to magneto-hydro-dynamic

    Directory of Open Access Journals (Sweden)

    D Cébron

    2016-04-01

    Full Text Available The present paper is concerned with the numerical simulation of Magneto-Hydro-Dynamic (MHD problems with industrial tools. MHD has receivedattention some twenty to thirty years ago as a possible alternative inpropulsion applications; MHD propelled ships have even been designed forthat purpose. However, such propulsion systems have been proved of lowefficiency and fundamental researches in the area have progressivelyreceived much less attention over the past decades. Numerical simulationof MHD problem could however provide interesting solutions in the field ofturbulent flow control. The development of recent efficient numericaltechniques for multi-physic applications provide promising tool for theengineer for that purpose. In the present paper, some elementary testcases in laminar flow with magnetic forcing terms are analysed; equationsof the coupled problem are exposed, analytical solutions are derived ineach case and are compared to numerical solutions obtained with anumerical tool for multi-physic applications. The present work can be seenas a validation of numerical tools (based on the finite element method foracademic as well as industrial application purposes.

  17. Chemical synthesis, characterisation, analytical method development and control to promote exposure assessments and toxicological testing. Highlights from COMPARE

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, Aa.; Malmberg, T.; Weiss, J. [Stockholm Univ. (Sweden). Dept. of Environmental Chemistry

    2004-09-15

    The issue of endocrine disruptor effects in wildlife and humans grow increasingly important during the 1990s'. As part of the focus on endocrine disruptors new contaminants and their metabolites were put forward for studies with endpoints related to hormone disruption. One such large group of chemicals and/or metabolites of neutral semi-persistent or persistent compounds was the substituted phenols, particularly the halogenated phenolic compounds (HPCs). Polychlorobiphenylols (OHPCBs) were reported to be strongly retained in human blood plasma in 1995 and this article was the first study to point out the general retention of several OH-PCBs in the plasma. The metabolic formation of OH-PCBs was well known and the specific blood retention had been reported for at least one PCB congener, 3,3',4,4'-tetrachlorobiphenyl (CB-77) in some previous studies. The identification of OH-PCBs being retained in blood and their specific binding to transthyretin (TTR) has formed much of the basis for two EU R and D programs, first RENCO and now COMPARE. The present report is aimed to highlight some of the results obtained within the COMPARE program mainly dealing with the chemical synthesis, characterisation and analytical aspects of HPCs.

  18. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  19. Radioimmunoassay. A revolution in the analytic procedure

    Energy Technology Data Exchange (ETDEWEB)

    Strecker, H; Eckert, H G [Farbwerke Hoechst A.G., Frankfurt am Main (Germany, F.R.). Radiochemisches Lab.

    1978-04-01

    Radioimmunoassay is an analytic method which combines the sensitivity of radioactive measurement and the specificity of the antigen-antibody reaction Substances down to a concentration of some picograms per ml serum (or biological material) can be measured in the presence of a millionfold excess of otherwise interfering substances. The method is easy to carry out (test tube chemistry). The main field of application at the moment is in endocrinology; further possibilities of application are in pharmaceutical research, environmental protection, forensic medicine, and for general analytic purposes. Radioactive sources are used only in vitro in the nanocurie range, i.e. radiation exposure is negligible.

  20. Study of a spherical torus based volumetric neutron source for nuclear technology testing and development

    International Nuclear Information System (INIS)

    Cheng, E.T.; Cerbone, R.J.; Sviatoslavsky, I.N.; Galambos, L.D.; Peng, Y.-K.M.

    2000-01-01

    A plasma based, deuterium and tritium (DT) fueled, volumetric 14 MeV neutron source (VNS) has been considered as a possible facility to support the development of the demonstration fusion power reactor (DEMO). It can be used to test and develop necessary fusion blanket and divertor components and provide sufficient database, particularly on the reliability of nuclear components necessary for DEMO. The VNS device can be complement to ITER by reducing the cost and risk in the development of DEMO. A low cost, scientifically attractive, and technologically feasible volumetric neutron source based on the spherical torus (ST) concept has been conceived. The ST-VNS, which has a major radius of 1.07 m, aspect ratio 1.4, and plasma elongation three, can produce a neutron wall loading from 0.5 to 5 MW m -2 at the outboard test section with a modest fusion power level from 38 to 380 MW. It can be used to test necessary nuclear technologies for fusion power reactor and develop fusion core components include divertor, first wall, and power blanket. Using staged operation leading to high neutron wall loading and optimistic availability, a neutron fluence of more than 30 MW year m -2 is obtainable within 20 years of operation. This will permit the assessments of lifetime and reliability of promising fusion core components in a reactor relevant environment. A full scale demonstration of power reactor fusion core components is also made possible because of the high neutron wall loading capability. Tritium breeding in such a full scale demonstration can be very useful to ensure the self-sufficiency of fuel cycle for a candidate power blanket concept

  1. Cooperative effort between Consorcio European Spallation Source--Bilbao and Oak Ridge National Laboratory spallation neutron source for manufacturing and testing of the JEMA-designed modulator system

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, David E [ORNL

    2017-01-02

    The JEMA modulator was originally developed for the European Spallation Source (ESS) when Spain was under consideration as a location for the ESS facility. Discussions ensued and the Spallation Neutron Source Research Accelerator Division agreed to form a collaboration with ESS-Bilbao (ESS-B) consortium to provide services for specifying the requirements for a version of the modulator capable of operating twelve 550 kW klystrons, monitoring the technical progress on the contract with JEMA, installing and commissioning the modulator at SNS, and performing a 30 day full power test. This work was recently completed, and this report discusses those activities with primary emphasis on the installation and testing activities.

  2. Cooperative effort between Consorcio European Spallation Source--Bilbao and Oak Ridge National Laboratory spallation neutron source for manufacturing and testing of the JEMA-designed modulator system

    International Nuclear Information System (INIS)

    Anderson, David E.

    2017-01-01

    The JEMA modulator was originally developed for the European Spallation Source (ESS) when Spain was under consideration as a location for the ESS facility. Discussions ensued and the Spallation Neutron Source Research Accelerator Division agreed to form a collaboration with ESS-Bilbao (ESS-B) consortium to provide services for specifying the requirements for a version of the modulator capable of operating twelve 550 kW klystrons, monitoring the technical progress on the contract with JEMA, installing and commissioning the modulator at SNS, and performing a 30 day full power test. This work was recently completed, and this report discusses those activities with primary emphasis on the installation and testing activities.

  3. Source-Type Inversion of the September 03, 2017 DPRK Nuclear Test

    Science.gov (United States)

    Dreger, D. S.; Ichinose, G.; Wang, T.

    2017-12-01

    On September 3, 2017, the DPRK announced a nuclear test at their Punggye-ri site. This explosion registered a mb 6.3, and was well recorded by global and regional seismic networks. We apply the source-type inversion method (e.g. Ford et al., 2012; Nayak and Dreger, 2015), and the MDJ2 seismic velocity model (Ford et al., 2009) to invert low frequency (0.02 to 0.05 Hz) complete three-component waveforms, and first-motion polarities to map the goodness of fit in source-type space. We have used waveform data from the New China Digital Seismic Network (BJT, HIA, MDJ), Korean Seismic Network (TJN), and the Global Seismograph Network (INCN, MAJO). From this analysis, the event discriminates as an explosion. For a pure explosion model, we find a scalar seismic moment of 5.77e+16 Nm (Mw 5.1), however this model fails to fit the large Love waves registered on the transverse components. The best fitting complete solution finds a total moment of 8.90e+16 Nm (Mw 5.2) that is decomposed as 53% isotropic, 40% double-couple, and 7% CLVD, although the range of isotropic moment from the source-type analysis indicates that it could be as high as 60-80%. The isotropic moment in the source-type inversion is 4.75e16 Nm (Mw 5.05). Assuming elastic moduli from model MDJ2 the explosion cavity radius is approximately 51m, and the yield estimated using Denny and Johnson (1991) is 246kt. Approximately 8.5 minutes after the blast a second seismic event was registered, which is best characterized as a vertically closing horizontal crack, perhaps representing the partial collapse of the blast cavity, and/or a service tunnel. The total moment of the collapse is 3.34e+16 Nm (Mw 4.95). The volumetric moment of the collapse is 1.91e+16 Nm, approximately 1/3 to 1/2 of the explosive moment. German TerraSAR-X observations of deformation (Wang et al., 2017) reveal large radial outward motions consistent with expected deformation for an explosive source, but lack significant vertical motions above the

  4. The influence of testing apparatus stiffness on the source properties of laboratory stick-slip

    Science.gov (United States)

    Kilgore, B. D.; McGarr, A.; Beeler, N. M.; Lockner, D. A.

    2016-12-01

    Stick-slip experiments were performed to determine the influence of the testing apparatus stiffness on source properties, to develop methods to relate stick-slip to natural earthquakes, and to examine the hypothesis of McGarr [2012] that the product of unloading stiffness, k, and slip duration, T, is both scale-independent and approximately constant for both laboratory and natural earthquakes. A double-direct shear load frame was used with Sierra White Granite samples at 2 MPa normal stress, and a remote loading rate of 0.2 µm/s. The stiffness of the test apparatus was varied by more than an order of magnitude by inserting disk springs into the shear loading column adjacent to the granite samples. Servo-controlling slip at a point between the forcing ram and the shear force load cell, produced repeatable slip events. Slip and slip duration decrease as k increases, as they do for natural earthquakes. In contrast to earthquakes, stress drop and slip rate decrease with increasing k, and the product kT for these experiments is not constant, but decreases with k. These data, collected over a range of k, do not conform to McGarr's [2012] hypothesis. However, analysis of stick-slip studies from other testing apparatuses is consistent with McGarr's hypothesis; kT is scale-independent, similar to that of earthquakes, equal to the ratio of static stress drop to average slip velocity, and similar to the ratio of shear modulus to wavespeed of rock. These properties result from conducting experiments over a range of sample sizes, using rock samples with the same elastic properties as the Earth, and using testing machines whose stiffnesses decrease, and characteristic periods increase with scale. A consequence of our experiments and analysis is that extrapolation of lab scale earthquake source properties to the Earth is more difficult than previously thought, requiring an accounting for the properties of the testing machines and additional research beyond that reported here.

  5. Testing for lightning as a source of radio bursts observed on the nightside of Venus

    International Nuclear Information System (INIS)

    Sonwalkar, V.S.; Carpenter, D.L.; Strangeway, R.J.

    1990-11-01

    In certain previous studies of radio burst events recorded by the Pioneer Venus Orbiting Electric Field Detector (OEFD), data were sorted for statistical purposes according to occurrence at filter band frequencies smaller than or greater than typical values of the ambient electron gyrofrequency. The expectation in making this distinction was that the lowest frequency signals, at 100 Hz, were candidates for propagation through the ionosphere to the spacecraft in the whistler mode, and that the higher frequency signals, if of subionospheric origin, would require some different ionospheric penetration mechanism. On the basis of certain assumptions about the homogeneity and horizontal stratification of the Venusian nightside ionosphere, methods were developed for case-by-case testing of the hypothesis that any particular burst event originated in subionospheric lightning. The tests, which are capable of refinement, allow prediction of the resonance cone angle, refractive index, wave dispersion, and wave polarization. The tests have been applied to data from 11 periods along 7 orbits, and are believed to represent an improved way of categorizing OEFD burst data for purposes of investigating source/propagation mechanisms. Four of the five burst events that were not found consistent with the lightning hypothesis involved receptions at multiple OEFD filter band frequencies

  6. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  7. Quality-control analytical methods: endotoxins: essential testing for pyrogens in the compounding laboratory, part 3: a simplified endotoxin test method for compounded sterile preparations.

    Science.gov (United States)

    Cooper, James F

    2011-01-01

    The first two parts of the IJPC series on endotoxin testing explained the nature of pyrogenic contamination and described various Limulus amebocyte lysate methods for detecting and measuring endotoxin levels with the bacterial endotoxin test described in the United States Pharmacopeia. This third article in that series describes the endotoxin test that is simplest to permorm for pharmacists who prefer to conduct an endotoxin assa at the time of compounding in the pharmacy setting.

  8. Comparison of Video Head Impulse Test (vHIT) Gains Between Two Commercially Available Devices and by Different Gain Analytical Methods.

    Science.gov (United States)

    Lee, Sang Hun; Yoo, Myung Hoon; Park, Jun Woo; Kang, Byung Chul; Yang, Chan Joo; Kang, Woo Suk; Ahn, Joong Ho; Chung, Jong Woo; Park, Hong Ju

    2018-06-01

    To evaluate whether video head impulse test (vHIT) gains are dependent on the measuring device and method of analysis. Prospective study. vHIT was performed in 25 healthy subjects using two devices simultaneously. vHIT gains were compared between these instruments and using five different methods of comparing position and velocity gains during head movement intervals. The two devices produced different vHIT gain results with the same method of analysis. There were also significant differences in the vHIT gains measured using different analytical methods. The gain analytic method that compares the areas under the velocity curve (AUC) of the head and eye movements during head movements showed lower vHIT gains than a method that compared the peak velocities of the head and eye movements. The former method produced the vHIT gain with the smallest standard deviation among the five procedures tested in this study. vHIT gains differ in normal subjects depending on the device and method of analysis used, suggesting that it is advisable for each device to have its own normal values. Gain calculations that compare the AUC of the head and eye movements during the head movements show the smallest variance.

  9. Creating and Evaluating a Hypertext System of Documenting Analytical Test Methods in a Chemical Plant Quality Assurance Laboratory.

    Science.gov (United States)

    White, Charles E., Jr.

    The purpose of this study was to develop and implement a hypertext documentation system in an industrial laboratory and to evaluate its usefulness by participative observation and a questionnaire. Existing word-processing test method documentation was converted directly into a hypertext format or "hyperdocument." The hyperdocument was designed and…

  10. Analytic and nearly optimal self-testing bounds for the Clauser-Horne-Shimony-Holt and Mermin inequalities

    DEFF Research Database (Denmark)

    Kaniewski, Jedrzej

    2016-01-01

    that nontrivial fidelity with the singlet can be achieved as long as the violation exceeds β∗=(16+142√)/17≈2.11). In the case of self-testing the tripartite Greenberger-Horne-Zeilinger state using the Mermin inequality, we derive a bound which not only improves on previously known results but turns out...

  11. Integral tests of coupled multigroup neutron and gamma cross sections with fission and fusion sources

    International Nuclear Information System (INIS)

    Schriewer, J.; Hehn, G.; Mattes, M.; Pfister, G.; Keinert, J.

    1978-01-01

    Calculations were made for different benchmark experiments in order to test the coupled multigroup neutron and gamma library EURLIB-3 with 100 neutron groups and 20 gamma groups. In cooperation with EURATOM, Ispra, we produced this shielding library recently from ENDF/B-IV data for application in fission and fusion technology. Integral checks were performed for natural lithium, carbon, oxygen, and iron. Since iron is the most important structural material in nuclear technology, we started with calculations of iron benchmark experiments. Most of them are integral experiments of INR, Karlsruhe, but comparisons were also done with benchmark experiments from USA and Japan. For the experiments with fission sources we got satisfying results. All details of the resonances cannot be checked with flux measurements and multigroup cross sections used. But some averaged resonance behaviour of the measured and calculated fluxes can be compared and checked within the error limits given. We get greater differences in the calculations of benchmark experiments with 14 MeV neutron sources. For iron the group cross sections of EURLIB-3 produce an underestimation of the neutron flux in a broad energy region below the source energy. The conclusion is that the energy degradation by inelastic scattering is too strong. For fusion application the anisotropy of the inelastic scatter process must be taken into account, which isn't done by the processing codes at present. If this effect isn't enough, additional corrections have to be applied to the inelastic cross sections of iron in ENDF/B-IV. (author)

  12. Laser-induced plasmas as an analytical source for quantitative analysis of gaseous and aerosol systems: Fundamentals of plasma-particle interactions

    Science.gov (United States)

    Diwakar, Prasoon K.

    2009-11-01

    Laser-induced Breakdown Spectroscopy (LIBS) is a relatively new analytical diagnostic technique which has gained serious attention in recent past due to its simplicity, robustness, and portability and multi-element analysis capabilities. LIBS has been used successfully for analysis of elements in different media including solids, liquids and gases. Since 1963, when the first breakdown study was reported, to 1983, when the first LIBS experiments were reported, the technique has come a long way, but the majority of fundamental understanding of the processes that occur has taken place in last few years, which has propelled LIBS in the direction of being a well established analytical technique. This study, which mostly focuses on LIBS involving aerosols, has been able to unravel some of the mysteries and provide knowledge that will be valuable to LIBS community as a whole. LIBS processes can be broken down to three basic steps, namely, plasma formation, analyte introduction, and plasma-analyte interactions. In this study, these three steps have been investigated in laser-induced plasma, focusing mainly on the plasma-particle interactions. Understanding plasma-particle interactions and the fundamental processes involved is important in advancing laser-induced breakdown spectroscopy as a reliable and accurate analytical technique. Critical understanding of plasma-particle interactions includes study of the plasma evolution, analyte atomization, and the particle dissociation and diffusion. In this dissertation, temporal and spatial studies have been done to understand the fundamentals of the LIBS processes including the breakdown of gases by the laser pulse, plasma inception mechanisms, plasma evolution, analyte introduction and plasma-particle interactions and their influence on LIBS signal. Spectral measurements were performed in a laser-induced plasma and the results reveal localized perturbations in the plasma properties in the vicinity of the analyte species, for

  13. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    International Nuclear Information System (INIS)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-01-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory

  14. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-04-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory.

  15. RF power source for the compact linear collider test facility (CTF3)

    CERN Document Server

    McMonagle, G; Brown, Peter; Carron, G; Hanni, R; Mourier, J; Rossat, G; Syratchev, I V; Tanner, L; Thorndahl, L

    2004-01-01

    The CERN CTF3 facility will test and demonstrate many vital components of CLIC (Compact Linear Collider). This paper describes the pulsed RF power source at 2998.55 MHz for the drive-beam accelerator (DBA), which produces a beam with an energy of 150 MeV and a current of 3.5 Amps. Where possible, existing equipment from the LEP preinjector, especially the modulators and klystrons, is being used and upgraded to achieve this goal. A high power RF pulse compression system is used at the output of each klystron, which requires sophisticated RF phase programming on the low level side to achieve the required RF pulse. In addition to the 3 GHz system two pulsed RF sources operating at 1.5 GHz are being built. The first is a wide-band, low power, travelling wave tube (TWT) for the subharmonic buncher (SHB) system that produces a train of "phase coded" subpulses as part of the injector scheme. The second is a high power narrow band system to produce 20 MW RF power to the 1.5 GHz RF deflectors in the delay loop situate...

  16. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    Science.gov (United States)

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  17. Beam diagnostic tools for the negative hydrogen ion source test facility ELISE

    International Nuclear Information System (INIS)

    Nocentini, Riccardo; Fantz, Ursel; Franzen, Peter; Froeschle, Markus; Heinemann, Bernd; Riedl, Rudolf; Ruf, Benjamin; Wuenderlich, Dirk

    2013-01-01

    Highlights: ► We present an overview of beam diagnostic tools foreseen for the new testbed ELISE. ► A sophisticated diagnostic calorimeter allows beam profile measurement. ► A tungsten wire mesh in the beam path provides a qualitative picture of the beam. ► Stripping losses and beam divergence are measured by H α Doppler shift spectroscopy. -- Abstract: The test facility ELISE, presently being commissioned at IPP, is a first step in the R and D roadmap for the RF driven ion source and extraction system of the ITER NBI system. The “half-size” ITER-like test facility includes a negative hydrogen ion source that can be operated for 1 h. ELISE is expected to extract an ion beam of 20 A at 60 kV for 10 s every 3 min, therefore delivering a total power of 1.2 MW. The extraction area has a geometry that closely reproduces the ITER design, with the same width and half the height, i.e. 1 m × 1 m. This paper presents an overview of beam diagnostic tools foreseen for ELISE. For the commissioning phase, a simple beam dump with basic diagnostic capabilities has been installed. In the second phase, the beam dump will be substituted by a more sophisticated diagnostic calorimeter to allow beam profile measurement. Additionally, a tungsten wire mesh will be introduced in the beam path to provide a qualitative picture of beam size and position. Stripping losses and beam divergence will be measured by means of H α Doppler shift spectroscopy. An absolute calibration is foreseen in order to measure beam intensity

  18. An analytical model for the distribution of CO2 sources and sinks, fluxes, and mean concentration within the roughness sub-layer

    Science.gov (United States)

    Siqueira, M. B.; Katul, G. G.

    2009-12-01

    A one-dimensional analytical model that predicts foliage CO2 uptake rates, turbulent fluxes, and mean concentration throughout the roughness sub-layer (RSL), a layer that extends from the ground surface up to 5 times the canopy height (h), is proposed. The model combines the mean continuity equation for CO2 with first-order closure principles for turbulent fluxes and simplified physiological and radiative transfer schemes for foliage uptake. This combination results in a second-order ordinary differential equation in which it is imposed soil respiration (RE) as lower and CO2 concentration well above the RSL as upper boundary conditions. An inverse version of the model was tested against data sets from two contrasting ecosystems: a tropical forest (TF, h=40 m) and a managed irrigated rice canopy (RC, h=0.7 m) - with good agreement noted between modeled and measured mean CO2 concentration profiles within the entire RSL (see figure). Sensitivity analysis on the model parameters revealed a plausible scaling regime between them and a dimensionless parameter defined by the ratio between external (RE) and internal (stomatal conductance) characteristics controlling the CO2 exchange process. The model can be used to infer the thickness of the RSL for CO2 exchange, the inequality in zero-plane displacement between CO2 and momentum, and its consequences on modeled CO2 fluxes. A simplified version of the solution is well suited for being incorporated into large-scale climate models. Furthermore, the model framework here can be used to a priori estimate relative contributions from the soil surface and the atmosphere to canopy-air CO2 concentration thereby making it synergetic to stable isotopes studies. Panels a) and c): Profiles of normalized measured leaf area density distribution (a) for TF and RC, respectively. Continuous lines are the constant a used in the model and dashed lines represent data-derived profiles. Panels b) and d) are modeled and ensemble-averaged measured

  19. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  20. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.