WorldWideScience

Sample records for study qa analytical results

  1. Environmental analytical laboratory setup operation and QA/QC

    International Nuclear Information System (INIS)

    Hsu, J.P.; Boyd, J.A.; DeViney, S.

    1991-01-01

    Environmental analysis requires precise and timely measurements. The required precise measurement is ensured with quality control and timeliness through an efficient operation. The efficiency of the operation also ensures cost-competitiveness. Environmental analysis plays a very important role in the environmental protection program. Due to the possible litigation involvement, most environmental analyses follow stringent criteria, such as the U.S. EPA Contract Laboratory Program procedures with analytical results documented in an orderly manner. The documentation demonstrates that all quality control steps are followed and facilitates data evaluation to determine the quality and usefulness of the data. Furthermore, the tedious documents concerning sample checking, chain-of-custody, standard or surrogate preparation, daily refrigerator and oven temperature monitoring, analytical and extraction logbooks, standard operation procedures, etc., also are an important part of the laboratory documentation. Quality control for environmental analysis is becoming more stringent, required documentation is becoming more detailed and turnaround time is shorter. However, the business is becoming more cost-competitive and it appears that this trend will continue. In this paper, we discuss what should be done to deal this high quality, fast-paced and tedious environmental analysis process at a competitive cost. The success of environmental analysis is people. The knowledge and experience of the staff are the key to a successful environmental analysis program. In order to be successful in this new area, the ability to develop new methods is crucial. In addition, the laboratory information system, laboratory automation and quality assurance/quality control (QA/QC) are major factors for laboratory success. This paper concentrates on these areas

  2. KINERJA AKADEMIK PASCA SERTIFIKASI AUN-QA PADA PROGRAM STUDI DI INSTITUT PERTANIAN BOGOR

    Directory of Open Access Journals (Sweden)

    Adelyna Adelyna

    2016-05-01

    Full Text Available The aims of the research are to evaluate the academic performance progress of the six study programs of IPB after the certification of AUN-QA. The research was a case study in six study programs that had been certified by AUN-QA until December 2014. This research was conducted with the objectives of defining the relevant indicators of BSC IPB and AUN-QA criteria, analyzing the criteria values of AUN-QA after the AUN-QA certification, analyzing the academic performance based on KPI BSC after the AUN-QA certification, and analyzing the problems in improving academic performance as the basis for the formulation of strategies for improving academic quality. The method used in this research was the balanced scorecard approach (BSC. The results showed that the certification of AUN-QA contains 15 relevant criteria and supports the achievement of BSC IPB. Key performance indicators (KPI BSC IPB supported by the AUN-QA criteria consist of 21 of the 33 indicators of BSC IPB, and 14 of them are relegated to the BSC department indicators. The AUN-QA criteria values on the study program have increased with the highest criterion value in student quality and the lowest one in support staff quality. The weak criteria required to be improved include support staff quality, student assessment, stakeholder feedback, and program specification.Keywords: AUN-QA certification, academic performance, balanced scorecard

  3. A comparative study and analysis of QA requirements for the establishment of a nuclear R and D QA system

    International Nuclear Information System (INIS)

    Kim, Kwan Hyun

    2000-06-01

    This technical report provides recommendations on how to fulfill the requirements of the code in relation to QA activities for nuclear R and D field. This guide applies to the quality assurance (QA) programmes of the responsible organization, i.e. the organization having overall responsibility for the nuclear power plant, as well as to any other separate QA programmes in each stage of a nuclear R and D project. This guide covers QA work on items, services and processes impacting nuclear safety during siting, design, construction, commissioning, operation and decommissioning of nuclear power plants. The impact on safety may occur during the performance of the QA work, or owing to the application of the results of the QA. This guide may, with appropriate modification, also be usefully applied at nuclear installations other than nuclear R and D field

  4. Interlaboratory analytical performance studies; a way to estimate measurement uncertainty

    Directory of Open Access Journals (Sweden)

    El¿bieta £ysiak-Pastuszak

    2004-09-01

    Full Text Available Comparability of data collected within collaborative programmes became the key challenge of analytical chemistry in the 1990s, including monitoring of the marine environment. To obtain relevant and reliable data, the analytical process has to proceed under a well-established Quality Assurance (QA system with external analytical proficiency tests as an inherent component. A programme called Quality Assurance in Marine Monitoring in Europe (QUASIMEME was established in 1993 and evolved over the years as the major provider of QA proficiency tests for nutrients, trace metals and chlorinated organic compounds in marine environment studies. The article presents an evaluation of results obtained in QUASIMEME Laboratory Performance Studies by the monitoring laboratory of the Institute of Meteorology and Water Management (Gdynia, Poland in exercises on nutrient determination in seawater. The measurement uncertainty estimated from routine internal quality control measurements and from results of analytical performance exercises is also presented in the paper.

  5. SU-F-T-275: A Correlation Study On 3D Fluence-Based QA and 2D Dose Measurement-Based QA

    International Nuclear Information System (INIS)

    Liu, S; Mazur, T; Li, H; Green, O; Sun, B; Mutic, S; Yang, D

    2016-01-01

    Purpose: The aim of this paper was to demonstrate the feasibility and creditability of computing and verifying 3D fluencies to assure IMRT and VMAT treatment deliveries, by correlating the passing rates of the 3D fluence-based QA (P(ά)) to the passing rates of 2D dose measurementbased QA (P(Dm)). Methods: 3D volumetric primary fluencies are calculated by forward-projecting the beam apertures and modulated by beam MU values at all gantry angles. We first introduce simulated machine parameter errors (MU, MLC positions, jaw, gantry and collimator) to the plan. Using passing rates of voxel intensity differences (P(Ir)) and 3D gamma analysis (P(γ)), calculated 3D fluencies, calculated 3D delivered dose, and measured 2D planar dose in phantom from the original plan are then compared with those from corresponding plans with errors, respectively. The correlations of these three groups of resultant passing rates, i.e. 3D fluence-based QA (P(ά,Ir) and P(ά,γ)), calculated 3D dose (P(Dc,Ir) and P(Dc,γ)), and 2D dose measurement-based QA (P(Dm,Ir) and P(Dm,γ)), will be investigated. Results: 20 treatment plans with 5 different types of errors were tested. Spearman’s correlations were found between P(ά,Ir) and P(Dc,Ir), and also between P(ά,γ) and P(Dc,γ), with averaged p-value 0.037, 0.065, and averaged correlation coefficient ρ-value 0.942, 0.871 respectively. Using Matrixx QA for IMRT plans, Spearman’s correlations were also obtained between P(ά,Ir) and P(Dm,Ir) and also between P(ά,γ) and P(Dm,γ), with p-value being 0.048, 0.071 and ρ-value being 0.897, 0.779 respectively. Conclusion: The demonstrated correlations improve the creditability of using 3D fluence-based QA for assuring treatment deliveries for IMRT/VMAT plans. Together with advantages of high detection sensitivity and better visualization of machine parameter errors, this study further demonstrates the accuracy and feasibility of 3D fluence based-QA in pre-treatment QA and daily QA. Research

  6. Comparability between NQA-1 and the QA programs for analytical laboratories within the nuclear industry and EPA hazardous waste laboratories

    International Nuclear Information System (INIS)

    English, S.L.; Dahl, D.R.

    1989-01-01

    There is increasing cooperation between the Department of Energy (DOE), Department of Defense (DOD), and the Environmental Protection Agency (EPA) in the activities associated with monitoring and clean-up of hazardous wastes. Pacific Northwest Laboratory (PNL) examined the quality assurance/quality control programs that the EPA requires of the private sector when performing routine analyses of hazardous wastes to confirm how or if the requirements correspond with PNL's QA program based upon NQA-1. This paper presents the similarities and differences between NQA-1 and the QA program identified in ASTM-C1009-83, Establishing a QA Program for Analytical Chemistry Laboratories within the Nuclear Industry; EPA QAMS-005/80, Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans, which is referenced in Statements of Work for CERCLA analytical activities; and Chapter 1 of SW-846, which is used in analyses of RCRA samples. The EPA QA programs for hazardous waste analyses are easily encompassed within an already established NQA-1 QA program. A few new terms are introduced and there is an increased emphasis upon the QC/verification, but there are many of the same basic concepts in all the programs

  7. SU-F-T-287: A Preliminary Study On Patient Specific VMAT Verification Using a Phosphor-Screen Based Geometric QA System (Raven QA)

    International Nuclear Information System (INIS)

    Lee, M; Yi, B; Wong, J; Ding, K

    2016-01-01

    Purpose: The RavenQA system (LAP Laser, Germany) is a QA device with a phosphor screen detector for performing the QA tasks of TG-142. This study tested if it is feasible to use the system for the patient specific QA of the Volumetric Modulated Arc Therapy (VMAT). Methods: Water equivalent material (5cm) is attached to the front of the detector plate of the RavenQA for dosimetry purpose. Then the plate is attached to the gantry to synchronize the movement between the detector and the gantry. Since the detector moves together with gantry, The ’Reset gantry to 0’ function of the Eclipse planning system (Varian, CA) is used to simulate the measurement situation when calculating dose of the detector plate. The same gantry setup is used when delivering the treatment beam for feasibility test purposes. Cumulative dose is acquired for each arc. The optical scatter component of each captured image from the CCD camera is corrected by deconvolving the 2D spatial invariant optical scatter kernel (OSK). We assume that the OSK is a 2D isotropic point spread function with inverse-squared decrease as a function of radius from the center. Results: Three cases of VMAT plans including head & neck, whole pelvis and abdomen-pelvis are tested. Setup time for measurements was less than 5 minutes. Passing rates of absolute gamma were 99.3, 98.2, 95.9 respectively for 3%/3mm criteria and 96.2, 97.1, 86.4 for 2%/2mm criteria. The abdomen-pelvis field has long treatment fields, 37cm, which are longer than the detector plate (25cm). This plan showed relatively lower passing rate than other plans. Conclusion: An algorithm for IMRT/VMAT verification using the RavenQA has been developed and tested. The model of spatially invariant OSK works well for deconvolution purpose. It is proved that the RavenQA can be used for the patient specific verification of VMAT. This work is funded in part by a Maryland Industrial Partnership Program grant to University of Maryland and to JPLC who owns the

  8. Study of a vibrating plate: comparison between experimental (ESPI) and analytical results

    Science.gov (United States)

    Romero, G.; Alvarez, L.; Alanís, E.; Nallim, L.; Grossi, R.

    2003-07-01

    Real-time electronic speckle pattern interferometry (ESPI) was used for tuning and visualization of natural frequencies of a trapezoidal plate. The plate was excited to resonant vibration by a sinusoidal acoustical source, which provided a continuous range of audio frequencies. Fringe patterns produced during the time-average recording of the vibrating plate—corresponding to several resonant frequencies—were registered. From these interferograms, calculations of vibrational amplitudes by means of zero-order Bessel functions were performed in some particular cases. The system was also studied analytically. The analytical approach developed is based on the Rayleigh-Ritz method and on the use of non-orthogonal right triangular co-ordinates. The deflection of the plate is approximated by a set of beam characteristic orthogonal polynomials generated by using the Gram-Schmidt procedure. A high degree of correlation between computational analysis and experimental results was observed.

  9. Sensitivity of volumetric modulated arc therapy patient specific QA results to multileaf collimator errors and correlation to dose volume histogram based metrics.

    LENUS (Irish Health Repository)

    Coleman, Linda

    2013-11-01

    This study investigates the impact of systematic multileaf collimator (MLC) positional errors on gamma analysis results used for quality assurance (QA) of Rapidarc treatments. In addition, this study evaluates the relationship of these gamma analysis results and clinical dose volume histogram metrics (DVH) for Rapidarc treatment plans.

  10. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  11. [Comparability study of analytical results between a group of clinical laboratories].

    Science.gov (United States)

    Alsius-Serra, A; Ballbé-Anglada, M; López-Yeste, M L; Buxeda-Figuerola, M; Guillén-Campuzano, E; Juan-Pereira, L; Colomé-Mallolas, C; Caballé-Martín, I

    2015-01-01

    To describe the study of the comparability of the measurements levels of biological tests processed in biochemistry in Catlab's 4 laboratories. Quality requirements, coefficients of variation and total error (CV% and TE %) were established. Controls were verified with the precision requirements (CV%) in each test and each individual laboratory analyser. Fresh serum samples were used for the comparability study. The differences were analysed using a Microsoft Access® application that produces modified Bland-Altman plots. The comparison of 32 biological parameters that are performed in more than one laboratory and/or analyser generated 306 Bland-Altman graphs. Of these, 101 (33.1%) fell within the accepted range of values based on biological variability, and 205 (66.9%) required revision. Data were re-analysed based on consensus minimum specifications for analytical quality (consensus of the Asociación Española de Farmacéuticos Analistas (AEFA), the Sociedad Española de Bioquímica Clínica y Patología Molecular (SEQC), the Asociación Española de Biopatología Médica (AEBM) and the Sociedad Española de Hematología y Hemoterapia (SEHH), October 2013). With the new specifications, 170 comparisons (56%) fitted the requirements and 136 (44%) required additional review. Taking into account the number of points that exceeded the requirement, random errors, range of results in which discrepancies were detected, and range of clinical decision, it was shown that the 44% that required review were acceptable, and the 32 tests were comparable in all laboratories and analysers. The analysis of the results showed that the consensus requirements of the 4 scientific societies were met. However, each laboratory should aim to meet stricter criteria for total error. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  12. Deep nets vs expert designed features in medical physics: An IMRT QA case study.

    Science.gov (United States)

    Interian, Yannet; Rideout, Vincent; Kearney, Vasant P; Gennatas, Efstathios; Morin, Olivier; Cheung, Joey; Solberg, Timothy; Valdes, Gilmer

    2018-03-30

    The purpose of this study was to compare the performance of Deep Neural Networks against a technique designed by domain experts in the prediction of gamma passing rates for Intensity Modulated Radiation Therapy Quality Assurance (IMRT QA). A total of 498 IMRT plans across all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam Linacs. Measurements were performed using a commercial 2D diode array, and passing rates for 3%/3 mm local dose/distance-to-agreement (DTA) were recorded. Separately, fluence maps calculated for each plan were used as inputs to a convolution neural network (CNN). The CNNs were trained to predict IMRT QA gamma passing rates using TensorFlow and Keras. A set of model architectures, inspired by the convolutional blocks of the VGG-16 ImageNet model, were constructed and implemented. Synthetic data, created by rotating and translating the fluence maps during training, was created to boost the performance of the CNNs. Dropout, batch normalization, and data augmentation were utilized to help train the model. The performance of the CNNs was compared to a generalized Poisson regression model, previously developed for this application, which used 78 expert designed features. Deep Neural Networks without domain knowledge achieved comparable performance to a baseline system designed by domain experts in the prediction of 3%/3 mm Local gamma passing rates. An ensemble of neural nets resulted in a mean absolute error (MAE) of 0.70 ± 0.05 and the domain expert model resulted in a 0.74 ± 0.06. Convolutional neural networks (CNNs) with transfer learning can predict IMRT QA passing rates by automatically designing features from the fluence maps without human expert supervision. Predictions from CNNs are comparable to a system carefully designed by physicist experts. © 2018 American Association of Physicists in Medicine.

  13. Cryptography based on neural networks - analytical results

    International Nuclear Information System (INIS)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2002-01-01

    The mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations. (letter to the editor)

  14. Analytical results for Abelian projection

    International Nuclear Information System (INIS)

    Ogilivie, Michael C.

    1999-01-01

    Analytic methods for Abelian projection are developed, and a number of results related to string tension measurements are obtained. It is proven that even without gauge fixing, Abelian projection yields string tensions of the underlying non-Abelian theory. Strong arguments are given for similar results in the case where gauge fixing is employed. The subgroup used for projection need only contain the center of the gauge group, and need not be Abelian. While gauge fixing is shown to be in principle unnecessary for the success of Abelian projection, it is computationally advantageous for the same reasons that improved operators, e.g., the use of fat links, are advantageous in Wilson loop measurements

  15. Studies on a Q/A selector for the SECRAL electron cyclotron resonance ion source.

    Science.gov (United States)

    Yang, Y; Sun, L T; Feng, Y C; Fang, X; Lu, W; Zhang, W H; Cao, Y; Zhang, X Z; Zhao, H W

    2014-08-01

    Electron cyclotron resonance ion sources are widely used in heavy ion accelerators in the world because they are capable of producing high current beams of highly charged ions. However, the design of the Q/A selector system for these devices is challenging, because it must have a sufficient ion resolution while controlling the beam emittance growth. Moreover, this system has to be matched for a wide range of ion beam species with different intensities. In this paper, research on the Q/A selector system at the SECRAL (Superconducting Electron Cyclotron Resonance ion source with Advanced design in Lanzhou) platform both in experiment and simulation is presented. Based on this study, a new Q/A selector system has been designed for SECRAL II. The features of the new design including beam simulations are also presented.

  16. Kawerau fluid chemistry : analytical results

    International Nuclear Information System (INIS)

    Mroczek, E.K.; Christenson, B.W.; Mountain, B.; Stewart, M.K.

    2001-01-01

    This report summarises the water and gas analytical data collected from Kawerau geothermal field 1998-2000 under the Sustainable Management of Geothermal and Mineral Resources (GMR) Project, Objective 2 'Understanding New Zealand Geothermal Systems'. The work is part of the continuing effort to characterise the chemical, thermal and isotopic signatures of the deep magmatic heat sources which drive our geothermal systems. At Kawerau there is clear indication that the present-day heat source relates to young volcanism within the field. However, being at the margins of the explored reservoir, little is presently known of the characteristics of that heat source. The Kawerau study follows on directly from the recently completed work characterising the geochemical signatures of the Ohaaki hydrothermal system. In the latter study the interpretation of the radiogenic noble gas isotope systematics was of fundamental importance in characterising the magmatic heat source. Unfortunately the collaboration with LLNL, which analysed the isotopes, could not be extended to include the Kawerau data. The gas samples have been archived and will be analysed once a new collaborator is found to continue the work. The purpose of the present compilation is to facilitate the final completion of the study by ensuring the data is accessible in one report. (author). 5 refs., 2 figs., 9 tabs

  17. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    Science.gov (United States)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  18. SU-E-T-636: ProteusONE Machine QA Procedure and Stabiity Study: Half Year Clinical Operation

    Energy Technology Data Exchange (ETDEWEB)

    Freund, D; Ding, X; Wu, H; Zhang, J; Syh, J; Syh, J; Patel, B; Song, X [Willis-Knighton Medical Center, Shreveport, LA (United States)

    2015-06-15

    Purpose: The objective of this study is to evaluate the stability of ProteusOne, the 1st commercial PBS proton system, throughout the daily QA and monthly over 6 month clinical operation. Method: Daily QA test includes IGRT position/repositioning, output in the middle of SOBP, beam flatness, symmetry, inplane and crossplane dimensions as well as energy range check. Daily range shifter QA consist of output, symmetry and field size checks to make sure its integrity. In 30 mins Daily QA test, all the measurements are performed using the MatriXXPT (IBA dosimetry). The data from these measurement was collected and compare over the first 6 month of clinical operation. In addition to the items check in daily QA, the summary also includes the monthly QA gantry star shots, absolute position check using a novel device, XRV-100. Results: Average machine output at the center of the spread out bragg peak was 197.5±.8 cGy and was within 1%of the baseline at 198.4 cGy. Beam flatness was within 1% cross plane with an average of 0.67±0.12% and 2% in-plane with an average of 1.08±0.17% compared to baseline measurements of 0.6 and 1.03, respectively. In all cases the radiation isocenter shift was less than or equal to 1mm. Output for the range shifter was within 2% for each individual measurement and averaged 34.4±.2cGy compare to a baseline reading of 34.5cGy. The average range shifter in and cross plane field size measurements were 19.8±0.5cm and 20.5±0.4cm compared with baseline values of 20.19cm and 20.79cm, respectively. Range shifter field symmetry had an average of less 1% for both in-plane and cross plane measurements. Conclusion: All machine metrics over the past 6 months have proved to be stable. Although, some averages are outside the baseline measurement they are within 1% tolerance and the deviation across all measurements is minimal.

  19. SU-E-T-636: ProteusONE Machine QA Procedure and Stabiity Study: Half Year Clinical Operation

    International Nuclear Information System (INIS)

    Freund, D; Ding, X; Wu, H; Zhang, J; Syh, J; Syh, J; Patel, B; Song, X

    2015-01-01

    Purpose: The objective of this study is to evaluate the stability of ProteusOne, the 1st commercial PBS proton system, throughout the daily QA and monthly over 6 month clinical operation. Method: Daily QA test includes IGRT position/repositioning, output in the middle of SOBP, beam flatness, symmetry, inplane and crossplane dimensions as well as energy range check. Daily range shifter QA consist of output, symmetry and field size checks to make sure its integrity. In 30 mins Daily QA test, all the measurements are performed using the MatriXXPT (IBA dosimetry). The data from these measurement was collected and compare over the first 6 month of clinical operation. In addition to the items check in daily QA, the summary also includes the monthly QA gantry star shots, absolute position check using a novel device, XRV-100. Results: Average machine output at the center of the spread out bragg peak was 197.5±.8 cGy and was within 1%of the baseline at 198.4 cGy. Beam flatness was within 1% cross plane with an average of 0.67±0.12% and 2% in-plane with an average of 1.08±0.17% compared to baseline measurements of 0.6 and 1.03, respectively. In all cases the radiation isocenter shift was less than or equal to 1mm. Output for the range shifter was within 2% for each individual measurement and averaged 34.4±.2cGy compare to a baseline reading of 34.5cGy. The average range shifter in and cross plane field size measurements were 19.8±0.5cm and 20.5±0.4cm compared with baseline values of 20.19cm and 20.79cm, respectively. Range shifter field symmetry had an average of less 1% for both in-plane and cross plane measurements. Conclusion: All machine metrics over the past 6 months have proved to be stable. Although, some averages are outside the baseline measurement they are within 1% tolerance and the deviation across all measurements is minimal

  20. Gas purity analytics, calibration studies, and background predictions towards the first results of XENON1T

    Energy Technology Data Exchange (ETDEWEB)

    Hasterok, Constanze

    2017-10-25

    The XENON1T experiment aims at the direct detection of the well motivated dark matter candidate of weakly interacting massive particles (WIMPs) scattering off xenon nuclei. The first science run of 34.2 live days has already achieved the most stringent upper limit on spin-independent WIMP-nucleon cross-sections above masses of 10 GeV with a minimum of 7.7.10{sup -47} cm{sup 2} at a mass of 35 GeV. Crucial for this unprecedented sensitivity are a high xenon gas purity and a good understanding of the background. In this work, a procedure is described that was developed to measure the purity of the experiment's xenon inventory of more than three tons during its initial transfer to the detector gas system. The technique of gas chromatography has been employed to analyze the noble gas for impurities with the focus on oxygen and krypton contaminations. Furthermore, studies on the calibration of the experiment's dominating background induced by natural gamma and beta radiation were performed. Hereby, the novel sources of radioactive isotopes that can be dissolved in the xenon were employed, namely {sup 220}Rn and tritium. The sources were analyzed in terms of a potential impact on the outcome of a dark matter search. As a result of the promising findings for {sup 220}Rn, the source was successfully deployed in the first science run of XENON1T. The first WIMP search of XENON1T is outlined in this thesis, in which a background component from interactions taking place in close proximity to the detector wall is identified, investigated and modeled. A background prediction was derived that was incorporated into the background model of the WIMP search which was found to be in good agreement with the observation.

  1. NRC overview: Repository QA

    International Nuclear Information System (INIS)

    Kennedy, J.E.

    1988-01-01

    The US Department of Energy (DOE) is on the threshold of an extensive program for characterizing Yucca Mountain in Nevada to determine if it is a suitable site for the permanent disposal of high-level nuclear waste. Earlier this year, the DOE published the Consultation Draft Site Characterization Plan for the Nevada site, which describes in some detail the studies that need to be performed to determine if the site is acceptable. In the near future, the final site characterization plan (SCP) is expected to be issued and large-scale site characterization activities to begin. The data and analyses that will result from the execution of that plan are expected to be the primary basis for the license application to the US Nuclear Regulatory Commission (NRC). Because of the importance of these data and analyses in the assessment of the suitability of the site and in the demonstration of that suitability in the NRC licensing process, the NRC requires in 10CFR60 that site characterization be performed under a quality assurance (QA) program. The QA program is designed to provide confidence that data are valid, retrievable, and reproducible. The documentation produced by the program will form an important part of the record on which the suitability of the site is judged in licensing. In addition, because the NRC staff can review only a selected portion of the data collected, the staff will need to rely on the system of controls in the DOE QA program

  2. Development of Magnetometer Digital Circuit for KSR-3 Rocket and Analytical Study on Calibration Result

    Directory of Open Access Journals (Sweden)

    Eun-Seok Lee

    2002-12-01

    Full Text Available This paper describes the re-design and the calibration results of the MAG digital circuit onboard the KSR-3. We enhanced the sampling rate of magnetometer data. Also, we reduced noise and increased authoritativeness of data. We could confirm that AIM resolution was decreased less than 1nT of analog calibration by a digital calibration of magnetometer. Therefore, we used numerical-program to correct this problem. As a result, we could calculate correction and error of data. These corrections will be applied to magnetometer data after the launch of KSR-3.

  3. The quality assurance process for the ARTSCAN head and neck study - A practical interactive approach for QA in 3DCRT and IMRT

    International Nuclear Information System (INIS)

    Johansson, Karl-Axel; Nilsson, Per; Zackrisson, Bjoern; Ohlson, Birgitta; Kjellen, Elisabeth; Mercke, Claes; Alvarez-Fonseca, Mauricio; Billstroem, Anette; Bjoerk-Eriksson, Thomas; Bjoer, Ove; Ekberg, Lars; Friesland, Signe; Karlsson, Magnus; Lagerlund, Magnus; Lundkvist, Lena; Loefroth, Per-Olov; Loefvander-Thapper, Kerstin; Nilsson, Alla; Nyman, Jan; Persson, Essie

    2008-01-01

    Aim: This paper describes the quality assurance (QA) work performed in the Swedish multicenter ARTSCAN (Accelerated RadioTherapy of Squamous cell CArcinomas in the head and Neck) trial to guarantee high quality in a multicenter study which involved modern radiotherapy such as 3DCRT or IMRT. Materials and methods: The study was closed in June 2006 with 750 randomised patients. Radiation therapy-related data for every patient were sent by each participating centre to the QA office where all trial data were reviewed, analysed and stored. In case of any deviation from the protocol, an interactive process was started between the QA office and the local responsible clinician and/or physicist to increase the compliance to the protocol for future randomised patients. Meetings and workshops were held on a regular basis for discussions on various trial-related issues and for the QA office to report on updated results. Results and discussion: This review covers the 734 patients out of a total of 750 who had entered the study. Deviations early in the study were corrected so that the overall compliance to the protocol was very high. There were only negligible variations in doses and dose distributions to target volumes for each specific site and stage. The quality of the treatments was high. Furthermore, an extensive database of treatment parameters was accumulated for future dose-volume vs. endpoint evaluations. Conclusions: This comprehensive QA programme increased the probability to draw firm conclusions from our study and may serve as a concept for QA work in future radiotherapy trials where comparatively small effects are searched for in a heterogeneous tumour population

  4. Dynamically triangulated surfaces - some analytical results

    International Nuclear Information System (INIS)

    Kostov, I.K.

    1987-01-01

    We give a brief review of the analytical results concerning the model of dynamically triangulated surfaces. We will discuss the possible types of critical behaviour (depending on the dimension D of the embedding space) and the exact solutions obtained for D=0 and D=-2. The latter are important as a check of the Monte Carlo simulations applyed to study the model in more physical dimensions. They give also some general insight of its critical properties

  5. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  6. Compact tokamak reactors. Part 1 (analytic results)

    International Nuclear Information System (INIS)

    Wootton, A.J.; Wiley, J.C.; Edmonds, P.H.; Ross, D.W.

    1996-01-01

    We discuss the possible use of tokamaks for thermonuclear power plants, in particular tokamaks with low aspect ratio and copper toroidal field coils. Three approaches are presented. First we review and summarize the existing literature. Second, using simple analytic estimates, the size of the smallest tokamak to produce an ignited plasma is derived. This steady state energy balance analysis is then extended to determine the smallest tokamak power plant, by including the power required to drive the toroidal field, and considering two extremes of plasma current drive efficiency. The analytic results will be augmented by a numerical calculation which permits arbitrary plasma current drive efficiency; the results of which will be presented in Part II. Third, a scaling from any given reference reactor design to a copper toroidal field coil device is discussed. Throughout the paper the importance of various restrictions is emphasized, in particular plasma current drive efficiency, plasma confinement, plasma safety factor, plasma elongation, plasma beta, neutron wall loading, blanket availability and recirculating electric power. We conclude that the latest published reactor studies, which show little advantage in using low aspect ratio unless remarkably high efficiency plasma current drive and low safety factor are combined, can be reproduced with the analytic model

  7. Achievements and advantages of participation in the IAEA project RER 002/004/1999-2001 'QA/QC of Nuclear Analytical Techniques'

    International Nuclear Information System (INIS)

    Vata, Ion; Cincu, Em.

    2002-01-01

    The National Institute for Physics and Engineering 'Horia Hulubei' (IFIN-HH) decided in the late 1990s to start applying nuclear techniques in economy and social life on a routine scale; reaching this goal implied achieving first-rate analytical performances and complying with the QA/QC requirements, as detailed in the ISO 17025. The IAEA Project appeared in 1999 as the best opportunity and tool for our specialists to become familiar with the standard requirements and begin to implement them in their operations, thus further enabling them to apply for accreditation according to the international criteria. This report outlines the experience gained from the participation in the project. The accomplishments of the project are presented and the main difficulties are identified

  8. Analytic turnaround time study for integrated reporting of pathology results on electronic medical records using the Illuminate system

    Directory of Open Access Journals (Sweden)

    Tawfik O

    2016-09-01

    Full Text Available Timely pathology results are critical for appropriate diagnosis and management of patients. Yet workflows in laboratories remain ad hoc and involve accessing multiple systems with no direct linkage between patient history and prior or pending pathology records for the case being analyzed. A major hindrance in timely reporting of pathology results is the need to incorporate/interface with multiple electronic health records (EHRs. We evaluated the Illuminate PatientView software (Illuminate integration into pathologist's workflow. Illuminate is a search engine architecture that has a repository of textual information from many hospital systems. Our goal was to develop a comprehensive, user friendly patient summary display to integrate the current fractionated subspecialty specific systems. An analytical time study noting changes in turnaround time (TAT before and after Illuminate implementation was recorded for reviewers, including pathologists, residents and fellows. Reviewers' TAT for 359 cases was recorded (200 cases before and 159 after implementation. The impact of implementing Illuminate on transcriptionists’ workflow was also studied. Average TAT to retrieve EHRs prior to Illuminate was 5:32 min (range 1:35-10:50. That time was significantly reduced to 35 seconds (range 10 sec-1:10 min using Illuminate. Reviewers were very pleased with the ease in accessing information and in eliminating the draft paper documents of the pathology reports, eliminating up to 65 min/day (25-65 min by transcriptionists matching requisition with paperwork. Utilizing Illuminate improved workflow, decreased TAT and minimized cost. Patient care can be improved through a comprehensive patient management system that facilitates communications between isolated information systems.

  9. How users adopt healthcare information: An empirical study of an online Q&A community.

    Science.gov (United States)

    Jin, Jiahua; Yan, Xiangbin; Li, Yijun; Li, Yumei

    2016-02-01

    The emergence of social media technology has led to the creation of many online healthcare communities, where patients can easily share and look for healthcare-related information from peers who have experienced a similar problem. However, with increased user-generated content, there is a need to constantly analyse which content should be trusted as one sifts through enormous amounts of healthcare information. This study aims to explore patients' healthcare information seeking behavior in online communities. Based on dual-process theory and the knowledge adoption model, we proposed a healthcare information adoption model for online communities. This model highlights that information quality, emotional support, and source credibility are antecedent variables of adoption likelihood of healthcare information, and competition among repliers and involvement of recipients moderate the relationship between the antecedent variables and adoption likelihood. Empirical data were collected from the healthcare module of China's biggest Q&A community-Baidu Knows. Text mining techniques were adopted to calculate the information quality and emotional support contained in each reply text. A binary logistics regression model and hierarchical regression approach were employed to test the proposed conceptual model. Information quality, emotional support, and source credibility have significant and positive impact on healthcare information adoption likelihood, and among these factors, information quality has the biggest impact on a patient's adoption decision. In addition, competition among repliers and involvement of recipients were tested as moderating effects between these antecedent factors and the adoption likelihood. Results indicate competition among repliers positively moderates the relationship between source credibility and adoption likelihood, and recipients' involvement positively moderates the relationship between information quality, source credibility, and adoption

  10. QA programme documentation

    International Nuclear Information System (INIS)

    Scheibelt, L.

    1980-01-01

    The present paper deals with the following topics: The need for a documented Q.A. program; Establishing a Q.A. program; Q.A. activities; Fundamental policies; Q.A. policies; Quality objectives Q.A. manual. (orig./RW)

  11. Graded approach for establishment of QA requirements for Type B packaging of radioactive material

    International Nuclear Information System (INIS)

    Fabian, R.R.; Woodruff, K.C.

    1988-01-01

    A study that was conducted by the Nuclear Regulatory Commission for the U.S. Congress to assess the effectiveness of quality assurance (QA) activities has demonstrated a need to modify and improve the application of QA requirements for the nuclear industry. As a result, the packaging community, along with the nuclear industry as a whole, has taken action to increase the efficacy of the QA function. The results of the study indicate that a graded approach for establishing QA requirements is the preferred method. The essence of the graded approach is the establishment of applicable QA requirements to an extent consistent with the importance to safety of an item, component, system, or activity. This paper describes the process that is used to develop the graded approach for QA requirements pertaining to Type B packaging

  12. Reinforcing of QA/QC programs in radiotherapy departments in Croatia: Results of treatment planning system verification

    Energy Technology Data Exchange (ETDEWEB)

    Jurković, Slaven; Švabić, Manda; Diklić, Ana; Smilović Radojčić, Đeni; Dundara, Dea [Clinic for Radiotherapy and Oncology, Physics Division, University Hospital Rijeka, Rijeka (Croatia); Kasabašić, Mladen; Ivković, Ana [Department for Radiotherapy and Oncology, University Hospital Osijek, Osijek (Croatia); Faj, Dario, E-mail: dariofaj@mefos.hr [Department of Physics, School of Medicine, University of Osijek, Osijek (Croatia)

    2013-04-01

    Implementation of advanced techniques in clinical practice can greatly improve the outcome of radiation therapy, but it also makes the process much more complex with a lot of room for errors. An important part of the quality assurance program is verification of treatment planning system (TPS). Dosimetric verifications in anthropomorphic phantom were performed in 4 centers where new systems were installed. A total of 14 tests for 2 photon energies and multigrid superposition algorithms were conducted using the CMS XiO TPS. Evaluation criteria as specified in the International Atomic Energy Agency Technical Reports Series (IAEA TRS) 430 were employed. Results of measurements are grouped according to the placement of the measuring point and the beam energy. The majority of differences between calculated and measured doses in the water-equivalent part of the phantom were in tolerance. Significantly more out-of-tolerance values were observed in “nonwater-equivalent” parts of the phantom, especially for higher-energy photon beams. This survey was done as a part of continuous effort to build up awareness of quality assurance/quality control (QA/QC) importance in the Croatian radiotherapy community. Understanding the limitations of different parts of the various systems used in radiation therapy can systematically improve quality as well.

  13. $W^+ W^-$ + Jet: Compact Analytic Results

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, John [Fermilab; Miller, David [Glasgow U.; Robens, Tania [Dresden, Tech. U.

    2016-01-14

    In the second run of the LHC, which started in April 2015, an accurate understanding of Standard Model processes is more crucial than ever. Processes including electroweak gauge bosons serve as standard candles for SM measurements, and equally constitute important background for BSM searches. We here present the NLO QCD virtual contributions to W+W- + jet in an analytic format obtained through unitarity methods and show results for the full process using an implementation into the Monte Carlo event generator MCFM. Phenomenologically, we investigate total as well as differential cross sections for the LHC with 14 TeV center-of-mass energy, as well as a future 100 TeV proton-proton machine. In the format presented here, the one-loop virtual contributions also serve as important ingredients in the calculation of W+W- pair production at NNLO.

  14. Development of a QA Phantom for online image registration and resultant couch shifts

    International Nuclear Information System (INIS)

    Arumugam, S.; Jameson, M.G.; Holloway, L.C.

    2010-01-01

    Full text: Purpose Recently our centre purchased an Elekta-Synergy accelerator with kV-CBCT and a hexapod couch attachment. This system allows six degrees of freedom for couch lOp shifts, based on registration of on line imaging. We designed and built a phantom in our centre to test the accuracy and precision of this system. The goal of this project was to investigate the accuracy and practical utilisation of this phantom. Method The phantom was constructed from perspex sheets and high density dental putty (Fig. I). Five high density regions (three small regions to simulate prostate seeds and two larger regions to simulate boney anatomy) were incorporated to test the manual and automatic registrations within the software. The phantom was utilised to test the accuracy and precision of repositioning with the hexapod couch and imaging system. To achieve this, the phantom was placed on the couch at known orientations and the shifts were quantified using the registration of verification and reference image data sets. True shifts and those predicted by the software were compared. Results The geometrical accuracy of the phantom was verified with measurements of the CT scan to be with I mm of the intended geometry. The image registration and resultant couch shifts were found to be accurate within I mm and 0.5 degrees. The phantom was found to be practical and easy to use. Conclusion The presented phantom provides a less expensive and effective alternative to commercially available systems for verifying imaging registration and corresponding six degrees of freedom couch shifts. (author)

  15. LHC Superconducting Dipole Production Follow-up Results of Audit on QA Aspects in Industry

    CERN Document Server

    Modena, M; Cornelis, M; Fessia, P; Liénard, P; Miles, J; de Rijk, G; Savary, F; Sgobba, Stefano; Tommasini, D; Vlogaert, J; Völlinger, C; Wildner, E

    2006-01-01

    The manufacturing of the 1232 Superconducting Main Dipoles for LHC is under way at three European Contractors: Alstom-Jeumont (Consortium), Ansaldo Superconduttori Genova and Babcock Noell Nuclear. The manufacturing is proceeding in a very satisfactory way and in March 2005 the mid production was achieved. To intercept eventually â€ワweak points” of the production process still present and in order to make a check of the Quality Assurance and Control in place for the series production, an Audit action was launched by CERN during summer-fall 2004. Aspects like: completion of Production and Quality Assurance documentation, structure of QC Teams, traceability, calibration and maintenance for tooling, incoming components inspections, were checked during a total of seven visits at the five different production sites. The results of the Audit in terms of analysis of â€ワsystematic” and â€ワrandom” problems encountered as well as corrective actions requested are presented.

  16. Micro-homogeneity of candidate reference materials: Results from an intercomparison study for the Analytical Quality Control Services (AQCS) of the IAEA

    International Nuclear Information System (INIS)

    Rossbach, M.; Kniewald, G.

    2002-01-01

    The IAEA Analytical Quality Control Services (AQCS) has made available two single cell algae materials IAEA-392 and IAEA-393 as well as an urban dust IAEA-396 to study their use for analytical sample sizes in the milligram range and below. Micro-analytical techniques such as PIXE and μ-PIXE, solid sampling AAS, scanning electron microprobe X-ray analysis and INAA were applied to the determination of trace elements on the basis of μg to mg amounts of the selected materials. The comparability of the mean values as well as the reproducibility of successive measurements is being evaluated in order to compare relative homogeneity factors for many elements in the investigated materials. From the reported results it seems that the algae materials IAEA-392 and IAEA-393 are extremely homogeneous biological materials for a number of elements with an extraordinary sharp particle size distribution below 10 μm. A similar situation seems to hold for the urban dust material IAEA-396 which had been air-jet milled to a particle size distribution around 4 μm. The introduction of these materials as CRMs with very small amounts needed to determine the certified concentrations will help to meet the needs of micro-analytical techniques for natural matrix reference materials. (author)

  17. The influence of bilirubin, haemolysis and turbidity on 20 analytical tests performed on automatic analysers. Results of an interlaboratory study.

    Science.gov (United States)

    Grafmeyer, D; Bondon, M; Manchon, M; Levillain, P

    1995-01-01

    The director of a laboratory has to be sure to give out reliable results for routine tests on automatic analysers regardless of the clinical context. However, he may find hyperbilirubinaemia in some circumstances, parenteral nutrition causing turbidity in others, and haemolysis occurring if sampling is difficult. For this reason, the Commission for Instrumentation of the Société Française de Biologie Clinique (SFBC) (president Alain Feuillu) decided to look into "visible" interferences--bilirubin, haemolysis and turbidity--and their effect on 20 major tests: 13 substrates/chemistries: albumin, calcium, cholesterol, creatinine, glucose, iron, magnesium, phosphorus, total bilirubin, total proteins, triacylglycerols, uric acid, urea, and 7 enzymatic activities: alkaline phosphatase, alanine aminotransferase, alpha-amylase, aspartate aminotransferase, creatine kinase, gamma-glutamyl transferase and lactate dehydrogenase measured on 15 automatic analysers representative of those found on the French market (Astra 8, AU 510, AU 5010, AU 5000, Chem 1, CX 7, Dax 72, Dimension, Ektachem, Hitachi 717, Hitachi 737, Hitachi 747, Monarch, Open 30, Paramax, Wako 30 R) and to see how much they affect the accuracy of results under routine conditions in the laboratory. The study was carried out following the SFBC protocol for the validation of techniques using spiked plasma pools with bilirubin, ditauro-bilirubin, haemoglobin (from haemolysate) and Intralipid (turbidity). Overall, the following results were obtained: haemolysis affects tests the most often (34.5% of cases); total bilirubin interferes in 21.7% of cases; direct bilirubin and turbidity seem to interfere less at around 17%. The different tests are not affected to the same extent; enzyme activity is hardly affected at all; on the other hand certain major tests are extremely sensitive, increasingly so as we go through the following: creatinine (interference of bilirubin), triacylglycerols (interference of bilirubin and

  18. Interacting Brownian Swarms: Some Analytical Results

    Directory of Open Access Journals (Sweden)

    Guillaume Sartoretti

    2016-01-01

    Full Text Available We consider the dynamics of swarms of scalar Brownian agents subject to local imitation mechanisms implemented using mutual rank-based interactions. For appropriate values of the underlying control parameters, the swarm propagates tightly and the distances separating successive agents are iid exponential random variables. Implicitly, the implementation of rank-based mutual interactions, requires that agents have infinite interaction ranges. Using the probabilistic size of the swarm’s support, we analytically estimate the critical interaction range below that flocked swarms cannot survive. In the second part of the paper, we consider the interactions between two flocked swarms of Brownian agents with finite interaction ranges. Both swarms travel with different barycentric velocities, and agents from both swarms indifferently interact with each other. For appropriate initial configurations, both swarms eventually collide (i.e., all agents interact. Depending on the values of the control parameters, one of the following patterns emerges after collision: (i Both swarms remain essentially flocked, or (ii the swarms become ultimately quasi-free and recover their nominal barycentric speeds. We derive a set of analytical flocking conditions based on the generalized rank-based Brownian motion. An extensive set of numerical simulations corroborates our analytical findings.

  19. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  20. Summary report of the TC regional project on 'QA/QC of nuclear analytical techniques' RER-2-004 (1999-2001)

    International Nuclear Information System (INIS)

    Akgun, A. Fadil

    2002-01-01

    This report provides a summary of the Cekmece Nuclear Research and Training Centre participation in the Project. The Project helped in setting up quality assurance system in the Centre and resulted in a progress in analytical proficiency as shown in the proficiency test results. The main accomplishments are listed along with the tasks to be done

  1. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  2. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  3. Patient QA systems for rotational radiation therapy

    DEFF Research Database (Denmark)

    Fredh, Anna; Scherman, J.B.; Munck af Rosenschöld, Per Martin

    2013-01-01

    The purpose of the present study was to investigate the ability of commercial patient quality assurance (QA) systems to detect linear accelerator-related errors.......The purpose of the present study was to investigate the ability of commercial patient quality assurance (QA) systems to detect linear accelerator-related errors....

  4. Impact and payback of a QA/QC program for steam-water chemistry

    International Nuclear Information System (INIS)

    Lerman, S.I.; Wilson, D.

    1992-01-01

    QA/QC programs for analytical laboratories and in-line instrumentation are essential if we are to have any faith in the data they produce. When the analytes are at trace levels, as they frequently are in a steam-water cycle, the importance of QA/QC increases by an order of magnitude. The cost and resources of such a program, although worth it, are frequently underestimated. QA/QC is much more than running a standard several times a week. This paper will discuss some of the essential elements of such a program, compare them to the cost, and point out the impact of not having such a program. RP-2712-3 showed how essential QA/QC is to understand the limitations of instruments doing trace analysis of water. What it did not do, nor was it intended to, is discuss how good reliability can be in your own plant. QA programs that include training of personnel, written procedures, and comprehensive maintenance and inventory programs ensure optimum performance of chemical monitors. QC samples run regularly allow plant personnel to respond to poor performance in a timely manner, appropriate to plant demands. Proper data management establishes precision information necessary to determine how good our measurements are. Generally, the plant has the advantage of a central laboratory to perform corroborative analysis, and a comprehensive QA/QC program will integrate the plant monitoring operations with the central lab. Where trace analysis is concerned, attention to detail becomes paramount. Instrument performance may be below expected levels, and instruments are probably being run at the bottom end of their optimum range. Without QA/QC the plant manager can have no confidence in analytical results. Poor steam-water chemistry can go unnoticed, causing system deterioration. We can't afford to wait for another RP-2712-3 to tell us how good our data is

  5. Analytical Study of Oxalates Coprecipitation

    Directory of Open Access Journals (Sweden)

    Liana MARTA

    2003-03-01

    Full Text Available The paper deals with the establishing of the oxalates coprecipitation conditions in view of the synthesis of superconducting systems. A systematic analytical study of the oxalates precipitation conditions has been performed, for obtaining superconducting materials, in the Bi Sr-Ca-Cu-O system. For this purpose, the formulae of the precipitates solubility as a function of pH and oxalate excess were established. The possible formation of hydroxo-complexes and soluble oxalato-complexes was taken into account. A BASIC program was used for tracing the precipitation curves. The curves of the solubility versus pH for different oxalate excess have plotted for the four oxalates, using a logaritmic scale. The optimal conditions for the quantitative oxalate coprecipitation have been deduced from the diagrams. The theoretical curves were confirmed by experimental results. From the precursors obtained by this method, the BSCCO superconducting phases were obtained by an appropriate thermal treatment. The formation of the superconducting phases was identified by X-ray diffraction analysis.

  6. Milestone M4900: Simulant Mixing Analytical Results

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, D.I.

    2001-07-26

    This report addresses Milestone M4900, ''Simulant Mixing Sample Analysis Results,'' and contains the data generated during the ''Mixing of Process Heels, Process Solutions, and Recycle Streams: Small-Scale Simulant'' task. The Task Technical and Quality Assurance Plan for this task is BNF-003-98-0079A. A report with a narrative description and discussion of the data will be issued separately.

  7. Analytic study of resistive instabilities

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Magnus

    2003-05-01

    In a fusion plasma there is always a small amount of resistivity that may cause instabilities. Although their rather slow growth rates they can be of major importance for fusion plasma confinement. In this work a MAPLE-code was rewritten and simplified to make it possible to analytically solve the linearized MHD-equations with resistivity in an RFP-configuration. By using the MHD-equations and expanding the unknown perturbed quantities u{sub 1r}(r) and B{sub 1r}(r) as Taylor series and solving each coefficient we could get eigenvalues, dispersion relations and a relation between the growth rate and the resistivity. The new code was first used to solve two cases with no resistivity and simple unstable equilibria which after running gave the correct expected results. The difference from running the original code with these two cases was the greater speed of the calculations and the less memory needed. Then by using an ideal MHD-stable equilibrium in a plasma with no resistivity the code gave us solutions which unfortunately were not of the expected kind but the time of the calculations was still very fast. The resistivity was finally added to the code with the ideal MHD-stable equilibrium. The program also this time gave incorrect results. We could, however, see from a relation between the growth rate and the resistivity that the solution may be approximately correct in this domain. Although we did not get all the correct results we have to consider the fact that we got results, that were not possible before. Before this work was carried out we could not get any results at all in the resistive cue because of the very long memory demanding expressions. In future work and studies it is not only possible to get the desired eigenvalues {gamma} as function of {eta} but also possible to get expressions for eigenfunctions, dispersion relations and other significant relations with a number of variable parameters. We could also use the method for any geometry and possibly for

  8. Analytic study of resistive instabilities

    International Nuclear Information System (INIS)

    Svensson, Magnus

    2003-05-01

    In a fusion plasma there is always a small amount of resistivity that may cause instabilities. Although their rather slow growth rates they can be of major importance for fusion plasma confinement. In this work a MAPLE-code was rewritten and simplified to make it possible to analytically solve the linearized MHD-equations with resistivity in an RFP-configuration. By using the MHD-equations and expanding the unknown perturbed quantities u 1r (r) and B 1r (r) as Taylor series and solving each coefficient we could get eigenvalues, dispersion relations and a relation between the growth rate and the resistivity. The new code was first used to solve two cases with no resistivity and simple unstable equilibria which after running gave the correct expected results. The difference from running the original code with these two cases was the greater speed of the calculations and the less memory needed. Then by using an ideal MHD-stable equilibrium in a plasma with no resistivity the code gave us solutions which unfortunately were not of the expected kind but the time of the calculations was still very fast. The resistivity was finally added to the code with the ideal MHD-stable equilibrium. The program also this time gave incorrect results. We could, however, see from a relation between the growth rate and the resistivity that the solution may be approximately correct in this domain. Although we did not get all the correct results we have to consider the fact that we got results, that were not possible before. Before this work was carried out we could not get any results at all in the resistive cue because of the very long memory demanding expressions. In future work and studies it is not only possible to get the desired eigenvalues γ as function of η but also possible to get expressions for eigenfunctions, dispersion relations and other significant relations with a number of variable parameters. We could also use the method for any geometry and possibly for non

  9. SU-F-P-54: Guidelines to Check Image Registration QA of a Clinical Deformation Registration Software: A Single Institution Preliminary Study

    Energy Technology Data Exchange (ETDEWEB)

    Gill, G; Souri, S; Rea, A; Chen, Y; Antone, J; Qian, X; Riegel, A; Taylor, P; Marrero, M; Diaz, F; Cao, Y; Jamshidi, A; Klein, E [Northwell Health, Lake Success, NY (United States); Barley, S; Sorell, V; Karangelis, G [Oncology Systems Limited, Longbow Close, Shrewsbury SY1 3GZ (United Kingdom); Button, T [Stony Brook University Hospital, Stony Brook, NY (United States)

    2016-06-15

    Purpose: The objective of this study is to verify and analyze the accuracy of a clinical deformable image registration (DIR) software. Methods: To test clinical DIR software qualitatively and quantitatively, we focused on lung radiotherapy and analyzed a single (Lung) patient CT scan. Artificial anatomical changes were applied to account for daily variations during the course of treatment including the planning target volume (PTV) and organs at risk (OAR). The primary CT (pCT) and the structure set (pST) was deformed with commercial tool (ImSimQA-Oncology Systems Limited) and after artificial deformation (dCT and dST) sent to another commercial tool (VelocityAI-Varian Medical Systems). In Velocity, the deformed CT and structures (dCT and dST) were inversely deformed back to original primary CT (dbpCT and dbpST). We compared the dbpST and pST structure sets using similarity metrics. Furthermore, a binary deformation field vector (BDF) was created and sent to ImSimQA software for comparison with known “ground truth” deformation vector fields (DVF). Results: An image similarity comparison was made by using “ground truth” DVF and “deformed output” BDF with an output of normalized “cross correlation (CC)” and “mutual information (MI)” in ImSimQA software. Results for the lung case were MI=0.66 and CC=0.99. The artificial structure deformation in both pST and dbpST was analyzed using DICE coefficient, mean distance to conformity (MDC) and deformation field error volume histogram (DFEVH) by comparing them before and after inverse deformation. We have noticed inadequate structure match for CTV, ITV and PTV due to close proximity of heart and overall affected by lung expansion. Conclusion: We have seen similarity between pCT and dbpCT but not so well between pST and dbpST, because of inadequate structure deformation in clinical DIR system. This system based quality assurance test will prepare us for adopting the guidelines of upcoming AAPM task group 132

  10. SU-F-P-54: Guidelines to Check Image Registration QA of a Clinical Deformation Registration Software: A Single Institution Preliminary Study

    International Nuclear Information System (INIS)

    Gill, G; Souri, S; Rea, A; Chen, Y; Antone, J; Qian, X; Riegel, A; Taylor, P; Marrero, M; Diaz, F; Cao, Y; Jamshidi, A; Klein, E; Barley, S; Sorell, V; Karangelis, G; Button, T

    2016-01-01

    Purpose: The objective of this study is to verify and analyze the accuracy of a clinical deformable image registration (DIR) software. Methods: To test clinical DIR software qualitatively and quantitatively, we focused on lung radiotherapy and analyzed a single (Lung) patient CT scan. Artificial anatomical changes were applied to account for daily variations during the course of treatment including the planning target volume (PTV) and organs at risk (OAR). The primary CT (pCT) and the structure set (pST) was deformed with commercial tool (ImSimQA-Oncology Systems Limited) and after artificial deformation (dCT and dST) sent to another commercial tool (VelocityAI-Varian Medical Systems). In Velocity, the deformed CT and structures (dCT and dST) were inversely deformed back to original primary CT (dbpCT and dbpST). We compared the dbpST and pST structure sets using similarity metrics. Furthermore, a binary deformation field vector (BDF) was created and sent to ImSimQA software for comparison with known “ground truth” deformation vector fields (DVF). Results: An image similarity comparison was made by using “ground truth” DVF and “deformed output” BDF with an output of normalized “cross correlation (CC)” and “mutual information (MI)” in ImSimQA software. Results for the lung case were MI=0.66 and CC=0.99. The artificial structure deformation in both pST and dbpST was analyzed using DICE coefficient, mean distance to conformity (MDC) and deformation field error volume histogram (DFEVH) by comparing them before and after inverse deformation. We have noticed inadequate structure match for CTV, ITV and PTV due to close proximity of heart and overall affected by lung expansion. Conclusion: We have seen similarity between pCT and dbpCT but not so well between pST and dbpST, because of inadequate structure deformation in clinical DIR system. This system based quality assurance test will prepare us for adopting the guidelines of upcoming AAPM task group 132

  11. Quality control and quality assurance of nuclear analytical techniques. Thematic planning of QC/QA in technical co-operations. Report of the external participants

    International Nuclear Information System (INIS)

    Innes, R.W.; Bode, P.; Brickenkamp, C.S.; Casa, A.; Abdul Khalik Haji Wood

    1998-02-01

    In areas of trade, health, safety, and environmental protection users of a laboratory's analytical results, for example by governments and private institutions, are increasingly requiring demonstrable proof of the reliability and credibility of the laboratory's analytical results using internationally accepted standards. This is so that the products and the decisions based on these laboratory results will be accepted in the respective national and international communities. These requirements are being imposed, for example by the European Community and others, for products to be imported and can be a significant barrier to trade, especially for developing nations. In addition to this there is a growing need for these laboratories to operate efficiently and effectively to reduce internal waste, to provide reports on time in an economical manner and to become self supporting. The need for change is global and this proposal is for the Agency to pursue a thematic plan for the implementation of quality assurance as partners in development with the selected laboratories using nuclear analytical techniques. This report describes a model project for this thematic approach to confirm the models immediate benefits as well as facilitating long-term sustainability of member states' laboratories. The model is thematic in that it is also applicable to all other projects for which the credibility and reliability of the results of a laboratory's processes and results must be demonstrated. This model project provides a cost effective approach for protecting the Agency's investment in these laboratories and strengthening the ability of these national institutions to define, organize, and manage the application of nuclear technology in their respective countries. This pilot project consists of (1) determining the general levels of knowledge and application of quality assurance principles (as delineated in ISO Guide 25) in the responding laboratories; (2) selecting a trail group of

  12. Investigating output and energy variations and their relationship to delivery QA results using Statistical Process Control for helical tomotherapy.

    Science.gov (United States)

    Binny, Diana; Mezzenga, Emilio; Lancaster, Craig M; Trapp, Jamie V; Kairn, Tanya; Crowe, Scott B

    2017-06-01

    The aims of this study were to investigate machine beam parameters using the TomoTherapy quality assurance (TQA) tool, establish a correlation to patient delivery quality assurance results and to evaluate the relationship between energy variations detected using different TQA modules. TQA daily measurement results from two treatment machines for periods of up to 4years were acquired. Analyses of beam quality, helical and static output variations were made. Variations from planned dose were also analysed using Statistical Process Control (SPC) technique and their relationship to output trends were studied. Energy variations appeared to be one of the contributing factors to delivery output dose seen in the analysis. Ion chamber measurements were reliable indicators of energy and output variations and were linear with patient dose verifications. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  13. SU-F-T-315: Comparative Studies of Planar Dose with Different Spatial Resolution for Head and Neck IMRT QA

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, T; Koo, T [Hallym University Medical Center, Chuncheon, Gangwon (Korea, Republic of)

    2016-06-15

    Purpose: To quantitatively investigate the planar dose difference and the γ value between the reference fluence map with the 1 mm detector-to-detector distance and the other fluence maps with less spatial resolution for head and neck intensity modulated radiation (IMRT) therapy. Methods: For ten head and neck cancer patients, the IMRT quality assurance (QA) beams were generated using by the commercial radiation treatment planning system, Pinnacle3 (ver. 8.0.d Philips Medical System, Madison, WI). For each beam, ten fluence maps (detector-to-detector distance: 1 mm to 10 mm by 1 mm) were generated. The fluence maps with larger than 1 mm detector-todetector distance were interpolated using MATLAB (R2014a, the Math Works,Natick, MA) by four different interpolation Methods: for the bilinear, the cubic spline, the bicubic, and the nearest neighbor interpolation, respectively. These interpolated fluence maps were compared with the reference one using the γ value (criteria: 3%, 3 mm) and the relative dose difference. Results: As the detector-to-detector distance increases, the dose difference between the two maps increases. For the fluence map with the same resolution, the cubic spline interpolation and the bicubic interpolation are almost equally best interpolation methods while the nearest neighbor interpolation is the worst.For example, for 5 mm distance fluence maps, γ≤1 are 98.12±2.28%, 99.48±0.66%, 99.45±0.65% and 82.23±0.48% for the bilinear, the cubic spline, the bicubic, and the nearest neighbor interpolation, respectively. For 7 mm distance fluence maps, γ≤1 are 90.87±5.91%, 90.22±6.95%, 91.79±5.97% and 71.93±4.92 for the bilinear, the cubic spline, the bicubic, and the nearest neighbor interpolation, respectively. Conclusion: We recommend that the 2-dimensional detector array with high spatial resolution should be used as an IMRT QA tool and that the measured fluence maps should be interpolated using by the cubic spline interpolation or the

  14. Analytic and numerical studies of Scyllac equilibrium

    International Nuclear Information System (INIS)

    Barnes, D.C.; Brackbill, J.U.; Dagazian, R.Y.; Freidberg, J.P.; Schneider, W.; Betancourt, O.; Garabedian, P.

    1976-01-01

    The results of both numerical and analytic studies of the Scyllac equilibria are presented. Analytic expansions are used to derive equilibrium equations appropriate to noncircular cross sections, and compute the stellarator fields which produce toroidal force balance. Numerical algorithms are used to solve both the equilibrium equations and the full system of dynamical equations in three dimensions. Numerical equilibria are found for both l = 1,0 and l= 1,2 systems. It is found that the stellarator fields which produce equilibria in the l = 1.0 system are larger for diffuse than for sharp boundary plasma profiles, and that the stability of the equilibria depends strongly on the harmonic content of the stellarator fields

  15. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  16. a cross-sectional analytic study 2014

    African Journals Online (AJOL)

    Assessment of HIV/AIDS comprehensive correct knowledge among Sudanese university: a cross-sectional analytic study 2014. ... There are limited studies on this topic in Sudan. In this study we investigated the Comprehensive correct ...

  17. Three-dimensional dose distribution in contrast-enhanced digital mammography using Gafchromic XR-QA2 films: Feasibility study

    International Nuclear Information System (INIS)

    Hwang, Yi-Shuan; Lin, Yu-Ying; Cheung, Yun-Chung; Tsai, Hui-Yu

    2014-01-01

    This study was aimed to establish three-dimensional dose distributions for contrast-enhanced digital mammography (CEDM) using self-developed Gafchromic XR-QA2 films. Dose calibration and distribution evaluations were performed on a full-field digital mammography unit with dual energy (DE) contrast-enhanced option. Strategy for dose calibration of films in the DE mode was based on the data obtained from common target/filter/kVp combinations used clinically and the dose response model modified from Rampado's model. Dose derived from films were also verified by measured data from an ionization chamber. The average difference of dose was 8.9% in the dose range for clinical uses. Three-dimensional dose distributions were estimated using triangular acrylic phantom equipped with the mammography system. Five pieces of film sheets were separately placed between the acrylic slabs to evaluate the dose distribution at different depths. After normalizing the dose in each pixel to the maximum dose at the top-center position of the acrylic, normalized dose distribution for transverse, coronal and sagittal planes, could thus be obtained. The depth dose distribution evaluated in this study may further serve as a reference for evaluating the patient glandular dose at different depths based on the entrance exposure information. - Highlights: • CEDM techniques can enhance contrast uptake areas and suppress background tissue. • Dose for the dual-energy acquisition is about 20% higher than standard mode. • A new method is proposed to estimate the 3D dose distribution in dual-energy CEDM. • Depth of normalized dose ratio of 0.5 is less than but near 1 cm in the DE mode

  18. Analytical study of doubly excited ridge states

    International Nuclear Information System (INIS)

    Wong, H.Y.

    1988-01-01

    Two different non-separable problems are explored and analyzed. Non-perturbative methods need to be used to handle them, as the competing forces involved in these problems are equally strong and do not yield to a perturbative analysis. The first one is the study of doubly excited ridge states of atoms, in which two electrons are comparably excited. An analytical wavefunction for such states is introduced and is used to solve the two-electron Hamiltonian in the pair coordinates called hyperspherical coordinates variationally. The correlation between the electrons is built in analytically into the structure of the wavefunction. Sequences of ridge states out to very high excitation are computed and are organized as Rydberg series converging to the double ionization limit. Numerical results of such states in He and H - are compared with other theoretical calculations where available. The second problem is the analysis of the photodetachment of negative ions in an electric field via the frame transformation theory. The presence of the electron field requires a transformation from spherical to cylindrical symmetry for the outgoing photoelectron. This gives an oscillatory modulating factor as the effect of the electric field on cross-sections. All of this work is derived analytically in a general form applicable to the photodetachment of any negative ion. The expressions are applied to H - and S - for illustration

  19. Path integral analysis of Jarzynski's equality: Analytical results

    Science.gov (United States)

    Minh, David D. L.; Adib, Artur B.

    2009-02-01

    We apply path integrals to study nonequilibrium work theorems in the context of Brownian dynamics, deriving in particular the equations of motion governing the most typical and most dominant trajectories. For the analytically soluble cases of a moving harmonic potential and a harmonic oscillator with a time-dependent natural frequency, we find such trajectories, evaluate the work-weighted propagators, and validate Jarzynski’s equality.

  20. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  1. A New Method to Study Analytic Inequalities

    Directory of Open Access Journals (Sweden)

    Xiao-Ming Zhang

    2010-01-01

    Full Text Available We present a new method to study analytic inequalities involving n variables. Regarding its applications, we proved some well-known inequalities and improved Carleman's inequality.

  2. QA at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1988-01-01

    This paper opens with a brief overview of the purpose of Fermilab and historical synopsis of the development and current status of quality assurance (QA) at the Laboratory. The paper subsequently addresses some of the more important aspects of interpreting the national standard ANSI/ASME NQA-1 in pure research environments like Fermilab. Highlights of this discussion include, (1) what is hermeneutics and why are hermeneutical considerations relevant for QA, (2) a critical analysis of NQA-1 focussing on teleological aspects of the standard, (3) a description of the hermeneutical approach to NQA-1 used at Fermilab which attempts to capture the true intents of the document without violating the deeply ingrained traditions of quality standards and peer review that have been foundational to the overall success of the paradigms of high-energy physics.

  3. Experimental analytical study on heat pipes

    International Nuclear Information System (INIS)

    Ismail, K.A.R.; Liu, C.Y.; Murcia, N.

    1981-01-01

    An analytical model is developed for optimizing the thickness distribution of the porous material in heat pipes. The method was used to calculate, design and construct heat pipes with internal geometrical changes. Ordinary pipes are also constructed and tested together with the modified ones. The results showed that modified tubes are superior in performance and that the analytical model can predict their performance to within 1.5% precision. (Author) [pt

  4. Analytic results for asymmetric random walk with exponential transition probabilities

    International Nuclear Information System (INIS)

    Gutkowicz-Krusin, D.; Procaccia, I.; Ross, J.

    1978-01-01

    We present here exact analytic results for a random walk on a one-dimensional lattice with asymmetric, exponentially distributed jump probabilities. We derive the generating functions of such a walk for a perfect lattice and for a lattice with absorbing boundaries. We obtain solutions for some interesting moment properties, such as mean first passage time, drift velocity, dispersion, and branching ratio for absorption. The symmetric exponential walk is solved as a special case. The scaling of the mean first passage time with the size of the system for the exponentially distributed walk is determined by the symmetry and is independent of the range

  5. Analytical results for a hole in an antiferromagnet

    International Nuclear Information System (INIS)

    Li, Y.M.; d'Ambrumenil, N.; Su, Z.B.

    1996-04-01

    The Green's function for a hole moving in an antiferromagnet is derived analytically in the long-wavelength limit. We find that the infrared divergence is eliminated in two and higher dimensions so that the quasiparticle weight is finite. Our results also suggest that the hole motion is polaronic in nature with a bandwidth proportional to t 2 /J exp[-c(t/J) 2 ] (c is a constant) for J/t >or approx 0.5. The connection of the long-wavelength approximation to the first-order approximation in the cumulant expansion is also clarified. (author). 23 refs, 2 figs

  6. Moving from gamma passing rates to patient DVH-based QA metrics in pretreatment dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, Heming; Nelms, Benjamin E.; Tome, Wolfgang A. [Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 and Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 and Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-10-15

    considerable scatter in the gamma passing rate vs DVH-based metric curves. However, for the same input data, the PDP estimates were in agreement with actual patient DVH results. Conclusions: Gamma passing rate, even if calculated based on patient dose grids, has generally weak correlation to critical patient DVH errors. However, the PDP algorithm was shown to accurately predict the DVH impact using conventional planar QA results. Using patient-DVH-based metrics IMRT QA allows per-patient dose QA to be based on metrics that are both sensitive and specific. Further studies are now required to analyze new processes and action levels associated with DVH-based metrics to ensure effectiveness and practicality in the clinical setting.

  7. Accounting for human factor in QC and QA inspections

    International Nuclear Information System (INIS)

    Goodman, J.

    1986-01-01

    Two types of human error during QC/QA inspection have been identified. The method of accounting for the effects of human error in QC/QA inspections was developed. The result of evaluation of the proportion of discrepant items in the population is affected significantly by human factor

  8. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  9. Analytic scattering kernels for neutron thermalization studies

    International Nuclear Information System (INIS)

    Sears, V.F.

    1990-01-01

    Current plans call for the inclusion of a liquid hydrogen or deuterium cold source in the NRU replacement vessel. This report is part of an ongoing study of neutron thermalization in such a cold source. Here, we develop a simple analytical model for the scattering kernel of monatomic and diatomic liquids. We also present the results of extensive numerical calculations based on this model for liquid hydrogen, liquid deuterium, and mixtures of the two. These calculations demonstrate the dependence of the scattering kernel on the incident and scattered-neutron energies, the behavior near rotational thresholds, the dependence on the centre-of-mass pair correlations, the dependence on the ortho concentration, and the dependence on the deuterium concentration in H 2 /D 2 mixtures. The total scattering cross sections are also calculated and compared with available experimental results

  10. Analytical study of dissipative solitary waves

    Energy Technology Data Exchange (ETDEWEB)

    Dini, Fatemeh [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Emamzadeh, Mehdi Molaie [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Khorasani, Sina [School of Electrical Engineering, Sharif University of Technology, PO Box 11365-363, Tehran (Iran, Islamic Republic of); Bobin, Jean Louis [Universite Pierre et Marie Curie, Paris (France); Amrollahi, Reza [Department of Physics, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Sodagar, Majid [School of Electrical Engineering, Sharif University of Technology, PO Box 11365-363, Tehran (Iran, Islamic Republic of); Khoshnegar, Milad [School of Electrical Engineering, Sharif University of Technology, PO Box 11365-363, Tehran (Iran, Islamic Republic of)

    2008-02-15

    In this paper, the analytical solution to a new class of nonlinear solitons is presented with cubic nonlinearity, subject to a dissipation term arising as a result of a first-order derivative with respect to time, in the weakly nonlinear regime. Exact solutions are found using the combination of the perturbation and Green's function methods up to the third order. We present an example and discuss the asymptotic behavior of the Green's function. The dissipative solitary equation is also studied in the phase space in the non-dissipative and dissipative forms. Bounded and unbounded solutions of this equation are characterized, yielding an energy conversation law for non-dissipative waves. Applications of the model include weakly nonlinear solutions of terahertz Josephson plasma waves in layered superconductors and ablative Rayleigh-Taylor instability.

  11. Beam dynamics of mixed high intensity highly charged ion Beams in the Q/A selector

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, X.H., E-mail: zhangxiaohu@impcas.ac.cn [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Yuan, Y.J.; Yin, X.J.; Qian, C.; Sun, L.T. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); Du, H.; Li, Z.S.; Qiao, J.; Wang, K.D. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Zhao, H.W.; Xia, J.W. [Institute of Modern Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2017-06-11

    Electron cyclotron resonance (ECR) ion sources are widely used in heavy ion accelerators for their advantages in producing high quality intense beams of highly charged ions. However, it exists challenges in the design of the Q/A selection systems for mixed high intensity ion beams to reach sufficient Q/A resolution while controlling the beam emittance growth. Moreover, as the emittance of beam from ECR ion sources is coupled, the matching of phase space to post accelerator, for a wide range of ion beam species with different intensities, should be carefully studied. In this paper, the simulation and experimental results of the Q/A selection system at the LECR4 platform are shown. The formation of hollow cross section heavy ion beam at the end of the Q/A selector is revealed. A reasonable interpretation has been proposed, a modified design of the Q/A selection system has been committed for HIRFL-SSC linac injector. The features of the new design including beam simulations and experiment results are also presented.

  12. Analytical results of radiochemistry of the JRR-3M

    International Nuclear Information System (INIS)

    Yoshijima, Tetsuo; Tanaka, Sumitoshi

    1997-07-01

    The JRR-3 was modified for upgrading to enhance the experimental capabilities in 1990 as JRR-3M. JRR-3M is pool type research reactor, moderated and cooled by light water with a maximum thermal power of 20 MWt and a thermal neutron flux of about 2x10 14 n/cm 2 ·sec. The core internal structure and fuel cladding tube is made by aluminum alloy. The cooling systems are composed of primary cooling system, secondary cooling system, heavy water reflector system and helium gas system. The primary piping system, reactor pool and heavy water reflector system is constructed of type 304 stainless steel. The main objectives of radiochemistry are check the general corrosion of structural materials and detection of failed fuel elements for safe operation of reactor plant. In this report analytical results of radiochemistry and evaluation of radionuclides of cooling systems in the JRR-3M are described. (author)

  13. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Science.gov (United States)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  14. Radiotherapy QA of the DAHANCA 19 protocol

    DEFF Research Database (Denmark)

    Samsøe, E.; Andersen, E.; Hansen, C. R.

    2015-01-01

    Purpose/Objective: It has been demonstrated that nonadherence to protocol-specified radiotherapy (RT) requirements is associated with reduced survival, local control and potentially increased toxicity [1]. Thus, quality assurance (QA) of RT is important when evaluating the results of clinical...

  15. Analytical performances of food microbiology laboratories - critical analysis of 7 years of proficiency testing results.

    Science.gov (United States)

    Abdel Massih, M; Planchon, V; Polet, M; Dierick, K; Mahillon, J

    2016-02-01

    Based on the results of 19 food microbiology proficiency testing (PT) schemes, this study aimed to assess the laboratory performances, to highlight the main sources of unsatisfactory analytical results and to suggest areas of improvement. The 2009-2015 results of REQUASUD and IPH PT, involving a total of 48 laboratories, were analysed. On average, the laboratories failed to detect or enumerate foodborne pathogens in 3·0% of the tests. Thanks to a close collaboration with the PT participants, the causes of outliers could be identified in 74% of the cases. The main causes of erroneous PT results were either pre-analytical (handling of the samples, timing of analysis), analytical (unsuitable methods, confusion of samples, errors in colony counting or confirmation) or postanalytical mistakes (calculation and encoding of results). PT schemes are a privileged observation post to highlight analytical problems, which would otherwise remain unnoticed. In this perspective, this comprehensive study of PT results provides insight into the sources of systematic errors encountered during the analyses. This study draws the attention of the laboratories to the main causes of analytical errors and suggests practical solutions to avoid them, in an educational purpose. The observations support the hypothesis that regular participation to PT, when followed by feed-back and appropriate corrective actions, can play a key role in quality improvement and provide more confidence in the laboratory testing results. © 2015 The Society for Applied Microbiology.

  16. The assessment report of QA program through the analysis of quality trend in 1994

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author).

  17. The assessment report of QA program through the analysis of quality trend in 1994

    International Nuclear Information System (INIS)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author)

  18. Nuclear analytical techniques for nanotoxicology studies

    International Nuclear Information System (INIS)

    Zhang, Z.Y.; Zhao, Y.L.; Chai, Z.F.

    2011-01-01

    With the rapid development of nanotechnology and its applications, a wide variety of nanomaterials are now used in commodities, pharmaceutics, cosmetics, biomedical products, and industries. The potential interactions of nanomaterials with living systems and the environment have attracted increasing attention from the public, as well as from manufacturers of nanomaterial-based products, academic researchers and policymakers. It is important to consider the environmental, health and safety aspects at an early stage of nanomaterial development and application in order to more effectively identify and manage potential human and environmental health impacts from nanomaterial exposure. This will require research in a range of areas, including detection and characterization, environmental fate and transport, ecotoxicology and toxicology. Nuclear analytical techniques (NATs) can play an important role in such studies due to their intrinsic merits such as high sensitivity, good accuracy, high space resolution, ability to distinguish the endogenous or exogenous sources of materials, and ability of in situ and in vivo analysis. In this paper, the applications of NATs in nanotoxicological and nano-ecotoxicological studies are outlined, and some recent results obtained in our laboratory are reported. (orig.)

  19. Application of QA to R ampersand D support of HLW programs

    International Nuclear Information System (INIS)

    Ryder, D.E.

    1988-01-01

    Quality has always been of primary importance in the research and development (R ampersand D) environment. An organization's ability to attract funds for new or continued research is largely dependent on the quality of past performance. However, with the possible exceptions of peer reviews for fund allocation and the referee process prior to publication, past quality assurance (QA) activities were primarily informal good practices. This resulted in standards of acceptable practice that varied from organization to organization. The increasing complexity of R ampersand D projects and the increasing need for project results to be upheld outside the scientific community (i.e., lawsuits and licensing hearings) are encouraging R ampersand D organizations and their clients to adopt more formalized methods for the scientific process and to increase control over support organizations (i.e., suppliers and subcontractors). This has become especially true for R ampersand D organizations involved in the high-level (HLW) projects for a number of years. The PNL began to implement QA program requirements within a few HLW repository preliminary studies in 1978. In 1985, PNL developed a comprehensive QA program for R ampersand D activities in support of two of the proposed repository projects. This QA program was developed by the PNL QA department with a significant amount of support assistance and guidance from PNL upper management, the Basalt Waste Isolation Project (BWIP), and the Salt Repository Program Office (SPRO). The QA program has been revised to add a three-level feature and is currently being implemented on projects sponsored by the Office of Geologic Repositories (DOE/OGR), Repository Technology Program (DOE-CH), Nevada Nuclear Waste Storage Investigation (NNWSI) Project, and other HLW projects

  20. AN ANALYTICAL STUDY OF SWITCHING TRACTION MOTORS

    Directory of Open Access Journals (Sweden)

    V. M. Bezruchenko

    2010-03-01

    Full Text Available The analytical study of switching of the tractive engines of electric locomotives is conducted. It is found that the obtained curves of change of current of the sections commuted correspond to the theory of average rectilinear switching. By means of the proposed method it is possible on the stage of design of tractive engines to forecast the quality of switching and to correct it timely.

  1. SU-E-J-52: Decreasing Frequency of Performing TG-142 Imaging QA – 5 Year Experience

    Energy Technology Data Exchange (ETDEWEB)

    Lin, T; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)

    2015-06-15

    Purpose This study is an update to check if the frequency of imaging QA suggested by AAPM Task Group Report 142 (TG142) is necessary with our 5 year experience. TG142 presents recommendations for QA criteria of IGRT treatment. ACR has adopted it to be the requirements for any radiatiotherapy practices; however, we propose to reduce the frequency on image quality QA according to this 5 year study.Method and Materials: This study uses VarianIX2100 and Siemens Artiste Linacs to perform QAs on KV, MV, CBCT modalities. The QA was designed following under the recommendations of TG142. This study reports the daily imaging positioning/repositioning and imaging and treatment coordinate coincidence. QA results on kV, MV and CBCT from 4/7/2010∼3/11/15 are analyzed. KV, MV, CBCT images are taken with the Varian isocube localized at the isocenter. Digital graticule is used in the software to verify the isocenter position. CBCT images are taken with the cube placed at 1cm superior, lateral and anterior of the isocenter. In-line fusion software is used to verify the contrived shift. Digital ruler provided at the on-board-imaging software or adaptive-targeting software was used to measure the position differences. The position differences were recorded at AP,LR,SI directions. Results 5 year records on kV, MV, CBCT show the shifts in all three directions are within the tolerance of 1mm suggested in TG142 for stereotactic radiation treatment(SRS/SRT). There is no occasion where shifts are outside 1mm tolerance. Conclusions The daily imaging QA suggested in TG142 is useful in ensuring the accuracy needed for SRS/SRT in IGRT. 5 year measurements presented suggest that decreasing the frequency of imaging QA may be acceptable, in particular for institutions reporting no violation of tolerance over periods of few years.

  2. SU-E-J-52: Decreasing Frequency of Performing TG-142 Imaging QA – 5 Year Experience

    International Nuclear Information System (INIS)

    Lin, T; Ma, C

    2015-01-01

    Purpose This study is an update to check if the frequency of imaging QA suggested by AAPM Task Group Report 142 (TG142) is necessary with our 5 year experience. TG142 presents recommendations for QA criteria of IGRT treatment. ACR has adopted it to be the requirements for any radiatiotherapy practices; however, we propose to reduce the frequency on image quality QA according to this 5 year study.Method and Materials: This study uses VarianIX2100 and Siemens Artiste Linacs to perform QAs on KV, MV, CBCT modalities. The QA was designed following under the recommendations of TG142. This study reports the daily imaging positioning/repositioning and imaging and treatment coordinate coincidence. QA results on kV, MV and CBCT from 4/7/2010∼3/11/15 are analyzed. KV, MV, CBCT images are taken with the Varian isocube localized at the isocenter. Digital graticule is used in the software to verify the isocenter position. CBCT images are taken with the cube placed at 1cm superior, lateral and anterior of the isocenter. In-line fusion software is used to verify the contrived shift. Digital ruler provided at the on-board-imaging software or adaptive-targeting software was used to measure the position differences. The position differences were recorded at AP,LR,SI directions. Results 5 year records on kV, MV, CBCT show the shifts in all three directions are within the tolerance of 1mm suggested in TG142 for stereotactic radiation treatment(SRS/SRT). There is no occasion where shifts are outside 1mm tolerance. Conclusions The daily imaging QA suggested in TG142 is useful in ensuring the accuracy needed for SRS/SRT in IGRT. 5 year measurements presented suggest that decreasing the frequency of imaging QA may be acceptable, in particular for institutions reporting no violation of tolerance over periods of few years

  3. IMRT QA: Selecting gamma criteria based on error detection sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Steers, Jennifer M. [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 and Physics and Biology in Medicine IDP, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90095 (United States); Fraass, Benedick A., E-mail: benedick.fraass@cshs.org [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 (United States)

    2016-04-15

    Purpose: The gamma comparison is widely used to evaluate the agreement between measurements and treatment planning system calculations in patient-specific intensity modulated radiation therapy (IMRT) quality assurance (QA). However, recent publications have raised concerns about the lack of sensitivity when employing commonly used gamma criteria. Understanding the actual sensitivity of a wide range of different gamma criteria may allow the definition of more meaningful gamma criteria and tolerance limits in IMRT QA. We present a method that allows the quantitative determination of gamma criteria sensitivity to induced errors which can be applied to any unique combination of device, delivery technique, and software utilized in a specific clinic. Methods: A total of 21 DMLC IMRT QA measurements (ArcCHECK®, Sun Nuclear) were compared to QA plan calculations with induced errors. Three scenarios were studied: MU errors, multi-leaf collimator (MLC) errors, and the sensitivity of the gamma comparison to changes in penumbra width. Gamma comparisons were performed between measurements and error-induced calculations using a wide range of gamma criteria, resulting in a total of over 20 000 gamma comparisons. Gamma passing rates for each error class and case were graphed against error magnitude to create error curves in order to represent the range of missed errors in routine IMRT QA using 36 different gamma criteria. Results: This study demonstrates that systematic errors and case-specific errors can be detected by the error curve analysis. Depending on the location of the error curve peak (e.g., not centered about zero), 3%/3 mm threshold = 10% at 90% pixels passing may miss errors as large as 15% MU errors and ±1 cm random MLC errors for some cases. As the dose threshold parameter was increased for a given %Diff/distance-to-agreement (DTA) setting, error sensitivity was increased by up to a factor of two for select cases. This increased sensitivity with increasing dose

  4. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    Energy Technology Data Exchange (ETDEWEB)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X [The University of Chicago, Chicago, IL (United States)

    2015-06-15

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system.

  5. SU-E-T-11: A Cloud Based CT and LINAC QA Data Management System

    International Nuclear Information System (INIS)

    Wiersma, R; Grelewicz, Z; Belcher, A; Liu, X

    2015-01-01

    Purpose: The current status quo of QA data management consists of a mixture of paper-based forms and spreadsheets for recording the results of daily, monthly, and yearly QA tests for both CT scanners and LINACs. Unfortunately, such systems suffer from a host of problems as, (1) records can be easily lost or destroyed, (2) data is difficult to access — one must physically hunt down records, (3) poor or no means of historical data analysis, and (4) no remote monitoring of machine performance off-site. To address these issues, a cloud based QA data management system was developed and implemented. Methods: A responsive tablet interface that optimizes clinic workflow with an easy-to-navigate interface accessible from any web browser was implemented in HTML/javascript/CSS to allow user mobility when entering QA data. Automated image QA was performed using a phantom QA kit developed in Python that is applicable to any phantom and is currently being used with the Gammex ACR, Las Vegas, Leeds, and Catphan phantoms for performing automated CT, MV, kV, and CBCT QAs, respectively. A Python based resource management system was used to distribute and manage intensive CPU tasks such as QA phantom image analysis or LaTeX-to-PDF QA report generation to independent process threads or different servers such that website performance is not affected. Results: To date the cloud QA system has performed approximately 185 QA procedures. Approximately 200 QA parameters are being actively tracked by the system on a monthly basis. Electronic access to historical QA parameter information was successful in proactively identifying a Linac CBCT scanner’s performance degradation. Conclusion: A fully comprehensive cloud based QA data management system was successfully implemented for the first time. Potential machine performance issues were proactively identified that would have been otherwise missed by a paper or spreadsheet based QA system

  6. SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams

    Energy Technology Data Exchange (ETDEWEB)

    LAH, J [Myongji Hospital, Goyangsi, Gyeonggi-do (Korea, Republic of); SHIN, D [National Cancer Center, Goyangsi, Gyeonggi-do (Korea, Republic of); Kim, G [UCSD Medical Center, La Jolla, CA (United States)

    2014-06-15

    Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classified into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.

  7. Application of QA geoscience investigations

    International Nuclear Information System (INIS)

    Henderson, J.T.

    1980-01-01

    This paper discusses the evolution of a classical hardware QA program (as currently embodied in DOE/ALO Manual Chapter 08XA; NRC 10CFR Part 50, Appendix B; and other similar documents) into the present geoscience quality assurance programs that address eventual NRC licensing, if required. In the context of this paper, QA will be restricted to the tasks associated with nuclear repositories, i.e. site identification, selection, characterization, verification, and utilization

  8. Institutional Patient-specific IMRT QA Does Not Predict Unacceptable Plan Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Kry, Stephen F., E-mail: sfkry@mdanderson.org [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Molineu, Andrea [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Kerns, James R.; Faught, Austin M.; Huang, Jessie Y.; Pulliam, Kiley B.; Tonigan, Jackie [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States); Alvarez, Paola [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Stingo, Francesco [The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States); Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Followill, David S. [Imaging and Radiation Oncology Core at Houston, Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Health Science Center Houston, Graduate School of Biomedical Sciences, Houston, Texas (United States)

    2014-12-01

    Purpose: To determine whether in-house patient-specific intensity modulated radiation therapy quality assurance (IMRT QA) results predict Imaging and Radiation Oncology Core (IROC)-Houston phantom results. Methods and Materials: IROC Houston's IMRT head and neck phantoms have been irradiated by numerous institutions as part of clinical trial credentialing. We retrospectively compared these phantom results with those of in-house IMRT QA (following the institution's clinical process) for 855 irradiations performed between 2003 and 2013. The sensitivity and specificity of IMRT QA to detect unacceptable or acceptable plans were determined relative to the IROC Houston phantom results. Additional analyses evaluated specific IMRT QA dosimeters and analysis methods. Results: IMRT QA universally showed poor sensitivity relative to the head and neck phantom, that is, poor ability to predict a failing IROC Houston phantom result. Depending on how the IMRT QA results were interpreted, overall sensitivity ranged from 2% to 18%. For different IMRT QA methods, sensitivity ranged from 3% to 54%. Although the observed sensitivity was particularly poor at clinical thresholds (eg 3% dose difference or 90% of pixels passing gamma), receiver operator characteristic analysis indicated that no threshold showed good sensitivity and specificity for the devices evaluated. Conclusions: IMRT QA is not a reasonable replacement for a credentialing phantom. Moreover, the particularly poor agreement between IMRT QA and the IROC Houston phantoms highlights surprising inconsistency in the QA process.

  9. Institutional Patient-specific IMRT QA Does Not Predict Unacceptable Plan Delivery

    International Nuclear Information System (INIS)

    Kry, Stephen F.; Molineu, Andrea; Kerns, James R.; Faught, Austin M.; Huang, Jessie Y.; Pulliam, Kiley B.; Tonigan, Jackie; Alvarez, Paola; Stingo, Francesco; Followill, David S.

    2014-01-01

    Purpose: To determine whether in-house patient-specific intensity modulated radiation therapy quality assurance (IMRT QA) results predict Imaging and Radiation Oncology Core (IROC)-Houston phantom results. Methods and Materials: IROC Houston's IMRT head and neck phantoms have been irradiated by numerous institutions as part of clinical trial credentialing. We retrospectively compared these phantom results with those of in-house IMRT QA (following the institution's clinical process) for 855 irradiations performed between 2003 and 2013. The sensitivity and specificity of IMRT QA to detect unacceptable or acceptable plans were determined relative to the IROC Houston phantom results. Additional analyses evaluated specific IMRT QA dosimeters and analysis methods. Results: IMRT QA universally showed poor sensitivity relative to the head and neck phantom, that is, poor ability to predict a failing IROC Houston phantom result. Depending on how the IMRT QA results were interpreted, overall sensitivity ranged from 2% to 18%. For different IMRT QA methods, sensitivity ranged from 3% to 54%. Although the observed sensitivity was particularly poor at clinical thresholds (eg 3% dose difference or 90% of pixels passing gamma), receiver operator characteristic analysis indicated that no threshold showed good sensitivity and specificity for the devices evaluated. Conclusions: IMRT QA is not a reasonable replacement for a credentialing phantom. Moreover, the particularly poor agreement between IMRT QA and the IROC Houston phantoms highlights surprising inconsistency in the QA process

  10. Size Effect of the 2-D Bodies on the Geothermal Gradient and Q-A Plot

    Science.gov (United States)

    Thakur, M.; Blackwell, D. D.

    2009-12-01

    Using numerical models we have investigated some of the criticisms on the Q-A plot of related to the effect of size of the body on the slope and reduced heat flow. The effects of horizontal conduction depend on the relative difference of radioactivity between the body and the country rock (assuming constant thermal conductivity). Horizontal heat transfer due to different 2-D bodies was numerically studied in order to quantify resulting temperature differences at the Moho and errors on the predication of Qr (reduced heat flow). Using the two end member distributions of radioactivity, the step model (thickness 10km) and exponential model, different 2-D models of horizontal scale (width) ranging from 10 -500 km were investigated. Increasing the horizontal size of the body tends to move observations closer towards the 1-D solution. A temperature difference of 50 oC is produced (for the step model) at Moho between models of width 10 km versus 500 km. In other words the 1-D solution effectively provides large scale averaging in terms of heat flow and temperature field in the lithosphere. For bodies’ ≤ 100 km wide the geotherms at shallower levels are affected, but at depth they converge and are 50 oC lower than that of the infinite plate model temperature. In case of 2-D bodies surface heat flow is decreased due to horizontal transfer of heat, which will shift the Q-A point vertically downward on the Q-A plot. The smaller the size of the body, the more will be the deviation from the 1-D solution and the more will be the movement of Q-A point downwards on a Q-A plot. On the Q-A plot, a limited points of bodies of different sizes with different radioactivity contrast (for the step and exponential model), exactly reproduce the reduced heat flow Qr. Thus the size of the body can affect the slope on a Q-A plot but Qr is not changed. Therefore, Qr ~ 32 mWm-2 obtained from the global terrain average Q-A plot represents the best estimate of stable continental mantle heat

  11. Adaptive cyclically dominating game on co-evolving networks: numerical and analytic results

    Science.gov (United States)

    Choi, Chi Wun; Xu, Chen; Hui, Pak Ming

    2017-10-01

    A co-evolving and adaptive Rock (R)-Paper (P)-Scissors (S) game (ARPS) in which an agent uses one of three cyclically dominating strategies is proposed and studied numerically and analytically. An agent takes adaptive actions to achieve a neighborhood to his advantage by rewiring a dissatisfying link with a probability p or switching strategy with a probability 1 - p. Numerical results revealed two phases in the steady state. An active phase for p pc has three separate clusters of agents using only R, P, and S, respectively with terminated adaptive actions. A mean-field theory based on the link densities in co-evolving network is formulated and the trinomial closure scheme is applied to obtain analytical solutions. The analytic results agree with simulation results on ARPS well. In addition, the different probabilities of winning, losing, and drawing a game among the agents are identified as the origin of the small discrepancy between analytic and simulation results. As a result of the adaptive actions, agents of higher degrees are often those being taken advantage of. Agents with a smaller (larger) degree than the mean degree have a higher (smaller) probability of winning than losing. The results are informative for future attempts on formulating more accurate theories.

  12. Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.

    Energy Technology Data Exchange (ETDEWEB)

    Shanks, Sonoya T.; Ted Redding; Lynn Jaussi; Allen, Mark B.; Fournier, Sean Donovan; Leonard, Elliott J.

    2017-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they are deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.

  13. SU-G-TeP2-01: Can EPID Based Measurement Replace Traditional Daily Output QA On Megavoltage Linac?

    International Nuclear Information System (INIS)

    Saleh, Z; Tang, X; Song, Y; Obcemea, C; Beeban, N; Chan, M; Li, X; Tang, G; Lim, S; Lovelock, D; LoSasso, T; Mechalakos, J; Both, S

    2016-01-01

    Purpose: To investigate the long term stability and viability of using EPID-based daily output QA via in-house and vendor driven protocol, to replace conventional QA tools and improve QA efficiency. Methods: Two Varian TrueBeam machines (TB1&TB2) equipped with electronic portal imaging devices (EPID) were employed in this study. Both machines were calibrated per TG-51 and used clinically since Oct 2014. Daily output measurement for 6/15 MV beams were obtained using SunNuclear DailyQA3 device as part of morning QA. In addition, in-house protocol was implemented for EPID output measurement (10×10 cm fields, 100 MU, 100cm SID, output defined over an ROI of 2×2 cm around central axis). Moreover, the Varian Machine Performance Check (MPC) was used on both machines to measure machine output. The EPID and DailyQA3 based measurements of the relative machine output were compared and cross-correlated with monthly machine output as measured by an A12 Exradin 0.65cc Ion Chamber (IC) serving as ground truth. The results were correlated using Pearson test. Results: The correlations among DailyQA3, in-house EPID and Varian MPC output measurements, with the IC for 6/15 MV were similar for TB1 (0.83–0.95) and TB2 (0.55–0.67). The machine output for the 6/15MV beams on both machines showed a similar trend, namely an increase over time as indicated by all measurements, requiring a machine recalibration after 6 months. This drift is due to a known issue with pressurized monitor chamber which tends to leak over time. MPC failed occasionally but passed when repeated. Conclusion: The results indicate that the use of EPID for daily output measurements has the potential to become a viable and efficient tool for daily routine LINAC QA, thus eliminating weather (T,P) and human setup variability and increasing efficiency of the QA process.

  14. SU-G-TeP2-01: Can EPID Based Measurement Replace Traditional Daily Output QA On Megavoltage Linac?

    Energy Technology Data Exchange (ETDEWEB)

    Saleh, Z; Tang, X; Song, Y; Obcemea, C; Beeban, N; Chan, M; Li, X; Tang, G; Lim, S; Lovelock, D; LoSasso, T; Mechalakos, J; Both, S [Memorial Sloan-Kettering Cancer Center, NY (United States)

    2016-06-15

    Purpose: To investigate the long term stability and viability of using EPID-based daily output QA via in-house and vendor driven protocol, to replace conventional QA tools and improve QA efficiency. Methods: Two Varian TrueBeam machines (TB1&TB2) equipped with electronic portal imaging devices (EPID) were employed in this study. Both machines were calibrated per TG-51 and used clinically since Oct 2014. Daily output measurement for 6/15 MV beams were obtained using SunNuclear DailyQA3 device as part of morning QA. In addition, in-house protocol was implemented for EPID output measurement (10×10 cm fields, 100 MU, 100cm SID, output defined over an ROI of 2×2 cm around central axis). Moreover, the Varian Machine Performance Check (MPC) was used on both machines to measure machine output. The EPID and DailyQA3 based measurements of the relative machine output were compared and cross-correlated with monthly machine output as measured by an A12 Exradin 0.65cc Ion Chamber (IC) serving as ground truth. The results were correlated using Pearson test. Results: The correlations among DailyQA3, in-house EPID and Varian MPC output measurements, with the IC for 6/15 MV were similar for TB1 (0.83–0.95) and TB2 (0.55–0.67). The machine output for the 6/15MV beams on both machines showed a similar trend, namely an increase over time as indicated by all measurements, requiring a machine recalibration after 6 months. This drift is due to a known issue with pressurized monitor chamber which tends to leak over time. MPC failed occasionally but passed when repeated. Conclusion: The results indicate that the use of EPID for daily output measurements has the potential to become a viable and efficient tool for daily routine LINAC QA, thus eliminating weather (T,P) and human setup variability and increasing efficiency of the QA process.

  15. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  16. A virtual dosimetry audit - Towards transferability of gamma index analysis between clinical trial QA groups.

    Science.gov (United States)

    Hussein, Mohammad; Clementel, Enrico; Eaton, David J; Greer, Peter B; Haworth, Annette; Ishikura, Satoshi; Kry, Stephen F; Lehmann, Joerg; Lye, Jessica; Monti, Angelo F; Nakamura, Mitsuhiro; Hurkmans, Coen; Clark, Catharine H

    2017-12-01

    Quality assurance (QA) for clinical trials is important. Lack of compliance can affect trial outcome. Clinical trial QA groups have different methods of dose distribution verification and analysis, all with the ultimate aim of ensuring trial compliance. The aim of this study was to gain a better understanding of different processes to inform future dosimetry audit reciprocity. Six clinical trial QA groups participated. Intensity modulated treatment plans were generated for three different cases. A range of 17 virtual 'measurements' were generated by introducing a variety of simulated perturbations (such as MLC position deviations, dose differences, gantry rotation errors, Gaussian noise) to three different treatment plan cases. Participants were blinded to the 'measured' data details. Each group analysed the datasets using their own gamma index (γ) technique and using standardised parameters for passing criteria, lower dose threshold, γ normalisation and global γ. For the same virtual 'measured' datasets, different results were observed using local techniques. For the standardised γ, differences in the percentage of points passing with γ audit has been an informative step in understanding differences in the verification of measured dose distributions between different clinical trial QA groups. This work lays the foundations for audit reciprocity between groups, particularly with more clinical trials being open to international recruitment. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Archetypes of Supply Chain Analytics Initiatives—An Exploratory Study

    Directory of Open Access Journals (Sweden)

    Tino T. Herden

    2018-05-01

    Full Text Available While Big Data and Analytics are arguably rising stars of competitive advantage, their application is often presented and investigated as an overall approach. A plethora of methods and technologies combined with a variety of objectives creates a barrier for managers to decide how to act, while researchers investigating the impact of Analytics oftentimes neglect this complexity when generalizing their results. Based on a cluster analysis applied to 46 case studies of Supply Chain Analytics (SCA we propose 6 archetypes of initiatives in SCA to provide orientation for managers as means to overcome barriers and build competitive advantage. Further, the derived archetypes present a distinction of SCA for researchers seeking to investigate the effects of SCA on organizational performance.

  18. Analytical quality control in environmental analysis - Recent results and future trends of the IAEA's analytical quality control programme

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Heinonen, J

    1973-12-01

    The significance of analytical results depends critically on the degree of their reliability, an assessment of this reliability is indispensable if the results are to have any meaning at all. Environmental radionuclide analysis is a relatively new analytical field in which new methods are continuously being developed and into which many new laboratories have entered during the last ten to fifteen years. The scarcity of routine methods and the lack of experience of the new laboratories have made the need for the assessment of the reliability of results particularly urgent in this field. The IAEA, since 1962, has provided assistance to its member states by making available to their laboratories analytical quality control services in the form of standard samples, reference materials and the organization of analytical intercomparisons. The scope of this programme has increased over the years and now includes, in addition to environmental radionuclides, non-radioactive environmental contaminants which may be analysed by nuclear methods, materials for forensic neutron activation analysis, bioassay materials and nuclear fuel. The results obtained in recent intercomparisons demonstrate the continued need for these services. (author)

  19. Communicating Qualitative Analytical Results Following Grice's Conversational Maxims

    Science.gov (United States)

    Chenail, Jan S.; Chenail, Ronald J.

    2011-01-01

    Conducting qualitative research can be seen as a developing communication act through which researchers engage in a variety of conversations. Articulating the results of qualitative data analysis results can be an especially challenging part of this scholarly discussion for qualitative researchers. To help guide investigators through this…

  20. Results from transcranial Doppler examination on children and adolescents with sickle cell disease and correlation between the time-averaged maximum mean velocity and hematological characteristics: a cross-sectional analytical study

    Directory of Open Access Journals (Sweden)

    Mary Hokazono

    Full Text Available CONTEXT AND OBJECTIVE: Transcranial Doppler (TCD detects stroke risk among children with sickle cell anemia (SCA. Our aim was to evaluate TCD findings in patients with different sickle cell disease (SCD genotypes and correlate the time-averaged maximum mean (TAMM velocity with hematological characteristics. DESIGN AND SETTING: Cross-sectional analytical study in the Pediatric Hematology sector, Universidade Federal de São Paulo. METHODS: 85 SCD patients of both sexes, aged 2-18 years, were evaluated, divided into: group I (62 patients with SCA/Sß0 thalassemia; and group II (23 patients with SC hemoglobinopathy/Sß+ thalassemia. TCD was performed and reviewed by a single investigator using Doppler ultrasonography with a 2 MHz transducer, in accordance with the Stroke Prevention Trial in Sickle Cell Anemia (STOP protocol. The hematological parameters evaluated were: hematocrit, hemoglobin, reticulocytes, leukocytes, platelets and fetal hemoglobin. Univariate analysis was performed and Pearson's coefficient was calculated for hematological parameters and TAMM velocities (P < 0.05. RESULTS: TAMM velocities were 137 ± 28 and 103 ± 19 cm/s in groups I and II, respectively, and correlated negatively with hematocrit and hemoglobin in group I. There was one abnormal result (1.6% and five conditional results (8.1% in group I. All results were normal in group II. Middle cerebral arteries were the only vessels affected. CONCLUSION: There was a low prevalence of abnormal Doppler results in patients with sickle-cell disease. Time-average maximum mean velocity was significantly different between the genotypes and correlated with hematological characteristics.

  1. Microwave magnetoelectric fields: An analytical study of topological characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Joffe, R., E-mail: ioffr1@gmail.com [Microwave Magnetic Laboratory, Department of Electrical and Computer Engineering, Ben Gurion University of the Negev, Beer Sheva (Israel); Department of Electrical and Electronics Engineering, Shamoon College of Engineering, Beer Sheva (Israel); Shavit, R.; Kamenetskii, E.O. [Microwave Magnetic Laboratory, Department of Electrical and Computer Engineering, Ben Gurion University of the Negev, Beer Sheva (Israel)

    2015-10-15

    The near fields originated from a small quasi-two-dimensional ferrite disk with magnetic-dipolar-mode (MDM) oscillations are the fields with broken dual (electric-magnetic) symmetry. Numerical studies show that such fields – called the magnetoelectric (ME) fields – are distinguished by the power-flow vortices and helicity parameters (E.O. Kamenetskii, R. Joffe, R. Shavit, Phys. Rev. E 87 (2013) 023201). These numerical studies can well explain recent experimental results with MDM ferrite disks. In the present paper, we obtain analytically topological characteristics of the ME-field modes. For this purpose, we used a method of successive approximations. In the second approximation we take into account the influence of the edge regions of an open ferrite disk, which are excluded in the first-approximation solving of the magnetostatic (MS) spectral problem. Based on the analytical method, we obtain a “pure” structure of the electric and magnetic fields outside the MDM ferrite disk. The analytical studies can display some fundamental features that are non-observable in the numerical results. While in numerical investigations, one cannot separate the ME fields from the external electromagnetic (EM) radiation, the present theoretical analysis allows clearly distinguish the eigen topological structure of the ME fields. Importantly, this ME-field structure gives evidence for certain phenomena that can be related to the Tellegen and bianisotropic coupling effects. We discuss the question whether the MDM ferrite disk can exhibit properties of the cross magnetoelectric polarizabilities. - Highlights: • We obtain analytically topological characteristics of the ME-field modes. • We take into account the influence of the edge regions of an open ferrite disk. • We obtain a “pure” structure of the electromagnetic fields outside the ferrite disk. • Analytical studies show features that are non-observable in the numerical results. • ME-field gives evidence for

  2. Microwave magnetoelectric fields: An analytical study of topological characteristics

    International Nuclear Information System (INIS)

    Joffe, R.; Shavit, R.; Kamenetskii, E.O.

    2015-01-01

    The near fields originated from a small quasi-two-dimensional ferrite disk with magnetic-dipolar-mode (MDM) oscillations are the fields with broken dual (electric-magnetic) symmetry. Numerical studies show that such fields – called the magnetoelectric (ME) fields – are distinguished by the power-flow vortices and helicity parameters (E.O. Kamenetskii, R. Joffe, R. Shavit, Phys. Rev. E 87 (2013) 023201). These numerical studies can well explain recent experimental results with MDM ferrite disks. In the present paper, we obtain analytically topological characteristics of the ME-field modes. For this purpose, we used a method of successive approximations. In the second approximation we take into account the influence of the edge regions of an open ferrite disk, which are excluded in the first-approximation solving of the magnetostatic (MS) spectral problem. Based on the analytical method, we obtain a “pure” structure of the electric and magnetic fields outside the MDM ferrite disk. The analytical studies can display some fundamental features that are non-observable in the numerical results. While in numerical investigations, one cannot separate the ME fields from the external electromagnetic (EM) radiation, the present theoretical analysis allows clearly distinguish the eigen topological structure of the ME fields. Importantly, this ME-field structure gives evidence for certain phenomena that can be related to the Tellegen and bianisotropic coupling effects. We discuss the question whether the MDM ferrite disk can exhibit properties of the cross magnetoelectric polarizabilities. - Highlights: • We obtain analytically topological characteristics of the ME-field modes. • We take into account the influence of the edge regions of an open ferrite disk. • We obtain a “pure” structure of the electromagnetic fields outside the ferrite disk. • Analytical studies show features that are non-observable in the numerical results. • ME-field gives evidence for

  3. Intensity-modulated radiation therapy: dynamic MLC (DMLC) therapy, multisegment therapy and tomotherapy. An example of QA in DMLC therapy

    International Nuclear Information System (INIS)

    Webb, S.

    1998-01-01

    Intensity-modulated radiation therapy will make a quantum leap in tumor control. It is the new radiation therapy for the new millennium. The major methods to achieve IMRT are: 1. Dynamic multileaf collimator (DMLC) therapy, 2. multisegment therapy, and 3. tomotherapy. The principles of these 3 techniques are briefly reviewed. Each technique presents unique QA issues which are outlined. As an example this paper will present the results of a recent new study of an important QA concern in DMLC therapy. (orig.) [de

  4. Analytical results of Tank 38H core samples -- Fall 1999

    International Nuclear Information System (INIS)

    Swingle, R.F.

    2000-01-01

    Two samples were pulled from Tank 38H in the Fall of 1999: a variable depth sample (VDS) of the supernate was pulled in October and a core sample from the salt layer was pulled in December. Analysis of the rinse from the outside of the core sample indicated no sign of volatile or semivolatile organics. Both supernate and solids from the VDS and the dried core sample solids were analyzed for isotopes which could pose a criticality concern and also for elements which could serve as neutron poisons, as well as other elements. Results of the elemental analyses of these samples show significant elements present to mitigate the potential for nuclear criticality. However, it should be noted the results given for the VDS solids elemental analyses may be higher than the actual concentration in the solids, since the filter paper was dissolved along with the sample solids

  5. TMI-2 core debris analytical methods and results

    International Nuclear Information System (INIS)

    Akers, D.W.; Cook, B.A.

    1984-01-01

    A series of six grab samples was taken from the debris bed of the TMI-2 core in early September 1983. Five of these samples were sent to the Idaho National Engineering Laboratory for analysis. Presented is the analysis strategy for the samples and some of the data obtained from the early stages of examination of the samples (i.e., particle size-analysis, gamma spectrometry results, and fissile/fertile material analysis)

  6. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  7. Analytical results from Tank 38H criticality Sample HTF-093

    International Nuclear Information System (INIS)

    Wilmarth, W.R.

    2000-01-01

    Resumption of processing in the 242-16H Evaporator could cause salt dissolution in the Waste Concentration Receipt Tank (Tank 38H). Therefore, High Level Waste personnel sampled the tank at the salt surface. Results of elemental analysis of the dried sludge solids from this sample (HTF-093) show significant quantities of neutron poisons (i.e., sodium, iron, and manganese) present to mitigate the potential for nuclear criticality. Comparison of this sample with the previous chemical and radiometric analyses of H-Area Evaporator samples show high poison to actinide ratios

  8. On the use of biomathematical models in patient-specific IMRT dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen Heming [UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Tome, Wolfgang A. [Department of Radiation Oncology, Division of Medical Physics, Montefiore Medical Center and Institute of Onco-Physics, Albert Einstein College of Medicine, Bronx, New York 10461 (United States)

    2013-07-15

    Purpose: To investigate the use of biomathematical models such as tumor control probability (TCP) and normal tissue complication probability (NTCP) as new quality assurance (QA) metrics.Methods: Five different types of error (MLC transmission, MLC penumbra, MLC tongue and groove, machine output, and MLC position) were intentionally induced to 40 clinical intensity modulated radiation therapy (IMRT) patient plans (20 H and N cases and 20 prostate cases) to simulate both treatment planning system errors and machine delivery errors in the IMRT QA process. The changes in TCP and NTCP for eight different anatomic structures (H and N: CTV, GTV, both parotids, spinal cord, larynx; prostate: CTV, rectal wall) were calculated as the new QA metrics to quantify the clinical impact on patients. The correlation between the change in TCP/NTCP and the change in selected DVH values was also evaluated. The relation between TCP/NTCP change and the characteristics of the TCP/NTCP curves is discussed.Results:{Delta}TCP and {Delta}NTCP were summarized for each type of induced error and each structure. The changes/degradations in TCP and NTCP caused by the errors vary widely depending on dose patterns unique to each plan, and are good indicators of each plan's 'robustness' to that type of error.Conclusions: In this in silico QA study the authors have demonstrated the possibility of using biomathematical models not only as patient-specific QA metrics but also as objective indicators that quantify, pretreatment, a plan's robustness with respect to possible error types.

  9. On the use of biomathematical models in patient-specific IMRT dose QA

    International Nuclear Information System (INIS)

    Zhen Heming; Nelms, Benjamin E.; Tomé, Wolfgang A.

    2013-01-01

    Purpose: To investigate the use of biomathematical models such as tumor control probability (TCP) and normal tissue complication probability (NTCP) as new quality assurance (QA) metrics.Methods: Five different types of error (MLC transmission, MLC penumbra, MLC tongue and groove, machine output, and MLC position) were intentionally induced to 40 clinical intensity modulated radiation therapy (IMRT) patient plans (20 H and N cases and 20 prostate cases) to simulate both treatment planning system errors and machine delivery errors in the IMRT QA process. The changes in TCP and NTCP for eight different anatomic structures (H and N: CTV, GTV, both parotids, spinal cord, larynx; prostate: CTV, rectal wall) were calculated as the new QA metrics to quantify the clinical impact on patients. The correlation between the change in TCP/NTCP and the change in selected DVH values was also evaluated. The relation between TCP/NTCP change and the characteristics of the TCP/NTCP curves is discussed.Results:ΔTCP and ΔNTCP were summarized for each type of induced error and each structure. The changes/degradations in TCP and NTCP caused by the errors vary widely depending on dose patterns unique to each plan, and are good indicators of each plan's “robustness” to that type of error.Conclusions: In this in silico QA study the authors have demonstrated the possibility of using biomathematical models not only as patient-specific QA metrics but also as objective indicators that quantify, pretreatment, a plan's robustness with respect to possible error types

  10. Physics acceptance and QA procedures for IMRT

    International Nuclear Information System (INIS)

    LoSasso, T.; Ling, C.

    2001-01-01

    Full text: Intensity modulated radiation therapy (IMRT) may improve tumor control without compromising normal tissues by facilitating higher, more conformal tumor doses relative to 3D CRT. Intensity modulation (IM) is now possible with inverse planning and radiation delivery using dynamic multileaf collimation. Compared to 3D CRT, certain components in the IMRT process are more obscure to the user. Thus, special quality assurance procedures are required. Hardware and software are still relatively new to many users, and the potential for error is unknown. The relationship between monitor unit (MU) setting and radiation dose for IM beams is much more complex than for non-IM fields. The leaf sequence computer files, which control the MLC position as a function of MU, are large and do not lend themselves to simple manual verification. The 'verification' port film for each IM treatment field, usually obtained with the MLC set at the extreme leaf positions for that field to outline the entire irradiated area, does not verify the intensity modulation pattern. Finally, in IMRT using DMLC (the so-called sliding window technique), a small error in the window (or gap) width will lead to a significant dose error. In earlier papers, we provided an evaluation of the mechanical and dosimetric aspects in the use of a MLC in the dynamic mode. Mechanical tolerances are significantly tighter for DMLC than for static MLC treatments. Transmission through the leaves and through rounded leaf ends and head scatter were shown to be significant to the accuracy of radiation dose delivery using DMLC. With these considerations, we concluded that the present DMLC hardware and software are effective for routine clinical implementation, provided that a carefully designed routine QA procedure is followed to assure the normality of operation. In our earlier studies, an evaluation of the long-term stability of DMLC operation had not yet been performed. This paper describes the current status of our

  11. Improvement of QA/QC activities in the construction of nuclear power plant

    International Nuclear Information System (INIS)

    Jinji Tomita; Shigetaka Tomaru

    1987-01-01

    Construction of commercial nuclear power plants in Japan started at around 1965. In this presentation are described quality assurance (QA) activities of a plant supplier who is a manufacturer of the key components as well. The QA activities until now are divided into several periods of the construction history in Japan. First term is 1960's when the QA activities are featured as the study and implementation through the construction of imported plants. Since then technologies and procedures of our own have been established and improved for the construction of high reliability plants. Our present QA activities are based on the active reflection of those lessons learned of past experiences. (author)

  12. Rigorous results of low-energy models of the analytic S-matrix theory

    International Nuclear Information System (INIS)

    Meshcheryakov, V.A.

    1974-01-01

    Results of analytic S-matrix theory, mainly dealing with the static limit of dispersion relations, are applied to pion-nucleon scattering in the low-energy region. Various approaches to solving equations of the chew-Low type are discussed. It is concluded that interesting results are obtained by reducing the equations to a system of nonlinear difference equations; the crucial element of this approach being the study of functions on the whole Riemann surface. Boundary and crossing symmetry conditions are studied. (HFdV)

  13. SU-F-T-558: ArcCheck for Patient Specific QA in Stereotactic Ablative Radiotherapy

    International Nuclear Information System (INIS)

    Ramachandran, P; Tajaldeen, A; Esen, N; Geso, M; Taylor, D; Wanigaratne, D; Roozen, K; Kron, T

    2016-01-01

    Purpose: Stereotactic Ablative Radiotherapy (SABR) is one of the most preferred treatment techniques for early stage lung cancer. This technique has been extended to other treatment sites like Spine, Liver, Scapula, Sternum etc., This has resulted in increased physics QA time on machine. In this study, we’ve tested the feasibility of using ArcCheck as an alternative method to replace film dosimetry. Methods: Twelve patients with varied diagnosis of Lung, Liver, scapula, sternum and Spine undergoing SABR were selected for this study. Pre-treatment QA was performed for all the patients which include ionization chamber and film dosimetry. The required gamma criteria for each SABR plan to pass QA and proceed to treatment is 95% (3%,1mm). In addition to this routine process, the treatment plans were exported on to an ArcCheck phantom. The planned and measured dose from the ArcCheck device were compared using four different gamma criteria: 2%,2 mm, 3%,2 mm, 3%,1 mm and 3%, 3 mm. In addition to this, we’ve also introduced errors to gantry, collimator and couch angle to assess sensitivity of the ArcCheck with potential delivery errors. Results: The ArcCheck mean passing rates for all twelve cases were 76.1%±9.7% for gamma criteria 3%,1 mm, 89.5%±5.3% for 2%,2 mm, 92.6%±4.2% for 3%,2 mm, and 97.6%±2.4% for 3%,3 mm gamma criteria. When SABR spine cases are excluded, we observe ArcCheck passing rates higher than 95% for all the studied cases with 3%, 3mm, and ArcCheck results in acceptable agreement with the film gamma results. Conclusion: Our ArcCheck results at 3%, 3 mm were found to correlate well with our non-SABR spine routine patient specific QA results (3%,1 mm). We observed significant reduction in QA time on using ArcCheck for SABR QA. This study shows that ArcCheck could replace film dosimetry for all sites except SABR spine.

  14. SU-F-T-558: ArcCheck for Patient Specific QA in Stereotactic Ablative Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Ramachandran, P [Peter MacCallum Cancer Centre, Melbourne (Australia); RMIT University, Bundoora (Australia); Tajaldeen, A; Esen, N; Geso, M [RMIT University, Bundoora (Australia); Taylor, D; Wanigaratne, D; Roozen, K; Kron, T [Peter MacCallum Cancer Centre, Melbourne (Australia)

    2016-06-15

    Purpose: Stereotactic Ablative Radiotherapy (SABR) is one of the most preferred treatment techniques for early stage lung cancer. This technique has been extended to other treatment sites like Spine, Liver, Scapula, Sternum etc., This has resulted in increased physics QA time on machine. In this study, we’ve tested the feasibility of using ArcCheck as an alternative method to replace film dosimetry. Methods: Twelve patients with varied diagnosis of Lung, Liver, scapula, sternum and Spine undergoing SABR were selected for this study. Pre-treatment QA was performed for all the patients which include ionization chamber and film dosimetry. The required gamma criteria for each SABR plan to pass QA and proceed to treatment is 95% (3%,1mm). In addition to this routine process, the treatment plans were exported on to an ArcCheck phantom. The planned and measured dose from the ArcCheck device were compared using four different gamma criteria: 2%,2 mm, 3%,2 mm, 3%,1 mm and 3%, 3 mm. In addition to this, we’ve also introduced errors to gantry, collimator and couch angle to assess sensitivity of the ArcCheck with potential delivery errors. Results: The ArcCheck mean passing rates for all twelve cases were 76.1%±9.7% for gamma criteria 3%,1 mm, 89.5%±5.3% for 2%,2 mm, 92.6%±4.2% for 3%,2 mm, and 97.6%±2.4% for 3%,3 mm gamma criteria. When SABR spine cases are excluded, we observe ArcCheck passing rates higher than 95% for all the studied cases with 3%, 3mm, and ArcCheck results in acceptable agreement with the film gamma results. Conclusion: Our ArcCheck results at 3%, 3 mm were found to correlate well with our non-SABR spine routine patient specific QA results (3%,1 mm). We observed significant reduction in QA time on using ArcCheck for SABR QA. This study shows that ArcCheck could replace film dosimetry for all sites except SABR spine.

  15. Widget, widget as you lead, I am performing well indeed! Using results from an exploratory offline study to inform an empirical online study about a learning analytics widget in a collaborative learning environment

    NARCIS (Netherlands)

    Scheffel, Maren; Drachsler, Hendrik; Kreijns, Karel; De Kraker, Joop; Specht, Marcus

    2017-01-01

    The collaborative learning processes of students in online learning environments can be supported by providing learning analytics-based visualisations that foster awareness and reflection about an individual's as well as the team's behaviour and their learning and collaboration processes. For this

  16. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  17. Waste-management QA training and motivation

    International Nuclear Information System (INIS)

    Henderson, J.T.

    1982-01-01

    Early in the development of a QA Program for the Waste Management and Geotechnical Projects Directorate, thought was given to establishing a QA Training Program commensurate with the needs and appropriate to the motivation of a staff of more than 130 scientists and project leaders. These individuals, i.e., researchers rather than hardware designers, had no prior experience with QA programs, and from their perception generally did not believe that such controls had any merit. Therefore, historically proven approaches to QA training had to be quickly modified or totally discarded. For instance, due to the mobility and diversity of backgrounds of personnel at SNL, the QA training program had to accommodate many different levels of QA maturity at any given time. Furthermore, since the application of QA to R and D was continuing to profit from project-specific lessons learned, these improvements in the QA program had to be easily and quickly imparted in the general staff's evolving awareness of QA. A somewhat novel approach to QA training has been developed that draws heavily upon SNL's existing In-Hours Technical Education Courses (INTEC) studio capabilities. This training attempts to accommodate individual staff needs and to ensure the required QA skills and awareness for the diverse types of programs addressed

  18. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    International Nuclear Information System (INIS)

    Nelson, C; Mason, B; Kirsner, S; Ohrt, J

    2015-01-01

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, and MLC errors. The gantry and collimator errors were off by 4 0 for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors

  19. Dispersion of helically corrugated waveguides: Analytical, numerical, and experimental study

    International Nuclear Information System (INIS)

    Burt, G.; Ronald, K.; Young, A.R.; Phelps, A.D.R.; Cross, A.W.; Konoplev, I.V.; He, W.; Thomson, J.; Whyte, C.G.; Samsonov, S.V.; Denisov, G.G.; Bratman, V.L.

    2004-01-01

    Helically corrugated waveguides have recently been studied for use in various applications such as interaction regions in gyrotron traveling-wave tubes and gyrotron backward-wave oscillators and as a dispersive medium for passive microwave pulse compression. The paper presents a summary of various methods that can be used for analysis of the wave dispersion of such waveguides. The results obtained from an analytical approach, simulations with the three-dimensional numerical code MAGIC, and cold microwave measurements are analyzed and compared

  20. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  1. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. STRengthening analytical thinking for observational studies

    DEFF Research Database (Denmark)

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G.

    2014-01-01

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, ma...

  3. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  4. Complete analytic results for radiative-recoil corrections to ground-state muonium hyperfine splitting

    International Nuclear Information System (INIS)

    Karshenboim, S.G.; Shelyuto, V.A.; Eides, M.E.

    1988-01-01

    Analytic expressions are obtained for radiative corrections to the hyperfine splitting related to the muon line. The corresponding contribution amounts to (Z 2 a) (Za) (m/M) (9/2 ζ(3) - 3π 2 ln 2 + 39/8) in units of the Fermi hyperfine splitting energy. A complete analytic result for all radiative-recoil corrections is also presented

  5. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    International Nuclear Information System (INIS)

    Lah, J; Shin, D; Kim, G

    2015-01-01

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary to meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design

  6. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    Energy Technology Data Exchange (ETDEWEB)

    Lah, J [Myongji Hospital, Goyang, Gyeonggi-do (Korea, Republic of); Shin, D [National Cancer Center, Goyang-si, Gyeonggi-do (Korea, Republic of); Kim, G [University of California, San Diego, La Jolla, CA (United States)

    2015-06-15

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary to meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.

  7. Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.

    Science.gov (United States)

    Drever, L; Salomons, G

    2012-07-01

    Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.

  8. An analytical study on the thermal stress of mass concrete

    International Nuclear Information System (INIS)

    Yoshida, H.; Sawada, T.; Yamazaki, M.; Miyashita, T.; Morikawa, H.; Hayami, Y.; Shibata, K.

    1983-01-01

    The thermal stress in mass concrete occurs as a result of the effect associated with the heat of hydration of the cement. Sometimes, the excessive stresses cause the cracking or other tensile failure in concrete. Therefore it is becoming necessary in the design and construction of mass concrete to predict the thermal stress. The thermal stress analysis of mass concrete requires to take account of the dependence of the elastic modulus on the age of concrete as well as the stress relaxation by creep effect. The studies of those phenomena and the analytical methods have been reported so far. The paper presents the analytical method and discusses its reliability through the application of the method to the actual structure, measuring the temperatures and the thermal stresses. The method is the time dependent thermal stress analysis based on the finite element method, which takes account of creep effect, the aging of concrete and the effect of temperature variation in time. (orig./HP)

  9. Final Report on the Analytical Results for Tank Farm Samples in Support of Salt Dissolution Evaluation

    International Nuclear Information System (INIS)

    Hobbs, D.T.

    1996-01-01

    Recent processing of dilute solutions through the 2H-Evaporator system caused dissolution of salt in Tank 38H, the concentrate receipt tank. This report documents analytical results for samples taken from this evaporator system

  10. New analytical results in the electromagnetic response of composite superconducting wire in parallel fields

    NARCIS (Netherlands)

    Niessen, E.M.J.; Niessen, E.M.J.; Zandbergen, P.J.

    1993-01-01

    Analytical results are presented concerning the electromagnetic response of a composite superconducting wire in fields parallel to the wire axis, using the Maxwell equations supplemented with constitutive equations. The problem is nonlinear due to the nonlinearity in the constitutive equation

  11. Results of an interlaboratory comparison of analytical methods for contaminants of emerging concern in water.

    Science.gov (United States)

    Vanderford, Brett J; Drewes, Jörg E; Eaton, Andrew; Guo, Yingbo C; Haghani, Ali; Hoppe-Jones, Christiane; Schluesener, Michael P; Snyder, Shane A; Ternes, Thomas; Wood, Curtis J

    2014-01-07

    An evaluation of existing analytical methods used to measure contaminants of emerging concern (CECs) was performed through an interlaboratory comparison involving 25 research and commercial laboratories. In total, 52 methods were used in the single-blind study to determine method accuracy and comparability for 22 target compounds, including pharmaceuticals, personal care products, and steroid hormones, all at ng/L levels in surface and drinking water. Method biases ranged from caffeine, NP, OP, and triclosan had false positive rates >15%. In addition, some methods reported false positives for 17β-estradiol and 17α-ethynylestradiol in unspiked drinking water and deionized water, respectively, at levels higher than published predicted no-effect concentrations for these compounds in the environment. False negative rates were also generally contamination, misinterpretation of background interferences, and/or inappropriate setting of detection/quantification levels for analysis at low ng/L levels. The results of both comparisons were collectively assessed to identify parameters that resulted in the best overall method performance. Liquid chromatography-tandem mass spectrometry coupled with the calibration technique of isotope dilution were able to accurately quantify most compounds with an average bias of <10% for both matrixes. These findings suggest that this method of analysis is suitable at environmentally relevant levels for most of the compounds studied. This work underscores the need for robust, standardized analytical methods for CECs to improve data quality, increase comparability between studies, and help reduce false positive and false negative rates.

  12. QA manpower requirement for nuclear power plants

    International Nuclear Information System (INIS)

    Link, M.

    1980-01-01

    To ensure the quality of the plant, QA activities are to be performed by the owner, the main contractor, the subcontractors and the Licensing Authority. The responsibilities of the QA-personnel of these organizations comprise as a minimum the control of the quality assurance systems and the proof of the quality requirements. Numbers of the required QA-personnel, designated for different tasks and recommended educational levels and professional qualifications will be given. (orig./RW)

  13. Comparison between numerical and analytical results on the required rf current for stabilizing neoclassical tearing modes

    Science.gov (United States)

    Wang, Xiaojing; Yu, Qingquan; Zhang, Xiaodong; Zhang, Yang; Zhu, Sizheng; Wang, Xiaoguang; Wu, Bin

    2018-04-01

    Numerical studies on the stabilization of neoclassical tearing modes (NTMs) by electron cyclotron current drive (ECCD) have been carried out based on reduced MHD equations, focusing on the amount of the required driven current for mode stabilization and the comparison with analytical results. The dependence of the minimum driven current required for NTM stabilization on some parameters, including the bootstrap current density, radial width of the driven current, radial deviation of the driven current from the resonant surface, and the island width when applying ECCD, are studied. By fitting the numerical results, simple expressions for these dependences are obtained. Analysis based on the modified Rutherford equation (MRE) has also been carried out, and the corresponding results have the same trend as numerical ones, while a quantitative difference between them exists. This difference becomes smaller when the applied radio frequency (rf) current is smaller.

  14. A Web-Based Geovisual Analytical System for Climate Studies

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2012-12-01

    Full Text Available Climate studies involve petabytes of spatiotemporal datasets that are produced and archived at distributed computing resources. Scientists need an intuitive and convenient tool to explore the distributed spatiotemporal data. Geovisual analytical tools have the potential to provide such an intuitive and convenient method for scientists to access climate data, discover the relationships between various climate parameters, and communicate the results across different research communities. However, implementing a geovisual analytical tool for complex climate data in a distributed environment poses several challenges. This paper reports our research and development of a web-based geovisual analytical system to support the analysis of climate data generated by climate model. Using the ModelE developed by the NASA Goddard Institute for Space Studies (GISS as an example, we demonstrate that the system is able to (1 manage large volume datasets over the Internet; (2 visualize 2D/3D/4D spatiotemporal data; (3 broker various spatiotemporal statistical analyses for climate research; and (4 support interactive data analysis and knowledge discovery. This research also provides an example for managing, disseminating, and analyzing Big Data in the 21st century.

  15. Analytical and Experimental Study of Residual Stresses in CFRP

    Directory of Open Access Journals (Sweden)

    Chia-Chin Chiang

    2013-01-01

    Full Text Available Fiber Bragg Grating sensors (FBGs have been utilized in various engineering and photoelectric fields because of their good environment tolerance. In this research, residual stresses of carbon fiber reinforced polymer composites (CFRP were studied using both experimental and analytical approach. The FBGs were embedded inside middle layers of CFRP to study the formation of residual stress during curing process. Finite element analysis was performed using ABAQUS software to simulate the CFRP curing process. Both experimental and simulation results showed that the residual stress appeared during cooling process and the residual stresses could be released when the CFRP was machined to a different shape.

  16. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    Energy Technology Data Exchange (ETDEWEB)

    Woollard, J; Ayan, A; DiCostanzo, D; Grzetic, S; Hessler, J; Gupta, N [OH State University, Columbus, OH (United States)

    2015-06-15

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed on each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs.

  17. SU-E-T-468: Implementation of the TG-142 QA Process for Seven Linacs with Enhanced Beam Conformance

    International Nuclear Information System (INIS)

    Woollard, J; Ayan, A; DiCostanzo, D; Grzetic, S; Hessler, J; Gupta, N

    2015-01-01

    Purpose: To develop a TG-142 compliant QA process for 7 Varian TrueBeam linear accelerators (linacs) with enhanced beam conformance and dosimetrically matched beam models. To ensure consistent performance of all 7 linacs, the QA process should include a common set of baseline values for use in routine QA on all linacs. Methods: The TG 142 report provides recommended tests, tolerances and frequencies for quality assurance of medical accelerators. Based on the guidance provided in the report, measurement tests were developed to evaluate each of the applicable parameters listed for daily, monthly and annual QA. These tests were then performed on each of our 7 new linacs as they came on line at our institution. Results: The tolerance values specified in TG-142 for each QA test are either absolute tolerances (i.e. ±2mm) or require a comparison to a baseline value. The results of our QA tests were first used to ensure that all 7 linacs were operating within the suggested tolerance values provided in TG −142 for those tests with absolute tolerances and that the performance of the linacs was adequately matched. The QA test results were then used to develop a set of common baseline values for those QA tests that require comparison to a baseline value at routine monthly and annual QA. The procedures and baseline values were incorporated into a spreadsheets for use in monthly and annual QA. Conclusion: We have developed a set of procedures for daily, monthly and annual QA of our linacs that are consistent with the TG-142 report. A common set of baseline values was developed for routine QA tests. The use of this common set of baseline values for comparison at monthly and annual QA will ensure consistent performance of all 7 linacs

  18. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  19. Holistic versus Analytic Evaluation of EFL Writing: A Case Study

    Science.gov (United States)

    Ghalib, Thikra K.; Al-Hattami, Abdulghani A.

    2015-01-01

    This paper investigates the performance of holistic and analytic scoring rubrics in the context of EFL writing. Specifically, the paper compares EFL students' scores on a writing task using holistic and analytic scoring rubrics. The data for the study was collected from 30 participants attending an English undergraduate program in a Yemeni…

  20. Applications of QA to RandD support of HLW programs

    International Nuclear Information System (INIS)

    Ryder, D.E.

    1988-05-01

    The application of a formal QA program to any discipline or organization can be difficult to achieve and to do so with a research and development organization has special challenges that are somewhat unique. This paper describes how a QA program based upon a national consensus standard (developed for application to the design, construction and operation of nuclear facilities) has been successfully applied to some of the research and development activities in support of the High Level Waste Programs. This description includes a discussion on the importance of being creative when interpreting the QA standard, a brief overview of the QA program that was developed and the results achieved during implementation of the QA program. 4 refs., 4 figs

  1. Analytical Characterization of Rococo Paintings in Egypt: Preliminary Results from El-Gawhara Palace at Cairo

    Directory of Open Access Journals (Sweden)

    Fatma REFAAT

    2012-12-01

    Full Text Available El-Gawhara palace (1813–1814 AD is situated south of the Mosque of Muhammad Ali in the Cairo Citadel. This palace is an important example of the best early 19th century rococo decorations in Egypt. The present study reports some of the results obtained from the application of different analytical techniques to characterize some rococo paintings at El-Gawhara palace at Cairo, Egypt. The characterization of the studied paintings was carried out by means of optical microscopy (OM, scanning electron microscopy equipped with an energy dispersive X-ray detector (EDS and Fourier transform infrared spectroscopy (FT−IR. The obtained results allowed the identification of the chemical composition, structure and the painting technique employed in these paintings. This methodology reveals some useful information on some rococo paintings dating back to the 19th century in Egypt.

  2. Backache amongst soldiers: a prospective analytical study

    International Nuclear Information System (INIS)

    Ilyas, S.; Rehman, A.U.; Janjua, S.H.; Tarrar, A.M.

    2012-01-01

    Objective: To investigate the occupational predispositions of low back pain in soldiers Study Design: Descriptive study. Place and Duration of Study: Combined Military Hospital, Bahawalpur, from June 2009 to Jan 2010. Patients and Methods: A questionnaire was developed to investigate the occupation-related issues in soldiers reporting with low backache in surgical OPD at CMH Bahawalpur. It included personal and occupational factors. The body mass index was also calculated. Of the 107 male soldiers assessed, 90 were enrolled into the study. The statistical analysis was performed by descriptive analysis of the data using SPSS 17.0. Results: Of all the soldiers evaluated (n=90), 32 (35.6%) were clerks/computer operators, 21 (23.1%) were drivers and 14(15.6%) were signal men. All were males (100%) and the average BMI was 24.8 kg/m2. The 69 (76.7%) patients who had backache had prolonged working hours (average 10.8 hours per day), 68 (75.6%) patients used to sleep over tape/nawar bed and only 12 (13.3%) had been sleeping on mattresses. The onset of pain was sudden in 58 (64.4%) patients. 27 (23.3%) had developed acute backache after prolonged sitting, 21 (30%) after lifting heavy objects. The pain was exaggerated by doing morning physical training 82 (91.1%), prolonged sitting 61 (67.8%) and standing with rifle 24 (26.7%). Conclusion: The prevalence of low back pain in sedentary occupation or soldiers on sitting jobs was higher 69 (76%). The number of working hours on these occupations was associated with occurrence as well as aggravation of low back pain. (author)

  3. QA engineering for the LCP USA magnet manufacturers

    International Nuclear Information System (INIS)

    Childress, C.E.; Batey, J.E.; Burn, P.B.

    1981-01-01

    This paper describes the QA and QC efforts and results used in fabricating the superconducting magnets of competing designs being developed by American Manufacturers for testing in the ORNL Large Coil Test Facility. Control of the design, materials and processes to assure proper functioning of the magnets in the test facility as well as the content of archival data being compiled is discussed

  4. EPA's radon study results

    International Nuclear Information System (INIS)

    Dowd, R.M.

    1988-01-01

    Last winter, in cooperation with agencies in 10 states and two metropolitan area counties, EPA measured the indoor air radon concentrations of 14,000 houses, some chosen statistically at random and some by request of the homeowner. Passive measurement methodologies were used, such as exposing a charcoal canister to the air for a few days and allowing the air to migrate in to the charcoal naturally. To reduce dilution of radon by the outside air, the protocol required that the house be shut up; therefore, the study was conducted during winter. The measuring device was placed in the lowest livable area (usually the basement) of each house to maximize potential concentration. It should be noted that these procedures are generally considered to be screening tests because they result in a worst-case measurement rather than a best value. The results of these findings are presented

  5. Circular orbits of corotating binary black holes: Comparison between analytical and numerical results

    International Nuclear Information System (INIS)

    Damour, Thibault; Gourgoulhon, Eric; Grandclement, Philippe

    2002-01-01

    We compare recent numerical results, obtained within a 'helical Killing vector' approach, on circular orbits of corotating binary black holes to the analytical predictions made by the effective one-body (EOB) method (which has been recently extended to the case of spinning bodies). On the scale of the differences between the results obtained by different numerical methods, we find good agreement between numerical data and analytical predictions for several invariant functions describing the dynamical properties of circular orbits. This agreement is robust against the post-Newtonian accuracy used for the analytical estimates, as well as under choices of the resummation method for the EOB 'effective potential', and gets better as one uses a higher post-Newtonian accuracy. These findings open the way to a significant 'merging' of analytical and numerical methods, i.e. to matching an EOB-based analytical description of the (early and late) inspiral, up to the beginning of the plunge, to a numerical description of the plunge and merger. We illustrate also the 'flexibility' of the EOB approach, i.e. the possibility of determining some 'best fit' values for the analytical parameters by comparison with numerical data

  6. Nonlinear heat conduction equations with memory: Physical meaning and analytical results

    Science.gov (United States)

    Artale Harris, Pietro; Garra, Roberto

    2017-06-01

    We study nonlinear heat conduction equations with memory effects within the framework of the fractional calculus approach to the generalized Maxwell-Cattaneo law. Our main aim is to derive the governing equations of heat propagation, considering both the empirical temperature-dependence of the thermal conductivity coefficient (which introduces nonlinearity) and memory effects, according to the general theory of Gurtin and Pipkin of finite velocity thermal propagation with memory. In this framework, we consider in detail two different approaches to the generalized Maxwell-Cattaneo law, based on the application of long-tail Mittag-Leffler memory function and power law relaxation functions, leading to nonlinear time-fractional telegraph and wave-type equations. We also discuss some explicit analytical results to the model equations based on the generalized separating variable method and discuss their meaning in relation to some well-known results of the ordinary case.

  7. mosaicQA - A General Approach to Facilitate Basic Data Quality Assurance for Epidemiological Research.

    Science.gov (United States)

    Bialke, Martin; Rau, Henriette; Schwaneberg, Thea; Walk, Rene; Bahls, Thomas; Hoffmann, Wolfgang

    2017-05-29

    Epidemiological studies are based on a considerable amount of personal, medical and socio-economic data. To answer research questions with reliable results, epidemiological research projects face the challenge of providing high quality data. Consequently, gathered data has to be reviewed continuously during the data collection period. This article describes the development of the mosaicQA-library for non-statistical experts consisting of a set of reusable R functions to provide support for a basic data quality assurance for a wide range of application scenarios in epidemiological research. To generate valid quality reports for various scenarios and data sets, a general and flexible development approach was needed. As a first step, a set of quality-related questions, targeting quality aspects on a more general level, was identified. The next step included the design of specific R-scripts to produce proper reports for metric and categorical data. For more flexibility, the third development step focussed on the generalization of the developed R-scripts, e.g. extracting characteristics and parameters. As a last step the generic characteristics of the developed R functionalities and generated reports have been evaluated using different metric and categorical datasets. The developed mosaicQA-library generates basic data quality reports for multivariate input data. If needed, more detailed results for single-variable data, including definition of units, variables, descriptions, code lists and categories of qualified missings, can easily be produced. The mosaicQA-library enables researchers to generate reports for various kinds of metric and categorical data without the need for computational or scripting knowledge. At the moment, the library focusses on the data structure quality and supports the assessment of several quality indicators, including frequency, distribution and plausibility of research variables as well as the occurrence of missing and extreme values. To

  8. AN ANALYTICAL STUDY IN ADHESIVE BOWEL OBSTRUCTION

    Directory of Open Access Journals (Sweden)

    Gerald Anand Raja

    2017-04-01

    Full Text Available BACKGROUND Peritoneal adhesions can be defined as abnormal fibrous bands between organs or tissues or both in the abdominal cavity that are normally separated. Adhesions may be acquired or congenital; however, most are acquired as a result of peritoneal injury, the most common cause of which is abdominopelvic surgery. Less commonly, adhesions may form as the result of inflammatory conditions, intraperitoneal infection or abdominal trauma. The extent of adhesion formation varies from one patient to another and is most dependent on the type and magnitude of surgery performed as well as whether any postoperative complications develop. Fortunately, most patients with adhesions do not experience any overt clinical symptoms. For others, adhesions may lead to any one of a host of problems and can be the cause of significant morbidity and mortality. MATERIALS AND METHODS This is a retrospective study of 50 patients admitted in Government Royapettah Hospital with adhesive bowel obstruction between September 2008 to September 2010. All patients were admitted and managed either conservatively or surgically. RESULTS 1. Adhesive bowel disease is the most common cause for bowel obstruction followed by hernias. 2. Increased incidence is noted in females. 3. Increased incidence of adhesions was documented in gynaecological and colorectal surgeries. 4. Below umbilical incisions have higher propensity for adhesion formation. 5. Laparotomies done for infective aetiology have higher adhesion risks. 6. Most of adhesive obstructions can be managed conservatively. 7. Adhesiolysis preferably laparoscopic can be done. For gangrenous bowel resection and anastomosis or ostomy done. 8. Given the above risk factors, adhesive bowel disease can be prevented to a certain extent. CONCLUSION The formation of peritoneal adhesions continues to plague patients, surgeons and society. Although, research in this area is ongoing, there is currently no method that is 100% effective in

  9. Analytical Study of Active Prosthetic Legs

    Science.gov (United States)

    Ono, Kyosuke; Katsumata, Mie

    Walking with prosthesis has not been well analyzed mathematically and it seems that the design of powered prosthesis has been done empirically so far. This paper presents a dynamic simulation of a normal human walking and walking with an active prosthesis. We also studied the two controlling methods of a powered thigh prosthesis based on multi-body simulation of human walking. First we measured the normal human walking gait, then, we showed that a 3-DOF human walking model can walk on level ground by applying tracking control to the measured walking gait within a certain range of tuned walking period. Next, we applied the tracking control and self-excited control to the powered thigh prosthesis and compared the robustness and efficiency of the two control methods by numerical simulation. As a result, we found that the self-excited control can significantly decrease the hip joint torque and specific cost to 1/3 compared with the tracking control. Moreover, the self-excited control is superior to the tracking control because tuning for the walking period is not needed for the active prosthetic leg.

  10. Follow-up utterances in QA dialogue

    NARCIS (Netherlands)

    van Schooten, B.W.; op den Akker, Hendrikus J.A.

    2006-01-01

    The processing of user follow-up utterances by a QA system is a topic which is still in its infant stages, but enjoys growing interest in the QA community. In this paper, we discuss the broader issues related to handling follow-up utterances in a real-life "information kiosk" setting. With help of a

  11. Dispersant testing : a study on analytical test procedures

    International Nuclear Information System (INIS)

    Fingas, M.F.; Fieldhouse, B.; Wang, Z.; Environment Canada, Ottawa, ON

    2004-01-01

    Crude oil is a complex mixture of hydrocarbons, ranging from small, volatile compounds to very large, non-volatile compounds. Analysis of the dispersed oil is crucial. This paper described Environment Canada's ongoing studies on various traits of dispersants. In particular, it describes small studies related to dispersant effectiveness and methods to improve analytical procedures. The study also re-evaluated the analytical procedure for the Swirling Flask Test, which is now part of the ASTM standard procedure. There are new and improved methods for analyzing oil-in-water using gas chromatography (GC). The methods could be further enhanced by integrating the entire chromatogram rather than just peaks. This would result in a decrease in maximum variation from 5 per cent to about 2 per cent. For oil-dispersant studies, the surfactant-dispersed oil hydrocarbons consist of two parts: GC-resolved hydrocarbons and GC-unresolved hydrocarbons. This study also tested a second feature of the Swirling Flask Test in which the side spout was tested and compared with a new vessel with a septum port instead of a side spout. This decreased the variability as well as the energy and mixing in the vessel. Rather than being a variation of the Swirling Flask Test, it was suggested that a spoutless vessel might be considered as a completely separate test. 7 refs., 2 tabs., 4 figs

  12. QA

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Koeberg's system for quality assurance was discussed with the Quality Assurance Programme Manager for Koeberg Construction. An American style of quality assurance, practised on French technology is used for Koeberg. The quality assurance that is practised at Koeberg, also affected other industries in South Africa

  13. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 06: Patient-specific QA Procedure for Gated VMAT SABR Treatments using 10x Beam in Flattening-Filter Free Mode

    Energy Technology Data Exchange (ETDEWEB)

    Mestrovic, Ante; Chitsazzadeh, Shadi; Wells, Derek M.; Gray, Stephen [University of Calgary, Tom Baker Cancer Centre, Tom Baker Cancer Centre (Canada)

    2016-08-15

    Purpose: To develop a highly sensitive patient specific QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: A platform was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside the ArcCheck. The Quasar phantom controller uses a patient-specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. With this system the ion chamber is used to QA the correct phase of the gated delivery and the ArcCheck diodes are used to QA the overall dose distribution. This novel approach requires a single plan delivery for a complete QA of a gated plan. The sensitivity of the gating QA procedure was investigated with respect to the following parameters: PTV size, exhale duration, baseline drift, gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns is currently undergoing to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.

  14. Analytic expressions for mode conversion in a plasma with a parabolic density profile: Generalized results

    International Nuclear Information System (INIS)

    Hinkel-Lipsker, D.E.; Fried, B.D.; Morales, G.J.

    1993-01-01

    This study provides an analytic solution to the general problem of mode conversion in an unmagnetized plasma. Specifically, an electromagnetic wave of frequency ω propagating through a plasma with a parabolic density profile of scale length L p is examined. The mode conversion points are located a distance Δ 0 from the peak of the profile, where the electron plasma frequency ω p (z) matches the wave frequency ω. The corresponding reflection, transmission, and mode conversion coefficients are expressed analytically in terms of parabolic cylinder functions for all values of Δ 0 . The method of solution is based on a source approximation technique that is valid when the electromagnetic and electrostatic scale lengths are well separated. For large Δ 0 , i.e., (cL p /ω) 1/2 much-lt Δ 0 p , the appropriately scaled result [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 559 (1992)] for a linear density profile is recovered as the parabolic cylinder functions asymptotically become Airy functions. When Δ 0 →0, the special case of conversion at the peak of the profile [D. E. Hinkel-Lipsker et al., Phys. Fluids B 4, 1772 (1992)] is obtained

  15. SU-F-T-459: ArcCHECK Machine QA : Highly Efficient Quality Assurance Tool for VMAT, SRS & SBRT Linear Accelerator Delivery

    International Nuclear Information System (INIS)

    Mhatre, V; Patwe, P; Dandekar, P

    2016-01-01

    Purpose: Quality assurance (QA) of complex linear accelerators is critical and highly time consuming. ArcCHECK Machine QA tool is used to test geometric and delivery aspects of linear accelerator. In this study we evaluated the performance of this tool. Methods: Machine QA feature allows user to perform quality assurance tests using ArcCHECK phantom. Following tests were performed 1) Gantry Speed 2) Gantry Rotation 3) Gantry Angle 4)MLC/Collimator QA 5)Beam Profile Flatness & Symmetry. Data was collected on trueBEAM stX machine for 6 MV for a period of one year. The Gantry QA test allows to view errors in gantry angle, rotation & assess how accurately the gantry moves around the isocentre. The MLC/Collimator QA tool is used to analyze & locate the differences between leaf bank & jaw position of linac. The flatness & Symmetry test quantifies beam flatness & symmetry in IEC-y & x direction. The Gantry & Flatness/Symmetry test can be performed for static & dynamic delivery. Results: The Gantry speed was 3.9 deg/sec with speed maximum deviation around 0.3 deg/sec. The Gantry Isocentre for arc delivery was 0.9mm & static delivery was 0.4mm. The maximum percent positive & negative difference was found to be 1.9 % & – 0.25 % & maximum distance positive & negative diff was 0.4mm & – 0.3 mm for MLC/Collimator QA. The Flatness for Arc delivery was 1.8 % & Symmetry for Y was 0.8 % & X was 1.8 %. The Flatness for gantry 0°,270°,90° & 180° was 1.75,1.9,1.8 & 1.6% respectively & Symmetry for X & Y was 0.8,0.6% for 0°, 0.6,0.7% for 270°, 0.6,1% for 90° & 0.6,0.7% for 180°. Conclusion: ArcCHECK Machine QA is an useful tool for QA of Modern linear accelerators as it tests both geometric & delivery aspects. This is very important for VMAT, SRS & SBRT treatments.

  16. SU-F-T-459: ArcCHECK Machine QA : Highly Efficient Quality Assurance Tool for VMAT, SRS & SBRT Linear Accelerator Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Mhatre, V; Patwe, P; Dandekar, P [Sir HN RF Hospital, Mumbai, Maharashtra (India)

    2016-06-15

    Purpose: Quality assurance (QA) of complex linear accelerators is critical and highly time consuming. ArcCHECK Machine QA tool is used to test geometric and delivery aspects of linear accelerator. In this study we evaluated the performance of this tool. Methods: Machine QA feature allows user to perform quality assurance tests using ArcCHECK phantom. Following tests were performed 1) Gantry Speed 2) Gantry Rotation 3) Gantry Angle 4)MLC/Collimator QA 5)Beam Profile Flatness & Symmetry. Data was collected on trueBEAM stX machine for 6 MV for a period of one year. The Gantry QA test allows to view errors in gantry angle, rotation & assess how accurately the gantry moves around the isocentre. The MLC/Collimator QA tool is used to analyze & locate the differences between leaf bank & jaw position of linac. The flatness & Symmetry test quantifies beam flatness & symmetry in IEC-y & x direction. The Gantry & Flatness/Symmetry test can be performed for static & dynamic delivery. Results: The Gantry speed was 3.9 deg/sec with speed maximum deviation around 0.3 deg/sec. The Gantry Isocentre for arc delivery was 0.9mm & static delivery was 0.4mm. The maximum percent positive & negative difference was found to be 1.9 % & – 0.25 % & maximum distance positive & negative diff was 0.4mm & – 0.3 mm for MLC/Collimator QA. The Flatness for Arc delivery was 1.8 % & Symmetry for Y was 0.8 % & X was 1.8 %. The Flatness for gantry 0°,270°,90° & 180° was 1.75,1.9,1.8 & 1.6% respectively & Symmetry for X & Y was 0.8,0.6% for 0°, 0.6,0.7% for 270°, 0.6,1% for 90° & 0.6,0.7% for 180°. Conclusion: ArcCHECK Machine QA is an useful tool for QA of Modern linear accelerators as it tests both geometric & delivery aspects. This is very important for VMAT, SRS & SBRT treatments.

  17. Applying QA to nuclear-development programs

    International Nuclear Information System (INIS)

    Caplinger, W.H.

    1981-12-01

    The application of quality assurance (QA) principles to developmental programs is usually accomplished by tailoring or selecting appropriate requirements from large QA systems. Developmental work at Westinghouse Hanford Company (WHC) covers the complete range from basic research to in-core reactor tests. Desired requirements are selected from the 18 criteria in ANSI/ASME NQA Standard 1 by the cognizant program engineer in conjunction with the quality engineer. These referenced criteria assure that QA for the program is planned, implemented, and maintained. In addition, the WHC QA Manual provides four categories or levels of QA that are assigned to programs or components within the program. These categories are based on safety, reliability, and consequences of failure to provide a cost effective program

  18. Spacelab Science Results Study

    Science.gov (United States)

    Naumann, R. J.; Lundquist, C. A.; Tandberg-Hanssen, E.; Horwitz, J. L.; Germany, G. A.; Cruise, J. F.; Lewis, M. L.; Murphy, K. L.

    2009-01-01

    Beginning with OSTA-1 in November 1981 and ending with Neurolab in March 1998, a total of 36 Shuttle missions carried various Spacelab components such as the Spacelab module, pallet, instrument pointing system, or mission peculiar experiment support structure. The experiments carried out during these flights included astrophysics, solar physics, plasma physics, atmospheric science, Earth observations, and a wide range of microgravity experiments in life sciences, biotechnology, materials science, and fluid physics which includes combustion and critical point phenomena. In all, some 764 experiments were conducted by investigators from the U.S., Europe, and Japan. The purpose of this Spacelab Science Results Study is to document the contributions made in each of the major research areas by giving a brief synopsis of the more significant experiments and an extensive list of the publications that were produced. We have also endeavored to show how these results impacted the existing body of knowledge, where they have spawned new fields, and if appropriate, where the knowledge they produced has been applied.

  19. Proposal for a Similar Question Search System on a Q&A Site

    Directory of Open Access Journals (Sweden)

    Katsutoshi Kanamori

    2014-06-01

    Full Text Available There is a service to help Internet users obtain answers to specific questions when they visit a Q&A site. A Q&A site is very useful for the Internet user, but posted questions are often not answered immediately. This delay in answering occurs because in most cases another site user is answering the question manually. In this study, we propose a system that can present a question that is similar to a question posted by a user. An advantage of this system is that a user can refer to an answer to a similar question. This research measures the similarity of a candidate question based on word and dependency parsing. In an experiment, we examined the effectiveness of the proposed system for questions actually posted on the Q&A site. The result indicates that the system can show the questioner the answer to a similar question. However, the system still has a number of aspects that should be improved.

  20. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  1. Electrochemical, surface analytical and quantum chemical studies ...

    Indian Academy of Sciences (India)

    subject of numerous studies due to their high technological value and wide range .... Mulliken population analysis of atoms in triazole derivatives, depending on the ... 2102–0003) with an accelerating voltage of 20 kV, at a scan speed=slow 5 and ... the corrosion rate can also be determined by Tafel extra- polation of either ...

  2. Analytical electron microscope study of eight ataxites

    Science.gov (United States)

    Novotny, P. M.; Goldstein, J. I.; Williams, D. B.

    1982-01-01

    Optical and electron optical (SEM, TEM, AEM) techniques were employed to investigate the fine structure of eight ataxite-iron meteorites. Structural studies indicated that the ataxites can be divided into two groups: a Widmanstaetten decomposition group and a martensite decomposition group. The Widmanstaetten decomposition group has a Type I plessite microstructure and the central taenite regions contain highly dislocated lath martensite. The steep M shaped Ni gradients in the taenite are consistent with the fast cooling rates, of not less than 500 C/my, observed for this group. The martensite decomposition group has a Type III plessite microstructure and contains all the chemical group IVB ataxites. The maximum taenite Ni contents vary from 47.5 to 52.7 wt % and are consistent with slow cooling to low temperatures of not greater than 350 C at cooling rates of not greater than 25 C/my.

  3. Analytical study of anisotropic compact star models

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, B.V. [Bulgarian Academy of Science, Institute for Nuclear Research and Nuclear Energy, Sofia (Bulgaria)

    2017-11-15

    A simple classification is given of the anisotropic relativistic star models, resembling the one of charged isotropic solutions. On the ground of this database, and taking into account the conditions for physically realistic star models, a method is proposed for generating all such solutions. It is based on the energy density and the radial pressure as seeding functions. Numerous relations between the realistic conditions are found and the need for a graphic proof is reduced just to one pair of inequalities. This general formalism is illustrated with an example of a class of solutions with linear equation of state and simple energy density. It is found that the solutions depend on three free constants and concrete examples are given. Some other popular models are studied with the same method. (orig.)

  4. An analytical study of slug impact phenomena

    International Nuclear Information System (INIS)

    Smith, B.L.

    1983-05-01

    Following a HCDA the roof of a fast reactor may be threatened by coolant impact. This article aims to develop at a fundamental level, understanding of the impact process and assess the relevance and magnitude of fluid-structure interaction effects. Reference is made to four 1/30th scale experiments, set up to verify the ideas developed in this work, and to provide quality data for code validation purposes. The impact of a one-dimensional liquid slug on a solid slab is investigated using a simplified form of the Rankine-Hugoniot shock equations derived under the joint assumptions of slight compressibility and small Mach number. Initially the roof slab is considered to be freely supported and of finite thickness. A detailed picture of the shock and expansion wave propagations is built up from the basic equations including the effects of wave reflections at boundaries and wave-wave interactions. Particular attention is paid to the impulse transfer mechanism from the slug as this controls the roof slab acceleration. Bulk fluid cavitation effects are noted. Roof flexural response is then taken into account, together with the effects of the hold-down constraints. It is seen that even very minor structural responses can result in significant mitigation of the impulse loading. Guidelines for the application of the work to HCDA analysis in pool reactor geometries are presented. (Auth.)

  5. A mathematical framework for virtual IMRT QA using machine learning.

    Science.gov (United States)

    Valdes, G; Scheuermann, R; Hung, C Y; Olszanski, A; Bellerive, M; Solberg, T D

    2016-07-01

    It is common practice to perform patient-specific pretreatment verifications to the clinical delivery of IMRT. This process can be time-consuming and not altogether instructive due to the myriad sources that may produce a failing result. The purpose of this study was to develop an algorithm capable of predicting IMRT QA passing rates a priori. From all treatment, 498 IMRT plans sites were planned in eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam Linacs. 3%/3 mm local dose/distance-to-agreement (DTA) was recorded using a commercial 2D diode array. Each plan was characterized by 78 metrics that describe different aspects of their complexity that could lead to disagreements between the calculated and measured dose. A Poisson regression with Lasso regularization was trained to learn the relation between the plan characteristics and each passing rate. Passing rates 3%/3 mm local dose/DTA can be predicted with an error smaller than 3% for all plans analyzed. The most important metrics to describe the passing rates were determined to be the MU factor (MU per Gy), small aperture score, irregularity factor, and fraction of the plan delivered at the corners of a 40 × 40 cm field. The higher the value of these metrics, the worse the passing rates. The Virtual QA process predicts IMRT passing rates with a high likelihood, allows the detection of failures due to setup errors, and it is sensitive enough to detect small differences between matched Linacs.

  6. Analytical method and result of radiation exposure for depressurization accident of HTTR

    International Nuclear Information System (INIS)

    Sawa, K.; Shiozawa, S.; Mikami, H.

    1990-01-01

    The Japan Atomic Energy Research Institute (JAERI) is now proceeding with the construction design of the High Temperature Engineering Test Reactor (HTTR). Since the HTTR has some characteristics different from LWRs, analytical method of radiation exposure in accidents provided for LWRs can not be applied directly. This paper describes the analytical method of radiation exposure developed by JAERI for the depressurization accident, which is the severest accident in respect to radiation exposure among the design basis accidents of the HTTR. The result is also described in this paper

  7. Tank 241-S-102, Core 232 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    STEEN, F.H.

    1998-11-04

    This document is the analytical laboratory report for tank 241-S-102 push mode core segments collected between March 5, 1998 and April 2, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-S-102 Retained Gas Sampler System Sampling and Analysis Plan (TSAP) (McCain, 1998), Letter of Instruction for Compatibility Analysis of Samples from Tank 241-S-102 (LOI) (Thompson, 1998) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Mulkey and Miller, 1998). The analytical results are included in the data summary table (Table 1).

  8. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  9. Tank 241-BY-109, cores 201 and 203, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-BY-109 push mode core segments collected between June 6, 1997 and June 17, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (Bell, 1997), the Tank Safety Screening Data Quality Objective (Dukelow, et al, 1995). The analytical results are included

  10. Interpretation of results for tumor markers on the basis of analytical imprecision and biological variation

    DEFF Research Database (Denmark)

    Sölétormos, G; Schiøler, V; Nielsen, D

    1993-01-01

    Interpretation of results for CA 15.3, carcinoembryonic antigen (CEA), and tissue polypeptide antigen (TPA) during breast cancer monitoring requires data on intra- (CVP) and inter- (CVG) individual biological variation, analytical imprecision (CVA), and indices of individuality. The average CVP...

  11. Arnol'd tongues for a resonant injection-locked frequency divider: analytical and numerical results

    DEFF Research Database (Denmark)

    Bartuccelli, Michele; Deane, Jonathan H.B.; Gentile, Guido

    2010-01-01

    ’d tongues in the frequency–amplitude plane. In particular, we provide exact analytical formulae for the widths of the tongues, which correspond to the plateaux of the devil’s staircase picture. The results account for numerical and experimental findings presented in the literature for special driving terms...

  12. Resonant particle production during inflation: a full analytical study

    Energy Technology Data Exchange (ETDEWEB)

    Pearce, Lauren; Peloso, Marco [School of Physics and Astronomy, University of Minnesota, 116 Church Street S.E., Minneapolis, MN 55455 (United States); Sorbo, Lorenzo, E-mail: lpearce@physics.umn.edu, E-mail: peloso@physics.umn.edu, E-mail: sorbo@physics.umass.edu [Amherst Center for Fundamental Interactions, Department of Physics, University of Massachusetts, 1126 Lederle Graduate Research Tower (LGRT), Amherst, MA 01003 (United States)

    2017-05-01

    We revisit the study of the phenomenology associated to a burst of particle production of a field whose mass is controlled by the inflaton field and vanishes at one given instance during inflation. This generates a bump in the correlators of the primordial scalar curvature. We provide a unified formalism to compute various effects that have been obtained in the literature and confirm that the dominant effects are due to the rescattering of the produced particles on the inflaton condensate. We improve over existing results (based on numerical fits) by providing exact analytic expressions for the shape and height of the bump, both in the power spectrum and the equilateral bispectrum. We then study the regime of validity of the perturbative computations of this signature. Finally, we extend these computations to the case of a burst of particle production in a sector coupled only gravitationally to the inflaton.

  13. SU-F-T-226: QA Management for a Large Institution with Multiple Campuses for FMEA

    Energy Technology Data Exchange (ETDEWEB)

    Tang, G; Chan, M; Lovelock, D; Lim, S; Febo, R; DeLauter, J; Both, S; Li, X; Ma, R; Saleh, Z; Song, Y; Tang, X; Xiong, W; Hunt, M; LoSasso, T [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: To redesign our radiation therapy QA program with the goal to improve quality, efficiency, and consistency among a growing number of campuses at a large institution. Methods: A QA committee was established with at least one physicist representing each of our six campuses (22 linacs). Weekly meetings were scheduled to advise on and update current procedures, to review end-to-end and other test results, and to prepare composite reports for internal and external audits. QA procedures for treatment and imaging equipment were derived from TG Reports 142 and 66, practice guidelines, and feedback from ACR evaluations. The committee focused on reaching a consensus on a single QA program among all campuses using the same type of equipment and reference data. Since the recommendations for tolerances referenced to baseline data were subject to interpretation in some instances, the committee reviewed the characteristics of all machines and quantified any variations before choosing between treatment planning system (i.e. treatment planning system commissioning data that is representative for all machines) or machine-specific values (i.e. commissioning data of the individual machines) as baseline data. Results: The configured QA program will be followed strictly by all campuses. Inventory of available equipment has been compiled, and additional equipment acquisitions for the QA program are made as needed. Dosimetric characteristics are evaluated for all machines using the same methods to ensure consistency of beam data where possible. In most cases, baseline data refer to treatment planning system commissioning data but machine-specific values are used as reference where it is deemed appropriate. Conclusion: With a uniform QA scheme, variations in QA procedures are kept to a minimum. With a centralized database, data collection and analysis are simplified. This program will facilitate uniformity in patient treatments and analysis of large amounts of QA data campus

  14. The Teacher's Role in Quality Classroom Interactions: Q&A with Dr. Drew Gitomer. REL Mid-Atlantic Teacher Effectiveness Webinar Series

    Science.gov (United States)

    Regional Educational Laboratory Mid-Atlantic, 2013

    2013-01-01

    In this webinar, Dr. Drew Gitomer, professor at Rutgers University, shared results from recent studies of classroom observations that helped participants understand both general findings about the qualities of classroom interactions and also the challenges to carrying out valid and reliable observations. This Q&A addressed the questions…

  15. Advances in classical and analytical mechanics: A reviews of author’s results

    Directory of Open Access Journals (Sweden)

    Hedrih-Stevanović Katica R.

    2013-01-01

    Full Text Available A review, in subjective choice, of author’s scientific results in area of: classical mechanics, analytical mechanics of discrete hereditary systems, analytical mechanics of discrete fractional order system vibrations, elastodynamics, nonlinear dynamics and hybrid system dynamics is presented. Main original author’s results were presented through the mathematical methods of mechanics with examples of applications for solving problems of mechanical real system dynamics abstracted to the theoretical models of mechanical discrete or continuum systems, as well as hybrid systems. Paper, also, presents serries of methods and scientific results authored by professors Mitropolyski, Andjelić and Rašković, as well as author’s of this paper original scientific research results obtained by methods of her professors. Vector method based on mass inertia moment vectors and corresponding deviational vector components for pole and oriented axis, defined in 1991 by K. Hedrih, is presented. Results in construction of analytical dynamics of hereditary discrete system obtained in collaboration with O. A. Gorosho are presented. Also, some selections of results author’s postgraduate students and doctorantes in area of nonlinear dynamics are presented. A list of scientific projects headed by author of this paper is presented with a list of doctoral dissertation and magister of sciences thesis which contain scientific research results obtained under the supervision by author of this paper or their fist doctoral candidates. [Projekat Ministarstva nauke Republike Srbije, br. ON174001: Dynamics of hybrid systems with complex structures

  16. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  17. TU-FG-201-01: 18-Month Clinical Experience of a Linac Daily Quality Assurance (QA) Solution Using Only EPID and OBI

    Energy Technology Data Exchange (ETDEWEB)

    Cai, B; Sun, B; Yaddanapudi, S; Goddu, S; Li, H; Caruthers, D; Kavanaugh, J; Mutic, S [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To describe the clinical use of a Linear Accelerator (Linac) DailyQA system with only EPID and OBI. To assess the reliability over an 18-month period and improve the robustness of this system based on QA failure analysis. Methods: A DailyQA solution utilizing an in-house designed phantom, combined EPID and OBI image acquisitions, and a web-based data analysis and reporting system was commissioned and used in our clinic to measure geometric, dosimetry and imaging components of a Varian Truebeam Linac. During an 18-month period (335 working days), the Daily QA results, including the output constancy, beam flatness and symmetry, uniformity, TPR20/10, MV and KV imaging quality, were collected and analyzed. For output constancy measurement, an independent monthly QA system with an ionization chamber (IC) and annual/incidental TG51 measurements with ADCL IC were performed and cross-compared to Daily QA system. Thorough analyses were performed on the recorded QA failures to evaluate the machine performance, optimize the data analysis algorithm, adjust the tolerance setting and improve the training procedure to prevent future failures. Results: A clinical workflow including beam delivery, data analysis, QA report generation and physics approval was established and optimized to suit daily clinical operation. The output tests over the 335 working day period cross-correlated with the monthly QA system within 1.3% and TG51 results within 1%. QA passed with one attempt on 236 days out of 335 days. Based on the QA failures analysis, the Gamma criteria is revised from (1%, 1mm) to (2%, 1mm) considering both QA accuracy and efficiency. Data analysis algorithm is improved to handle multiple entries for a repeating test. Conclusion: We described our 18-month clinical experience on a novel DailyQA system using only EPID and OBI. The long term data presented demonstrated the system is suitable and reliable for Linac daily QA.

  18. Analytical and numerical studies of creation probabilities of hierarchical trees

    Directory of Open Access Journals (Sweden)

    S.S. Borysov

    2011-03-01

    Full Text Available We consider the creation conditions of diverse hierarchical trees both analytically and numerically. A connection between the probabilities to create hierarchical levels and the probability to associate these levels into a united structure is studied. We argue that a consistent probabilistic picture requires the use of deformed algebra. Our consideration is based on the study of the main types of hierarchical trees, among which both regular and degenerate ones are studied analytically, while the creation probabilities of Fibonacci, scale-free and arbitrary trees are determined numerically.

  19. Per-beam, planar IMRT QA passing rates do not predict clinically relevant patient dose errors

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E.; Zhen Heming; Tome, Wolfgang A. [Canis Lupus LLC and Department of Human Oncology, University of Wisconsin, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Departments of Human Oncology, Medical Physics, and Biomedical Engineering, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-02-15

    Purpose: The purpose of this work is to determine the statistical correlation between per-beam, planar IMRT QA passing rates and several clinically relevant, anatomy-based dose errors for per-patient IMRT QA. The intent is to assess the predictive power of a common conventional IMRT QA performance metric, the Gamma passing rate per beam. Methods: Ninety-six unique data sets were created by inducing four types of dose errors in 24 clinical head and neck IMRT plans, each planned with 6 MV Varian 120-leaf MLC linear accelerators using a commercial treatment planning system and step-and-shoot delivery. The error-free beams/plans were used as ''simulated measurements'' (for generating the IMRT QA dose planes and the anatomy dose metrics) to compare to the corresponding data calculated by the error-induced plans. The degree of the induced errors was tuned to mimic IMRT QA passing rates that are commonly achieved using conventional methods. Results: Analysis of clinical metrics (parotid mean doses, spinal cord max and D1cc, CTV D95, and larynx mean) vs IMRT QA Gamma analysis (3%/3 mm, 2/2, 1/1) showed that in all cases, there were only weak to moderate correlations (range of Pearson's r-values: -0.295 to 0.653). Moreover, the moderate correlations actually had positive Pearson's r-values (i.e., clinically relevant metric differences increased with increasing IMRT QA passing rate), indicating that some of the largest anatomy-based dose differences occurred in the cases of high IMRT QA passing rates, which may be called ''false negatives.'' The results also show numerous instances of false positives or cases where low IMRT QA passing rates do not imply large errors in anatomy dose metrics. In none of the cases was there correlation consistent with high predictive power of planar IMRT passing rates, i.e., in none of the cases did high IMRT QA Gamma passing rates predict low errors in anatomy dose metrics or vice versa

  20. Per-beam, planar IMRT QA passing rates do not predict clinically relevant patient dose errors

    International Nuclear Information System (INIS)

    Nelms, Benjamin E.; Zhen Heming; Tome, Wolfgang A.

    2011-01-01

    Purpose: The purpose of this work is to determine the statistical correlation between per-beam, planar IMRT QA passing rates and several clinically relevant, anatomy-based dose errors for per-patient IMRT QA. The intent is to assess the predictive power of a common conventional IMRT QA performance metric, the Gamma passing rate per beam. Methods: Ninety-six unique data sets were created by inducing four types of dose errors in 24 clinical head and neck IMRT plans, each planned with 6 MV Varian 120-leaf MLC linear accelerators using a commercial treatment planning system and step-and-shoot delivery. The error-free beams/plans were used as ''simulated measurements'' (for generating the IMRT QA dose planes and the anatomy dose metrics) to compare to the corresponding data calculated by the error-induced plans. The degree of the induced errors was tuned to mimic IMRT QA passing rates that are commonly achieved using conventional methods. Results: Analysis of clinical metrics (parotid mean doses, spinal cord max and D1cc, CTV D95, and larynx mean) vs IMRT QA Gamma analysis (3%/3 mm, 2/2, 1/1) showed that in all cases, there were only weak to moderate correlations (range of Pearson's r-values: -0.295 to 0.653). Moreover, the moderate correlations actually had positive Pearson's r-values (i.e., clinically relevant metric differences increased with increasing IMRT QA passing rate), indicating that some of the largest anatomy-based dose differences occurred in the cases of high IMRT QA passing rates, which may be called ''false negatives.'' The results also show numerous instances of false positives or cases where low IMRT QA passing rates do not imply large errors in anatomy dose metrics. In none of the cases was there correlation consistent with high predictive power of planar IMRT passing rates, i.e., in none of the cases did high IMRT QA Gamma passing rates predict low errors in anatomy dose metrics or vice versa. Conclusions: There is a lack of correlation between

  1. Twist-2 at seven loops in planar N=4 SYM theory: full result and analytic properties

    Energy Technology Data Exchange (ETDEWEB)

    Marboe, Christian [School of Mathematics, Trinity College Dublin,College Green, Dublin 2 (Ireland); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany); Velizhanin, Vitaly [Theoretical Physics Division, NRC “Kurchatov Institute”,Petersburg Nuclear Physics Institute, Orlova Roscha,Gatchina, 188300 St. Petersburg (Russian Federation); Institut für Mathematik und Institut für Physik, Humboldt-Universität zu Berlin,IRIS Adlershof, Zum Großen Windkanal 6, 12489 Berlin (Germany)

    2016-11-04

    The anomalous dimension of twist-2 operators of arbitrary spin in planar N=4 SYM theory is found at seven loops by using the quantum spectral curve to compute values at fixed spin, and reconstructing the general result using the LLL-algorithm together with modular arithmetic. The result of the analytic continuation to negative spin is presented, and its relation with the recently computed correction to the BFKL and double-logarithmic equation is discussed.

  2. Improving the trust in results of numerical simulations and scientific data analytics

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Hovland, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Peterka, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Carolyn [Argonne National Lab. (ANL), Argonne, IL (United States); Snir, Marc [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-30

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation and scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general

  3. Analytical study on model tests of soil-structure interaction

    International Nuclear Information System (INIS)

    Odajima, M.; Suzuki, S.; Akino, K.

    1987-01-01

    Since nuclear power plant (NPP) structures are stiff, heavy and partly-embedded, the behavior of those structures during an earthquake depends on the vibrational characteristics of not only the structure but also the soil. Accordingly, seismic response analyses considering the effects of soil-structure interaction (SSI) are extremely important for seismic design of NPP structures. Many studies have been conducted on analytical techniques concerning SSI and various analytical models and approaches have been proposed. Based on the studies, SSI analytical codes (computer programs) for NPP structures have been improved at JINS (Japan Institute of Nuclear Safety), one of the departments of NUPEC (Nuclear Power Engineering Test Center) in Japan. These codes are soil-spring lumped-mass code (SANLUM), finite element code (SANSSI), thin layered element code (SANSOL). In proceeding with the improvement of the analytical codes, in-situ large-scale forced vibration SSI tests were performed using models simulating light water reactor buildings, and simulation analyses were performed to verify the codes. This paper presents an analytical study to demonstrate the usefulness of the codes

  4. QA CLASSIFICATION ANALYSIS OF GROUND SUPPORT SYSTEMS

    International Nuclear Information System (INIS)

    D. W. Gwyn

    1996-01-01

    The purpose and objective of this analysis is to determine if the permanent function Ground Support Systems (CI: BABEEOOOO) are quality-affecting items and if so, to establish the appropriate Quality Assurance (QA) classification

  5. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    International Nuclear Information System (INIS)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M

    2016-01-01

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  6. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    Energy Technology Data Exchange (ETDEWEB)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M [UCLA Radiological Sciences, Los Angeles, CA (United States)

    2016-06-15

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  7. An analytical model for nanoparticles concentration resulting from infusion into poroelastic brain tissue.

    Science.gov (United States)

    Pizzichelli, G; Di Michele, F; Sinibaldi, E

    2016-02-01

    We consider the infusion of a diluted suspension of nanoparticles (NPs) into poroelastic brain tissue, in view of relevant biomedical applications such as intratumoral thermotherapy. Indeed, the high impact of the related pathologies motivates the development of advanced therapeutic approaches, whose design also benefits from theoretical models. This study provides an analytical expression for the time-dependent NPs concentration during the infusion into poroelastic brain tissue, which also accounts for particle binding onto cells (by recalling relevant results from the colloid filtration theory). Our model is computationally inexpensive and, compared to fully numerical approaches, permits to explicitly elucidate the role of the involved physical aspects (tissue poroelasticity, infusion parameters, NPs physico-chemical properties, NP-tissue interactions underlying binding). We also present illustrative results based on parameters taken from the literature, by considering clinically relevant ranges for the infusion parameters. Moreover, we thoroughly assess the model working assumptions besides discussing its limitations. While not laying any claims of generality, our model can be used to support the development of more ambitious numerical approaches, towards the preliminary design of novel therapies based on NPs infusion into brain tissue. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F

    1996-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  9. Analytical and Numerical Studies of Sloshing in Tanks

    Energy Technology Data Exchange (ETDEWEB)

    Solaas, F.

    1995-12-31

    For oil cargo ship tanks and liquid natural gas carriers, the dimensions of the tanks are often such that the highest resonant sloshing periods and the ship motions are in the same period range, which may cause violent resonant sloshing of the liquid. In this doctoral thesis, linear and non-linear analytical potential theory solutions of the sloshing problem are studied for a two-dimensional rectangular tank and a vertical circular cylindrical tank, using perturbation technique for the non-linear case. The tank is forced to oscillate harmonically with small amplitudes of sway with frequency in the vicinity of the lowest natural frequency of the fluid inside the tank. The method is extended to other tank shapes using a combined analytical and numerical method. A boundary element numerical method is used to determine the eigenfunctions and eigenvalues of the problem. These are used in the non-linear analytical free surface conditions, and the velocity potential and free surface elevation for each boundary value problem in the perturbation scheme are determined by the boundary element method. Both the analytical method and the combined analytical and numerical method are restricted to tanks with vertical walls in the free surface. The suitability of a commercial programme, FLOW-3D, to estimate sloshing is studied. It solves the Navier-Stokes equations by the finite difference method. The free surface as function of time is traced using the fractional volume of fluid method. 59 refs., 54 figs., 37 tabs.

  10. Discordant Analytical Results Caused by Biotin Interference on Diagnostic Immunoassays in a Pediatric Hospital.

    Science.gov (United States)

    Ali, Mahesheema; Rajapakshe, Deepthi; Cao, Liyun; Devaraj, Sridevi

    2017-09-01

    Recent studies have reported that biotin interferes with certain immunoassays. In this study, we evaluated the analytical interference of biotin on immunoassays that use streptavidin-biotin in our pediatric hospital. We tested the effect of different concentrations of biotin (1.5-200 ng/ml) on TSH, Prolactin, Ferritin, CK-MB, β-hCG, Troponin I, LH, FSH, Cortisol, Anti-HAV antibody (IgG and IgM), assays on Ortho Clinical Diagnostic Vitros 5600 Analyzer. Biotin (up to 200 ng/mL) did not significantly affect Troponin I and HAV assays. Biotin (up to 12.5 ng/ml) resulted in biotin >6.25 ng/mL significantly affected TSH (>20% bias) assay. Prolactin was significantly affected even at low levels (Biotin 1.5 ng/mL). Thus, we recommend educating physicians about biotin interference in common immunoassays and adding an electronic disclaimer. © 2017 by the Association of Clinical Scientists, Inc.

  11. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  12. The analytical and numerical study of the fluorination of uranium dioxide particles

    International Nuclear Information System (INIS)

    Sazhin, S.S.

    1997-01-01

    A detailed analytical study of the equations describing the fluorination of UO 2 particles is presented for some limiting cases assuming that the mass flowrate of these particles is so small that they do not affect the state of the gas. The analytical solutions obtained can be used for approximate estimates of the effect of fluorination on particle diameter and temperature but their major application, however, is probably in the verification of self-consistent numerical solutions. Computational results are presented and discussed for a self-consistent problem in which both the effects of gas on particles and particles on gas are accounted for. It has been shown that in the limiting cases for which analytical solutions have been obtained, the coincidence between numerical and analytical results is almost exact. This can be considered as a verification of both the analytical and numerical solutions. (orig.)

  13. Tank 241-AN-104, cores 163 and 164 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-AN-104 push mode core segments collected between August 8, 1996 and September 12, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-AAr-1 04 Push Mode Core Sampling and Analysis Plan (TSAP) (Winkelman, 1996), the Safety Screening Data Quality Objective (DQO) (Dukelow, et at., 1995) and the Flammable Gas Data Quality Objective (DQO) (Benar, 1995). The analytical results are included in a data summary table. None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), Total Organic Carbon (TOC) and Plutonium analyses (239,240 Pu) exceeded notification limits as stated in the TSAP. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  14. Tank 241-AX-103, cores 212 and 214 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1998-01-01

    This document is the analytical laboratory report for tank 241-AX-103 push mode core segments collected between July 30, 1997 and August 11, 1997. The segments were subsampled and analyzed in accordance with the Tank 241-AX-103 Push Mode Core Sampling and Analysis Plan (TSAP) (Comer, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT), plutonium 239 (Pu239), and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Conner, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  15. Interactions between $U(1)$ Cosmic Strings: An Analytical Study

    OpenAIRE

    Bettencourt, L. M. A.; Rivers, R. J.

    1994-01-01

    We derive analytic expressions for the interaction energy between two general $U(1)$ cosmic strings as the function of their relative orientation and the ratio of the coupling constants in the model. The results are relevant to the statistic description of strings away from critical coupling and shed some light on the mechanisms involved in string formation and the evolution of string networks.

  16. Analytical results for entanglement in the five-qubit anisotropic Heisenberg model

    International Nuclear Information System (INIS)

    Wang Xiaoguang

    2004-01-01

    We solve the eigenvalue problem of the five-qubit anisotropic Heisenberg model, without use of Bethe's ansatz, and give analytical results for entanglement and mixedness of two nearest-neighbor qubits. The entanglement takes its maximum at Δ=1 (Δ>1) for the case of zero (finite) temperature with Δ being the anisotropic parameter. In contrast, the mixedness takes its minimum at Δ=1 (Δ>1) for the case of zero (finite) temperature

  17. SU-E-T-422: Correlation Between 2D Passing Rates and 3D Dose Differences for Pretreatment VMAT QA

    International Nuclear Information System (INIS)

    Jin, X; Xie, C

    2014-01-01

    Purpose: Volumetric modulated arc therapy (VMAT) quality assurance (QA) is typically using QA methods and action levels taken from fixedbeam intensity-modulated radiotherapy (IMRT) QA methods. However, recent studies demonstrated that there is no correlation between the percent gamma passing rate (%GP) and the magnitude of dose discrepancy between the planned dose and the actual delivered dose for IMRT. The purpose of this study is to investigate whether %GP is correlated with clinical dosimetric difference for VMAT. Methods: Twenty nasopharyngeal cancer (NPC) patients treated with dual-arc simultaneous integrated boost VMAT and 20 esophageal cancer patients treated with one-arc VMAT were enrolled in this study. Pretreatment VMAT QA was performed by a 3D diode array ArcCheck. Acceptance criteria of 2%/2mm, 3%/3mm, and 4%/4mm were applied for 2D %GP. Dose values below 10% of the per-measured normalization maximum dose were ignored.Mean DVH values obtained from 3DVH software and TPS were calculated and percentage dose differences were calculated. Statistical correlation between %GP and percent dose difference was studied by using Pearson correlation. Results: The %GP for criteria 2%/2mm, 3%/3mm, and 4%/4mm were 82.33±4.45, 93.47±2.31, 97.13±2.41, respectively. Dose differences calculated from 3DVH and TPS for beam isocenter, mean dose of PTV, maximum dose of PTV, D2 of PTV and D98 of PTV were -1.04±3.24, -0.74±1.71, 2.92±3.62, 0.89±3.29, -1.46±1.97, respectively. No correction were found between %GP and dose differences. Conclusion: There are weak correlations between the 2D %GP and dose differences calculated from 3DVH. The %GP acceptance criteria of 3%/3mm usually applied for pretreatment QA of IMRT and VMAT is not indicating strong clinical correlation with 3D dose difference. 3D dose reconstructions on patient anatomy may be necessary for physicist to predict the accuracy of delivered dose for VMAT QA

  18. Influence of Hemolysis on Analytic Results of Nuclear Magnetic Resonance-based Metabonomics

    Directory of Open Access Journals (Sweden)

    Qiao LIU

    2015-09-01

    Full Text Available Objective: To explore the changes of small molecular metabolites and their content in plasma samples due to hemolysis so as to analyze the influence of hemolysis of plasma samples on metabonomic study. Methods: Healthy adult males undergoing physical examination without drug administration history in recent period were selected to collect 10 hemolytic plasma samples and 10 hemolysis-free samples from them. Spectrograms of hydrogen nuclear magnetic resonance (1H-NMR were collected and Carr-Purcell-Meiboom-Gill (CPMG pulse sequence was used to inhibit the production of broad peak by protein and lipid, and SIMCA-P+12.0 software was applied to conduct mode recognition and Pearson correlation analysis.Results: CPMG-1H NMR plasma metabolism spectrums showed that compared with hemolysis-free samples, hemolytic samples were evidently higher in the contents of acetate, acetone and pyruvic acid, but markedly lower in that of glucose. In addition, the chemical shift of glycine-CH2 in hemolysis group moved to the lower field. Orthogonal partial least-square discriminant analysis (OPLS-DA was further applied to initiate mode recognition analysis and the results demonstrated that hemolysis group was prominently higher in the contents of metabolites, such as leucine, valine, lysine, acetate, proline, acetone, pyruvic acid, creatine, creatinine, glycine, glycerol, serine and lactic acid, but obviously lower in the contents of isoleucine and glucose than hemolysis-free group. Pearson correlation analysis indicated that in hemolytic samples, the contents of eucine, valine, lysine, proline, N-acetyl-glycoprotein, creatine, creatinine, glycerol and serine were higher but that of isoleucine was lower.Conclusion: Hemolysis can lead to the changes of multiple metabolite content and influence the analytic results of metabonomics, so in practical operation, hemolytic samples should be excluded from the study.

  19. Analytical studies related to Indian PHWR containment system performance

    International Nuclear Information System (INIS)

    Haware, S.K.; Markandeya, S.G.; Ghosh, A.K.; Kushwaha, H.S.; Venkat Raj, V.

    1998-01-01

    Build-up of pressure in a multi-compartment containment after a postulated accident, the growth, transportation and removal of aerosols in the containment are complex processes of vital importance in deciding the source term. The release of hydrogen and its combustion increases the overpressure. In order to analyze these complex processes and to enable proper estimation of the source term, well tested analytical tools are necessary. This paper gives a detailed account of the analytical tools developed/adapted for PSA level 2 studies. (author)

  20. DeepQA: Improving the estimation of single protein model quality with deep belief networks

    OpenAIRE

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-01-01

    Background Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. Results We introduce a novel single-model quality assessment method DeepQA based on deep belie...

  1. Case Study : Visual Analytics in Software Product Assessments

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Lanza, M; Storey, M; Muller, H

    2009-01-01

    We present how a combination of static source code analysis, repository analysis, and visualization techniques has been used to effectively get and communicate insight in the development and project management problems of a large industrial code base. This study is an example of how visual analytics

  2. Practical approach to a procedure for judging the results of analytical verification measurements

    International Nuclear Information System (INIS)

    Beyrich, W.; Spannagel, G.

    1979-01-01

    For practical safeguards a particularly transparent procedure is described to judge analytical differences between declared and verified values based on experimental data relevant to the actual status of the measurement technique concerned. Essentially it consists of two parts: Derivation of distribution curves for the occurrence of interlaboratory differences from the results of analytical intercomparison programmes; and judging of observed differences using criteria established on the basis of these probability curves. By courtesy of the Euratom Safeguards Directorate, Luxembourg, the applicability of this judging procedure has been checked in practical data verification for safeguarding; the experience gained was encouraging and implementation of the method is intended. Its reliability might be improved further by evaluation of additional experimental data. (author)

  3. Experimental and analytical study on thermoelectric self cooling of devices

    International Nuclear Information System (INIS)

    Martinez, A.; Astrain, D.; Rodriguez, A.

    2011-01-01

    This paper presents and studies the novel concept of thermoelectric self cooling, which can be introduced as the cooling and temperature control of a device using thermoelectric technology without electricity consumption. For this study, it is designed a device endowed with an internal heat source. Subsequently, a commonly used cooling system is attached to the device and the thermal performance is statistically assessed. Afterwards, it is developed and studied a thermoelectric self cooling system appropriate for the device. Experimental and analytical results show that the thermal resistance between the heat source and the environment reduced by 25-30% when the thermoelectric self cooling system is installed, and indicates the promising applicability of this technology to devices that generate large amounts of heat, such as electrical power converters, transformers and control systems. Likewise, it was statistically proved that the thermoelectric self cooling system leads to significant reductions in the temperature difference between the heat source and the environment, and, what is more, this reduction increases as the heat flow generated by the heat source increases, which makes evident the fact that thermoelectric self cooling systems work as temperature controllers. -- Highlights: → Novel concept of thermoelectric self cooling is presented and studied. → No extra electricity is needed. → Thermal resistance between the heat source and the environment reduces by 25-30%. → Increasing reduction in temperature difference between heat source and environment. → Great applicability to any device that generates heat and must be cooled.

  4. Seabird tissue archival and monitoring project: Egg collections and analytical results 1999-2002

    Science.gov (United States)

    Vander Pol, Stacy S.; Christopher, Steven J.; Roseneau, David G.; Becker, Paul R.; Day, Russel D.; Kucklick, John R.; Pugh, Rebecca S.; Simac, Kristin S.; Weston-York, Geoff

    2003-01-01

    In 1998, the U.S. Geological Survey Biological Resources Division (USGS-BRD), the U.S. Fish and Wildlife Service (USFWS) Alaska Maritime National Wildlife Refuge (AMNWR), and the National Institute of Standards and Technology (NIST) began the Seabird Tissue Archival and Monitoring Project (STAMP) to collect and cryogenically bank tissues from seabirds in Alaska for future retrospective analysis of anthropogenic contaminants. The approach of STAMP was similar to that of the Alaska Marine Mammal Tissue Archival Project (AMMTAP). AMMTAP was started in 1987 by NIST and the National Oceanic and Atmospheric Administration (NOAA) as part of the Outer Continental Shelf Environmental Assessment Program sponsored by the Minerals Management Service. Presently sponsored by the USGS-BRD, AMMTAP continues its work as part of a larger national program, the Marine Mammal Health and Stranding Response Program. AMMTAP developed carefully designed sampling and specimen banking protocols. Since 1987, AMMTAP has collected tissues from marine mammals taken in Alaska Native subsistence hunts and has cryogenically banked these tissues at the NIST National Biomonitoring Specimen Bank (NBSB). Through its own analytical work and working in partnership with other researchers both within and outside Alaska, AMMTAP has helped to develop a substantial database on contaminants in Alaska marine mammals. In contrast, data and information is limited on contaminants in Alaska seabirds, which are similar to marine mammals in that they feed near the top of the food chain and have the potential for accumulating anthropogenic contaminants. During its early planning stages, STAMP managers identified the seabird egg as the first tissue of choice for study by the project. There is a relatively long history of using bird eggs for environmental monitoring and for investigating the health status of bird populations. Since 1998, protocols for collecting and processing eggs, and cryogenically banking egg samples

  5. Analytical study of solids-gas two phase flow

    International Nuclear Information System (INIS)

    Hosaka, Minoru

    1977-01-01

    Fundamental studies were made on the hydrodynamics of solids-gas two-phase suspension flow, in which very small solid particles are mixed in a gas flow to enhance the heat transfer characteristics of gas cooled high temperature reactors. Especially, the pressure drop due to friction and the density distribution of solid particles are theoretically analyzed. The friction pressure drop of two-phase flow was analyzed based on the analytical result of the single-phase friction pressure drop. The calculated values of solid/gas friction factor as a function of solid/gas mass loading are compared with experimental results. Comparisons are made for Various combinations of Reynolds number and particle size. As for the particle density distribution, some factors affecting the non-uniformity of distribution were considered. The minimum of energy dispersion was obtained with the variational principle. The suspension density of particles was obtained as a function of relative distance from wall and was compared with experimental results. It is concluded that the distribution is much affected by the particle size and that the smaller particles are apt to gather near the wall. (Aoki, K.)

  6. Analytic results for planar three-loop integrals for massive form factors

    Energy Technology Data Exchange (ETDEWEB)

    Henn, Johannes M. [PRISMA Cluster of Excellence, Johannes Gutenberg Universität Mainz,55099 Mainz (Germany); Kavli Institute for Theoretical Physics, UC Santa Barbara,Santa Barbara (United States); Smirnov, Alexander V. [Research Computing Center, Moscow State University,119992 Moscow (Russian Federation); Smirnov, Vladimir A. [Skobeltsyn Institute of Nuclear Physics of Moscow State University,119992 Moscow (Russian Federation); Institut für Theoretische Teilchenphysik, Karlsruhe Institute of Technology (KIT),76128 Karlsruhe (Germany)

    2016-12-28

    We use the method of differential equations to analytically evaluate all planar three-loop Feynman integrals relevant for form factor calculations involving massive particles. Our results for ninety master integrals at general q{sup 2} are expressed in terms of multiple polylogarithms, and results for fiftyone master integrals at the threshold q{sup 2}=4m{sup 2} are expressed in terms of multiple polylogarithms of argument one, with indices equal to zero or to a sixth root of unity.

  7. Worldwide QA networks for radiotherapy dosimetry

    International Nuclear Information System (INIS)

    Izewska, J.; Svensson, H.; Ibbott, G.

    2002-01-01

    A number of national or international organizations have developed various types and levels of external audits for radiotherapy dosimetry. There are three major programmes who make available external audits, based on mailed TLD (thermoluminescent dosimetry), to local radiotherapy centres on a regular basis. These are the IAEA/WHO TLD postal dose audit service operating worldwide, the European Society for Therapeutic Radiology and Oncology (ESTRO) system, EQUAL, in European Union (EU) and the Radiological Physics Center (RPC) in North America. The IAEA, in collaboration with WHO, was the first organization to initiate TLD audits on an international scale in 1969, using mailed system, and has a well-established programme for providing dose verification in reference conditions. Over 32 years, the IAEA/WHO TLD audit service has checked the calibration of more than 4300 radiotherapy beams in about 1200 hospitals world-wide. Only 74% of those hospitals who receive TLDs for the first time have results with deviation between measured and stated dose within acceptance limits of ±5%, while approximately 88% of the users that have benefited from a previous TLD audit are successful. EQUAL, an audit programme set up in 1998 by ESTRO, involves the verification of output for high energy photon and electron beams, and the audit of beam parameters in non-reference conditions. More than 300 beams are checked each year, mainly in the countries of EU, covering approximately 500 hospitals. The results show that although 98% of the beam calibrations are within the tolerance level of ±5%, a second check was required in 10% of the participating centres, because a deviation larger than ±5% was observed in at least one of the beam parameters in non-reference conditions. EQUAL has been linked to another European network (EC network) which tested the audit methodology prior to its application. The RPC has been funded continuously since 1968 to monitor radiation therapy dose delivery at

  8. Tolerance design of patient-specific range QA using the DMAIC framework in proton therapy.

    Science.gov (United States)

    Rah, Jeong-Eun; Shin, Dongho; Manger, Ryan P; Kim, Tae Hyun; Oh, Do Hoon; Kim, Dae Yong; Kim, Gwe-Ya

    2018-02-01

    To implement the DMAIC (Define-Measure-Analyze-Improve-Control) can be used for customizing the patient-specific QA by designing site-specific range tolerances. The DMAIC framework (process flow diagram, cause and effect, Pareto chart, control chart, and capability analysis) were utilized to determine the steps that need focus for improving the patient-specific QA. The patient-specific range QA plans were selected according to seven treatment site groups, a total of 1437 cases. The process capability index, C pm was used to guide the tolerance design of patient site-specific range. For prostate field, our results suggested that the patient range measurements were capable at the current tolerance level of ±1 mm in clinical proton plans. For other site-specific ranges, we analyzed that the tolerance tends to be overdesigned to insufficient process capability calculated by the patient-specific QA data. The customized tolerances were calculated for treatment sites. Control charts were constructed to simulate the patient QA time before and after the new tolerances were implemented. It is found that the total simulation QA time was decreased on average of approximately 20% after establishing new site-specific range tolerances. We simulated the financial impact of this project. The QA failure for whole process in proton therapy would lead up to approximately 30% increase in total cost. DMAIC framework can be used to provide an effective QA by setting customized tolerances. When tolerance design is customized, the quality is reasonably balanced with time and cost demands. © 2017 American Association of Physicists in Medicine.

  9. A MULTIDISCIPLINARY ANALYTICAL FRAMEWORK FOR STUDYING ACTIVE MOBILITY PATTERNS

    Directory of Open Access Journals (Sweden)

    D. Orellana

    2016-06-01

    Full Text Available Intermediate cities are urged to change and adapt their mobility systems from a high energy-demanding motorized model to a sustainable low-motorized model. In order to accomplish such a model, city administrations need to better understand active mobility patterns and their links to socio-demographic and cultural aspects of the population. During the last decade, researchers have demonstrated the potential of geo-location technologies and mobile devices to gather massive amounts of data for mobility studies. However, the analysis and interpretation of this data has been carried out by specialized research groups with relatively narrow approaches from different disciplines. Consequently, broader questions remain less explored, mainly those relating to spatial behaviour of individuals and populations with their geographic environment and the motivations and perceptions shaping such behaviour. Understanding sustainable mobility and exploring new research paths require an interdisciplinary approach given the complex nature of mobility systems and their social, economic and environmental impacts. Here, we introduce the elements for a multidisciplinary analytical framework for studying active mobility patterns comprised of three components: a Methodological, b Behavioural, and c Perceptual. We demonstrate the applicability of the framework by analysing mobility patterns of cyclists and pedestrians in an intermediate city integrating a range of techniques, including: GPS tracking, spatial analysis, auto-ethnography, and perceptual mapping. The results demonstrated the existence of non-evident spatial behaviours and how perceptual features affect mobility. This knowledge is useful for developing policies and practices for sustainable mobility planning.

  10. An analytical study of various telecomminication networks using Markov models

    International Nuclear Information System (INIS)

    Ramakrishnan, M; Jayamani, E; Ezhumalai, P

    2015-01-01

    The main aim of this paper is to examine issues relating to the performance of various Telecommunication networks, and applied queuing theory for better design and improved efficiency. Firstly, giving an analytical study of queues deals with quantifying the phenomenon of waiting lines using representative measures of performances, such as average queue length (on average number of customers in the queue), average waiting time in queue (on average time to wait) and average facility utilization (proportion of time the service facility is in use). In the second, using Matlab simulator, summarizes the finding of the investigations, from which and where we obtain results and describing methodology for a) compare the waiting time and average number of messages in the queue in M/M/1 and M/M/2 queues b) Compare the performance of M/M/1 and M/D/1 queues and study the effect of increasing the number of servers on the blocking probability M/M/k/k queue model. (paper)

  11. a Multidisciplinary Analytical Framework for Studying Active Mobility Patterns

    Science.gov (United States)

    Orellana, D.; Hermida, C.; Osorio, P.

    2016-06-01

    Intermediate cities are urged to change and adapt their mobility systems from a high energy-demanding motorized model to a sustainable low-motorized model. In order to accomplish such a model, city administrations need to better understand active mobility patterns and their links to socio-demographic and cultural aspects of the population. During the last decade, researchers have demonstrated the potential of geo-location technologies and mobile devices to gather massive amounts of data for mobility studies. However, the analysis and interpretation of this data has been carried out by specialized research groups with relatively narrow approaches from different disciplines. Consequently, broader questions remain less explored, mainly those relating to spatial behaviour of individuals and populations with their geographic environment and the motivations and perceptions shaping such behaviour. Understanding sustainable mobility and exploring new research paths require an interdisciplinary approach given the complex nature of mobility systems and their social, economic and environmental impacts. Here, we introduce the elements for a multidisciplinary analytical framework for studying active mobility patterns comprised of three components: a) Methodological, b) Behavioural, and c) Perceptual. We demonstrate the applicability of the framework by analysing mobility patterns of cyclists and pedestrians in an intermediate city integrating a range of techniques, including: GPS tracking, spatial analysis, auto-ethnography, and perceptual mapping. The results demonstrated the existence of non-evident spatial behaviours and how perceptual features affect mobility. This knowledge is useful for developing policies and practices for sustainable mobility planning.

  12. Analytical & Experimental Study of Radio Frequency Cavity Beam Profile Monitor

    Energy Technology Data Exchange (ETDEWEB)

    Balcazar, Mario D. [Fermilab; Yonehara, Katsuya [Fermilab

    2017-10-22

    The purpose of this analytical and experimental study is multifold: 1) To explore a new, radiation-robust, hadron beam profile monitor for intense neutrino beam applications; 2) To test, demonstrate, and develop a novel gas-filled Radio-Frequency (RF) cavity to use in this monitoring system. Within this context, the first section of the study analyzes the beam distribution across the hadron monitor as well as the ion-production rate inside the RF cavity. Furthermore a more effecient pixel configuration across the hadron monitor is proposed to provide higher sensitivity to changes in beam displacement. Finally, the results of a benchtop test of the tunable quality factor RF cavity will be presented. The proposed hadron monitor configuration consists of a circular array of RF cavities located at a radial distance of 7cm { corresponding to the standard deviation of the beam due to scatering { and a gas-filled RF cavity with a quality factor in the range 400 - 800.

  13. Analytical methods for study of transmission line lightning protection

    International Nuclear Information System (INIS)

    Pettersson, Per.

    1993-04-01

    Transmission line lightning performance is studied by analytical methods. The elements of shielding failure flashovers and back-flashovers are analysed as functions of incidence, response and insulation. Closed-form approximate expressions are sought to enhance understanding of the phenomena. Probabilistic and wave propagation aspects are particularly studied. The electrogeometric model of lightning attraction to structures is used in combination with the log-normal probability distribution of lightning to ground currents. The log-normality is found to be retained for the currents collected by mast-type as well as line-type structures, but with a change of scale. For both types, exceedingly simple formulas for the number of hits are derived. Simple closed-form expressions for the line outage rates from back- flashovers and shielding failure flashovers are derived in a uniform way as functions of the critical currents. The expressions involve the standardized normal distribution function. System response is analysed by use of Laplace transforms in combination with text-book transmission-line theory. Inversion into time domain is accomplished by an approximate asymptotic method producing closed-form results. The back-flashover problem is analysed in particular. Approximate, image type expressions are derived for shunt admittance of wires above, on and under ground for analyses of fast transients. The derivation parallels that for series impedance, now well-known. 3 refs, 5 figs

  14. The unusually strong hydrogen bond between the carbonyl of Q(A) and His M219 in the Rhodobacter sphaeroides reaction center is not essential for efficient electron transfer from Q(A)(-) to Q(B).

    Science.gov (United States)

    Breton, Jacques; Lavergne, Jérôme; Wakeham, Marion C; Nabedryk, Eliane; Jones, Michael R

    2007-06-05

    In native reaction centers (RCs) from photosynthetic purple bacteria the primary quinone (QA) and the secondary quinone (QB) are interconnected via a specific His-Fe-His bridge. In Rhodobacter sphaeroides RCs the C4=O carbonyl of QA forms a very strong hydrogen bond with the protonated Npi of His M219, and the Ntau of this residue is in turn coordinated to the non-heme iron atom. The second carbonyl of QA is engaged in a much weaker hydrogen bond with the backbone N-H of Ala M260. In previous work, a Trp side chain was introduced by site-directed mutagenesis at the M260 position in the RC of Rb. sphaeroides, resulting in a complex that is completely devoid of QA and therefore nonfunctional. A photochemically competent derivative of the AM260W mutant was isolated that contains a Cys side chain at the M260 position (denoted AM260(W-->C)). In the present work, the interactions between the carbonyl groups of QA and the protein in the AM260(W-->C) suppressor mutant have been characterized by light-induced FTIR difference spectroscopy of the photoreduction of QA. The QA-/QA difference spectrum demonstrates that the strong interaction between the C4=O carbonyl of QA and His M219 is lost in the mutant, and the coupled CO and CC modes of the QA- semiquinone are also strongly perturbed. In parallel, a band assigned to the perturbation of the C5-Ntau mode of His M219 upon QA- formation in the native RC is lacking in the spectrum of the mutant. Furthermore, a positive band between 2900 and 2400 cm-1 that is related to protons fluctuating within a network of highly polarizable hydrogen bonds in the native RC is reduced in amplitude in the mutant. On the other hand, the QB-/QB FTIR difference spectrum is essentially the same as for the native RC. The kinetics of electron transfer from QA- to QB were measured by the flash-induced absorption changes at 780 nm. Compared to native RCs the absorption transients are slowed by a factor of about 2 for both the slow phase (in the

  15. QA experience at the University of Wisconsin accredited dosimetry calibration laboratory

    Energy Technology Data Exchange (ETDEWEB)

    DeWard, L.A.; Micka, J.A. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    The University of Wisconsin Accredited Dosimetry Calibration Laboratory (UW ADCL) employs procedure manuals as part of its Quality Assurance (QA) program. One of these manuals covers the QA procedures and results for all of the UW ADCL measurement equipment. The QA procedures are divided into two main areas: QA for laboratory equipment and QA for external chambers sent for calibration. All internal laboratory equipment is checked and recalibrated on an annual basis, after establishing its consistency on a 6-month basis. QA for external instruments involves checking past calibration history as well as comparing to a range of calibration values for specific instrument models. Generally, the authors find that a chamber will have a variation of less than 0.5 % from previous Co-60 calibration factors, and falls within two standard deviations of previous calibrations. If x-ray calibrations are also performed, the energy response of the chamber is plotted and compared to previous instruments of the same model. These procedures give the authors confidence in the transfer of calibration values from National Institute of Standards and Technology (NIST).

  16. QA experience at the University of Wisconsin accredited dosimetry calibration laboratory

    International Nuclear Information System (INIS)

    DeWard, L.A.; Micka, J.A.

    1993-01-01

    The University of Wisconsin Accredited Dosimetry Calibration Laboratory (UW ADCL) employs procedure manuals as part of its Quality Assurance (QA) program. One of these manuals covers the QA procedures and results for all of the UW ADCL measurement equipment. The QA procedures are divided into two main areas: QA for laboratory equipment and QA for external chambers sent for calibration. All internal laboratory equipment is checked and recalibrated on an annual basis, after establishing its consistency on a 6-month basis. QA for external instruments involves checking past calibration history as well as comparing to a range of calibration values for specific instrument models. Generally, the authors find that a chamber will have a variation of less than 0.5 % from previous Co-60 calibration factors, and falls within two standard deviations of previous calibrations. If x-ray calibrations are also performed, the energy response of the chamber is plotted and compared to previous instruments of the same model. These procedures give the authors confidence in the transfer of calibration values from National Institute of Standards and Technology (NIST)

  17. Electron Beam Return-Current Losses in Solar Flares: Initial Comparison of Analytical and Numerical Results

    Science.gov (United States)

    Holman, Gordon

    2010-01-01

    Accelerated electrons play an important role in the energetics of solar flares. Understanding the process or processes that accelerate these electrons to high, nonthermal energies also depends on understanding the evolution of these electrons between the acceleration region and the region where they are observed through their hard X-ray or radio emission. Energy losses in the co-spatial electric field that drives the current-neutralizing return current can flatten the electron distribution toward low energies. This in turn flattens the corresponding bremsstrahlung hard X-ray spectrum toward low energies. The lost electron beam energy also enhances heating in the coronal part of the flare loop. Extending earlier work by Knight & Sturrock (1977), Emslie (1980), Diakonov & Somov (1988), and Litvinenko & Somov (1991), I have derived analytical and semi-analytical results for the nonthermal electron distribution function and the self-consistent electric field strength in the presence of a steady-state return-current. I review these results, presented previously at the 2009 SPD Meeting in Boulder, CO, and compare them and computed X-ray spectra with numerical results obtained by Zharkova & Gordovskii (2005, 2006). The phYSical significance of similarities and differences in the results will be emphasized. This work is supported by NASA's Heliophysics Guest Investigator Program and the RHESSI Project.

  18. Tank 241-U-106, cores 147 and 148, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Steen, F.H.

    1996-09-27

    This document is the final report deliverable for tank 241-U-106 push mode core segments collected between May 8, 1996 and May 10, 1996 and received by the 222-S Laboratory between May 14, 1996 and May 16, 1996. The segments were subsampled and analyzed in accordance with the Tank 241-U-106 Push Mode Core Sampling and analysis Plan (TSAP), the Historical Model Evaluation Data Requirements (Historical DQO), Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) and the Safety Screening Data Quality Objective (DQO). The analytical results are included in Table 1.

  19. Solar neutrino masses and mixing from bilinear R-parity broken supersymmetry: Analytical versus numerical results

    Science.gov (United States)

    Díaz, M.; Hirsch, M.; Porod, W.; Romão, J.; Valle, J.

    2003-07-01

    We give an analytical calculation of solar neutrino masses and mixing at one-loop order within bilinear R-parity breaking supersymmetry, and compare our results to the exact numerical calculation. Our method is based on a systematic perturbative expansion of R-parity violating vertices to leading order. We find in general quite good agreement between the approximate and full numerical calculations, but the approximate expressions are much simpler to implement. Our formalism works especially well for the case of the large mixing angle Mikheyev-Smirnov-Wolfenstein solution, now strongly favored by the recent KamLAND reactor neutrino data.

  20. Analytic results for the one loop NMHV H anti qqgg amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Badger, Simon [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Campbell, John M. [Glasgow Univ. (United Kingdom). Dept. of Physics and Astronomy; Ellis, R. Keith [Fermilab, Batavia, IL (United States); Williams, Ciaran [Durham Univ. (United Kingdom). Dept. of Physics

    2009-10-23

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1{sup -} {sub anti} {sub q}, 2{sup +}{sub q}, 3{sup -}{sub g}, 4{sup -}{sub g}). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  1. Analytic results for the one loop NMHV H anti qqgg amplitude

    International Nuclear Information System (INIS)

    Badger, Simon; Campbell, John M.; Williams, Ciaran

    2009-01-01

    We compute the one-loop amplitude for a Higgs boson, a quark-antiquark pair and a pair of gluons of negative helicity, i.e. for the next-to-maximally helicity violating (NMHV) case, A(H, 1 - anti q , 2 + q , 3 - g , 4 - g ). The calculation is performed using an effective Lagrangian which is valid in the limit of very large top quark mass. As a result of this paper all amplitudes for the transition of a Higgs boson into 4 partons are now known analytically at one-loop order. (orig.)

  2. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, S; Kessler, M [The University of Michigan, Ann Arbor, MI (United States); Litzenberg, D [Univ Michigan, Ann Arbor, MI (United States); Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M [University of Michigan, Ann Arbor, MI (United States); Keranen, W [Varian Medical Systems, Palo Alto, CA (United States); Covington, E [University of Michigan Hospital and Health System, Ann Arbor, MI (United States); Moran, J [Univ Michigan Medical Center, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  3. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    International Nuclear Information System (INIS)

    Hadley, S; Kessler, M; Litzenberg, D; Lee, C; Irrer, J; Chen, X; Acosta, E; Weyburne, G; Lam, K; Younge, K; Matuszak, M; Keranen, W; Covington, E; Moran, J

    2015-01-01

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those events and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from

  4. An analytical study of double bend achromat lattice

    Energy Technology Data Exchange (ETDEWEB)

    Fakhri, Ali Akbar, E-mail: fakhri@rrcat.gov.in; Kant, Pradeep; Singh, Gurnam; Ghodke, A. D. [Raja Ramanna Centre for Advanced Technology, Indore 452 013 (India)

    2015-03-15

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  5. An analytical study of double bend achromat lattice.

    Science.gov (United States)

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A D

    2015-03-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  6. An analytical study of double bend achromat lattice

    International Nuclear Information System (INIS)

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A. D.

    2015-01-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented

  7. Complex dynamics of memristive circuits: Analytical results and universal slow relaxation

    Science.gov (United States)

    Caravelli, F.; Traversa, F. L.; Di Ventra, M.

    2017-02-01

    Networks with memristive elements (resistors with memory) are being explored for a variety of applications ranging from unconventional computing to models of the brain. However, analytical results that highlight the role of the graph connectivity on the memory dynamics are still few, thus limiting our understanding of these important dynamical systems. In this paper, we derive an exact matrix equation of motion that takes into account all the network constraints of a purely memristive circuit, and we employ it to derive analytical results regarding its relaxation properties. We are able to describe the memory evolution in terms of orthogonal projection operators onto the subspace of fundamental loop space of the underlying circuit. This orthogonal projection explicitly reveals the coupling between the spatial and temporal sectors of the memristive circuits and compactly describes the circuit topology. For the case of disordered graphs, we are able to explain the emergence of a power-law relaxation as a superposition of exponential relaxation times with a broad range of scales using random matrices. This power law is also universal, namely independent of the topology of the underlying graph but dependent only on the density of loops. In the case of circuits subject to alternating voltage instead, we are able to obtain an approximate solution of the dynamics, which is tested against a specific network topology. These results suggest a much richer dynamics of memristive networks than previously considered.

  8. Analytical results of variance reduction characteristics of biased Monte Carlo for deep-penetration problems

    International Nuclear Information System (INIS)

    Murthy, K.P.N.; Indira, R.

    1986-01-01

    An analytical formulation is presented for calculating the mean and variance of transmission for a model deep-penetration problem. With this formulation, the variance reduction characteristics of two biased Monte Carlo schemes are studied. The first is the usual exponential biasing wherein it is shown that the optimal biasing parameter depends sensitively on the scattering properties of the shielding medium. The second is a scheme that couples exponential biasing to the scattering angle biasing proposed recently. It is demonstrated that the coupled scheme performs better than exponential biasing

  9. Construction QA/QC systems: comparative analysis

    International Nuclear Information System (INIS)

    Willenbrock, J.H.; Shepard, S.

    1980-01-01

    An analysis which compares the quality assurance/quality control (QA/QC) systems adopted in the highway, nuclear power plant, and U.S. Navy construction areas with the traditional quality control approach used in building construction is presented. Full participation and support by the owner as well as the contractor and AE firm are required if a QA/QC system is to succeed. Process quality control, acceptance testing and quality assurance responsibilities must be clearly defined in the contract documents. The owner must audit these responsibilities. A contractor quality control plan, indicating the tasks which will be performed and the fact that QA/QC personnel are independent of project time/cost pressures should be submitted for approval. The architect must develop realistic specifications which consider the natural variability of material. Acceptance criteria based on the random sampling technique should be used. 27 refs

  10. Tank 241-TX-118, core 236 analytical results for the final report

    International Nuclear Information System (INIS)

    ESCH, R.A.

    1998-01-01

    This document is the analytical laboratory report for tank 241-TX-118 push mode core segments collected between April 1, 1998 and April 13, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-TX-118 Push Mode Core sampling and Analysis Plan (TSAP) (Benar, 1997), the Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995), the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al, 1995) and the Historical Model Evaluation Data Requirements (Historical DQO) (Sipson, et al., 1995). The analytical results are included in the data summary table (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Benar, 1997). One sample exceeded the Total Alpha Activity (AT) analysis notification limit of 38.4microCi/g (based on a bulk density of 1.6), core 236 segment 1 lower half solids (S98T001524). Appropriate notifications were made. Plutonium 239/240 analysis was requested as a secondary analysis. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and are not considered in this report

  11. Tank 241-T-203, core 190 analytical results for the final report

    International Nuclear Information System (INIS)

    Steen, F.H.

    1997-01-01

    This document is the analytical laboratory report for tank 241-T-203 push mode core segments collected on April 17, 1997 and April 18, 1997. The segments were subsainpled and analyzed in accordance with the Tank 241-T-203 Push Mode Core Sampling andanalysis Plan (TSAP) (Schreiber, 1997a), the Safety Screening Data Quality Objective (DQO)(Dukelow, et al., 1995) and Leffer oflnstructionfor Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI)(Hall, 1997). The analytical results are included in the data summary report (Table 1). None of the samples submitted for Differential Scanning Calorimetry (DSC), Total Alpha Activity (AT) and Total Organic Carbon (TOC) exceeded notification limits as stated in the TSAP (Schreiber, 1997a). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997b) and not considered in this report

  12. Investigations of low qa discharges in the SINP tokamak

    Indian Academy of Sciences (India)

    Low edge safety factor discharges including very low qa (1 qa ... From fluctuation analysis of the external magnetic probe data it has been found that MHD ... To investigate the internal details of these discharges, an internal magnetic probe ...

  13. STABLE CONIC-HELICAL ORBITS OF PLANETS AROUND BINARY STARS: ANALYTICAL RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Oks, E. [Physics Department, 206 Allison Lab., Auburn University, Auburn, AL 36849 (United States)

    2015-05-10

    Studies of planets in binary star systems are especially important because it was estimated that about half of binary stars are capable of supporting habitable terrestrial planets within stable orbital ranges. One-planet binary star systems (OBSS) have a limited analogy to objects studied in atomic/molecular physics: one-electron Rydberg quasimolecules (ORQ). Specifically, ORQ, consisting of two fully stripped ions of the nuclear charges Z and Z′ plus one highly excited electron, are encountered in various plasmas containing more than one kind of ion. Classical analytical studies of ORQ resulted in the discovery of classical stable electronic orbits with the shape of a helix on the surface of a cone. In the present paper we show that despite several important distinctions between OBSS and ORQ, it is possible for OBSS to have stable planetary orbits in the shape of a helix on a conical surface, whose axis of symmetry coincides with the interstellar axis; the stability is not affected by the rotation of the stars. Further, we demonstrate that the eccentricity of the stars’ orbits does not affect the stability of the helical planetary motion if the center of symmetry of the helix is relatively close to the star of the larger mass. We also show that if the center of symmetry of the conic-helical planetary orbit is relatively close to the star of the smaller mass, a sufficiently large eccentricity of stars’ orbits can switch the planetary motion to the unstable mode and the planet would escape the system. We demonstrate that such planets are transitable for the overwhelming majority of inclinations of plane of the stars’ orbits (i.e., the projections of the planet and the adjacent start on the plane of the sky coincide once in a while). This means that conic-helical planetary orbits at binary stars can be detected photometrically. We consider, as an example, Kepler-16 binary stars to provide illustrative numerical data on the possible parameters and the

  14. Quality assurance and quality control of nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cincu, Emanuelathor

    2001-01-01

    involved in the RER 2/004 Project. The latter 6 groups deal with gamma-ray spectrometry, analysis of the total beta-ray spectra, neutron activation and radiochemical analysis. IFIN-HH follows the IAEA Project plan consisting of participation in the IAEA training workshops and proficiency tests and elaboration of QA documents 2-3 times /year. In the year 2000 we worked out part of the QA technical operation procedures and QA Procedures of general interest, namely: 'Organization, Sample registration', 'Report of results', 'Relations with Clients' and other QA-Procedures according to the ISO/IEC 17025 Standard. In April - May 2000 all analytical working groups successfully participated in the Proficiency Test 1 organized by IAEA. The 2nd series of Proficiency Test measurements are scheduled for May-June 2001. The Progress Report no.2 (the 3rd in the PR series) was the last document prepared in 2000; by means of the last 4th and 5th Progress Reports that will be prepared in April-June 2001 the contents of the typical accreditation dossier of each group taking part in the Project will be established. The QA System and QA Manual (previously prepared according to European Norms 45001), will also be updated according to the ISO/IEC 17025 Standard requirements. The final objective of the activity in 2001 (the last year of the project) consists of preparing valid 'accreditation dossiers' with a view to securing certification/accreditation from the national accreditation/ regulatory body; we need to operate efficiently and effectively and as soon as possible on a certified basis and ensure high performance in the current applications on the clients' samples coming from various fields, including medicine, environment, industry, forensics, geology and archaeology. After the previous steps and evaluations by IAEA experts, IFIN-HH got a quite good position, ranking 5th among the participating countries. Also due to its deep involvement in the Project, IFIN-HH proposed to IAEA to organize

  15. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  16. Determination of serum albumin, analytical challenges: a French multicenter study.

    Science.gov (United States)

    Rossary, Adrien; Blondé-Cynober, Françoise; Bastard, Jean-Philippe; Beauvieux, Marie-Christine; Beyne, Pascale; Drai, Jocelyne; Lombard, Christine; Anglard, Ingrid; Aussel, Christian; Claeyssens, Sophie; Vasson, Marie-Paule

    2017-06-01

    Among the biological markers of morbidity and mortality, albumin holds a key place in the range of criteria used by the High Authority for Health (HAS) for the assessment of malnutrition and the coding of information system medicalization program (PMSI). If the principle of quantification methods have not changed in recent years, the dispersion of external evaluations of the quality (EEQ) data shows that the standardization using the certified reference material (CRM) 470 is not optimal. The aim of this multicenter study involving 7 sites, conducted by a working group of the French Society of Clinical Biology (SFBC), was to assess whether the albuminemia values depend on the analytical system used. The albumin from plasma (n=30) and serum (n=8) pools was quantified by 5 different methods [bromocresol green (VBC) and bromocresol purple (PBC) colorimetry, immunoturbidimetry (IT), immunonephelometry (IN) and capillary electrophoresis (CE)] using 12 analyzers. Bland and Altman's test evaluated the difference between the results obtained by the different methods. For example, a difference as high as 13 g/L was observed for the same sample between the methods (p albumin across the range of values tested compared to PBC (p albumin values inducing a difference of performance between the immunoprecipitation methods (IT vs IN, p albumin results are related to the technical/analyzer tandem used. This variability is usually not taken into account by the clinician. Thus, clinicians and biologists have to be aware and have to check, depending on the method used, the albumin thresholds identified as risk factors for complications related to malnutrition and PMSI coding.

  17. Tank 241-T-204, core 188 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-07-24

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.

  18. Tank 241-T-105, cores 205 and 207 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-T-105 push mode core segments collected between June 24, 1997 and June 30, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP) (Field,1997), the Tank Safety Screening Data Quality Objective (Safety DQO) (Dukelow, et al., 1995) and Tank 241-T-105 Sample Analysis (memo) (Field, 1997a). The analytical results are included in Table 1. None of the subsamples submitted for the differential scanning calorimetry (DSC) analysis or total alpha activity (AT) exceeded the notification limits as stated in the TSAP (Field, 1997). The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems (TWRS) Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and not considered in this report

  19. A novel approach to EPID-based 3D volumetric dosimetry for IMRT and VMAT QA

    Science.gov (United States)

    Alhazmi, Abdulaziz; Gianoli, Chiara; Neppl, Sebastian; Martins, Juliana; Veloza, Stella; Podesta, Mark; Verhaegen, Frank; Reiner, Michael; Belka, Claus; Parodi, Katia

    2018-06-01

    Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are relatively complex treatment delivery techniques and require quality assurance (QA) procedures. Pre-treatment dosimetric verification represents a fundamental QA procedure in daily clinical routine in radiation therapy. The purpose of this study is to develop an EPID-based approach to reconstruct a 3D dose distribution as imparted to a virtual cylindrical water phantom to be used for plan-specific pre-treatment dosimetric verification for IMRT and VMAT plans. For each depth, the planar 2D dose distributions acquired in air were back-projected and convolved by depth-specific scatter and attenuation kernels. The kernels were obtained by making use of scatter and attenuation models to iteratively estimate the parameters from a set of reference measurements. The derived parameters served as a look-up table for reconstruction of arbitrary measurements. The summation of the reconstructed 3D dose distributions resulted in the integrated 3D dose distribution of the treatment delivery. The accuracy of the proposed approach was validated in clinical IMRT and VMAT plans by means of gamma evaluation, comparing the reconstructed 3D dose distributions with Octavius measurement. The comparison was carried out using (3%, 3 mm) criteria scoring 99% and 96% passing rates for IMRT and VMAT, respectively. An accuracy comparable to the one of the commercial device for 3D volumetric dosimetry was demonstrated. In addition, five IMRT and five VMAT were validated against the 3D dose calculation performed by the TPS in a water phantom using the same passing rate criteria. The median passing rates within the ten treatment plans was 97.3%, whereas the lowest was 95%. Besides, the reconstructed 3D distribution is obtained without predictions relying on forward dose calculation and without external phantom or dosimetric devices. Thus, the approach provides a fully automated, fast and easy QA

  20. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  1. The Physics of Type Ia Supernova Light Curves. I. Analytic Results and Time Dependence

    International Nuclear Information System (INIS)

    Pinto, Philip A.; Eastman, Ronald G.

    2000-01-01

    We develop an analytic solution of the radiation transport problem for Type Ia supernovae (SNe Ia) and show that it reproduces bolometric light curves produced by more detailed calculations under the assumption of a constant-extinction coefficient. This model is used to derive the thermal conditions in the interior of SNe Ia and to study the sensitivity of light curves to various properties of the underlying supernova explosions. Although the model is limited by simplifying assumptions, it is adequate for demonstrating that the relationship between SNe Ia maximum-light luminosity and rate of decline is most easily explained if SNe Ia span a range in mass. The analytic model is also used to examine the size of various terms in the transport equation under conditions appropriate to maximum light. For instance, the Eulerian and advective time derivatives are each shown to be of the same order of magnitude as other order v/c terms in the transport equation. We conclude that a fully time-dependent solution to the transport problem is needed in order to compute SNe Ia light curves and spectra accurate enough to distinguish subtle differences of various explosion models. (c) 2000 The American Astronomical Society

  2. DNA breathing dynamics: analytic results for distribution functions of relevant Brownian functionals.

    Science.gov (United States)

    Bandyopadhyay, Malay; Gupta, Shamik; Segal, Dvira

    2011-03-01

    We investigate DNA breathing dynamics by suggesting and examining several Brownian functionals associated with bubble lifetime and reactivity. Bubble dynamics is described as an overdamped random walk in the number of broken base pairs. The walk takes place on the Poland-Scheraga free-energy landscape. We suggest several probability distribution functions that characterize the breathing process, and adopt the recently studied backward Fokker-Planck method and the path decomposition method as elegant and flexible tools for deriving these distributions. In particular, for a bubble of an initial size x₀, we derive analytical expressions for (i) the distribution P(t{f}|x₀) of the first-passage time t{f}, characterizing the bubble lifetime, (ii) the distribution P(A|x₀) of the area A until the first-passage time, providing information about the effective reactivity of the bubble to processes within the DNA, (iii) the distribution P(M) of the maximum bubble size M attained before the first-passage time, and (iv) the joint probability distribution P(M,t{m}) of the maximum bubble size M and the time t{m} of its occurrence before the first-passage time. These distributions are analyzed in the limit of small and large bubble sizes. We supplement our analytical predictions with direct numericalsimulations of the related Langevin equation, and obtain a very good agreement in the appropriate limits. The nontrivial scaling behavior of the various quantities analyzed here can, in principle, be explored experimentally.

  3. Basic concept of QA for advanced technologies

    International Nuclear Information System (INIS)

    Mijnheer, Ben

    2008-01-01

    The lecture was structured as follows: (1) Rationale for accurate dose determination; (2) Existing recommendations and guidance; (3) Challenges within the current QA paradigm; (4) New paradigm adopted by AAPM TG 100; and (5) Application of new paradigm to IMRT. Attention was paid, i.a., to major accidents in radiotherapy such as Epinal-1. (P.A.)

  4. Recent results on analytical plasma turbulence theory: Realizability, intermittency, submarginal turbulence, and self-organized criticality

    International Nuclear Information System (INIS)

    Krommes, J.A.

    2000-01-01

    Recent results and future challenges in the systematic analytical description of plasma turbulence are described. First, the importance of statistical realizability is stressed, and the development and successes of the Realizable Markovian Closure are briefly reviewed. Next, submarginal turbulence (linearly stable but nonlinearly self-sustained fluctuations) is considered and the relevance of nonlinear instability in neutral-fluid shear flows to submarginal turbulence in magnetized plasmas is discussed. For the Hasegawa-Wakatani equations, a self-consistency loop that leads to steady-state vortex regeneration in the presence of dissipation is demonstrated and a partial unification of recent work of Drake (for plasmas) and of Waleffe (for neutral fluids) is given. Brief remarks are made on the difficulties facing a quantitatively accurate statistical description of submarginal turbulence. Finally, possible connections between intermittency, submarginal turbulence, and self-organized criticality (SOC) are considered and outstanding questions are identified

  5. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  6. Weak-field asymptotic theory of tunneling ionization: benchmark analytical results for two-electron atoms

    International Nuclear Information System (INIS)

    Trinh, Vinh H; Morishita, Toru; Tolstikhin, Oleg I

    2015-01-01

    The recently developed many-electron weak-field asymptotic theory of tunneling ionization of atoms and molecules in an external static electric field (Tolstikhin et al 2014, Phys. Rev. A 89, 013421) is extended to the first-order terms in the asymptotic expansion in field. To highlight the results, here we present a simple analytical formula giving the rate of tunneling ionization of two-electron atoms H − and He. Comparison with fully-correlated ab initio calculations available for these systems shows that the first-order theory works quantitatively in a wide range of fields up to the onset of over-the-barrier ionization and hence is expected to find numerous applications in strong-field physics. (fast track communication)

  7. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  8. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    Science.gov (United States)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  9. Biological and analytical studies of peritoneal dialysis solutions

    Directory of Open Access Journals (Sweden)

    N. Hudz

    2018-04-01

    Full Text Available The purpose of our work was to conduct biological and analytical studies of the peritoneal dialysis (PD solutions containing glucose and sodium lactate and establish correlations between cell viability of the Vero cell line and values of analytical indexes of the tested solutions. The results of this study confirm the cytotoxicity of the PD solutions even compared with the isotonic solution of sodium chloride, which may be due to the low pH of the solutions, presence of glucose degradation products (GDPs and high osmolarity of the solutions, and unphysiological concentrations of glucose and sodium lactate. However, it is not yet known what factors or their combination and to what extent cause the cytotoxicity of PD solutions. In the neutral red (NR test the weak, almost middle (r = -0.496 and 0.498, respectively and unexpected correlations were found between reduced viability of monkey kidney cells and increased pH of the PD solutions and between increased cell viability and increased absorbance at 228 nm of the tested PD solutions. These two correlations can be explained by a strong correlation (r = -0.948 between a decrease in pH and an increase in the solution absorbance at 228 nm. The opposite effect was observed in the MTT test. The weak, but expected correlations (r = 0.32 and -0.202, respectively were found between increased cell viability and increased pH in the PD solutions and between decreased cell viability and increased absorbance at 228 nm of the tested PD solutions. The middle and weak correlations (r = 0.56 and 0.29, respectively were detected between increased cell viability and increased lactate concentration in the NR test and MTT test. The data of these correlations can be partially explained by the fact that a correlation with a coefficient r = -0.34 was found between decreased pH in the solutions and increased lactate concentration. The very weak correlations (0.138 and 0.196, respectively were found between increased cell

  10. ANALYTICAL APPROACHES TO THE STUDY OF EXPORT TRANSACTIONS

    Directory of Open Access Journals (Sweden)

    Ekaterina Viktorovna Medvedeva

    2015-12-01

    Full Text Available Analytical approaches to research of export operations depend on the conditions containing in separate external economic contracts with foreign buyers and also on a form of an exit of the Russian supplier of export goods to a foreign market. By means of analytical procedures it is possible to foresee and predict admissible situations which can have an adverse effect on a financial position of the economic subject. The economic entity, the engaged foreign economic activity, has to carry out surely not only the analysis of the current activity, but also the analysis of export operations. In article analytical approaches of carrying out the analysis of export operations are considered, on an example the analysis of export operations in dynamics is submitted, it is recommended to use the formulas allowing to estimate export in dynamics. For the comparative analysis export volume in the comparable prices is estimated. On the commodity groups including and quantitatively and qualitatively commensurable goods, the index of quantitative structure is calculated, the coefficient of delay of delivery of goods in comparison with other periods pays off. Carrying out the analysis allows to determine a tendency of change of export deliveries by export operations for the analyzed period for adoption of administrative decisions.Purpose Definition of the ways and receptions of the analysis applying when carrying out the analysis of export operations.Methodology in article economic-mathematical methods, and also statistical methods of the analysis were used.Results: the most informative parameters showing some aspects of carrying out the analysis of export operations are received.Practical implications it is expedient to apply the received results the economic subjects which are carrying out foreign economic activity, one of which elements are export operations.

  11. Analytical study of zirconium and hafnium α-hydroxy carboxylates

    International Nuclear Information System (INIS)

    Terra, V.R.

    1991-01-01

    The analytical study of zirconium and hafnium α-hydroxy carboxylates was described. For this purpose dl-mandelic, dl-p-bromo mandelic, dl-2-naphthyl glycolic, and benzilic acids were prepared. These were used in conjunction with glycolic, dl-lactic, dl-2-hydroxy isovaleric, dl-2-hydroxy hexanoic, and dl-2-hydroxy dodecanoic acids in order to synthesize the zirconium(IV) and hafnium(IV) tetrakis(α-hydroxy carboxylates). The compounds were characterized by melting point determination, infrared spectroscopy, thermogravimetric analysis, calcination to oxides and X-ray diffractometry by the powder method. (C.G.C)

  12. Meta-Analytical Studies in Transport Economics. Methodology and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Brons, M.R.E.

    2006-05-18

    Vast increases in the external costs of transport in the late twentieth century have caused national and international governmental bodies to worry about the sustainability of their transport systems. In this thesis we use meta-analysis as a research method to study various topics in transport economics that are relevant for sustainable transport policymaking. Meta-analysis is a research methodology that is based on the quantitative summarisation of a body of previously documented empirical evidence. In several fields of economic, meta-analysis has become a well-accepted research tool. Despite the appeal of the meta-analytical approach, there are methodological difficulties that need to be acknowledged. We study a specific methodological problem which is common in meta-analysis in economics, viz., within-study dependence caused by multiple sampling techniques. By means of Monte Carlo analysis we investigate the effect of such dependence on the performance of various multivariate estimators. In the applied part of the thesis we use and develop meta-analytical techniques to study the empirical variation in indicators of the price sensitivity of demand for aviation transport, the price sensitivity of demand for gasoline, the efficiency of urban public transport and the valuation of the external costs of noise from rail transport. We focus on the estimation of mean values for these indicators and on the identification of the impact of conditioning factors.

  13. Analytical and Numerical Studies of Several Fluid Mechanical Problems

    Science.gov (United States)

    Kong, D. L.

    2014-03-01

    In this thesis, three parts, each with several chapters, are respectively devoted to hydrostatic, viscous, and inertial fluids theories and applications. Involved topics include planetary, biological fluid systems, and high performance computing technology. In the hydrostatics part, the classical Maclaurin spheroids theory is generalized, for the first time, to a more realistic multi-layer model, establishing geometries of both the outer surface and the interfaces. For one of its astrophysical applications, the theory explicitly predicts physical shapes of surface and core-mantle-boundary for layered terrestrial planets, which enables the studies of some gravity problems, and the direct numerical simulations of dynamo flows in rotating planetary cores. As another application of the figure theory, the zonal flow in the deep atmosphere of Jupiter is investigated for a better understanding of the Jovian gravity field. An upper bound of gravity field distortions, especially in higher-order zonal gravitational coefficients, induced by deep zonal winds is estimated firstly. The oblate spheroidal shape of an undistorted Jupiter resulting from its fast solid body rotation is fully taken into account, which marks the most significant improvement from previous approximation based Jovian wind theories. High viscosity flows, for example Stokes flows, occur in a lot of processes involving low-speed motions in fluids. Microorganism swimming is such a typical case. A fully three dimensional analytic solution of incompressible Stokes equation is derived in the exterior domain of an arbitrarily translating and rotating prolate spheroid, which models a large family of microorganisms such as cocci bacteria. The solution is then applied to the magnetotactic bacteria swimming problem, and good consistency has been found between theoretical predictions and laboratory observations of the moving patterns of such bacteria under magnetic fields. In the analysis of dynamics of planetary

  14. Analytical results for 544 water samples collected in the Attean Quartz Monzonite in the vicinity of Jackman, Maine

    Science.gov (United States)

    Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.

    1983-01-01

    Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.

  15. Target normal sheath acceleration analytical modeling, comparative study and developments

    International Nuclear Information System (INIS)

    Perego, C.; Batani, D.; Zani, A.; Passoni, M.

    2012-01-01

    Ultra-intense laser interaction with solid targets appears to be an extremely promising technique to accelerate ions up to several MeV, producing beams that exhibit interesting properties for many foreseen applications. Nowadays, most of all the published experimental results can be theoretically explained in the framework of the target normal sheath acceleration (TNSA) mechanism proposed by Wilks et al. [Phys. Plasmas 8(2), 542 (2001)]. As an alternative to numerical simulation various analytical or semi-analytical TNSA models have been published in the latest years, each of them trying to provide predictions for some of the ion beam features, given the initial laser and target parameters. However, the problem of developing a reliable model for the TNSA process is still open, which is why the purpose of this work is to enlighten the present situation of TNSA modeling and experimental results, by means of a quantitative comparison between measurements and theoretical predictions of the maximum ion energy. Moreover, in the light of such an analysis, some indications for the future development of the model proposed by Passoni and Lontano [Phys. Plasmas 13(4), 042102 (2006)] are then presented.

  16. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    International Nuclear Information System (INIS)

    Lowry, N.J.

    1998-01-01

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints

  17. Analytical Study of High Concentration PCB Paint at the Heavy Water Components Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, N.J.

    1998-10-21

    This report provides results of an analytical study of high concentration PCB paint in a shutdown nuclear test reactor located at the US Department of Energy's Savannah River Site (SRS). The study was designed to obtain data relevant for an evaluation of potential hazards associated with the use of and exposure to such paints.

  18. Quality assurance (QA) training at Westinghouse including innovative approaches for achieving an effective QA programme and establishing constructive interaction

    International Nuclear Information System (INIS)

    Chivers, J.H.; Scanga, B.E.

    1982-01-01

    Experience of the Westinghouse Water Reactors Division with indoctrination and training of quality engineers includes training of personnel from Westinghouse divisions in the USA and overseas as well as of customers' personnel. A written plan is prepared for each trainee in order to fit the training to the individual's needs, and to cover the full range of information and activities. The trainee is also given work assignments, working closely with experienced quality engineers. He may prepare inspection plans and audit check lists, assist in the preparation of QA training modules, write procedures, and perform supplier surveillance and data analyses, or make special studies of operating systems. The trainee attends seminars and special courses on work-related technical subjects. Throughout the training period, emphasis is placed on inculcating an attitude of team work in the trainee so that the result of the training is the achievement of both quality and productivity. Certification is extended (given that education/experience/skill requirements are met) to such functions as mechanical equipment quality engineering, electrical equipment quality engineering, and start-up and testing quality engineering. A well-trained quality engineer is equipped to provide technical assistance to other disciplines and, through effective co-operation with others, contributes to the success of the organization's endeavours. (author)

  19. German precursor study: methods and results

    International Nuclear Information System (INIS)

    Hoertner, H.; Frey, W.; von Linden, J.; Reichart, G.

    1985-01-01

    This study has been prepared by the GRS by contract of the Federal Minister of Interior. The purpose of the study is to show how the application of system-analytic tools and especially of probabilistic methods on the Licensee Event Reports (LERs) and on other operating experience can support a deeper understanding of the safety-related importance of the events reported in reactor operation, the identification of possible weak points, and further conclusions to be drawn from the events. Additionally, the study aimed at a comparison of its results for the severe core damage frequency with those of the German Risk Study as far as this is possible and useful. The German Precursor Study is a plant-specific study. The reference plant is Biblis NPP with its very similar Units A and B, whereby the latter was also the reference plant for the German Risk Study

  20. NRC [Nuclear Regulatory Commission] perspective of software QA [quality assurance] in the nuclear history

    International Nuclear Information System (INIS)

    Weiss, S.H.

    1988-01-01

    Computer technology has been a part of the nuclear industry since its inception. However, it is only recently that computers have been integrated into reactor operations. During the early history of commercial nuclear power in the United States, the US Nuclear Regulatory Commission (NRC) discouraged the use of digital computers for real-time control and monitoring of nuclear power plant operation. At the time, this position was justified since software engineering was in its infancy, and horror stories on computer crashes were plentiful. Since the advent of microprocessors and inexpensive computer memories, significant advances have been made in fault-tolerant computer architecture that have resulted in highly reliable, durable computer systems. The NRC's requirement for safety parameter display system (SPDS) stemmed form the results of studies and investigations conducted on the Three Mile Island Unit 2 (TMI-2) accident. An NRC contractor has prepared a handbook of software QA techniques applicable to the nuclear industry, published as NUREG/CR-4640 in August 1987. Currently, the NRC is considering development of an inspection program covering software QA. Future efforts may address verification and validation as applied to expert systems and artificial intelligence programs

  1. Minimum requirements on a QA program in radiation oncology

    International Nuclear Information System (INIS)

    Almond, P.R.

    1996-01-01

    In April, 1994, the American Association of Physicists in Medicine published a ''Comprehensive QA for radiation oncology:'' a report of the AAPM Radiation Therapy Committee. This is a comprehensive QA program which is likely to become the standard for such programs in the United States. The program stresses the interdisciplinary nature of QA in radiation oncology involving the radiation oncologists, the radiotherapy technologies (radiographers), dosimetrists, and accelerator engineers, as well as the medical physicists. This paper describes a comprehensive quality assurance program with the main emphasis on the quality assurance in radiation therapy using a linear accelerator. The paper deals with QA for a linear accelerator and simulator and QA for treatment planning computers. Next the treatment planning process and QA for individual patients is described. The main features of this report, which should apply to QA programs in any country, emphasizes the responsibilities of the medical physicist. (author). 7 refs, 9 tabs

  2. Minimum requirements on a QA program in radiation oncology

    Energy Technology Data Exchange (ETDEWEB)

    Almond, P R [Louisville Univ., Louisville, KY (United States). J.G. Brown Cancer Center

    1996-08-01

    In April, 1994, the American Association of Physicists in Medicine published a ``Comprehensive QA for radiation oncology:`` a report of the AAPM Radiation Therapy Committee. This is a comprehensive QA program which is likely to become the standard for such programs in the United States. The program stresses the interdisciplinary nature of QA in radiation oncology involving the radiation oncologists, the radiotherapy technologies (radiographers), dosimetrists, and accelerator engineers, as well as the medical physicists. This paper describes a comprehensive quality assurance program with the main emphasis on the quality assurance in radiation therapy using a linear accelerator. The paper deals with QA for a linear accelerator and simulator and QA for treatment planning computers. Next the treatment planning process and QA for individual patients is described. The main features of this report, which should apply to QA programs in any country, emphasizes the responsibilities of the medical physicist. (author). 7 refs, 9 tabs.

  3. Lymphocytes Negatively Regulate NK Cell Activity via Qa-1b following Viral Infection

    Directory of Open Access Journals (Sweden)

    Haifeng C. Xu

    2017-11-01

    Full Text Available NK cells can reduce anti-viral T cell immunity during chronic viral infections, including infection with the lymphocytic choriomeningitis virus (LCMV. However, regulating factors that maintain the equilibrium between productive T cell and NK cell immunity are poorly understood. Here, we show that a large viral load resulted in inhibition of NK cell activation, which correlated with increased expression of Qa-1b, a ligand for inhibitory NK cell receptors. Qa-1b was predominantly upregulated on B cells following LCMV infection, and this upregulation was dependent on type I interferons. Absence of Qa-1b resulted in increased NK cell-mediated regulation of anti-viral T cells following viral infection. Consequently, anti-viral T cell immunity was reduced in Qa-1b- and NKG2A-deficient mice, resulting in increased viral replication and immunopathology. NK cell depletion restored anti-viral immunity and virus control in the absence of Qa-1b. Taken together, our findings indicate that lymphocytes limit NK cell activity during viral infection in order to promote anti-viral T cell immunity.

  4. Tank 241-S-106, cores 183, 184 and 187 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-S-106 push mode core segments collected between February 12, 1997 and March 21, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP), the Tank Safety Screening Data Quality Objective (Safety DQO), the Historical Model Evaluation Data Requirements (Historical DQO) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO). The analytical results are included in Table 1. Six of the twenty-four subsamples submitted for the differential scanning calorimetry (DSC) analysis exceeded the notification limit of 480 Joules/g stated in the DQO. Appropriate notifications were made. Total Organic Carbon (TOC) analyses were performed on all samples that produced exotherms during the DSC analysis. All results were less than the notification limit of three weight percent TOC. No cyanide analysis was performed, per agreement with the Tank Safety Program. None of the samples submitted for Total Alpha Activity exceeded notification limits as stated in the TSAP. Statistical evaluation of results by calculating the 95% upper confidence limit is not performed by the 222-S Laboratory and is not considered in this report. No core composites were created because there was insufficient solid material from any of the three core sampling events to generate a composite that would be representative of the tank contents

  5. Analytical results for a stochastic model of gene expression with arbitrary partitioning of proteins

    Science.gov (United States)

    Tschirhart, Hugo; Platini, Thierry

    2018-05-01

    In biophysics, the search for analytical solutions of stochastic models of cellular processes is often a challenging task. In recent work on models of gene expression, it was shown that a mapping based on partitioning of Poisson arrivals (PPA-mapping) can lead to exact solutions for previously unsolved problems. While the approach can be used in general when the model involves Poisson processes corresponding to creation or degradation, current applications of the method and new results derived using it have been limited to date. In this paper, we present the exact solution of a variation of the two-stage model of gene expression (with time dependent transition rates) describing the arbitrary partitioning of proteins. The methodology proposed makes full use of the PPA-mapping by transforming the original problem into a new process describing the evolution of three biological switches. Based on a succession of transformations, the method leads to a hierarchy of reduced models. We give an integral expression of the time dependent generating function as well as explicit results for the mean, variance, and correlation function. Finally, we discuss how results for time dependent parameters can be extended to the three-stage model and used to make inferences about models with parameter fluctuations induced by hidden stochastic variables.

  6. Analytical study of stress and deformation of HTR fuel blocks

    International Nuclear Information System (INIS)

    Tanaka, M.

    1982-01-01

    A two-dimensional finite element computer code named HANS-GR has been developed to predict the mechanical behavior of the graphite fuel blocks with realistic material properties and core environment. When graphite material is exposed to high temperature and fast neutron flux of high density, strains arise due to thermal expansion, irradiation-induced shrinkage and creep. Thus stresses and distortions are induced in the fuel block in which there are spatial variation of these strains. The analytical method used in the program to predcit these induced stresses and distortions by finite element method is discussed. In order to illustrate the versatility of the computer code, numerical results of two example analyses of the multi-hole type fuel elements in the VHTR Reactor are given. Two example analyses presented are those concerning the stresses in fuel blocks with control rod holes and distortions of the fuel blocks at the periphery of the reactor core. It is considered these phenomena should be carefully examined when the multi-hole type fuel elements are applied to VHTR. It is assured that the predicted mechanical behavior of the graphite components is strongly dependent on the material properties used and obtaining the reliable material property is important to make the analytical prediction a reliable one

  7. An Analytical Study of Prostate-Specific Antigen Dynamics.

    Science.gov (United States)

    Esteban, Ernesto P; Deliz, Giovanni; Rivera-Rodriguez, Jaileen; Laureano, Stephanie M

    2016-01-01

    The purpose of this research is to carry out a quantitative study of prostate-specific antigen dynamics for patients with prostatic diseases, such as benign prostatic hyperplasia (BPH) and localized prostate cancer (LPC). The proposed PSA mathematical model was implemented using clinical data of 218 Japanese patients with histological proven BPH and 147 Japanese patients with LPC (stages T2a and T2b). For prostatic diseases (BPH and LPC) a nonlinear equation was obtained and solved in a close form to predict PSA progression with patients' age. The general solution describes PSA dynamics for patients with both diseases LPC and BPH. Particular solutions allow studying PSA dynamics for patients with BPH or LPC. Analytical solutions have been obtained and solved in a close form to develop nomograms for a better understanding of PSA dynamics in patients with BPH and LPC. This study may be useful to improve the diagnostic and prognosis of prostatic diseases.

  8. Some analytical results for toroidal magnetic field coils with elongated minor cross-sections

    International Nuclear Information System (INIS)

    Raeder, J.

    1976-09-01

    The problem of determining the shape of a flexible current filament forming part of an ideal toroidal magnetic field coil is solved in a virtually analytical form. Analytical formulae for characteristic coil dimensions, stored magnetic energies, inductances and forces are derived for the so-called D-coils. The analytically calculated inductances of ideal D-coils are compared with numerically calculated ones for the case of finite numbers of D-shaped current filaments. Finally, the magnetic energies stored in ideal rectangular, elliptic and D-coils are compared. (orig.) [de

  9. Tank 241-B-108, cores 172 and 173 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L., Fluoro Daniel Hanford

    1997-03-04

    The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poor precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.

  10. Tank 241-TX-104, cores 230 and 231 analytical results for the final report

    International Nuclear Information System (INIS)

    Diaz, L.A.

    1998-01-01

    This document is the analytical laboratory report for tank 241-TX-104 push mode core segments collected between February 18, 1998 and February 23, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-TX-104 Push Mode Core Sampling and Analysis Plan (TSAP) (McCain, 1997), the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO) (Turner, et al., 1995) and the Safety Screening Data Quality Objective (DQO) (Dukelow, et.al., 1995). The analytical results are included in the data summary table. None of the samples submitted for Differential Scanning Calorimetry (DSC) and Total Alpha Activity (AT) exceeded notification limits as stated in the TSAP. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group in accordance with the Memorandum of Understanding (Schreiber, 1997) and are not considered in this report. Appearance and Sample Handling Attachment 1 is a cross reference to relate the tank farm identification numbers to the 222-S Laboratory LabCore/LIMS sample numbers. The subsamples generated in the laboratory for analyses are identified in these diagrams with their sources shown. Core 230: Three push mode core segments were removed from tank 241-TX-104 riser 9A on February 18, 1998. Segments were received by the 222-S Laboratory on February 19, 1998. Two segments were expected for this core. However, due to poor sample recovery, an additional segment was taken and identified as 2A. Core 231: Four push mode core segments were removed from tank 241-TX-104 riser 13A between February 19, 1998 and February 23, 1998. Segments were received by the 222-S Laboratory on February 24, 1998. Two segments were expected for this core. However, due to poor sample recovery, additional segments were taken and identified as 2A and 2B. The TSAP states the core samples should be transported to the laboratory within three

  11. Heavy-quark QCD vacuum polarisation function. Analytical results at four loops

    International Nuclear Information System (INIS)

    Kniehl, B.A.; Kotikov, A.V.

    2006-07-01

    The first two moments of the heavy-quark vacuum polarisation function at four loops in quantum chromo-dynamics are found in fully analytical form by evaluating the missing massive four-loop tadpole master integrals. (orig.)

  12. Analytical Results for Scaling Properties of the Spectrum of the Fibonacci Chain

    Science.gov (United States)

    Piéchon, Frédéric; Benakli, Mourad; Jagannathan, Anuradha

    1995-06-01

    We solve the approximate renormalization group found by Niu and Nori for a quasiperiodic tight-binding Hamiltonian on the Fibonacci chain. This enables us to characterize analytically the spectral properties of this model.

  13. Resonant amplification of neutrino transitions in the Sun: exact analytical results

    International Nuclear Information System (INIS)

    Toshev, S.; Petkov, P.

    1988-01-01

    We investigate in detail the Mikheyev-Smirnov-Wolfenstein explanation of the solar neutrino puzzle using analytical expressions for the neutrino transition probabilities in matter with exponentially varying electron number density

  14. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  15. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    Science.gov (United States)

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity

  16. Study of trace elements in milk by nuclear analytical techniques

    International Nuclear Information System (INIS)

    Gharib, A.; Rahimi, H.; Pyrovan, H.; Raoffi, N.J.; Taherpoor, H.

    1985-01-01

    This work is part of project with the IAEA in a coordinated program on 'Trace Elements in Human and Bio-environmental Systems' to evaluate their nutritional requierements, interrelations and the role of trace elements in health, metabolism, etc. Cow's milk is regarded to be one of the most important and most nutritious foodstuffs of mankind. Hence, as a first step, an elemental analysis for milk was carried out: a few samples of pasteurized milk and local samples were investigated for essential and toxic trace elements. The secondary aim of the project was the assessment of various analytical techniques involved. AAS, PIXE and NAA are presented here. The latter was applied both instrumentally and radiochemically. Although the results pertaining to the various methods employed are not in good agreement, there is, however, some justification to clarify this internal inconsistency. PIXE analysis is very fast and rather routine, but the technique for trace element analysis needs certain adaptations and improvement. (author)

  17. Pressurized thermal shocks: the JRC Ispra experimental test rig and analytical results

    International Nuclear Information System (INIS)

    Jovanovic, A.; Lucia, A.C.

    1990-01-01

    The paper tackles some issues of particular interest for the remanent (remaining) life prediction for the pressurized components exposed to pressurized thermal shock (PTS) loads, that have been tackled in analytical work performed in the framework of the MPA - JRC collaboration for the PTS experimental research at the JRC Ispra. These issues regard in general application of damage mechanics, fracture mechanics and artificial intelligence (including the treatment of uncertainties in the PTS analysis and experiments). The considered issues are essential for further understanding and modelling of the crack behaviour and of the component response in PTS conditions. In particular, the development of the FRAP preprocessor and development and implementation of a methodology for analysis of local non-stationary heat transfer coefficients during a PTS, have been explained more in detail. FRAP is used as a frontend, for the finite element code ABAQUS, for the heat transfer, stress and fracture mechanics analyses. The ABAQUS results are used further on, for the probabilistic fatigue crack growth analysis performed by the COVASTOL code. (author)

  18. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria 3122 (Australia)

    2016-02-15

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short.{sup 5} sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population.

  19. Network Traffic Analysis With Query Driven VisualizationSC 2005HPC Analytics Results

    Energy Technology Data Exchange (ETDEWEB)

    Stockinger, Kurt; Wu, Kesheng; Campbell, Scott; Lau, Stephen; Fisk, Mike; Gavrilov, Eugene; Kent, Alex; Davis, Christopher E.; Olinger,Rick; Young, Rob; Prewett, Jim; Weber, Paul; Caudell, Thomas P.; Bethel,E. Wes; Smith, Steve

    2005-09-01

    Our analytics challenge is to identify, characterize, and visualize anomalous subsets of large collections of network connection data. We use a combination of HPC resources, advanced algorithms, and visualization techniques. To effectively and efficiently identify the salient portions of the data, we rely on a multi-stage workflow that includes data acquisition, summarization (feature extraction), novelty detection, and classification. Once these subsets of interest have been identified and automatically characterized, we use a state-of-the-art-high-dimensional query system to extract data subsets for interactive visualization. Our approach is equally useful for other large-data analysis problems where it is more practical to identify interesting subsets of the data for visualization than to render all data elements. By reducing the size of the rendering workload, we enable highly interactive and useful visualizations. As a result of this work we were able to analyze six months worth of data interactively with response times two orders of magnitude shorter than with conventional methods.

  20. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 . (orig.)

  1. Analytic result for the one-loop scalar pentagon integral with massless propagators

    Energy Technology Data Exchange (ETDEWEB)

    Kniehl, Bernd A.; Tarasov, Oleg V. [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik

    2010-01-15

    The method of dimensional recurrences proposed by one of the authors (O. V.Tarasov, 1996) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}, both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions {sub 2}F{sub 1}. For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions {sub 2}F{sub 1} are presented in d=2-2{epsilon}, 4-2{epsilon}, and 6-2{epsilon} dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2{epsilon} dimensions is given in terms of the Appell function F{sub 3} and the Gauss hypergeometric function {sub 2}F{sub 1}. (orig.)

  2. Dynamics of tachyon fields and inflation - comparison of analytical and numerical results with observation

    Directory of Open Access Journals (Sweden)

    Milošević M.

    2016-01-01

    Full Text Available The role tachyon fields may play in evolution of early universe is discussed in this paper. We consider the evolution of a flat and homogeneous universe governed by a tachyon scalar field with the DBI-type action and calculate the slow-roll parameters of inflation, scalar spectral index (n, and tensor-scalar ratio (r for the given potentials. We pay special attention to the inverse power potential, first of all to V (x ~ x−4, and compare the available results obtained by analytical and numerical methods with those obtained by observation. It is shown that the computed values of the observational parameters and the observed ones are in a good agreement for the high values of the constant X0. The possibility that influence of the radion field can extend a range of the acceptable values of the constant X0 to the string theory motivated sector of its values is briefly considered. [Projekat Ministarstva nauke Republike Srbije, br. 176021, br. 174020 i br. 43011

  3. SEMI-ANALYTIC GALAXY EVOLUTION (SAGE): MODEL CALIBRATION AND BASIC RESULTS

    International Nuclear Information System (INIS)

    Croton, Darren J.; Stevens, Adam R. H.; Tonini, Chiara; Garel, Thibault; Bernyk, Maksym; Bibiano, Antonio; Hodkinson, Luke; Mutch, Simon J.; Poole, Gregory B.; Shattow, Genevieve M.

    2016-01-01

    This paper describes a new publicly available codebase for modeling galaxy formation in a cosmological context, the “Semi-Analytic Galaxy Evolution” model, or sage for short. 5 sage is a significant update to the 2006 model of Croton et al. and has been rebuilt to be modular and customizable. The model will run on any N-body simulation whose trees are organized in a supported format and contain a minimum set of basic halo properties. In this work, we present the baryonic prescriptions implemented in sage to describe the formation and evolution of galaxies, and their calibration for three N-body simulations: Millennium, Bolshoi, and GiggleZ. Updated physics include the following: gas accretion, ejection due to feedback, and reincorporation via the galactic fountain; a new gas cooling–radio mode active galactic nucleus (AGN) heating cycle; AGN feedback in the quasar mode; a new treatment of gas in satellite galaxies; and galaxy mergers, disruption, and the build-up of intra-cluster stars. Throughout, we show the results of a common default parameterization on each simulation, with a focus on the local galaxy population

  4. Analytic result for the one-loop scalar pentagon integral with massless propagators

    International Nuclear Information System (INIS)

    Kniehl, Bernd A.; Tarasov, Oleg V.

    2010-01-01

    The method of dimensional recurrences proposed by Tarasov (1996, 2000) is applied to the evaluation of the pentagon-type scalar integral with on-shell external legs and massless internal lines. For the first time, an analytic result valid for arbitrary space-time dimension d and five arbitrary kinematic variables is presented. An explicit expression in terms of the Appell hypergeometric function F 3 and the Gauss hypergeometric function 2 F 1 , both admitting one-fold integral representations, is given. In the case when one kinematic variable vanishes, the integral reduces to a combination of Gauss hypergeometric functions 2 F 1 . For the case when one scalar invariant is large compared to the others, the asymptotic values of the integral in terms of Gauss hypergeometric functions 2 F 1 are presented in d=2-2ε, 4-2ε, and 6-2ε dimensions. For multi-Regge kinematics, the asymptotic value of the integral in d=4-2ε dimensions is given in terms of the Appell function F 3 and the Gauss hypergeometric function 2 F 1 .

  5. Experimental and analytical study on removal of strontium from cultivated soil

    International Nuclear Information System (INIS)

    Fukutani, Satoshi; Takahashi, Tomoyuki

    2003-01-01

    Experimental and analytical study was done to estimate the removal of strontium from cultivated soil. The continuous batch tests were made and uneasy desorption form or immobility form was proved to exist. 2-Component Model, which considers easy desorption and uneasy desorption form fraction, was constructed and it showed good explanation of the continuous batch test results. (author)

  6. Measuring Students' Writing Ability on a Computer-Analytic Developmental Scale: An Exploratory Validity Study

    Science.gov (United States)

    Burdick, Hal; Swartz, Carl W.; Stenner, A. Jackson; Fitzgerald, Jill; Burdick, Don; Hanlon, Sean T.

    2013-01-01

    The purpose of the study was to explore the validity of a novel computer-analytic developmental scale, the Writing Ability Developmental Scale. On the whole, collective results supported the validity of the scale. It was sensitive to writing ability differences across grades and sensitive to within-grade variability as compared to human-rated…

  7. (U) An Analytic Study of Piezoelectric Ejecta Mass Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tregillis, Ian Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-16

    We consider the piezoelectric measurement of the areal mass of an ejecta cloud, for the specific case where ejecta are created by a single shock at the free surface and fly ballistically through vacuum to the sensor. To do so, we define time- and velocity-dependent ejecta “areal mass functions” at the source and sensor in terms of typically unknown distribution functions for the ejecta particles. Next, we derive an equation governing the relationship between the areal mass function at the source (which resides in the rest frame of the free surface) and at the sensor (which resides in the laboratory frame). We also derive expressions for the analytic (“true”) accumulated ejecta mass at the sensor and the measured (“inferred”) value obtained via the standard method for analyzing piezoelectric voltage traces. This approach enables us to derive an exact expression for the error imposed upon a piezoelectric ejecta mass measurement (in a perfect system) by the assumption of instantaneous creation. We verify that when the ejecta are created instantaneously (i.e., when the time dependence is a delta function), the piezoelectric inference method exactly reproduces the correct result. When creation is not instantaneous, the standard piezo analysis will always overestimate the true mass. However, the error is generally quite small (less than several percent) for most reasonable velocity and time dependences. In some cases, errors exceeding 10-15% may require velocity distributions or ejecta production timescales inconsistent with experimental observations. These results are demonstrated rigorously with numerous analytic test problems.

  8. Analytic study of the Migdal-Kadanoff recursion formula

    International Nuclear Information System (INIS)

    Ito, K.R.

    1984-01-01

    After proposing lattice gauge field models in which the Migdal renormalization group recursion formulas are exact, we study the recursion formulas analytically. If D is less than 4, it is shown that the effective actions of D-dimensional U(1) lattice gauge models are uniformly driven to the high temperature region no matter how low the initial temperature is. If the initial temperature is large enough, this holds for any D and gauge group G. These are also the cases for the recursion formulas of Kadanoff type. It turns out, however, that the string tension for D=3 obtained by these methods is rather big compared with the one already obtained by Mack, Goepfert and by the present author. The reason is clarified. (orig.)

  9. Analytical study plan: Shielded Cells batch 1 campaign; Revision 1

    International Nuclear Information System (INIS)

    Bibler, N.E.; Ha, B.C.; Hay, M.S.; Ferrara, D.M.; Andrews, M.K.

    1993-01-01

    Radioactive operations in the Defense Waste Processing Facility (DWPF) will require that the Savannah River Technology Center (SRTC) perform analyses and special studies with actual Savannah River Site (SRS) high-level waste sludge. SRS Tank 42 and Tank 51 will comprise the first batch of sludge to be processed in the DWPF. Approximately 25 liters of sludge from each of these tanks will be characterized and processed in the Shielded Cells of SRTC. During the campaign, processes will include sludge characterization, sludge washing, rheology determination, mixing, hydrogen evolution, feed preparation, and vitrification of the waste. To complete the campaign, the glass will be characterized to determine its durability and crystallinity. This document describes the types of samples that will be produced, the sampling schedule and analyses required, and the methods for sample and analytical control

  10. Analytic study of nonperturbative solutions in open string field theory

    International Nuclear Information System (INIS)

    Bars, I.; Kishimoto, I.; Matsuo, Y.

    2003-01-01

    We propose an analytic framework to study the nonperturbative solutions of Witten's open string field theory. The method is based on the Moyal star formulation where the kinetic term can be split into two parts. The first one describes the spectrum of two identical half strings which are independent from each other. The second one, which we call midpoint correction, shifts the half string spectrum to that of the standard open string. We show that the nonlinear equation of motion of string field theory is exactly solvable at zeroth order in the midpoint correction. An infinite number of solutions are classified in terms of projection operators. Among them, there exists only one stable solution which is identical to the standard butterfly state. We include the effect of the midpoint correction around each exact zeroth order solution as a perturbation expansion which can be formally summed to the complete exact solution

  11. Common QA/QM Criteria for Multinational Vendor Inspection

    International Nuclear Information System (INIS)

    2014-01-01

    This VICWG document provides the 'Common QA/QM Criteria' which will be used in Multinational Vendor Inspection. The 'Common QA/QM Criteria' provides the basic consideration when performing the Vendor Inspection. These criteria has been developed in conformity with International Codes and Standards such as IAEA, ISO and so on that MDEP member countries adopted. The purpose of the VICWG is to establish areas of co-operation in the Vendor Inspection practices among MDEP member countries as described in the MDEP issue-specific Terms of Reference (ToR). As part of this, from the beginning, a survey was performed to understand and to identify areas of commonality and differences between regulatory practices of member countries in the area of vendor inspection. The VICWG also collaborated by performing Witnessed Inspections and Joint Inspections. Through these activities, it was recognized that member countries commonly apply the IAEA safety standard (GS-R-3) to the vendor inspection criteria, and almost ail European member countries apply the ISO standard (ISO9001). In the US, the NRC regulatory requirement in 10 CFR, Part 50, Appendix B is used. South Korea uses the same criteria as in the US. As a result of the information obtained, a comparison table between codes and standards (IAEAGS-R-3, ISO 9001:2008.10CFR50 Appendix Band ASME NQA-1) has been developed in order to inform the development of 'Common QA/QM Criteria'. The result is documented in Table 1, 'MDEP CORE QA/QM Requirement and Comparison between Codes and Standards'. In addition, each country's criteria were compared with the US 10CFR50 Appendix B as a template. Table 2 shows VICWG Survey on Quality Assurance Program Requirements. Through these activities above, we considered that the core requirements should be consistent with both IAEA safety standard and ISO standard, and considered that the common requirements in the US 10CFR50 Appendix B used to the survey

  12. Understanding Business Analytics Success and Impact: A Qualitative Study

    Science.gov (United States)

    Parks, Rachida F.; Thambusamy, Ravi

    2017-01-01

    Business analytics is believed to be a huge boon for organizations since it helps offer timely insights over the competition, helps optimize business processes, and helps generate growth and innovation opportunities. As organizations embark on their business analytics initiatives, many strategic questions, such as how to operationalize business…

  13. WE-AB-201-03: TPS Commissioning and QA: Incorporating the Entire Planning Process

    International Nuclear Information System (INIS)

    Mutic, S.

    2015-01-01

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  14. WE-AB-201-00: Treatment Planning System Commissioning and QA

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  15. WE-AB-201-01: Treatment Planning System Commissioning and QA: Challenges and Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Salomons, G. [Cancer Center of Southeastern Ontario (Canada)

    2015-06-15

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  16. WE-AB-201-03: TPS Commissioning and QA: Incorporating the Entire Planning Process

    Energy Technology Data Exchange (ETDEWEB)

    Mutic, S. [Washington University School of Medicine (United States)

    2015-06-15

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  17. WE-AB-201-00: Treatment Planning System Commissioning and QA

    International Nuclear Information System (INIS)

    2015-01-01

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  18. WE-AB-201-01: Treatment Planning System Commissioning and QA: Challenges and Opportunities

    International Nuclear Information System (INIS)

    Salomons, G.

    2015-01-01

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  19. Case Study: IBM Watson Analytics Cloud Platform as Analytics-as-a-Service System for Heart Failure Early Detection

    Directory of Open Access Journals (Sweden)

    Gabriele Guidi

    2016-07-01

    Full Text Available In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections.

  20. WE-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA II

    International Nuclear Information System (INIS)

    Childress, N; Murray, B

    2014-01-01

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Using DoseLab to Perform TG-142 Imaging QA The goals of this session will be to present a clinical overview of acquiring images for TG-142 Imaging QA, as well as analyzing and evaluating results using DoseLab software. DoseLab supports planar imaging QA analysis using almost any QA phantom provided by numerous vendors. General advantages and disadvantages of selecting each of these phantoms will be briefly summarized. Best practices for selecting image acquisition parameters will be presented. A demonstration of using DoseLab software to perform a series of TG-142 tests will be performed. We will disuss why DoseLab uses its own set of imaging QA formulas, and why imaging QA measurement values of the same nominal properties will vary between TG- 142 software packages. Because TG-142 does not specify baseline and tolerance values for imaging QA, the presentation will recommend performing the manufacturer's acceptance test procedure to validate the equipment is functioning correctly. Afterwards, results can be obtained using the clinic's selected set of phantoms, image acquisition parameters, and TG-142 software to set proper baseline values. This presentation will highlight the reasons why comparing imaging QA results can be trickier than comparing linear accelerator treatment results and what physicists should keep in mind when comparing imaging QA results for different machines. Physicists are often unsure of the next step when there is an issue discovered during Imaging QA. Therefore, a few common examples

  1. WE-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA II

    Energy Technology Data Exchange (ETDEWEB)

    Childress, N [Mobius Medical Management, LLC,, Houston, TX (United States); Murray, B [ZapIT Medical, Dublin, OH (Ireland)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Using DoseLab to Perform TG-142 Imaging QA The goals of this session will be to present a clinical overview of acquiring images for TG-142 Imaging QA, as well as analyzing and evaluating results using DoseLab software. DoseLab supports planar imaging QA analysis using almost any QA phantom provided by numerous vendors. General advantages and disadvantages of selecting each of these phantoms will be briefly summarized. Best practices for selecting image acquisition parameters will be presented. A demonstration of using DoseLab software to perform a series of TG-142 tests will be performed. We will disuss why DoseLab uses its own set of imaging QA formulas, and why imaging QA measurement values of the same nominal properties will vary between TG- 142 software packages. Because TG-142 does not specify baseline and tolerance values for imaging QA, the presentation will recommend performing the manufacturer's acceptance test procedure to validate the equipment is functioning correctly. Afterwards, results can be obtained using the clinic's selected set of phantoms, image acquisition parameters, and TG-142 software to set proper baseline values. This presentation will highlight the reasons why comparing imaging QA results can be trickier than comparing linear accelerator treatment results and what physicists should keep in mind when comparing imaging QA results for different machines. Physicists are often unsure of the next step when there is an issue discovered during Imaging QA. Therefore, a few common examples

  2. Improvement in QA protocol for TLD based personnel monitoring laboratory in last five year

    International Nuclear Information System (INIS)

    Rakesh, R.B.

    2018-01-01

    The Quality Assurance (QA) in Personnel monitoring (PM) is a tool to assess the performance of PM laboratories and reliability of dose estimation with respect to standards laid down by international agencies such as IAEA (ISO trumpet curve), IEC, ANSI etc. Reliable personal dose estimation is a basic requirement for radiation protection planning as well as decision making continuous improvement in radiation protection is inherent in radiation protection practices which is highly dependent on accuracy and reliability of the monitoring data. Experience based evolution of Quality control (QC) measures as well as Quality assurance (QA) protocol are two important aspects towards continuous improvement in accuracy and reliability of personnel monitoring results. The paper describes improvement in QC measures and QA protocols initiated during the last five years which led to improvement in the quality of PM services

  3. A Chatbot as a Natural Web Interface to Arabic Web QA

    Directory of Open Access Journals (Sweden)

    Bayan Abu Shawar

    2011-03-01

    Full Text Available In this paper, we describe a way to access Arabic Web Question Answering (QA corpus using a chatbot, without the need for sophisticated natural language processing or logical inference. Any Natural Language (NL interface to Question Answer (QA system is constrained to reply with the given answers, so there is no need for NL generation to recreate well-formed answers, or for deep analysis or logical inference to map user input questions onto this logical ontology; simple (but large set of pattern-template matching rules will suffice. In previous research, this approach works properly with English and other European languages. In this paper, we try to see how the same chatbot will react in terms of Arabic Web QA corpus. Initial results shows that 93% of answers were correct, but because of a lot of characteristics related to Arabic language, changing Arabic questions into other forms may lead to no answers.

  4. Comparison of EPRI safety valve test data with analytically determined hydraulic results

    International Nuclear Information System (INIS)

    Smith, L.C.; Howe, K.S.

    1983-01-01

    NUREG-0737 (November 1980) and all subsequent U.S. NRC generic follow-up letters require that all operating plant licensees and applicants verify the acceptability of plant specific pressurizer safety valve piping systems for valve operation transients by testing. To aid in this verification process, the Electric Power Research Institute (EPRI) conducted an extensive testing program at the Combustion Engineering Test Facility. Pertinent tests simulating dynamic opening of the safety valves for representative upstream environments were carried out. Different models and sizes of safety valves were tested at the simulated operating conditions. Transducers placed at key points in the system monitored a variety of thermal, hydraulic and structural parameters. From this data, a more complete description of the transient can be made. The EPRI test configuration was analytically modeled using a one-dimensional thermal hydraulic computer program that uses the method of characteristics approach to generate key fluid parameters as a function of space and time. The conservative equations are solved by applying both the implicit and explicit characteristic methods. Unbalanced or wave forces were determined for each straight run of pipe bounded on each side by a turn or elbow. Blowdown forces were included, where appropriate. Several parameters were varied to determine the effects on the pressure, hydraulic forces and timings of events. By comparing these quantities with the experimentally obtained data, an approximate picture of the flow dynamics is arrived at. Two cases in particular are presented. These are the hot and cold loop seal discharge tests made with the Crosby 6M6 spring-loaded safety valve. Included in the paper is a description of the hydraulic code, modeling techniques and assumptions, a comparison of the numerical results with experimental data and a qualitative description of the factors which govern pipe support loading. (orig.)

  5. An analytical study of photoacoustic and thermoacoustic generation efficiency towards contrast agent and film design optimization

    Directory of Open Access Journals (Sweden)

    Fei Gao

    2017-09-01

    Full Text Available Photoacoustic (PA and thermoacoustic (TA effects have been explored in many applications, such as bio-imaging, laser-induced ultrasound generator, and sensitive electromagnetic (EM wave film sensor. In this paper, we propose a compact analytical PA/TA generation model to incorporate EM, thermal and mechanical parameters, etc. From the derived analytical model, both intuitive predictions and quantitative simulations are performed. It shows that beyond the EM absorption improvement, there are many other physical parameters that deserve careful consideration when designing contrast agents or film composites, followed by simulation study. Lastly, several sets of experimental results are presented to prove the feasibility of the proposed analytical model. Overall, the proposed compact model could work as a clear guidance and predication for improved PA/TA contrast agents and film generator/sensor designs in the domain area.

  6. Notes on analytical study of holographic superconductors with Lifshitz scaling in external magnetic field

    International Nuclear Information System (INIS)

    Zhao, Zixu; Pan, Qiyuan; Jing, Jiliang

    2014-01-01

    We employ the matching method to analytically investigate the holographic superconductors with Lifshitz scaling in an external magnetic field. We discuss systematically the restricted conditions for the matching method and find that this analytic method is not always powerful to explore the effect of external magnetic field on the holographic superconductors unless the matching point is chosen in an appropriate range and the dynamical exponent z satisfies the relation z=d−1 or z=d−2. From the analytic treatment, we observe that Lifshitz scaling can hinder the condensation to be formed, which can be used to back up the numerical results. Moreover, we study the effect of Lifshitz scaling on the upper critical magnetic field and reproduce the well-known relation obtained from Ginzburg–Landau theory

  7. Conventional patient specific IMRT QA and 3DVH verification of dose distribution for helical tomotherapy

    International Nuclear Information System (INIS)

    Sharma, Prabhat Krishna; Joshi, Kishore; Epili, D.; Gavake, Umesh; Paul, Siji; Reena, Ph.; Jamema, S.V.

    2016-01-01

    In recent years, patient-specific IMRT QA has transitioned from point dose measurements by ion chambers to films to 2D array measurements. 3DVH software has taken this transition a step further by estimating the 3D dose delivered to the patient volume from 2D diode measurements using a planned dose perturbation (PDP) algorithm. This algorithm was developed to determine, if the conventional IMRT QA though sensitive at detecting errors, has any predictive power in detecting dose errors of clinical significance related to dose to the target volume and organs at risk (OAR). The aim of this study is to compare the conventional IMRT patient specific QA and 3DVH dose distribution for patients treated with helical tomotherapy (HT)

  8. An overview of gamma-hydroxybutyric acid: pharmacodynamics, pharmacokinetics, toxic effects, addiction, analytical methods, and interpretation of results.

    Science.gov (United States)

    Andresen, H; Aydin, B E; Mueller, A; Iwersen-Bergmann, S

    2011-09-01

    Abuse of gamma-hydroxybutyric acid (GHB) has been known since the early 1990's, but is not as widespread as the consumption of other illegal drugs. However, the number of severe intoxications with fatal outcomes is comparatively high; not the least of which is brought about by the consumption of the currently legal precursor substances gamma-butyrolactone (GBL) and 1,4-butanediol (1,4-BD). In regards to previous assumptions, addiction to GHB or its analogues can occur with severe symptoms of withdrawal. Moreover, GHB can be used for drug-facilitated sexual assaults. Its pharmacological effects are generated mainly by interaction with both GABA(B) and GHB receptors, as well as its influence on other transmitter systems in the human brain. Numerous analytical methods for determining GHB using chromatographic techniques were published in recent years, and an enzymatic screening method was established. However, the short window of GHB detection in blood or urine due to its rapid metabolism is a challenge. Furthermore, despite several studies addressing this problem, evaluation of analytical results can be difficult: GHB is a metabolite of GABA (gamma-aminobutyric acid); a differentiation between endogenous and exogenous concentrations has to be made. Apart from this, in samples with a longer storage interval and especially in postmortem specimens, higher levels can be measured due to GHB generation during this postmortem interval or storage time. Copyright © 2011 John Wiley & Sons, Ltd.

  9. Many analysts, one dataset: Making transparent how variations in analytical choices affect results

    NARCIS (Netherlands)

    Silberzahn, Raphael; Uhlmann, E.L.; Martin, D.P.; Anselmi, P.; Aust, F.; Awtrey, E.; Bahnik, Š.; Bai, F.; Bannard, C.; Bonnier, E.; Carlsson, R.; Cheung, F.; Christensen, G.; Clay, R.; Craig, M.A.; Dalla Rosa, A.; Dam, Lammertjan; Evans, M.H.; Flores Cervantes, I.; Fong, N.; Gamez-Djokic, M.; Glenz, A.; Gordon-McKeon, S.; Heaton, T.J.; Hederos, K.; Heene, M.; Hofelich Mohr, A.J.; Högden, F.; Hui, K.; Johannesson, M.; Kalodimos, J.; Kaszubowski, E.; Kennedy, D.M.; Lei, R.; Lindsay, T.A.; Liverani, S.; Madan, C.R.; Molden, D.; Molleman, Henricus; Morey, R.D.; Mulder, Laetitia; Nijstad, Bernard; Pope, N.G.; Pope, B.; Prenoveau, J.M.; Rink, Floortje; Robusto, E.; Roderique, H.; Sandberg, A.; Schlüter, E.; Schönbrodt, F.D.; Sherman, M.F.; Sommer, S.A.; Sotak, K.; Spain, S.; Spörlein, C.; Stafford, T.; Stefanutti, L.; Täuber, Susanne; Ullrich, J.; Vianello, M.; Wagenmakers, E.-J.; Witkowiak, M.; Yoon, S.; Nosek, B.A.

    2018-01-01

    Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged

  10. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  11. Analytical results for non-Hermitian parity–time-symmetric and ...

    Indian Academy of Sciences (India)

    Abstract. We investigate both the non-Hermitian parity–time-(PT-)symmetric and Hermitian asymmetric volcano potentials, and present the analytical solution in terms of the confluent Heun function. Under certain special conditions, the confluent Heun function can be terminated as a polynomial, thereby leading to certain ...

  12. Experimental and analytical studies of high heat flux components for fusion experimental reactor

    International Nuclear Information System (INIS)

    Araki, Masanori

    1993-03-01

    In this report, the experimental and analytical results concerning the development of plasma facing components of ITER are described. With respect to developing high heat removal structures for the divertor plates, an externally-finned swirl tube was developed based on the results of critical heat flux (CHF) experiments on various tube structures. As the result, the burnout heat flux, which also indicates incident CHF, of 41 ± 1 MW/m 2 was achieved in the externally-finned swirl tube. The applicability of existing CHF correlations based on uniform heating conditions was evaluated by comparing the CHF experimental data with the smooth and the externally-finned tubes under one-sided heating condition. As the results, experimentally determined CHF data for straight tube show good agreement, for the externally-finned tube, no existing correlations are available for prediction of the CHF. With respect to the evaluation of the bonds between carbon-based material and heat sink metal, results of brazing tests were compared with the analytical results by three dimensional model with temperature-dependent thermal and mechanical properties. Analytical results showed that residual stresses from brazing can be estimated by the analytical three directional stress values instead of the equivalent stress value applied. In the analytical study on the separatrix sweeping for effectively reducing surface heat fluxes on the divertor plate, thermal response of the divertor plate has been analyzed under ITER relevant heat flux conditions and has been tested. As the result, it has been demonstrated that application of the sweeping technique is very effective for improvement in the power handling capability of the divertor plate and that the divertor mock-up has withstood a large number of additional cyclic heat loads. (J.P.N.) 62 refs

  13. SU-F-T-182: A Stochastic Approach to Daily QA Tolerances On Spot Properties for Proton Pencil Beam Scanning

    International Nuclear Information System (INIS)

    St James, S; Bloch, C; Saini, J

    2016-01-01

    Purpose: Proton pencil beam scanning is used clinically across the United States. There are no current guidelines on tolerances for daily QA specific to pencil beam scanning, specifically related to the individual spot properties (spot width). Using a stochastic method to determine tolerances has the potential to optimize tolerances on individual spots and decrease the number of false positive failures in daily QA. Individual and global spot tolerances were evaluated. Methods: As part of daily QA for proton pencil beam scanning, a field of 16 spots (corresponding to 8 energies) is measured using an array of ion chambers (Matrixx, IBA). Each individual spot is fit to two Gaussian functions (x,y). The spot width (σ) in × and y are recorded (32 parameters). Results from the daily QA were retrospectively analyzed for 100 days of data. The deviations of the spot widths were histogrammed and fit to a Gaussian function. The stochastic spot tolerance was taken to be the mean ± 3σ. Using these results, tolerances were developed and tested against known deviations in spot width. Results: The individual spot tolerances derived with the stochastic method decreased in 30/32 instances. Using the previous tolerances (± 20% width), the daily QA would have detected 0/20 days of the deviation. Using a tolerance of any 6 spots failing the stochastic tolerance, 18/20 days of the deviation would have been detected. Conclusion: Using a stochastic method we have been able to decrease daily tolerances on the spot widths for 30/32 spot widths measured. The stochastic tolerances can lead to detection of deviations that previously would have been picked up on monthly QA and missed by daily QA. This method could be easily extended for evaluation of other QA parameters in proton spot scanning.

  14. Analysis of QA audit checklist for equipment suppliers

    International Nuclear Information System (INIS)

    Tian Xuehang

    2012-01-01

    Eleven aspects during the equipment manufacturing by the suppliers, including the guidelines and objectives of quality assurance, management department review, document and record control, staffing and training, design control, procurement control, control of items, process control, inspection and testing control, non-conformance control, and internal and external QA audit, are analyzed in this article. The detailed QA audit checklist on these above mentioned aspects are described and the problems found in real QA audit are listed in this article. (authors)

  15. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub, L.

    2010-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  16. QA practice for online analyzers in water steam cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2009-01-01

    The liberalization of power markets throughout the world has resulted in more and more power stations being operated in cycling mode, with frequent load changes and multiple daily start-up and shut-down cycles. This more flexible operation also calls for better automation and poses new challenges to water chemistry in water steam cycles, to avoid subsequent damage to vital plant components such as turbines, boilers or condensers. But automation for the most important chemistry control tool, the sampling and online analyzer system, is only possible if chemists can rely on their online analysis equipment. Proof of plausibility as well as reliability and availability of online analysis results becomes a major focus. While SOP and standard QA procedures for laboratory equipment are well established and daily practice, such measures are widely neglected for online process analyzers. This paper is aiming to establish a roadmap for the implementation of SOP and QA/QC procedures for online instruments in water steam cycles, leading to reliable chemical information that is trustworthy for process automation and chemistry control in water steam cycles. (author)

  17. Poster - 10: QA of Ultrasound Images for Prostate Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Szpala, Stanislaw; Kohli, Kirpal S. [BCCA-Fraser Valley Centre (Canada)

    2016-08-15

    Purpose: The current QA protocol of ultrasound systems used in prostate brachytherapy (TG128) addresses geometrical verifications, but the scope of evaluation of image quality is limited. We recognized importance of the latter in routine practice, and designed a protocol for QA of the images. Methods: Images of an ultrasound prostate phantom (CIRS053) were collected with BK Flex Focus 400. The images were saved as bmp after adjusting the gain to 50% for consistent results. Mean pixel values and signal to noise ratio were inspected in the representative sections of the phantom, including the mock prostate and the unechoic medium. Constancy of these numbers over a one year period was looked at. Results: The typical intensity in the mock prostate region in the transverse images ranged between 95 and 118 (out of 256), and the signal to noise was about 10. The intensity in the urethra region was about 170±40, and the unechoic medium was 2±2. The mean and the signal to noise ratio remained almost unchanged after a year, while the signal in the unechoic medium increased to about 7±4. Similar values were obtained in the sagittal images. Conclusions: The image analysis discussed above allows quick evaluation of constancy of the image quality. This may be also useful in troubleshooting image-quality problems during routine exams, which might not be due to deterioration of the US system, but other reasons, e.g. variations in tissue properties or air being trapped between the probe and the anatomy.

  18. Laboratory QA/QC improvements for small drinking water systems at Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Turner, R.D.

    1995-12-01

    The Savannah River Site (SRS), a 310 square mile facility located near Aiken, S.C., is operated by Westinghouse Savannah River Company for the US Department of Energy. SRS has 28 separate drinking water systems with average daily demands ranging from 0.0002 to 0.5 MGD. All systems utilize treated groundwater. Until recently, the water laboratories for each system operated independently. As a result, equipment, reagents, chemicals, procedures, personnel, and quality control practices differed from location to location. Due to this inconsistency, and a lack of extensive laboratory OA/QC practices at some locations, SRS auditors were not confident in the accuracy of daily water quality analyses results. The Site`s Water Services Department addressed these concerns by developing and implementing a practical laboratory QA/QC program. Basic changes were made which can be readily adopted by most small drinking water systems. Key features of the program include: Standardized and upgraded laboratory instrumentation and equipment; standardized analytical procedures based on vendor manuals and site requirements; periodic accuracy checks for all instrumentation; creation of a centralized laboratory to perform metals digestions and chlorine colorimeter accuracy checks; off-site and on-site operator training; proper storage, inventory and shelf life monitoring for reagents and chemicals. This program has enhanced the credibility and accuracy of SRS drinking water system analyses results.

  19. TU-C-BRE-01: KEYNOTE PRESENTATION - Emerging Frontiers in IMRT QA

    Energy Technology Data Exchange (ETDEWEB)

    Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2014-06-15

    As IMRT treatment processes advance and mature, so must the quality assurance processes being used to validate their delivery. In some respects, treatment delivery advancements (e.g. VMAT) have out-paced QA advancements. The purpose of this session is to describe new processes that are being implemented to bring IMRT QA up-to-date with the treatment delivery advances. It would explore emerging IMRT QA paradigms, including requirements-based IMRT QA which necessitates definition of delivery errors (e.g. patient dose error, leaf positioning error) and development of processes to ensure reliable error detection. Engineeringbased QA approaches, including use of IMRT treatment delivery process trees, fault tree analysis and failure modes effects analysis would be described. Approaches to detect errors such as (1) during treatment delivery validation using exit fluence detectors (e.g. EPIDs); (2) analysis of treatment delivery via use of machine parameter log files; (3) dose recalculation using (3a) treatment planning system; (3b) record-and-verify; or (3c) entrance and exit fluence measurement parameters would be explained. The relative advantages and disadvantages of each method would be discussed. Schemes for error classification and root cause analysis would be described – steps which are essential for future error prevention. For each QA method, testing procedures and results would be presented indicating the types of errors that can be detected, those that cannot be detected, and the reliability of the error detection method (for example determined via ROC analysis). For speakers, we are seeking to engage non-commercially biased experts. Those listed below are a sub-sample of possible qualified individuals.

  20. TU-C-BRE-01: KEYNOTE PRESENTATION - Emerging Frontiers in IMRT QA

    International Nuclear Information System (INIS)

    Siebers, J

    2014-01-01

    As IMRT treatment processes advance and mature, so must the quality assurance processes being used to validate their delivery. In some respects, treatment delivery advancements (e.g. VMAT) have out-paced QA advancements. The purpose of this session is to describe new processes that are being implemented to bring IMRT QA up-to-date with the treatment delivery advances. It would explore emerging IMRT QA paradigms, including requirements-based IMRT QA which necessitates definition of delivery errors (e.g. patient dose error, leaf positioning error) and development of processes to ensure reliable error detection. Engineeringbased QA approaches, including use of IMRT treatment delivery process trees, fault tree analysis and failure modes effects analysis would be described. Approaches to detect errors such as (1) during treatment delivery validation using exit fluence detectors (e.g. EPIDs); (2) analysis of treatment delivery via use of machine parameter log files; (3) dose recalculation using (3a) treatment planning system; (3b) record-and-verify; or (3c) entrance and exit fluence measurement parameters would be explained. The relative advantages and disadvantages of each method would be discussed. Schemes for error classification and root cause analysis would be described – steps which are essential for future error prevention. For each QA method, testing procedures and results would be presented indicating the types of errors that can be detected, those that cannot be detected, and the reliability of the error detection method (for example determined via ROC analysis). For speakers, we are seeking to engage non-commercially biased experts. Those listed below are a sub-sample of possible qualified individuals

  1. Applications of nuclear analytical techniques to environmental studies

    International Nuclear Information System (INIS)

    Freitas, M.C.; Marques, A.P.; Reis, M.A.; Pacheco, A.M.G.; Barros, L.I.C.

    2001-01-01

    A few examples of application of nuclear-analytical techniques to biological monitors - natives and transplants - are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal - the Setubal peninsula, about 50 km south of Lisbon - where indigenous lichens are rare. The whole area was 10x15 km around an oil-fired power station, and a 2.5x2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50x50 km, using a 10x10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively

  2. Study of trace elements in milk by nuclear analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gharib, A.; Rahimi, H.; Peyrovan, H.; Raofei, H.N.J.; Taherpour, H. (Atomic Energy Organization of Iran, Teheran. Nuclear Research Centre)

    This work is part of a project with the IAEA, in a coordinated programme on ''trace elements in Human Nutrition and Bio-Environmental Systems'' to evaluate their nutritional requirements, interrelations and the role of trace elements in health, metabolism etc. Cow's milk is regarded as one of the most important and nutritious foodstuffs consumed by people. Hence, as a first step, an elemental analysis for milk was carried out for this purpose so a few samples of pasteurized milk and local samples were investigated for essential and toxic trace elements. The secondary aim of this project was the assessment of various analytical techniques involved. However, in the present work, the methods involved were AAS, PIXE and NAA. The latter method applied, both instrumentally and radiochemically. Although the results pertaining to the various methods employed are not in good agreement, there is however, some justification to clarify this internal inconsistency. The precision for NAA and AAS allows greater degree of acceptance respectively. Although PIXE is very fast and rather routine, the technique of trace element analysis needs certain adaptations and developments.

  3. SU-F-T-285: Evaluation of a Patient DVH-Based IMRT QA System

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, H; Redler, G; Chu, J; Turian, J [Rush University Medical Center, Chicago, IL (United States)

    2016-06-15

    Purpose: To evaluate the clinical performance of a patient DVH-based QA system for prostate VMAT QA. Methods: Mobius3D(M3D) is a QA software with an independent beam model and dose engine. The MobiusFX(MFX) add-on predicts patient dose using treatment machine log files. We commissioned the Mobius beam model in two steps. First, the stock beam model was customized using machine commissioning data, then verified against the TPS with 12 simple phantom plans and 7 clinical 3D plans. Secondly, the Dosimetric Leaf Gap(DLG) in the Mobius model was fine-tuned for VMAT treatment based on ion chamber measurements for 6 clinical VMAT plans. Upon successful commissioning, we retrospectively performed IMRT QA for 12 VMAT plans with the Mobius system as well as the ArcCHECK-3DVH system. Selected patient DVH values (PTV D95, D50; Bladder D2cc, Dmean; Rectum D2cc) were compared between TPS, M3D, MFX, and 3DVH. Results: During the first commissioning step, TPS and M3D calculated target Dmean for 3D plans agree within 0.7%±0.7%, with 3D gamma passing rates of 98%±2%. In the second commissioning step, the Mobius DLG was adjusted by 1.2mm from the stock value, reducing the average difference between MFX calculation and ion chamber measurement from 3.2% to 0.1%. In retrospective prostate VMAT QA, 5 of 60 MFX calculated DVH values have a deviation greater than 5% compared to TPS. One large deviation at high dose level was identified as a potential QA failure. This echoes the 3DVH QA result, which identified 2 instances of large DVH deviation on the same structure. For all DVH’s evaluated, M3D and MFX show high level of agreement (0.1%±0.2%), indicating that the observed deviation is likely from beam modelling differences rather than delivery errors. Conclusion: Mobius system provides a viable solution for DVH based VMAT QA, with the capability of separating TPS and delivery errors.

  4. SU-F-T-285: Evaluation of a Patient DVH-Based IMRT QA System

    International Nuclear Information System (INIS)

    Zhen, H; Redler, G; Chu, J; Turian, J

    2016-01-01

    Purpose: To evaluate the clinical performance of a patient DVH-based QA system for prostate VMAT QA. Methods: Mobius3D(M3D) is a QA software with an independent beam model and dose engine. The MobiusFX(MFX) add-on predicts patient dose using treatment machine log files. We commissioned the Mobius beam model in two steps. First, the stock beam model was customized using machine commissioning data, then verified against the TPS with 12 simple phantom plans and 7 clinical 3D plans. Secondly, the Dosimetric Leaf Gap(DLG) in the Mobius model was fine-tuned for VMAT treatment based on ion chamber measurements for 6 clinical VMAT plans. Upon successful commissioning, we retrospectively performed IMRT QA for 12 VMAT plans with the Mobius system as well as the ArcCHECK-3DVH system. Selected patient DVH values (PTV D95, D50; Bladder D2cc, Dmean; Rectum D2cc) were compared between TPS, M3D, MFX, and 3DVH. Results: During the first commissioning step, TPS and M3D calculated target Dmean for 3D plans agree within 0.7%±0.7%, with 3D gamma passing rates of 98%±2%. In the second commissioning step, the Mobius DLG was adjusted by 1.2mm from the stock value, reducing the average difference between MFX calculation and ion chamber measurement from 3.2% to 0.1%. In retrospective prostate VMAT QA, 5 of 60 MFX calculated DVH values have a deviation greater than 5% compared to TPS. One large deviation at high dose level was identified as a potential QA failure. This echoes the 3DVH QA result, which identified 2 instances of large DVH deviation on the same structure. For all DVH’s evaluated, M3D and MFX show high level of agreement (0.1%±0.2%), indicating that the observed deviation is likely from beam modelling differences rather than delivery errors. Conclusion: Mobius system provides a viable solution for DVH based VMAT QA, with the capability of separating TPS and delivery errors.

  5. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results...

  6. Nature and strength of bonding in a crystal of semiconducting nanotubes: van der Waals density functional calculations and analytical results

    DEFF Research Database (Denmark)

    Kleis, Jesper; Schröder, Elsebeth; Hyldgaard, Per

    2008-01-01

    calculations, the vdW-DF study predicts an intertube vdW bonding with a strength that is consistent with recent observations for the interlayer binding in graphitics. It also produces a nanotube wall-to-wall separation, which is in very good agreement with experiments. Moreover, we find that the vdW-DF result...... for the nanotube-crystal binding energy can be approximated by a sum of nanotube-pair interactions when these are calculated in vdW-DR This observation suggests a framework for an efficient implementation of quantum-physical modeling of the carbon nanotube bundling in more general nanotube bundles, including......The dispersive interaction between nanotubes is investigated through ab initio theory calculations and in an analytical approximation. A van der Waals density functional (vdW-DF) [M. Dion et al., Phys. Rev. Lett. 92, 246401 (2004)] is used to determine and compare the binding of a pair of nanotubes...

  7. Quasi-normal frequencies: Semi-analytic results for highly damped modes

    International Nuclear Information System (INIS)

    Skakala, Jozef; Visser, Matt

    2011-01-01

    Black hole highly-damped quasi-normal frequencies (QNFs) are very often of the form ω n = (offset) + in (gap). We have investigated the genericity of this phenomenon for the Schwarzschild-deSitter (SdS) black hole by considering a model potential that is piecewise Eckart (piecewise Poschl-Teller), and developing an analytic 'quantization condition' for the highly-damped quasi-normal frequencies. We find that the ω n = (offset) + in (gap) behaviour is common but not universal, with the controlling feature being whether or not the ratio of the surface gravities is a rational number. We furthermore observed that the relation between rational ratios of surface gravities and periodicity of QNFs is very generic, and also occurs within different analytic approaches applied to various types of black hole spacetimes. These observations are of direct relevance to any physical situation where highly-damped quasi-normal modes are important.

  8. Distribution of Steps with Finite-Range Interactions: Analytic Approximations and Numerical Results

    Science.gov (United States)

    GonzáLez, Diego Luis; Jaramillo, Diego Felipe; TéLlez, Gabriel; Einstein, T. L.

    2013-03-01

    While most Monte Carlo simulations assume only nearest-neighbor steps interact elastically, most analytic frameworks (especially the generalized Wigner distribution) posit that each step elastically repels all others. In addition to the elastic repulsions, we allow for possible surface-state-mediated interactions. We investigate analytically and numerically how next-nearest neighbor (NNN) interactions and, more generally, interactions out to q'th nearest neighbor alter the form of the terrace-width distribution and of pair correlation functions (i.e. the sum over n'th neighbor distribution functions, which we investigated recently.[2] For physically plausible interactions, we find modest changes when NNN interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  9. A semi-analytical study on helical springs made of shape memory polymer

    International Nuclear Information System (INIS)

    Baghani, M; Naghdabadi, R; Arghavani, J

    2012-01-01

    In this paper, the responses of shape memory polymer (SMP) helical springs under axial force are studied both analytically and numerically. In the analytical solution, we first derive the response of a cylindrical tube under torsional loadings. This solution can be used for helical springs in which both the curvature and pitch effects are negligible. This is the case for helical springs with large ratios of the mean coil radius to the cross sectional radius (spring index) and also small pitch angles. Making use of this solution simplifies the analysis of the helical springs to that of the torsion of a straight bar with circular cross section. The 3D phenomenological constitutive model recently proposed for SMPs is also reduced to the 1D shear case. Thus, an analytical solution for the torsional response of SMP tubes in a full cycle of stress-free strain recovery is derived. In addition, the curvature effect is added to the formulation and the SMP helical spring is analyzed using the exact solution presented for torsion of curved SMP tubes. In this modified solution, the effect of the direct shear force is also considered. In the numerical analysis, the 3D constitutive equations are implemented in a finite element program and a full cycle of stress-free strain recovery of an SMP (extension or compression) helical spring is simulated. Analytical and numerical results are compared and it is shown that the analytical solution gives accurate stress distributions in the cross section of the helical SMP spring besides the global load–deflection response. Some case studies are presented to show the validity of the presented analytical method. (paper)

  10. Three-neutrino oscillations in matter: Analytical results in the adiabatic approximaton

    International Nuclear Information System (INIS)

    Petcov, S.T.; Toshev, S.

    1987-01-01

    Analytical expressions for the probabilities of the transitions between different neutrino flavours in matter in the case of three lepton families and small vacuum mixing angles are obtained in the adiabatic approximation. A brief discussion of the characteristic features of the Mikheyev-Smirnov-Wolfenstein effect in the system of the three neutrino flavours ν e , ν μ and ν τ is also given. (orig.)

  11. DEMT experimental and analytical studies on seismic isolation

    International Nuclear Information System (INIS)

    Gantenbein, F.; Buland, P.

    1989-01-01

    Work on seismic isolation has been performed in France for many years, and the isolation device developed by SPIE-BATIGNOLLES in collaboration with Electricite de France (EDF) has been incorporated in the design of pressurized-water reactor (PWR) nuclear power plants. This paper reviews the experimental and theoretical studies performed at CEA/DEMT related to the overall behavior of isolated structures. The experimental work consists of the seismic shaking-table tests of a concrete cylinder isolated by neoprene sliding pads, and the vibrational tests on the reaction mass of the TAMARIS seismic facility. The analytical work consists of the development of procedures for dynamic calculation methods: for soil-structure interaction where pads are placed between an upper raft and pedestals, for time-history calculations where sliding plates are used, and for fluid-structure interaction where coupled fluid and structure motions and sloshing modes are important. Finally, this paper comments on the consequences of seismic isolation for the analysis of fast breeder reactor (FBR) vessels. The modes can no longer be considered independent (SRSS Method leads to important errors), and the sloshing increases

  12. A study on improvement of analytical prediction model for spacer grid pressure loss coefficients

    International Nuclear Information System (INIS)

    Lim, Jonh Seon

    2002-02-01

    Nuclear fuel assemblies used in the nuclear power plants consist of the nuclear fuel rods, the control rod guide tubes, an instrument guide tube, spacer grids,a bottom nozzle, a top nozzle. The spacer grid is the most important component of the fuel assembly components for thermal hydraulic and mechanical design and analyses. The spacer grids fixed with the guide tubes support the fuel rods and have the very important role to activate thermal energy transfer by the coolant mixing caused to the turbulent flow and crossflow in the subchannels. In this paper, the analytical spacer grid pressure loss prediction model has been studied and improved by considering the test section wall to spacer grid gap pressure loss independently and applying the appropriate friction drag coefficient to predict pressure loss more accurately at the low Reynolds number region. The improved analytical model has been verified based on the hydraulic pressure drop test results for the spacer grids of three types with 5x5, 16x16, 17x17 arrays, respectively. The pressure loss coefficients predicted by the improved analytical model are coincident with those test results within ±12%. This result shows that the improved analytical model can be used for research and design change of the nuclear fuel assembly

  13. MO-FG-202-09: Virtual IMRT QA Using Machine Learning: A Multi-Institutional Validation

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, G; Scheuermann, R; Solberg, T [University of Pennsylvania, Philadelphia, PA (United States); Chan, M; Deasy, J [Memorial Sloan-Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: To validate a machine learning approach to Virtual IMRT QA for accurately predicting gamma passing rates using different QA devices at different institutions. Methods: A Virtual IMRT QA was constructed using a machine learning algorithm based on 416 IMRT plans, in which QA measurements were performed using diode-array detectors and a 3%local/3mm with 10% threshold. An independent set of 139 IMRT measurements from a different institution, with QA data based on portal dosimetry using the same gamma index and 10% threshold, was used to further test the algorithm. Plans were characterized by 90 different complexity metrics. A weighted poison regression with Lasso regularization was trained to predict passing rates using the complexity metrics as input. Results: In addition to predicting passing rates with 3% accuracy for all composite plans using diode-array detectors, passing rates for portal dosimetry on per-beam basis were predicted with an error <3.5% for 120 IMRT measurements. The remaining measurements (19) had large areas of low CU, where portal dosimetry has larger disagreement with the calculated dose and, as such, large errors were expected. These beams need to be further modeled to correct the under-response in low dose regions. Important features selected by Lasso to predict gamma passing rates were: complete irradiated area outline (CIAO) area, jaw position, fraction of MLC leafs with gaps smaller than 20 mm or 5mm, fraction of area receiving less than 50% of the total CU, fraction of the area receiving dose from penumbra, weighted Average Irregularity Factor, duty cycle among others. Conclusion: We have demonstrated that the Virtual IMRT QA can predict passing rates using different QA devices and across multiple institutions. Prediction of QA passing rates could have profound implications on the current IMRT process.

  14. MO-FG-202-09: Virtual IMRT QA Using Machine Learning: A Multi-Institutional Validation

    International Nuclear Information System (INIS)

    Valdes, G; Scheuermann, R; Solberg, T; Chan, M; Deasy, J

    2016-01-01

    Purpose: To validate a machine learning approach to Virtual IMRT QA for accurately predicting gamma passing rates using different QA devices at different institutions. Methods: A Virtual IMRT QA was constructed using a machine learning algorithm based on 416 IMRT plans, in which QA measurements were performed using diode-array detectors and a 3%local/3mm with 10% threshold. An independent set of 139 IMRT measurements from a different institution, with QA data based on portal dosimetry using the same gamma index and 10% threshold, was used to further test the algorithm. Plans were characterized by 90 different complexity metrics. A weighted poison regression with Lasso regularization was trained to predict passing rates using the complexity metrics as input. Results: In addition to predicting passing rates with 3% accuracy for all composite plans using diode-array detectors, passing rates for portal dosimetry on per-beam basis were predicted with an error <3.5% for 120 IMRT measurements. The remaining measurements (19) had large areas of low CU, where portal dosimetry has larger disagreement with the calculated dose and, as such, large errors were expected. These beams need to be further modeled to correct the under-response in low dose regions. Important features selected by Lasso to predict gamma passing rates were: complete irradiated area outline (CIAO) area, jaw position, fraction of MLC leafs with gaps smaller than 20 mm or 5mm, fraction of area receiving less than 50% of the total CU, fraction of the area receiving dose from penumbra, weighted Average Irregularity Factor, duty cycle among others. Conclusion: We have demonstrated that the Virtual IMRT QA can predict passing rates using different QA devices and across multiple institutions. Prediction of QA passing rates could have profound implications on the current IMRT process.

  15. PRA studies: results, insights and applications

    International Nuclear Information System (INIS)

    Levine, S.; Stetson, F.T.

    1983-01-01

    This paper deals with Probalistic Risk Assessment (PRA) studies and their results. The PRA is a combination of logic structures and analytical techniques that can be used to estimate the likelihood and consequences of events that have not been observed because of their low frequency occurrence. At first attitudes concerning PRA reports were controversial principally because of their new techniques and complex multidisciplinary nature. However these attitudes changed following the accident at Three Mile Island in 1979. Many people after this event came to appreciate the risks associated with the operation of nuclear power plants, and since the TMI accident there has been a rapid expansion, in the use of PRA in the US and other countries. (NEA) [fr

  16. An analytic study of applying Miller cycle to reduce NOx emission from petrol engine

    International Nuclear Information System (INIS)

    Wang Yaodong; Lin Lin; Roskilly, Anthony P.; Zeng Shengchuo; Huang, Jincheng; He Yunxin; Huang Xiaodong; Huang Huilan; Wei Haiyan; Li Shangping; Yang Jing

    2007-01-01

    An analytic investigation of applying Miller cycle to reduce nitrogen oxides (NO x ) emissions from a petrol engine is carried out. The Miller cycle used in the investigation is a late intake valve closing version. A detailed thermodynamic analysis of the cycle is presented. A comparison of the characters of Miller cycle with Otto cycle is presented. From the results of thermodynamic analyses, it can be seen that the application of Miller cycle is able to reduce the compression pressure and temperature in the cylinder at the end of compression stroke. Therefore, it lowers down the combustion temperature and NO x formation in engine cylinder. These results in a lower exhaust temperature and less NO x emissions compared with that of Otto cycle. The analytic results also show that Miller cycle ratio is a main factor to influence the combustion temperature, and then the NO x emissions and the exhaust temperature. The results from the analytic study are used to analyse and to compare with the previous experimental results. An empirical formula from the previous experimental results that showed the relation of NO x emissions with the exhaust temperature at different engine speed is presented. The results from the study showed that the application of Miller cycle may reduce NO x emissions from petrol engine

  17. Marshland study brings fruitful results

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    There are approximately 110,000 square kilometers of marshland in China containing peat and other valuable resources. The results of a marshland study have been presented as a chapter in a book titled Physical Geography of China. In 1970, the Kirin Normal University established an experimental plant for the utilization of peat. Four kinds of peat fertilizers produced by this plant have been used with good results at 20 communes in Kirin Province. Four kinds of construction materials (peat board, peat tiles, peat insulation bricks and peat insulation tubes) were also successfully made. A certain type of peat can be used as fuel to heat malt in the process of whiskey making. New applications of peat have been found in medicine, water purification, and in the manufacturing of electrodes for condensers.

  18. Manufacturing and QA of adaptors for LHC

    International Nuclear Information System (INIS)

    Madhu Murthy, V.; Dwivedi, J.; Goswami, S.G.; Soni, H.C.; Mainaud Durand, H.; Quesnel, J.P.; )

    2006-01-01

    The LHC low beta quadrupoles, have very tight alignment tolerances and are located in areas with strong radiation field. They require remote re-alignment, by motorized jacks, based on the feedback of alignment sensors of each magnet. Jacks designed to support arc cryomagnets of LHC are modified and motorized with the help of adaptors. Two types of adapters, for vertical and transverse axes of the jacks, were developed and supplied through collaboration between RRCAT, DAE, India and CERN, Geneva. This paper describes their functional requirements, manufacture and quality assurance (QA). (author)

  19. Systematic analytical characterization of new psychoactive substances: A case study.

    Science.gov (United States)

    Lobo Vicente, Joana; Chassaigne, Hubert; Holland, Margaret V; Reniero, Fabiano; Kolář, Kamil; Tirendi, Salvatore; Vandecasteele, Ine; Vinckier, Inge; Guillou, Claude

    2016-08-01

    New psychoactive substances (NPS) are synthesized compounds that are not usually covered by European and/or international laws. With a slight alteration in the chemical structure of existing illegal substances registered in the European Union (EU), these NPS circumvent existing controls and are thus referred to as "legal highs". They are becoming increasingly available and can easily be purchased through both the internet and other means (smart shops). Thus, it is essential that the identification of NPS keeps up with this rapidly evolving market. In this case study, the Belgian Customs authorities apprehended a parcel, originating from China, containing two samples, declared as being "white pigments". For routine identification, the Belgian Customs Laboratory first analysed both samples by gas-chromatography mass-spectrometry and Fourier-Transform Infrared spectroscopy. The information obtained by these techniques is essential and can give an indication of the chemical structure of an unknown substance but not the complete identification of its structure. To bridge this gap, scientific and technical support is ensured by the Joint Research Centre (JRC) to the European Commission Directorate General for Taxation and Customs Unions (DG TAXUD) and the Customs Laboratory European Network (CLEN) through an Administrative Arrangement for fast recognition of NPS and identification of unknown chemicals. The samples were sent to the JRC for a complete characterization using advanced techniques and chemoinformatic tools. The aim of this study was also to encourage the development of a science-based policy driven approach on NPS. These samples were fully characterized and identified as 5F-AMB and PX-3 using (1)H and (13)C nuclear magnetic resonance (NMR), high-resolution tandem mass-spectrometry (HR-MS/MS) and Raman spectroscopy. A chemoinformatic platform was used to manage, unify analytical data from multiple techniques and instruments, and combine it with chemical and

  20. Noble gas encapsulation into carbon nanotubes: Predictions from analytical model and DFT studies

    Energy Technology Data Exchange (ETDEWEB)

    Balasubramani, Sree Ganesh; Singh, Devendra; Swathi, R. S., E-mail: swathi@iisertvm.ac.in [School of Chemistry, Indian Institute of Science Education and Research Thiruvananthapuram (IISER-TVM), Kerala 695016 (India)

    2014-11-14

    The energetics for the interaction of the noble gas atoms with the carbon nanotubes (CNTs) are investigated using an analytical model and density functional theory calculations. Encapsulation of the noble gas atoms, He, Ne, Ar, Kr, and Xe into CNTs of various chiralities is studied in detail using an analytical model, developed earlier by Hill and co-workers. The constrained motion of the noble gas atoms along the axes of the CNTs as well as the off-axis motion are discussed. Analyses of the forces, interaction energies, acceptance and suction energies for the encapsulation enable us to predict the optimal CNTs that can encapsulate each of the noble gas atoms. We find that CNTs of radii 2.98 − 4.20 Å (chiral indices, (5,4), (6,4), (9,1), (6,6), and (9,3)) can efficiently encapsulate the He, Ne, Ar, Kr, and Xe atoms, respectively. Endohedral adsorption of all the noble gas atoms is preferred over exohedral adsorption on various CNTs. The results obtained using the analytical model are subsequently compared with the calculations performed with the dispersion-including density functional theory at the M06 − 2X level using a triple-zeta basis set and good qualitative agreement is found. The analytical model is however found to be computationally cheap as the equations can be numerically programmed and the results obtained in comparatively very less time.

  1. An analytical study on groundwater flow in drainage basins with horizontal wells

    Science.gov (United States)

    Wang, Jun-Zhi; Jiang, Xiao-Wei; Wan, Li; Wang, Xu-Sheng; Li, Hailong

    2014-06-01

    Analytical studies on release/capture zones are often limited to a uniform background groundwater flow. In fact, for basin-scale problems, the undulating water table would lead to the development of hierarchically nested flow systems, which are more complex than a uniform flow. Under the premise that the water table is a replica of undulating topography and hardly influenced by wells, an analytical solution of hydraulic head is derived for a two-dimensional cross section of a drainage basin with horizontal injection/pumping wells. Based on the analytical solution, distributions of hydraulic head, stagnation points and flow systems (including release/capture zones) are explored. The superposition of injection/pumping wells onto the background flow field leads to the development of new internal stagnation points and new flow systems (including release/capture zones). Generally speaking, the existence of n injection/pumping wells would result in up to n new internal stagnation points and up to 2n new flow systems (including release/capture zones). The analytical study presented, which integrates traditional well hydraulics with the theory of regional groundwater flow, is useful in understanding basin-scale groundwater flow influenced by human activities.

  2. Experimental and analytical study of the sputtering phenomena

    International Nuclear Information System (INIS)

    Howard, P.A.

    1976-03-01

    One form of the sputtering phenomena, the heat-transfer process that occurs when an initially hot vertical surface is cooled by a falling liquid film, was examined from a new experimental approach. The sputtering front is the lowest wetted position on the vertical surface and is characterized by a short region of intense nucleate boiling. The sputtering front progresses downward at nearly a constant rate, the surface below the sputtering front being dry and almost adiabatic. This heat-transfer process is of interest in the analysis of some of the performance aspects of emergency core-cooling systems of light-water reactors. An experimental apparatus was constructed to examine the heat-transfer characteristics of a sputtering front. In the present study, a heat source of sufficient intensity was located immediately below the sputtering front, which prevented its downward progress, thus permitting detailed measurements of steady-state surface temperatures throughout a sputtering front. Experimental evidence showed the sputtering front to correspond to a critical heat-flux (CHF) phenomenon. Data were obtained with water flow rates of 350-1600 lb/sub m//hr-ft and subcoolings of 40-140 0 F on a 3 / 8 -in. solid copper rod at 1 atm. A two-dimensional analytical model was developed to describe a stationary sputtering front where the wet-dry interface corresponds to a CHF phenomena and the dry zone is adiabatic. This model is nonlinear because of the temperature dependence of the heat-transfer coefficient in the wetted region and has yielded good agreement with data. A simplified one-dimensional approximation was developed which adequately describes these data. Finally, by means of a coordinate transformation and additional simplifying assumptions, this analysis was extended to analyze moving sputtering fronts, and reasonably good agreement with reported data was shown

  3. [Pregnancy-Associated Breast Cancer: An analytical observational study].

    Science.gov (United States)

    Baulies, Sonia; Cusidó, Maite; Tresserra, Francisco; Rodríguez, Ignacio; Ubeda, Belén; Ara, Carmen; Fábregas, Rafael

    2014-03-04

    Pregnancy-associated breast cancer is defined as breast cancer diagnosed during pregnancy and up to one year postpartum. A retrospective, analytical, observational study comparing 56 cases of breast cancer and pregnancy (PABC) diagnosed 1976-2008 with 73 patients with breast cancer not associated with pregnancy (non-PABC) was performed. Demographic data, prognostic factors, treatment and survival were reviewed and compared. The prevalence of PABC in our center is 8.3/10,000. The highest frequency (62%) appeared during the postpartum period. The stages are higher in PABC, being 31.3% advanced (EIII and EIV) in PABC versus 13.3% in non-PABC (P < .05). Regarding prognostic factors, 27.3% in PABC had a tumoral grade 3 versus 15.8% of non-PABC. Among women with PABC, 33.3% had negative estrogen receptors, 48.7% negative progesterone receptors and 34.5% positive Her2Neu compared with 22.2, 24.1 and 31%, respectively of non-PABC patients. Finally, positive lymph nodes were found in 52.8% of PABC, versus 33.8% non-PABC (P < .05). Overall and disease-free survival rate at 5 years for PABC was 63.7 and 74.2%, respectively. The poorer survival observed is possibly due to the presence of adverse prognostic features such as lymph node metastases, negative hormone receptors, tumoral grade iii, as well as a delay in diagnosis with a higher rate of advanced stages. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  4. Group-analytic training groups for psychology students: A qualitative study

    DEFF Research Database (Denmark)

    Nathan, Vibeke Torpe; Poulsen, Stig

    2004-01-01

    This article presents results from an interview study of psychology students' experiences from group-analytic groups conducted at the University of Copenhagen. The primary foci are the significance of differences in themotivation participants'  personal aims of individual participantsfor particip......This article presents results from an interview study of psychology students' experiences from group-analytic groups conducted at the University of Copenhagen. The primary foci are the significance of differences in themotivation participants'  personal aims of individual participantsfor...... participation in the group, the impact of the composition of participants on the group process, and the professional learning through the group experience. In general the interviews show a marked satisfaction with the group participation. In particular, learning about the importance of group boundaries...

  5. Investigating ion recombination effects in a liquid-filled ionization chamber array used for IMRT QA measurements

    Energy Technology Data Exchange (ETDEWEB)

    Knill, Cory, E-mail: knillcor@gmail.com; Snyder, Michael; Rakowski, Joseph T.; Burmeister, Jay [Department of Radiation Oncology, Karmanos Cancer Institute, Detroit, Michigan 48201 and Department of Radiation Oncology, Wayne State University School of Medicine, Detroit, Michigan 48201 (United States); Zhuang, Ling [Department of Radiation Oncology, Wayne State University School of Medicine, Detroit, Michigan 48201 (United States); Matuszak, Martha [Department of Radiation Oncology, University of Michigan Health System, Ann Arbor, Michigan 48109 (United States)

    2016-05-15

    Purpose: PTW’s Octavius 1000 SRS array performs IMRT quality assurance (QA) measurements with liquid-filled ionization chambers (LICs) to allow closer detector spacing and higher resolution, compared to air-filled QA devices. However, reduced ion mobility in LICs relative to air leads to increased ion recombination effects and reduced collection efficiencies that are dependent on Linac pulse frequency and pulse dose. These pulse parameters are variable during an IMRT delivery, which affects QA results. In this study, (1) 1000 SRS collection efficiencies were measured as a function of pulse frequency and pulse dose, (2) two methods were developed to correct changes in collection efficiencies during IMRT QA measurements, and the effects of these corrections on QA pass rates were compared. Methods: To obtain collection efficiencies, the OCTAVIUS 1000 SRS was used to measure open fields of varying pulse frequency, pulse dose, and beam energy with results normalized to air-filled chamber measurements. Changes in ratios of 1000 SRS to chamber measured dose were attributed to changing collection efficiencies, which were then correlated to pulse parameters using regression analysis. The usefulness of the derived corrections was then evaluated using 6 MV and 10FFF SBRT RapidArc plans delivered to the OCTAVIUS 4D system using a TrueBeam (Varian Medical Systems) linear accelerator equipped with a high definition multileaf collimator. For the first correction, MATLAB software was developed that calculates pulse frequency and pulse dose for each detector, using measurement and DICOM RT Plan files. Pulse information is converted to collection efficiency, and measurements are corrected by multiplying detector dose by ratios of calibration to measured collection efficiencies. For the second correction the MU/min in the daily 1000 SRS calibration was chosen to match the average MU/min of the volumetric modulated arc therapy plan. Effects of the two corrections on QA results were

  6. Experimental and analytical study of high velocity impact on Kevlar/Epoxy composite plates

    Science.gov (United States)

    Sikarwar, Rahul S.; Velmurugan, Raman; Madhu, Velmuri

    2012-12-01

    In the present study, impact behavior of Kevlar/Epoxy composite plates has been carried out experimentally by considering different thicknesses and lay-up sequences and compared with analytical results. The effect of thickness, lay-up sequence on energy absorbing capacity has been studied for high velocity impact. Four lay-up sequences and four thickness values have been considered. Initial velocities and residual velocities are measured experimentally to calculate the energy absorbing capacity of laminates. Residual velocity of projectile and energy absorbed by laminates are calculated analytically. The results obtained from analytical study are found to be in good agreement with experimental results. It is observed from the study that 0/90 lay-up sequence is most effective for impact resistance. Delamination area is maximum on the back side of the plate for all thickness values and lay-up sequences. The delamination area on the back is maximum for 0/90/45/-45 laminates compared to other lay-up sequences.

  7. From the Phenix irradiation end to the analytical results: PROFIL R target destructive characterization

    International Nuclear Information System (INIS)

    Ferlay, G.; Dancausse, J. Ph.

    2009-01-01

    In the French long-lived radionuclide (LLRN) transmutation program, several irradiation experiments were initiated in the Phenix fast neutron reactor to obtain a better understanding of the transmutation processes. The PROFIL experiments are performed in order to collect accurate information on the total capture integral cross sections of the principal heavy isotopes and some important fission products in the spectral range of fast reactors. One of the final goals is to diminish the uncertainties on the capture cross-section of the fission products involved in reactivity losses in fast reactors. This program includes two parts: PROFIL-R irradiated in a standard fast reactor spectrum and PROFIL-M irradiated in a moderated spectrum. The PROFIL-R and PROFIL-M irradiations were completed in August 2005 and May 2008, respectively. For both irradiations more than a hundred containers with isotopes of pure actinides and other elements in different chemical forms must be characterized. This raises a technical and analytical challenge: how to recover by selective dissolution less than 5 mg of isotope powder from a container with dimensions of only a few millimeters using hot cell facilities, and how to determine analytically both trace and ultra-trace elemental and isotopic compositions with sufficient accuracy to be useful for code calculations. (authors)

  8. Comparison of experimental target currents with analytical model results for plasma immersion ion implantation

    International Nuclear Information System (INIS)

    En, W.G.; Lieberman, M.A.; Cheung, N.W.

    1995-01-01

    Ion implantation is a standard fabrication technique used in semiconductor manufacturing. Implantation has also been used to modify the surface properties of materials to improve their resistance to wear, corrosion and fatigue. However, conventional ion implanters require complex optics to scan a narrow ion beam across the target to achieve implantation uniformity. An alternative implantation technique, called Plasma Immersion Ion Implantation (PIII), immerses the target into a plasma. The ions are extracted from the plasma directly and accelerated by applying negative high-voltage pulses to the target. An analytical model of the voltage and current characteristics of a remote plasma is presented. The model simulates the ion, electron and secondary electron currents induced before, during and after a high voltage negative pulse is applied to a target immersed in a plasma. The model also includes analytical relations that describe the sheath expansion and collapse due to negative high voltage pulses. The sheath collapse is found to be important for high repetition rate pulses. Good correlation is shown between the model and experiment for a wide variety of voltage pulses and plasma conditions

  9. SU-D-213-04: Accounting for Volume Averaging and Material Composition Effects in An Ionization Chamber Array for Patient Specific QA

    International Nuclear Information System (INIS)

    Fugal, M; McDonald, D; Jacqmin, D; Koch, N; Ellis, A; Peng, J; Ashenafi, M; Vanek, K

    2015-01-01

    Purpose: This study explores novel methods to address two significant challenges affecting measurement of patient-specific quality assurance (QA) with IBA’s Matrixx Evolution™ ionization chamber array. First, dose calculation algorithms often struggle to accurately determine dose to the chamber array due to CT artifact and algorithm limitations. Second, finite chamber size and volume averaging effects cause additional deviation from the calculated dose. Methods: QA measurements were taken with the Matrixx positioned on the treatment table in a solid-water Multi-Cube™ phantom. To reduce the effect of CT artifact, the Matrixx CT image set was masked with appropriate materials and densities. Individual ionization chambers were masked as air, while the high-z electronic backplane and remaining solid-water material were masked as aluminum and water, respectively. Dose calculation was done using Varian’s Acuros XB™ (V11) algorithm, which is capable of predicting dose more accurately in non-biologic materials due to its consideration of each material’s atomic properties. Finally, the exported TPS dose was processed using an in-house algorithm (MATLAB) to assign the volume averaged TPS dose to each element of a corresponding 2-D matrix. This matrix was used for comparison with the measured dose. Square fields at regularly-spaced gantry angles, as well as selected patient plans were analyzed. Results: Analyzed plans showed improved agreement, with the average gamma passing rate increasing from 94 to 98%. Correction factors necessary for chamber angular dependence were reduced by 67% compared to factors measured previously, indicating that previously measured factors corrected for dose calculation errors in addition to true chamber angular dependence. Conclusion: By comparing volume averaged dose, calculated with a capable dose engine, on a phantom masked with correct materials and densities, QA results obtained with the Matrixx Evolution™ can be significantly

  10. Analytical studies on a modified Nagel-Schreckenberg model with the Fukui-Ishibashi acceleration rule

    International Nuclear Information System (INIS)

    Fu Chuanji; Wang Binghong; Yin Chuanyang; Zhou Tao; Hu Bo; Gao Kun; Hui, P.M.; Hu, C.-K.

    2007-01-01

    We propose and study a one-dimensional traffic flow cellular automaton model of high-speed vehicles with the Fukui-Ishibashi-type (FI) acceleration rule for all cars, and the Nagel-Schreckenberg-type (NS) stochastic delay mechanism. We obtain analytically the fundamental diagrams of the average speed and vehicle flux depending on the vehicle density and stochastic delay probability. Our theoretical results are in excellent agreement with numerical simulations

  11. Studies on the spectral interference of gadolinium on different analytes in inductively coupled plasma atomic emission spectroscopy

    International Nuclear Information System (INIS)

    Sengupta, Arijit; Thulasidas, S.K.; Natarajan, V.; Airan, Yougant

    2015-01-01

    Due to the multi-electronic nature, rare earth elements are prone to exhibit spectral interference in ICP-AES, which leads to erroneous determination of analytes in presence of such matrix. This interference is very significant, when the analytes are to be determined at trace level in presence of emission rich matrix elements. An attempt was made to understand the spectral interference of Gd on 29 common analytes like Ag, Al, B, Ba, Bi, Ca, Cd, Ce, Co, Cr, Cu, Dy, Fe, Ga, Gd, In, La, Li, Lu, Mg, Mn, Na, Nd, Ni, Pb, Pr, Sr, Tl and Zn using ICP-AES with capacitive Charged Coupled Device (CCD) as detector. The present study includes identification of suitable interference free analytical lines of these analytes, evaluation of correction factor for each analytical line and determination of tolerance levels of these analytical lines along with the ICP-AES based methodology for simultaneous determination of Gd. Based on the spectral interference study, an ICP-AES based method was developed for the determination of these analytes at trace level in presence of Gd matrix without chemical separation. Further the developed methodology was validated using synthetic samples prepared from commercially available reference material solution of individual element; the results were found to be satisfactory. The method was also compared with other existing techniques

  12. Application of nuclear analytical methods to heavy metal pollution studies of estuaries

    International Nuclear Information System (INIS)

    Anders, B.; Junge, W.; Knoth, J.; Michaelis, W.; Pepelnik, R.; Schwenke, H.

    1984-01-01

    Important objectives of heavy metal pollution studies of estuaries are the understanding of the transport phenomena in these complex ecosystems and the discovery of the pollution history and the geochemical background. Such studies require high precision and accuracy of the analytical methods. Moreover, pronounced spatial heterogeneities and temporal variabilities that are typical for estuaries necessitate the analysis of a great number of samples if relevant results are to be obtained. Both requirements can economically be fulfilled by a proper combination of analytical methods. Applications of energy-dispersive X-ray fluorescence analysis with total reflection of the exciting beam at the sample support and of neutron activation analysis with both thermal and fast neutrons are reported in the light of pollution studies performed in the Lower Elbe River. (orig.)

  13. Application of nuclear analytical methods to heavy metal pollution studies of estuaries

    International Nuclear Information System (INIS)

    Anders, B.; Junge, W.; Knoth, J.; Michaelis, W.; Pepelnik, R.; Schwenke, H.

    1984-01-01

    Important objectives of heavy metal pollution studies of estuaries are the understanding of the transport phenomena in these complex ecosystems and the discovery of the pollution history and the geochemical background. Such studies require high precision and accuracy of the analytical methods. Moreover, pronounced spatial heterogeneities and temporal variabilities that are typical for estuaries necessitate the analysis of a great number of samples if relevant results are to be obtained. Both requirements can economically be fulfilled by a proper combination of analytical methods. Applications of energy-dispersive X-ray fluorescence analysis with total reflection of the exciting beam at the sample support and of neutron activation analysis with both thermal and fast neutrons are reported in the light of pollution studies performed in the Lower Elbe River. Profiles are presented for the total heavy metal content determined from particulate matter and sediment. They include V, Mn, Fe, Ni, Cu, Zn, As, Pb, and Cd. 16 references 10 figures, 1 table

  14. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P.

    2014-12-01

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of ? and ? type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of ? and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by ?, where ? is the replica order. These circular replicas are shown to be fundamentally different from the replicas that arise in a Cartesian coordinate system.

  15. Paraxial light distribution in the focal region of a lens: a comparison of several analytical solutions and a numerical result.

    Science.gov (United States)

    Wu, Yang; Kelly, Damien P

    2014-12-12

    The distribution of the complex field in the focal region of a lens is a classical optical diffraction problem. Today, it remains of significant theoretical importance for understanding the properties of imaging systems. In the paraxial regime, it is possible to find analytical solutions in the neighborhood of the focus, when a plane wave is incident on a focusing lens whose finite extent is limited by a circular aperture. For example, in Born and Wolf's treatment of this problem, two different, but mathematically equivalent analytical solutions, are presented that describe the 3D field distribution using infinite sums of [Formula: see text] and [Formula: see text] type Lommel functions. An alternative solution expresses the distribution in terms of Zernike polynomials, and was presented by Nijboer in 1947. More recently, Cao derived an alternative analytical solution by expanding the Fresnel kernel using a Taylor series expansion. In practical calculations, however, only a finite number of terms from these infinite series expansions is actually used to calculate the distribution in the focal region. In this manuscript, we compare and contrast each of these different solutions to a numerically calculated result, paying particular attention to how quickly each solution converges for a range of different spatial locations behind the focusing lens. We also examine the time taken to calculate each of the analytical solutions. The numerical solution is calculated in a polar coordinate system and is semi-analytic. The integration over the angle is solved analytically, while the radial coordinate is sampled with a sampling interval of [Formula: see text] and then numerically integrated. This produces an infinite set of replicas in the diffraction plane, that are located in circular rings centered at the optical axis and each with radii given by [Formula: see text], where [Formula: see text] is the replica order. These circular replicas are shown to be fundamentally

  16. A European multicenter study on the analytical performance of the VERIS HBV assay.

    Science.gov (United States)

    Braun, Patrick; Delgado, Rafael; Drago, Monica; Fanti, Diana; Fleury, Hervé; Izopet, Jacques; Lombardi, Alessandra; Mancon, Alessandro; Marcos, Maria Angeles; Sauné, Karine; O Shea, Siobhan; Pérez-Rivilla, Alfredo; Ramble, John; Trimoulet, Pascale; Vila, Jordi; Whittaker, Duncan; Artus, Alain; Rhodes, Daniel

    Hepatitis B viral load monitoring is an essential part of managing patients with chronic Hepatits B infection. Beckman Coulter has developed the VERIS HBV Assay for use on the fully automated Beckman Coulter DxN VERIS Molecular Diagnostics System. 1 OBJECTIVES: To evaluate the analytical performance of the VERIS HBV Assay at multiple European virology laboratories. Precision, analytical sensitivity, negative sample performance, linearity and performance with major HBV genotypes/subtypes for the VERIS HBV Assay was evaluated. Precision showed an SD of 0.15 log 10 IU/mL or less for each level tested. Analytical sensitivity determined by probit analysis was between 6.8-8.0 IU/mL. Clinical specificity on 90 unique patient samples was 100.0%. Performance with 754 negative samples demonstrated 100.0% not detected results, and a carryover study showed no cross contamination. Linearity using clinical samples was shown from 1.23-8.23 log 10 IU/mL and the assay detected and showed linearity with major HBV genotypes/subtypes. The VERIS HBV Assay demonstrated comparable analytical performance to other currently marketed assays for HBV DNA monitoring. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Analytic method study of point-reactor kinetic equation when cold start-up

    International Nuclear Information System (INIS)

    Zhang Fan; Chen Wenzhen; Gui Xuewen

    2008-01-01

    The reactor cold start-up is a process of inserting reactivity by lifting control rod discontinuously. Inserting too much reactivity will cause short-period and may cause an overpressure accident in the primary loop. It is therefore very important to understand the rule of neutron density variation and to find out the relationships among the speed of lifting control rod, and the duration and speed of neutron density response. It is also helpful for the operators to grasp the rule in order to avoid a start-up accident. This paper starts with one-group delayed neutron point-reactor kinetics equations and provides their analytic solution when reactivity is introduced by lifting control rods discontinuously. The analytic expression is validated by comparison with practical data. It is shown that the analytic solution agrees well with numerical solution. Using this analytical solution, the relationships among neutron density response with the speed of lifting control rod and its duration are also studied. By comparing the results with those under the condition of step inserted reactivity, useful conclusions are drawn

  18. RESULTS OF SUPPLEMENTAL MST STUDIES

    International Nuclear Information System (INIS)

    Peters, T; David Hobbs, D; Samuel Fink, S

    2006-01-01

    The current design of the Salt Waste Processing Facility (SWPF) includes an auxiliary facility, the Actinide Finishing Facility, which provides a second contact of monosodium titanate (MST) to remove soluble actinides and strontium from waste if needed. This treatment will occur after cesium removal by Caustic-Side Solvent Extraction (CSSX). Although the process changes and safety basis implications have not yet been analyzed, provisions also exist to recover the MST from this operation and return to the initial actinide removal step in the SWPF for an additional (third) contact with fresh waste. A U.S. Department of Energy (DOE) request identified the need to study the following issues involving this application of MST: Determine the effect of organics from the solvent extraction (CSSX) process on radionuclide sorption by MST; Determine the efficiency of re-using MST for multiple contacts; and Examine fissile loading on MST under conditions using a waste containing significantly elevated concentrations of plutonium, uranium, neptunium, and strontium. This report describes the results of three experimental studies conducted to address these needs: (1) Addition of high concentrations of entrained CSSX solvent had no noticeable effect, over a two week period, on the sorption of the actinides and strontium by MST in a direct comparison experiment. (2) Test results show that MST still retains appreciable capacity after being used once. For instance, reused MST--in the presence of entrained solvent--continued to sorb actinides and strontium. (3) A single batch of MST was used to sequentially contact five volumes of a simulant solution containing elevated concentrations of the radionuclides of interest. After the five contacts, we measured the following solution actinide loadings on the MST: plutonium: 0.884 ± 0.00539 wt % or (1.02 ± 0.0112) E+04 (micro)g/g MST, uranium: 12.1 ± 0.786 wt % or (1.40 ± 0.104) E+05 (micro)g/g MST, and neptunium: 0.426 ± 0.00406 wt % or

  19. One-loop Higgs plus four gluon amplitudes. Full analytic results

    International Nuclear Information System (INIS)

    Badger, Simon; Nigel Glover, E.W.; Williams, Ciaran; Mastrolia, Pierpaolo

    2009-10-01

    We consider one-loop amplitudes of a Higgs boson coupled to gluons in the limit of a large top quark mass. We treat the Higgs as the real part of a complex field φ that couples to the self-dual field strengths and compute the one-loop corrections to the φ-NMHV amplitude, which contains one gluon of positive helicity whilst the remaining three have negative helicity. We use four-dimensional unitarity to construct the cut-containing contributions and a hybrid of Feynman diagram and recursive based techniques to determine the rational piece. Knowledge of the φ-NMHV contribution completes the analytic calculation of the Higgs plus four gluon amplitude. For completeness we also include expressions for the remaining helicity configurations which have been calculated elsewhere. These amplitudes are relevant for Higgs plus jet production via gluon fusion in the limit where the top quark is large compared to all other scales in the problem. (orig.)

  20. Analytic result for the two-loop six-point NMHV amplitude in N=4 super Yang-Mills theory

    CERN Document Server

    Dixon, Lance J.; Henn, Johannes M.

    2012-01-01

    We provide a simple analytic formula for the two-loop six-point ratio function of planar N = 4 super Yang-Mills theory. This result extends the analytic knowledge of multi-loop six-point amplitudes beyond those with maximal helicity violation. We make a natural ansatz for the symbols of the relevant functions appearing in the two-loop amplitude, and impose various consistency conditions, including symmetry, the absence of spurious poles, the correct collinear behaviour, and agreement with the operator product expansion for light-like (super) Wilson loops. This information reduces the ansatz to a small number of relatively simple functions. In order to fix these parameters uniquely, we utilize an explicit representation of the amplitude in terms of loop integrals that can be evaluated analytically in various kinematic limits. The final compact analytic result is expressed in terms of classical polylogarithms, whose arguments are rational functions of the dual conformal cross-ratios, plus precisely two function...

  1. Evaluation of MotionSim XY/4D for patient specific QA of respiratory gated treatment for lung cancer

    International Nuclear Information System (INIS)

    Wen, C.; Ackerly, T.; Lancaster, C.; Bailey, N.

    2011-01-01

    Full text: A commercial system-MotionSim XY/4D(TM) capable of simulating two-dimensional tumour motion and measuring planar dose with diode-matrix was evaluated at the Alfred Hospital, for establishing patient-specific QA programme of respiratory gated treatment of lung cancer. This study presents the investigation of accuracies, limitations and the practical aspects of that system. Planar doses generated on iPlan-TM by mapping clinical beams to a scanned-in water phantom were measured by MotionSim XY/4D-TM with 5 cm water equivalent build-up at normal incidence. The gated delivery using ExacTrac-TM through tracking infrared markers simulating external respiration surrogate was measured simultaneously with Gaf-ChromicR RTQA2 film and MapCHECK 2TM . Dose maps of both non-gated and gated beams with 30% duty cycle were compared with both film and diodes measurements. Differences in dose distribution were analysed with built-in tools in MapCHECK2 TM and the effect of residual motion within the beamenabled window was then assessed. Preliminary results indicate that difference between Gafchromic film and MapCHECK2 measurements of same beam was ignorable. Gated dose delivery to a target at 9 mm maximum motion was in good agreement with planned dose. Complement to measurements suggested in AAPM Report No.9 I I, this QA device can detect any random error and assess the magnitude of residual target motion through analysing differences between planned and delivered doses as gamma function. Although some user-friendliness aspects could be improved, it meets its specification and can be used for routine clinical QA purposes provided calibrations were performed and procedures were followed.

  2. Review of Factor Analytic Studies Examining Symptoms of Autism Spectrum Disorders

    Science.gov (United States)

    Shuster, Jill; Perry, Adrienne; Bebko, James; Toplak, Maggie E.

    2014-01-01

    Factor analytic studies have been conducted to examine the inter-relationships and degree of overlap among symptoms in Autism Spectrum Disorder (ASD). This paper reviewed 36 factor analytic studies that have examined ASD symptoms, using 13 different instruments. Studies were grouped into three categories: Studies with all DSM-IV symptoms, studies…

  3. Semi Active Control of Civil Structures, Analytical and Numerical Studies

    Science.gov (United States)

    Kerboua, M.; Benguediab, M.; Megnounif, A.; Benrahou, K. H.; Kaoulala, F.

    numerical example of the parallel R-L piezoelectric vibration shunt control simulated with MATLAB® is presented. An analytical study of the resistor-inductor (R-L) passive piezoelectric vibration shunt control of a cantilever beam was undertaken. The modal and strain analyses were performed by varying the material properties and geometric configurations of the piezoelectric transducer in relation to the structure in order to maximize the mechanical strain produced in the piezoelectric transducer.

  4. Horizontal Parallel Pipe Ground Heat Exchanger : Analytical Conception and Experimental Study

    International Nuclear Information System (INIS)

    Naili, Nabiha; Jemli, Ramzi; Farhat, Abdel Hamid; Ben Nasrallah, Sassi

    2009-01-01

    Due to limited amount of natural resources exploited for heating, and in order to reduce the environmental impact, people should strive to use renewable energy resources. Ambient low-grade energy may be upgraded by the ground heat exchanger (GH E), which exploits the ground thermal inertia for buildings heating and cooling. In this study, analytical performance and experiments analysis of a horizontal ground heat exchanger have been performed. The analytical study, relates to the dimensioning of the heat exchanger, shows that the heat exchanger characteristics are very important for the determination of heat extracted from ground. The experimental results were obtained during the period 30 November to 10 December 2007, in the heating season of the greenhouses. Measurements show that the ground temperature under a certain depth remains relatively constant. To exploit effectively the heat capacity of the ground, a horizontal heat exchanger system has to be constructed and tested in the Center of Research and Technology of Energy, in Tunisia

  5. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    Science.gov (United States)

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  6. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  7. Information management system study results. Volume 1: IMS study results

    Science.gov (United States)

    1971-01-01

    The information management system (IMS) special emphasis task was performed as an adjunct to the modular space station study, with the objective of providing extended depth of analysis and design in selected key areas of the information management system. Specific objectives included: (1) in-depth studies of IMS requirements and design approaches; (2) design and fabricate breadboard hardware for demonstration and verification of design concepts; (3) provide a technological base to identify potential design problems and influence long range planning (4) develop hardware and techniques to permit long duration, low cost, manned space operations; (5) support SR&T areas where techniques or equipment are considered inadequate; and (6) permit an overall understanding of the IMS as an integrated component of the space station.

  8. Study designs may influence results

    DEFF Research Database (Denmark)

    Johansen, Christoffer; Schüz, Joachim; Andreasen, Anne-Marie Serena

    2017-01-01

    appeared to show an inverse association, whereas nested case-control and cohort studies showed no association. For allergies, the inverse association was observed irrespective of study design. We recommend that the questionnaire-based case-control design be placed lower in the hierarchy of studies...... for establishing cause-and-effect for diseases such as glioma. We suggest that a state-of-the-art case-control study should, as a minimum, be accompanied by extensive validation of the exposure assessment methods and the representativeness of the study sample with regard to the exposures of interest. Otherwise...

  9. Analytical and biological studies of kanji and extracts of its ingredient, daucus carota L

    International Nuclear Information System (INIS)

    Latif, A.; Hussain, K.; Bukhari, N.; Karim, S.; Hussain, A.; Khurshid, F.

    2013-01-01

    A fermented beverage, Kanji, prepared from roots of Daucus carota L. subsp. sativus (Hoffm.) Arcang. var. vavilovii Mazk. (Apiaceae), despite long usage history has not been investigated for analytical studies and biological activities. Therefore, the present study aimed to investigate different types of Kanji samples and various types of extracts/fractions of root of the plant for a number of analytical studies and in vitro antioxidant activities. The Kanji sample, Lab-made Kanji, having better analytical and biological profile was further investigated for preliminary clinical studies. The analytical studies indicated that Lab-made Kanji was having comparatively higher contents of phytochemicals than that of the commercial Kanji samples, different types of extracts and fractions (P < 0.05). All the Kanji samples and aqueous and ethanol extracts of fresh roots exhibited comparable antioxidant activities in DPPH assay (52.20 - 54.19%) that were higher than that of methanol extract (48.78%) of dried roots. The antiradical powers (1/ EC50) of Lab-made Kanji and aqueous extract were found to be higher than that of the ethanol and methanol extracts. In beta-carotene linoleate assay, the Kanji samples showed higher activity than that of the methanol extract, but comparable to that of the vitamin-E and butylated hydroxyl anisole (BHA) (P < 0.05). A preliminary clinical evaluation indicated that Kanji has no harmful effect on blood components, liver function and serum lipid profile. The results of the present study indicate that Kanji is an effective antioxidant beverage. (author)

  10. Phase II of a Six sigma Initiative to Study DWPF SME Analytical Turnaround Times: SRNL's Evaluation of Carbonate-Based Dissolution Methods

    International Nuclear Information System (INIS)

    Edwards, Thomas

    2005-01-01

    The Analytical Development Section (ADS) and the Statistical Consulting Section (SCS) of the Savannah River National Laboratory (SRNL) are participating in a Six Sigma initiative to improve the Defense Waste Processing Facility (DWPF) Laboratory. The Six Sigma initiative has focused on reducing the analytical turnaround time of samples from the Slurry Mix Evaporator (SME) by developing streamlined sampling and analytical methods [1]. The objective of Phase I was to evaluate the sub-sampling of a larger sample bottle and the performance of a cesium carbonate (Cs 2 CO 3 ) digestion method. Successful implementation of the Cs 2 CO 3 fusion method in the DWPF would have important time savings and convenience benefits because this single digestion would replace the dual digestion scheme now used. A single digestion scheme would result in more efficient operations in both the DWPF shielded cells and the inductively coupled plasma--atomic emission spectroscopy (ICP-AES) laboratory. By taking a small aliquot of SME slurry from a large sample bottle and dissolving the vitrified SME sample with carbonate fusion methods, an analytical turnaround time reduction from 27 hours to 9 hours could be realized in the DWPF. This analytical scheme has the potential for not only dramatically reducing turnaround times, but also streamlining operations to minimize wear and tear on critical shielded cell components that are prone to fail, including the Hydragard(trademark) sampling valves and manipulators. Favorable results from the Phase I tests [2] led to the recommendation for a Phase II effort as outlined in the DWPF Technical Task Request (TTR) [3]. There were three major tasks outlined in the TTR, and SRNL issued a Task Technical and QA Plan [4] with a corresponding set of three major task activities: (1) Compare weight percent (wt%) total solids measurements of large volume samples versus peanut vial samples. (2) Evaluate Cs 2 CO 3 and K 2 CO 3 fusion methods using DWPF simulated

  11. The Viking X ray fluorescence experiment - Analytical methods and early results

    Science.gov (United States)

    Clark, B. C., III; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Rose, H. J., Jr.; Toulmin, P., III; Christian, R. P.; Kelliher, W. C.; Keil, K.; Huss, G. R.

    1977-01-01

    Ten samples of the Martian regolith have been analyzed by the Viking lander X ray fluorescence spectrometers. Because of high-stability electronics, inclusion of calibration targets, and special data encoding within the instruments the quality of the analyses performed on Mars is closely equivalent to that attainable with the same instruments operated in the laboratory. Determination of absolute elemental concentrations requires gain drift adjustments, subtraction of background components, and use of a mathematical response model with adjustable parameters set by prelaunch measurements on selected rock standards. Bulk fines at both Viking landing sites are quite similar in composition, implying that a chemically and mineralogically homogeneous regolith covers much of the surface of the planet. Important differences between samples include a higher sulfur content in what appear to be duricrust fragments than in fines and a lower iron content in fines taken from beneath large rocks than those taken from unprotected surface material. Further extensive reduction of these data will allow more precise and more accurate analytical numbers to be determined and thus a more comprehensive understanding of elemental trends between samples.

  12. Crystal growth of pure substances: Phase-field simulations in comparison with analytical and experimental results

    Science.gov (United States)

    Nestler, B.; Danilov, D.; Galenko, P.

    2005-07-01

    A phase-field model for non-isothermal solidification in multicomponent systems [SIAM J. Appl. Math. 64 (3) (2004) 775-799] consistent with the formalism of classic irreversible thermodynamics is used for numerical simulations of crystal growth in a pure material. The relation of this approach to the phase-field model by Bragard et al. [Interface Science 10 (2-3) (2002) 121-136] is discussed. 2D and 3D simulations of dendritic structures are compared with the analytical predictions of the Brener theory [Journal of Crystal Growth 99 (1990) 165-170] and with recent experimental measurements of solidification in pure nickel [Proceedings of the TMS Annual Meeting, March 14-18, 2004, pp. 277-288; European Physical Journal B, submitted for publication]. 3D morphology transitions are obtained for variations in surface energy and kinetic anisotropies at different undercoolings. In computations, we investigate the convergence behaviour of a standard phase-field model and of its thin interface extension at different undercoolings and at different ratios between the diffuse interface thickness and the atomistic capillary length. The influence of the grid anisotropy is accurately analyzed for a finite difference method and for an adaptive finite element method in comparison.

  13. Why do ultrasoft repulsive particles cluster and crystallize? Analytical results from density-functional theory.

    Science.gov (United States)

    Likos, Christos N; Mladek, Bianca M; Gottwald, Dieter; Kahl, Gerhard

    2007-06-14

    We demonstrate the accuracy of the hypernetted chain closure and of the mean-field approximation for the calculation of the fluid-state properties of systems interacting by means of bounded and positive pair potentials with oscillating Fourier transforms. Subsequently, we prove the validity of a bilinear, random-phase density functional for arbitrary inhomogeneous phases of the same systems. On the basis of this functional, we calculate analytically the freezing parameters of the latter. We demonstrate explicitly that the stable crystals feature a lattice constant that is independent of density and whose value is dictated by the position of the negative minimum of the Fourier transform of the pair potential. This property is equivalent with the existence of clusters, whose population scales proportionally to the density. We establish that regardless of the form of the interaction potential and of the location on the freezing line, all cluster crystals have a universal Lindemann ratio Lf=0.189 at freezing. We further make an explicit link between the aforementioned density functional and the harmonic theory of crystals. This allows us to establish an equivalence between the emergence of clusters and the existence of negative Fourier components of the interaction potential. Finally, we make a connection between the class of models at hand and the system of infinite-dimensional hard spheres, when the limits of interaction steepness and space dimension are both taken to infinity in a particularly described fashion.

  14. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  15. Experimental and Analytical Studies of Solar System Chemistry

    Science.gov (United States)

    Burnett, Donald S.

    2003-01-01

    The cosmochemistry research funded by this grant resulted in the publications given in the attached Publication List. The research focused in three areas: (1) Experimental studies of trace element partitioning. (2) Studies of the minor element chemistry and O isotopic compositions of MgAlO4 spinels from Ca-Al-Rich Inclusions in carbonaceous chondrite meteorites, and (3) The abundances and chemical fractionations of Th and U in chondritic meteorites.

  16. Student Satisfaction in Higher Education: A Meta-Analytic Study

    Science.gov (United States)

    Santini, Fernando de Oliveira; Ladeira, Wagner Junior; Sampaio, Claudio Hoffmann; da Silva Costa, Gustavo

    2017-01-01

    This paper discusses the results of a meta-analysis performed to identify key antecedent and consequent constructs of satisfaction in higher education. We offer an integrated model to achieve a better understanding of satisfaction in the context of higher education. To accomplish this objective, we identified 83 studies that were valid and…

  17. Analytical study on holographic superfluid in AdS soliton background

    International Nuclear Information System (INIS)

    Lai, Chuyu; Pan, Qiyuan; Jing, Jiliang; Wang, Yongjiu

    2016-01-01

    We analytically study the holographic superfluid phase transition in the AdS soliton background by using the variational method for the Sturm–Liouville eigenvalue problem. By investigating the holographic s-wave and p-wave superfluid models in the probe limit, we observe that the spatial component of the gauge field will hinder the phase transition. Moreover, we note that, different from the AdS black hole spacetime, in the AdS soliton background the holographic superfluid phase transition always belongs to the second order and the critical exponent of the system takes the mean-field value in both s-wave and p-wave models. Our analytical results are found to be in good agreement with the numerical findings.

  18. Parametric study of a turbocompound diesel engine based on an analytical model

    International Nuclear Information System (INIS)

    Zhao, Rongchao; Zhuge, Weilin; Zhang, Yangjun; Yin, Yong; Zhao, Yanting; Chen, Zhen

    2016-01-01

    Turbocompounding is an important technique to recover waste heat from engine exhaust and reduce CO_2 emission. This paper presents a parametric study of turbocompound diesel engine based on analytical model. An analytical model was developed to investigate the influence of system parameters on the engine fuel consumption. The model is based on thermodynamics knowledge and empirical models, which can consider the impacts of each parameter independently. The effects of turbine efficiency, back pressure, exhaust temperature, pressure ratio and engine speed on the recovery energy, pumping loss and engine fuel reductions were studied. Results show that turbine efficiency, exhaust temperature and back pressure has great influence on the fuel reduction and optimal power turbine (PT) expansion ratio. However, engine operation speed has little impact on the fuel savings obtained by turbocompounding. The interaction mechanism between the PT recovery power and engine pumping loss is presented in the paper. Due to the nonlinear characteristic of turbine power, there is an optimum value of PT expansion ratio to achieve largest power gain. At the end, the fuel saving potential of high performance turbocompound engine and the requirements for it are proposed in the paper. - Highlights: • An analytical model for turbocompound engine is developed and validated. • Parametric study is performed to obtain lowest BSFC and optimal expansion ratio. • The influences of each parameter on the fuel saving potentials are presented. • The impact mechanisms of each parameter on the energy tradeoff are disclosed. • It provides an effective tool to guide the preliminary design of turbocompounding.

  19. SU-C-BRD-03: Closing the Loop On Virtual IMRT QA

    International Nuclear Information System (INIS)

    Valdes, G; Scheuermann, R; Y, H C.; Olszanski, A; Bellerive, M; Solberg, T

    2015-01-01

    Purpose: To develop an algorithm that predicts a priori IMRT QA passing rates. Methods: 416 IMRT plans from all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam linacs (Varian Medical Systems, Palo Alto, CA). The 3%/3mm and 2%/2mm local distance to agreement (DTA) were recorded during clinical operations using a commercial 2D diode array (MapCHECK 2, Sun Nuclear, Melbourne, FL). Each plan was characterized by 37 metrics that describe different failure modes between the calculated and measured dose. Machine-learning algorithms (MLAs) were trained to learn the relation between the plan characteristics and each passing rate. Minimization of the cross validated error, together with maximum a posteriori estimation (MAP), were used to choose the model parameters. Results: 3%/3mm local DTA can be predicted with an error smaller than 3% for 98% of the plans. For the remaining 2% of plans, the residual error was within 5%. For 2%/2mm local DTA passing rates, 96% percent of the plans were successfully predicted with an error smaller than 5%. All high-risk plans that failed the 2%/2mm local criteria were correctly identified by the algorithm. The most important metric to describe the passing rates was determined to be the MU per Gray (modulation factor). Conclusions: Logs files and independent dose calculations have been suggested as possible substitutes for measurement based IMRT QA. However, none of these methods answer the fundamental question of whether a plan can be delivered with a clinically acceptable error given the limitations of the linacs and the treatment planning system. Predicting the IMRT QA passing rates a priori closes that loop. For additional robustness, virtual IMRT QA can be combined with Linac QA and log file analysis to confirm appropriate delivery

  20. SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA

    International Nuclear Information System (INIS)

    Mathew, D; Alaei, P

    2016-01-01

    Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predetermined doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.

  1. SU-F-P-07: Applying Failure Modes and Effects Analysis to Treatment Planning System QA

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, D; Alaei, P [University Minnesota, Minneapolis, MN (United States)

    2016-06-15

    Purpose: A small-scale implementation of Failure Modes and Effects Analysis (FMEA) for treatment planning system QA by utilizing methodology of AAPM TG-100 report. Methods: FMEA requires numerical values for severity (S), occurrence (O) and detectability (D) of each mode of failure. The product of these three values gives a risk priority number (RPN). We have implemented FMEA for the treatment planning system (TPS) QA for two clinics which use Pinnacle and Eclipse TPS. Quantitative monthly QA data dating back to 4 years for Pinnacle and 1 year for Eclipse have been used to determine values for severity (deviations from predetermined doses at points or volumes), and occurrence of such deviations. The TPS QA protocol includes a phantom containing solid water and lung- and bone-equivalent heterogeneities. Photon and electron plans have been evaluated in both systems. The dose values at multiple distinct points of interest (POI) within the solid water, lung, and bone-equivalent slabs, as well as mean doses to several volumes of interest (VOI), have been re-calculated monthly using the available algorithms. Results: The computed doses vary slightly month-over-month. There have been more significant deviations following software upgrades, especially if the upgrade involved re-modeling of the beams. TG-100 guidance and the data presented here suggest an occurrence (O) of 2 depending on the frequency of re-commissioning the beams, severity (S) of 3, and detectability (D) of 2, giving an RPN of 12. Conclusion: Computerized treatment planning systems could pose a risk due to dosimetric errors and suboptimal treatment plans. The FMEA analysis presented here suggests that TPS QA should immediately follow software upgrades, but does not need to be performed every month.

  2. SU-E-T-432: A Rapid and Comprehensive Procedure for Daily Proton QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, T; Sun, B; Grantham, K; Knutson, N; Santanam, L; Goddu, S; Klein, E [Washington University, St. Louis, MO (United States)

    2014-06-01

    Purpose: The objective is to develop a rapid and comprehensive daily QA procedure implemented at the S. Lee Kling Proton Therapy Center at Barnes-Jewish Hospital. Methods: A scribed phantom with imbedded fiducials is used for checking lasers accuracy followed by couch isocentricity and for X-ray imaging congruence with isocenter. A Daily QA3 device (Sun Nuclear, FL) was used to check output, range and profiles. Five chambers in the central region possess various build-ups. After converting the thickness of the inherent build-ups into water equivalent thickness (WET) for proton, range of any beam can be checked with additional build-up on the Daily QA3 device. In our procedure, 3 beams from 3 bands (large, small and deep) with nominal range of 20 cm are checked daily. 17cm plastic water with WET of 16.92cm are used as additional build-up so that four chambers sit on the SOBP plateau at various depths and one sit on the distal fall off. Reading from the five chambers are fitted to an error function that has been parameterized to match the SOBP with the same nominal range. Shifting of the error function to maximize the correlation between measurements and the error function is deemed as the range shift from the nominal value. Results: We have found couch isocentricity maintained over 180 degrees. Imaging system exhibits accuracy in regard to imaging and mechanical isocenters. Ranges are within 1mm accuracy from measurements in water tank, and sensitive to change of sub-millimeter. Data acquired since the start of operation show outputs, profiles and range stay within 1% or 1mm from baselines. The whole procedure takes about 40 minutes. Conclusion: Taking advantage of the design of Daily QA3 device turns the device originally designed for photon and electron into a comprehensive and rapid tool for proton daily QA.

  3. SU-E-T-432: A Rapid and Comprehensive Procedure for Daily Proton QA

    International Nuclear Information System (INIS)

    Zhao, T; Sun, B; Grantham, K; Knutson, N; Santanam, L; Goddu, S; Klein, E

    2014-01-01

    Purpose: The objective is to develop a rapid and comprehensive daily QA procedure implemented at the S. Lee Kling Proton Therapy Center at Barnes-Jewish Hospital. Methods: A scribed phantom with imbedded fiducials is used for checking lasers accuracy followed by couch isocentricity and for X-ray imaging congruence with isocenter. A Daily QA3 device (Sun Nuclear, FL) was used to check output, range and profiles. Five chambers in the central region possess various build-ups. After converting the thickness of the inherent build-ups into water equivalent thickness (WET) for proton, range of any beam can be checked with additional build-up on the Daily QA3 device. In our procedure, 3 beams from 3 bands (large, small and deep) with nominal range of 20 cm are checked daily. 17cm plastic water with WET of 16.92cm are used as additional build-up so that four chambers sit on the SOBP plateau at various depths and one sit on the distal fall off. Reading from the five chambers are fitted to an error function that has been parameterized to match the SOBP with the same nominal range. Shifting of the error function to maximize the correlation between measurements and the error function is deemed as the range shift from the nominal value. Results: We have found couch isocentricity maintained over 180 degrees. Imaging system exhibits accuracy in regard to imaging and mechanical isocenters. Ranges are within 1mm accuracy from measurements in water tank, and sensitive to change of sub-millimeter. Data acquired since the start of operation show outputs, profiles and range stay within 1% or 1mm from baselines. The whole procedure takes about 40 minutes. Conclusion: Taking advantage of the design of Daily QA3 device turns the device originally designed for photon and electron into a comprehensive and rapid tool for proton daily QA

  4. SU-C-BRD-03: Closing the Loop On Virtual IMRT QA

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, G; Scheuermann, R; Y, H C.; Olszanski, A; Bellerive, M; Solberg, T [University of Pennsylvania, Philadelphia, PA (United States)

    2015-06-15

    Purpose: To develop an algorithm that predicts a priori IMRT QA passing rates. Methods: 416 IMRT plans from all treatment sites were planned in Eclipse version 11 and delivered using a dynamic sliding window technique on Clinac iX or TrueBeam linacs (Varian Medical Systems, Palo Alto, CA). The 3%/3mm and 2%/2mm local distance to agreement (DTA) were recorded during clinical operations using a commercial 2D diode array (MapCHECK 2, Sun Nuclear, Melbourne, FL). Each plan was characterized by 37 metrics that describe different failure modes between the calculated and measured dose. Machine-learning algorithms (MLAs) were trained to learn the relation between the plan characteristics and each passing rate. Minimization of the cross validated error, together with maximum a posteriori estimation (MAP), were used to choose the model parameters. Results: 3%/3mm local DTA can be predicted with an error smaller than 3% for 98% of the plans. For the remaining 2% of plans, the residual error was within 5%. For 2%/2mm local DTA passing rates, 96% percent of the plans were successfully predicted with an error smaller than 5%. All high-risk plans that failed the 2%/2mm local criteria were correctly identified by the algorithm. The most important metric to describe the passing rates was determined to be the MU per Gray (modulation factor). Conclusions: Logs files and independent dose calculations have been suggested as possible substitutes for measurement based IMRT QA. However, none of these methods answer the fundamental question of whether a plan can be delivered with a clinically acceptable error given the limitations of the linacs and the treatment planning system. Predicting the IMRT QA passing rates a priori closes that loop. For additional robustness, virtual IMRT QA can be combined with Linac QA and log file analysis to confirm appropriate delivery.

  5. Analytical studies on the gum exudate from Anogeissus leiocarpus

    International Nuclear Information System (INIS)

    Ahmed, Samia Eltayeb

    1999-04-01

    Anogeissus leiocarpus gum samples were collected as natural exudate nodules, from three different location. Physicochemical properties of gum samples were studied. results showed significant differences within each location in most parameters studied except refractive index value which was found to be constant in all samples. The effect of location on the properties of gum samples was also studied and the analysis of variance showed insignificant differences (P≤0.05) in all properties studied except in ash content. Inter nodule variations of gum from two different location were studied individually. Results showed significant differences for each parameter studied except for the refractive index value. The properties studied of all gum samples were as follows: 9.2% moisture, 3.4% ash, 0.72% nitrogen, 4.74% protein, -35.5 specific rotation, 1.68 relative viscosity, 4.2 pH, 1.334 refractive index, 14.3 uronic acid, 0.44% reducing sugar, 1336.0 equivalent weight and 0.68% tannin content. UV absorption spectra of gum samples and gum nodules were determined. Cationic composition of gum samples was also determined and the results showed that (Mg) has highest value in all samples studied followed by Fe, Na, K, Ca, Zn and trace amount of Mn, Co, Ni, Cd and Pb. The water holding capacity was found to be 65.5% and emulsifying stability was found to be 1.008. The component sugars of gum were examined by different methods followed by qualitative and quantitative analysis. Analysis of hydrolysate crude gum sample by HPLC show L-rhamnose (6.82), L-arabinose (48.08), D-galactose (11.26) and two unknown oligosaccharides having values (0.22 and 32.61). Some physicochemical properties were studied. Results showed significant differences in nitrogen and protein contents, specific rotation, relative viscosity, equivalent weight and pH of fractions, where as insignificant differences were observed in uronic acid content and refractive index values

  6. Environmental influences on fruit and vegetable intake: Results from a path analytic model

    Science.gov (United States)

    Liese, Angela D.; Bell, Bethany A.; Barnes, Timothy L.; Colabianchi, Natalie; Hibbert, James D.; Blake, Christine E.; Freedman, Darcy A.

    2014-01-01

    Objective Fruit and vegetable intake (F&V) is influenced by behavioral and environmental factors, but these have rarely been assessed simultaneously. We aimed to quantify the relative influence of supermarket availability, perceptions of the food environment, and shopping behavior on F&V intake. Design A cross-sectional study. Setting Eight-counties in South Carolina, USA, with verified locations of all supermarkets. Subjects A telephone survey of 831 household food shoppers ascertained F&V intake with a 17-item screener, primary food store location, shopping frequency, perceptions of healthy food availability, and calculated GIS-based supermarket availability. Path analysis was conducted. We report standardized beta coefficients on paths significant at the 0.05 level. Results Frequency of grocery shopping at primary food store (β=0.11) was the only factor exerting an independent, statistically significant direct effect on F&V intake. Supermarket availability was significantly associated with distance to food store (β=-0.24) and shopping frequency (β=0.10). Increased supermarket availability was significantly and positively related to perceived healthy food availability in the neighborhood (β=0.18) and ease of shopping access (β=0.09). Collectively considering all model paths linked to perceived availability of healthy foods, this measure was the only other factor to have a significant total effect on F&V intake. Conclusions While the majority of literature to date has suggested an independent and important role of supermarket availability for F&V intake, our study found only indirect effects of supermarket availability and suggests that food shopping frequency and perceptions of healthy food availability are two integral components of a network of influences on F&V intake. PMID:24192274

  7. Refinement of MLC modeling improves commercial QA dosimetry system for SRS and SBRT patient-specific QA.

    Science.gov (United States)

    Hillman, Yair; Kim, Josh; Chetty, Indrin; Wen, Ning

    2018-04-01

    dosimetry, MLC modeling, and inhomogeneity corrections in the beam model for SRS/SBRT QA. The improvements noted in this study, and further collaborations between clinical physicists and the vendor to refine the M3D beam model could enable M3D to become a premier SRS/SBRT QA tool. © 2018 American Association of Physicists in Medicine.

  8. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    International Nuclear Information System (INIS)

    Xie, J; Wang, J; Peng, J; Chen, J; Hu, W

    2016-01-01

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed. A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.

  9. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Xie, J; Wang, J; Peng, J; Chen, J; Hu, W [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China)

    2016-06-15

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed. A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.

  10. Phonon dispersion on Ag (100) surface: A modified analytic embedded atom method study

    International Nuclear Information System (INIS)

    Zhang Xiao-Jun; Chen Chang-Le

    2016-01-01

    Within the harmonic approximation, the analytic expression of the dynamical matrix is derived based on the modified analytic embedded atom method (MAEAM) and the dynamics theory of surface lattice. The surface phonon dispersions along three major symmetry directions, and XM-bar are calculated for the clean Ag (100) surface by using our derived formulas. We then discuss the polarization and localization of surface modes at points X-bar and M-bar by plotting the squared polarization vectors as a function of the layer index. The phonon frequencies of the surface modes calculated by MAEAM are compared with the available experimental and other theoretical data. It is found that the present results are generally in agreement with the referenced experimental or theoretical results, with a maximum deviation of 10.4%. The agreement shows that the modified analytic embedded atom method is a reasonable many-body potential model to quickly describe the surface lattice vibration. It also lays a significant foundation for studying the surface lattice vibration in other metals. (paper)

  11. Studies on analytical method and nondestructive measuring method on the sensitization of austenitic stainless steels

    International Nuclear Information System (INIS)

    Onimura, Kichiro; Arioka, Koji; Horai, Manabu; Noguchi, Shigeru.

    1982-03-01

    Austenitic stainless steels are widely used as structural materials for the machine and equipment of various kinds of plants, such as thermal power, nuclear power, and chemical plants. The machines and equipment using this kind of material, however, have the possibility of suffering corrosion damage while in service, and these damages are considered to be largely due to the sensitization of the material in sometimes. So, it is necessary to develop an analytical method for grasping the sensitization of the material more in detail and a quantitative nondestructive measuring method which is applicable to various kinds of structures in order to prevent the corrosion damage. From the above viewpoint, studies have been made on the analytical method based on the theory of diffusion of chromium in austenitic stainless steels and on Electro-Potentiokinetics Reactivation Method (EPR Method) as a nondestructive measuring method, using 304 and 316 austenitic stainless steels having different carbon contents in base metals. This paper introduces the results of EPR test on the sensitization of austenitic stainless steels and the correlation between analytical and experimental results. (author)

  12. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    Energy Technology Data Exchange (ETDEWEB)

    Sathiaseelan, V [Northwestern Memorial Hospital, Chicago, IL (United States); Thomadsen, B [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  13. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    International Nuclear Information System (INIS)

    Sathiaseelan, V; Thomadsen, B

    2014-01-01

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  14. Piecewise linear emulator of the nonlinear Schroedinger equation and the resulting analytic solutions for Bose-Einstein condensates

    International Nuclear Information System (INIS)

    Theodorakis, Stavros

    2003-01-01

    We emulate the cubic term Ψ 3 in the nonlinear Schroedinger equation by a piecewise linear term, thus reducing the problem to a set of uncoupled linear inhomogeneous differential equations. The resulting analytic expressions constitute an excellent approximation to the exact solutions, as is explicitly shown in the case of the kink, the vortex, and a δ function trap. Such a piecewise linear emulation can be used for any differential equation where the only nonlinearity is a Ψ 3 one. In particular, it can be used for the nonlinear Schroedinger equation in the presence of harmonic traps, giving analytic Bose-Einstein condensate solutions that reproduce very accurately the numerically calculated ones in one, two, and three dimensions

  15. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    International Nuclear Information System (INIS)

    Bros, J.

    1991-01-01

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the η-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods

  16. Fields, particles and analyticity: recent results or 30 goldberg (ER) variations on B.A.C.H

    Energy Technology Data Exchange (ETDEWEB)

    Bros, J

    1992-12-31

    As it is known, Axiomatic Field Theory (A) implies double analyticity of the {eta}-point functions in space-time and energy-momentum Complex Variables (C), with various interconnections by Fourier-Laplace analysis. When the latter is replaced by. Harmonic Analysis (H) on spheres and hyperboloids, a new kind of double analyticity results from (A) (i.e. from locality, spectral condition, temperateness and invariance): complex angular momentum is thereby introduced (a missing chapter in (A)). Exploitation of Asymptotic Completeness via Bethe-Salpeter-type equations (B) leads to new developments of the previous theme on (A, C, H) (complex angular momentum) and of other themes on (A,C) (crossing, Haag-Swieca property etc...). Various aspects of (A) + (B) have been implemented in Constructive Field Theory (composite spectrum, asymptotic properties etc...) by a combination of specific techniques and of model-independent methods.

  17. A modified analytical model to study the sensing performance of a flexible capacitive tactile sensor array

    International Nuclear Information System (INIS)

    Liang, Guanhao; Wang, Yancheng; Mei, Deqing; Xi, Kailun; Chen, Zichen

    2015-01-01

    This paper presents a modified analytical model to study the sensing performance of a flexible capacitive tactile sensor array, which utilizes solid polydimethylsiloxane (PDMS) film as the dielectric layer. To predict the deformation of the sensing unit and capacitance changes, each sensing unit is simplified into a three-layer plate structure and divided into central, edge and corner regions. The plate structure and the three regions are studied by the general and modified models, respectively. For experimental validation, the capacitive tactile sensor array with 8  ×  8 (= 64) sensing units is fabricated. Experiments are conducted by measuring the capacitance changes versus applied external forces and compared with the general and modified models’ predictions. For the developed tactile sensor array, the sensitivity predicted by the modified analytical model is 1.25%/N, only 0.8% discrepancy from the experimental measurement. Results demonstrate that the modified analytical model can accurately predict the sensing performance of the sensor array and could be utilized for model-based optimal capacitive tactile sensor array design. (paper)

  18. Skylab experiment results: Hematology studies

    Science.gov (United States)

    Kimzey, S. L.; Ritzmann, S. E.; Mengel, C. E.; Fischer, C. L.

    1975-01-01

    Studies were conducted to evaluate specific aspects of man's immunologic and hematologic systems that might be altered by or respond to the space flight environment. Biochemical functions investigated included cytogenetic damage to blood cells, immune resistance to disease, regulation of plasma and red cell volumes, metabolic processes of the red blood cell, and physicochemical aspects of red blood cell function. Measurements of hematocrit value showed significant fluctuations postflight, reflecting observed changes in red cell mass and plasma volume. The capacity of lymphocytes to respond to an in vitro mitogenic challenge was repressed postflight, and appeared to be related to mission duration. Most other deviations from earth function in these systems were minor or transient.

  19. Some results of radioisotope studies

    Energy Technology Data Exchange (ETDEWEB)

    Isamov, N.N.

    1974-10-01

    The accumulation of radioisotopes by brucellae depends on the consistency of the feed medium on which they are grown. The uptake of P-32 is a factor of 5 to 16 greater, and that of sulfur-35 in the form of sodium sulfate is a factor of 30 to 100 greater when grown on a complex solid agar than in a bouillion solution of the same ingredients. Brucellae are readily tagged with /sup 32/P and /sup 35/S simultaneously. These tagged brucellae were used to study in vitro storage under various temperature regimes. Brucellae actively incorporate iron. The uptake of methionine and cystine tagged with sulfur-35 by brucellae was investigated. Methionine is absorbed directly for the most part by brucellae, while the sulfur-35 in sodium sulfate is primarily transformed to cystine and cysteine. The uptake of various radioisotopes can be used to type various strains of brucellae. Isotopes are used to trace the course of various diseases in animals. (SJR)

  20. Analytical study of electron flows with a virtual cathode

    International Nuclear Information System (INIS)

    Dubinov, A.E.

    2000-01-01

    The dynamics of the electron flow behavior by its injection into a half-space is considered. Two problems are considered, namely the long-term injection of a monoenergetic electron flow and instantaneous flow injection with an assigned electron energy spectrum. The all flow electrons in both cases return to the injection plane. The simple analytical self-consistent model of the initial stage of the virtual cathode formation in a plane-parallel equipotential gap is plotted in the course of analysis whereof the duration of the virtual cathode formation process is determined. The performance of this model is not limited by the multivalence of the electron velocity in the flow. This makes it possible to extend the frames of the model performance relative to the moment of the virtual cathode formation and to consider its dynamics. The frequency of electron oscillations in the potential cathode-virtual cathode well is determined on the basis of the above model [ru

  1. Nuclear analytical methods for trace element studies in calcified tissues

    International Nuclear Information System (INIS)

    Chaudhry, M.A.; Chaudhry, M.N.

    2001-01-01

    Full text: Various nuclear analytical methods have been developed and applied to determine the elemental composition of calcified tissues (teeth and bones). Fluorine was determined by prompt gamma activation analysis through the 19 F(p,ag) 16 O reaction. Carbon was measured by activation analysis with He-3 ions, and the technique of Proton-Induced X-ray Emission (PIXE) was applied to simultaneously determine Ca, P, and trace elements in well-documented teeth. Dental hard tissues, enamel, dentine, cement, and their junctions, as well as different parts of the same tissue, were examined separately. Furthermore, using a Proton Microprobe, we measured the surface distribution of F and other elements on and around carious lesions on the enamel. The depth profiles of F, and other elements, were also measured right up to the amelodentin junction

  2. Feasibility study of a lead(II) iodide-based dosimeter for quality assurance in therapeutic radiology

    Science.gov (United States)

    Heo, Y. J.; Kim, K. T.; Oh, K. M.; Lee, Y. K.; Ahn, K. J.; Cho, H. L.; Kim, J. Y.; Min, B. I.; Mun, C. W.; Park, S. K.

    2017-09-01

    The most widely used form of radiotherapy to treat tumors uses a linear accelerator, and the apparatus requires regular quality assurance (QA). QA for a linear accelerator demands accuracy throughout, from mock treatment and treatment planning, up to treatment itself. Therefore, verifying a radiation dose is essential to ensure that the radiation is being applied as planned. In current clinical practice, ionization chambers and diodes are used for QA. However, using conventional gaseous ionization chambers presents drawbacks such as complex analytical procedures, difficult measurement procedures, and slow response time. In this study, we discuss the potential of a lead(II) iodide (PbI2)-based radiation dosimeter for radiotherapy QA. PbI2 is a semiconductor material suited to measurements of X-rays and gamma rays, because of its excellent response properties to radiation signals. Our results show that the PbI2-based dosimeter offers outstanding linearity and reproducibility, as well as dose-independent characteristics. In addition, percentage depth dose (PDD) measurements indicate that the error at a fixed reference depth Dmax was 0.3%, very similar to the measurement results obtained using ionization chambers. Based on these results, we confirm that the PbI2-based dosimeter has all the properties required for radiotherapy: stable dose detection, dose linearity, and rapid response time. Based on the evidence of this experimental verification, we believe that the PbI2-based dosimeter could be used commercially in various fields for precise measurements of radiation doses in the human body and for measuring the dose required for stereotactic radiosurgery or localized radiosurgery.

  3. Feasibility study of a lead(II) iodide-based dosimeter for quality assurance in therapeutic radiology

    International Nuclear Information System (INIS)

    Heo, Y.J.; Kim, K.T.; Mun, C.W.; Oh, K.M.; Lee, Y.K.; Ahn, K.J.; Cho, H.L.; Park, S.K.; Kim, J.Y.; Min, B.I.

    2017-01-01

    The most widely used form of radiotherapy to treat tumors uses a linear accelerator, and the apparatus requires regular quality assurance (QA). QA for a linear accelerator demands accuracy throughout, from mock treatment and treatment planning, up to treatment itself. Therefore, verifying a radiation dose is essential to ensure that the radiation is being applied as planned. In current clinical practice, ionization chambers and diodes are used for QA. However, using conventional gaseous ionization chambers presents drawbacks such as complex analytical procedures, difficult measurement procedures, and slow response time. In this study, we discuss the potential of a lead(II) iodide (PbI 2 )-based radiation dosimeter for radiotherapy QA. PbI 2 is a semiconductor material suited to measurements of X-rays and gamma rays, because of its excellent response properties to radiation signals. Our results show that the PbI 2 -based dosimeter offers outstanding linearity and reproducibility, as well as dose-independent characteristics. In addition, percentage depth dose (PDD) measurements indicate that the error at a fixed reference depth D max was 0.3%, very similar to the measurement results obtained using ionization chambers. Based on these results, we confirm that the PbI 2 -based dosimeter has all the properties required for radiotherapy: stable dose detection, dose linearity, and rapid response time. Based on the evidence of this experimental verification, we believe that the PbI 2 -based dosimeter could be used commercially in various fields for precise measurements of radiation doses in the human body and for measuring the dose required for stereotactic radiosurgery or localized radiosurgery.

  4. Publication bias in studies of an applied behavior-analytic intervention: an initial analysis.

    Science.gov (United States)

    Sham, Elyssa; Smith, Tristram

    2014-01-01

    Publication bias arises when studies with favorable results are more likely to be reported than are studies with null findings. If this bias occurs in studies with single-subject experimental designs(SSEDs) on applied behavior-analytic (ABA) interventions, it could lead to exaggerated estimates of intervention effects. Therefore, we conducted an initial test of bias by comparing effect sizes, measured by percentage of nonoverlapping data (PND), in published SSED studies (n=21) and unpublished dissertations (n=10) on 1 well-established intervention for children with autism, pivotal response treatment (PRT). Although published and unpublished studies had similar methodologies, the mean PND in published studies was 22% higher than in unpublished studies, 95% confidence interval (4%, 38%). Even when unpublished studies are included, PRT appeared to be effective (PNDM=62%). Nevertheless, the disparity between published and unpublished studies suggests a need for further assessment of publication bias in the ABA literature.

  5. Comment on 'Analytical results for a Bessel function times Legendre polynomials class integrals'

    International Nuclear Information System (INIS)

    Cregg, P J; Svedlindh, P

    2007-01-01

    A result is obtained, stemming from Gegenbauer, where the products of certain Bessel functions and exponentials are expressed in terms of an infinite series of spherical Bessel functions and products of associated Legendre functions. Closed form solutions for integrals involving Bessel functions times associated Legendre functions times exponentials, recently elucidated by Neves et al (J. Phys. A: Math. Gen. 39 L293), are then shown to result directly from the orthogonality properties of the associated Legendre functions. This result offers greater flexibility in the treatment of classical Heisenberg chains and may do so in other problems such as occur in electromagnetic diffraction theory. (comment)

  6. Analytic study of SU(3) lattice gauge theory

    International Nuclear Information System (INIS)

    Zheng Xite; Xu Yong

    1989-01-01

    The variational-cumulant expansion method has been extended to the case of lattice SU(3) Wilson model. The plaquette energy as an order paramenter has been calculated to the 2nd order expansion. No 1st order phase transition in the D = 4 case is found which is in agreement with the monte Carlo results, and the 1st order phase transition in the d = 5 case is clearly seen. The method can be used in the study of problems in LGT with SU(3) gauge group

  7. The structure and evolution of galacto-detonation waves - Some analytic results in sequential star formation models of spiral galaxies

    Science.gov (United States)

    Cowie, L. L.; Rybicki, G. B.

    1982-01-01

    Waves of star formation in a uniform, differentially rotating disk galaxy are treated analytically as a propagating detonation wave front. It is shown, that if single solitary waves could be excited, they would evolve asymptotically to one of two stable spiral forms, each of which rotates with a fixed pattern speed. Simple numerical solutions confirm these results. However, the pattern of waves that develop naturally from an initially localized disturbance is more complex and dies out within a few rotation periods. These results suggest a conclusive observational test for deciding whether sequential star formation is an important determinant of spiral structure in some class of galaxies.

  8. Tank 241-BY-112, cores 174 and 177 analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    Results from bulk density tests ranged from 1.03 g/mL to 1.86 g/mL. The highest bulk density result of 1.86 g/mL was used to calculate the solid total alpha activity notification limit for this tank (33.1 uCi/g), Total Alpha (AT) Analysis. Attachment 2 contains the Data Verification and Deliverable (DVD) Summary Report for AT analyses. This report summarizes results from AT analyses and provides data qualifiers and total propagated uncertainty (TPU) values for results. The TPU values are based on the uncertainties inherent in each step of the analysis process. They may be used as an additional reference to determine reasonable RPD values which may be used to accept valid data that do not meet the TSAP acceptance criteria. A report guide is provided with the report to assist in understanding this summary report

  9. Analytical, Experimental, and Modelling Studies of Lunar and Terrestrial Rocks

    Science.gov (United States)

    Haskin, Larry A.

    1997-01-01

    , mafic, trace-element-rich geochemical province, and a regolith that globally contains trace-element-rich material distributed from this province by the Imbrium basin-forming impact. This contrasts with earlier models of a concentrically zoned Moon with a crust of ferroan anorthosite overlying a layer of urKREEP overlying ultramafic cumulates. From this work, we have learned lessons useful for developing strategies for studying regolith materials that help to maximize the information available about both the evolution of the regolith and the igneous differentiation of the planet. We believe these lessons are useful in developing strategies for on-surface geological, mineralogical, and geochemical studies, as well. The main results of our work are given in the following brief summaries of major tasks. Detailed accounts of these results have been submitted in the annual progress reports.

  10. Analytical study of sodium combustion phenomena under sodium leak accidents

    International Nuclear Information System (INIS)

    Kim, Byung Ho; Jeong, J. Y.; Jeong, K. C.; Kim, T. J.; Choi, J. H.

    2001-12-01

    The rise of temperature and pressure, the release of aerosol in the buildings as a result of sodium fire must be considered for the safety measures of LMR. Therefore for the safety of the LMR, it is necessary to understand the characteristics of sodium fire, resulting from the various type of leakage. ASSCOPS(Analysis of Simultaneous Sodium Combustion in Pool and Spray) is the computer code for the analysis of the thermal consequence of sodium leak and fire in LMR that has been developed by Japan Nuclear Cycle Development Institute(JNC) in Japan. In this study, a preliminary analysis of sodium leak and fire accidents in S/G building of KALIMER is made by using ASSCOPS code. Various phenomena of interest are spray and pool burning, peak pressure, temperature change, local structure temperature, aerosol behavior, drain system into smothering tank, ventilation characteristics at each cell with the safety venting system and nitrogen injection system. In this calculation, the dimension of the S/G building was chosen in accordance with the selected options of LMR name KALIMER(Korea). As a result of this study, it was shown that subsequent effect of sodium fire depended upon whether the sodium continued to leak from the pipe or not, whether the ventilation system was running, whether the inert gas injection system was provided, whether the sodium on floor was drained into the smothering tank or not, whether the building was sealed or not, etc. Specially the excessive rise of pressure into each cell was prevented by installing the pressure release plates on wall of the building

  11. Comparison of analytical models and experimental results for single-event upset in CMOS SRAMs

    International Nuclear Information System (INIS)

    Mnich, T.M.; Diehl, S.E.; Shafer, B.D.

    1983-01-01

    In an effort to design fully radiation-hardened memories for satellite and deep-space applications, a 16K and a 2K CMOS static RAM were modeled for single-particle upset during the design stage. The modeling resulted in the addition of a hardening feedback resistor in the 16K remained tentatively unaltered. Subsequent experiments, using the Lawrence Berkeley Laboratories' 88-inch cyclotron to accelerate krypton and oxygen ions, established an upset threshold for the 2K and the 16K without resistance added, as well as a hardening threshold for the 16K with feedback resistance added. Results for the 16K showed it to be hardenable to the higher level than previously published data for other unhardened 16K RAMs. The data agreed fairly well with the modeling results; however, a close look suggests that modification of the simulation methodology is required to accurately predict the resistance necessary to harden the RAM cell

  12. Role of QA in total quality management environment

    International Nuclear Information System (INIS)

    McCarthy, J.B.; Ayres, R.A.

    1992-01-01

    A successful company in today's highly competitive business environment must emphasize quality in all activities at all times. For most companies, this requires a major cultural change to establish appropriate operating attitudes and priorities. A total quality environment is required where quality becomes a way of life, and this process must be carefully managed. It will not be accomplished in a few short months with a simple management pronouncement. Instead, it evolves over a period of years through continuous incremental improvement. This evolution towards total quality requires a dramatic change in the quality assurance (QA) function of most companies. Traditionally, quality was automatically equated to QA and its attendant procedures and personnel. Now, quality is becoming a global concept, and QA can play a significant role in the process. The QA profession must, however, recognize and accept a new role as consultant, coach, and partner in today's total quality game. The days of the hard-line enforcer of procedural requirements are gone

  13. Current Status of QA For Nuclear Power Plants in Japan

    International Nuclear Information System (INIS)

    Nagoshi, Hitohiko

    1986-01-01

    It is the current status of QA and our QA experiences with nuclear power plants against the background of the Japanese social and business environment. Accordingly, in 1972, 'The Guidance for Quality Assurance in Construction of Nuclear Power Plants' based on U. S. 10CEF50 Appendix B, was published by the Japan Electric Association. 'Jug-4101 The Guide for Quality Assurance of Nuclear Power Plants' has been prepared by referring to the IAEA QA code. The Guide has been accepted by the Japanese nuclear industry and applied to the QA programs of every organization concerned therewith. The Japanese approach to higher quality will naturally be different from that of other countries because of Japan's cultural, social, and economic conditions. Even higher quality is being aimed at through the LWR Improvement and Standardization Program and coordinated quality assurance efforts

  14. Analytical results for post-buckling behaviour of plates in compression and in shear

    Science.gov (United States)

    Stein, M.

    1985-01-01

    The postbuckling behavior of long rectangular isotropic and orthotropic plates is determined. By assuming trigonometric functions in one direction, the nonlinear partial differential equations of von Karman large deflection plate theory are converted into nonlinear ordinary differential equations. The ordinary differential equations are solved numerically using an available boundary value problem solver which makes use of Newton's method. Results for longitudinal compression show different postbuckling behavior between isotropic and orthotropic plates. Results for shear show that change in inplane edge constraints can cause large change in postbuckling stiffness.

  15. A fully electronic intensity-modulated radiation therapy quality assurance (IMRT QA) process implemented in a network comprised of independent treatment planning, record and verify, and delivery systems

    International Nuclear Information System (INIS)

    Bailey, Daniel W; Kumaraswamy, Lalith; Podgorsak, Matthew B

    2010-01-01

    The purpose of this study is to implement an electronic method to perform and analyze intensity-modulated radiation therapy quality assurance (IMRT QA) using an aSi megavoltage electronic portal imaging device in a network comprised of independent treatment planning, record and verify (R&V), and delivery systems. A verification plan was generated in the treatment planning system using the actual treatment plan of a patient. After exporting the treatment fields to the R&V system, the fields were delivered in QA mode with the aSi imager deployed. The resulting dosimetric images are automatically stored in a DICOM-RT format in the delivery system treatment console computer. The relative dose density images are subsequently pushed to the R&V system. The absolute dose images are then transferred electronically from the treatment console computer to the treatment planning system and imported into the verification plan in the dosimetry work space for further analysis. Screen shots of the gamma evaluation and isodose comparison are imported into the R&V system as an electronic file (e.g. PDF) to be reviewed prior to initiation of patient treatment. A relative dose image predicted by the treatment planning system can also be sent to the R&V system to be compared with the relative dose density image measured with the aSi imager. Our department does not have integrated planning, R&V, and delivery systems. In spite of this, we are able to fully implement a paperless and filmless IMRT QA process, allowing subsequent analysis and approval to be more efficient, while the QA document is directly attached to its specific patient chart in the R&V system in electronic form. The calculated and measured relative dose images can be compared electronically within the R&V system to analyze the density differences and ensure proper dose delivery to patients. In the absence of an integrated planning, verifying, and delivery system, we have shown that it is nevertheless possible to develop a

  16. Analytical techniques for the study of polyphenol-protein interactions.

    Science.gov (United States)

    Poklar Ulrih, Nataša

    2017-07-03

    This mini review focuses on advances in biophysical techniques to study polyphenol interactions with proteins. Polyphenols have many beneficial pharmacological properties, as a result of which they have been the subject of intensive studies. The most conventional techniques described here can be divided into three groups: (i) methods used for screening (in-situ methods); (ii) methods used to gain insight into the mechanisms of polyphenol-protein interactions; and (iii) methods used to study protein aggregation and precipitation. All of these methods used to study polyphenol-protein interactions are based on modifications to the physicochemical properties of the polyphenols or proteins after binding/complex formation in solution. To date, numerous review articles have been published in the field of polyphenols. This review will give a brief insight in computational methods and biosensors and cell-based methods, spectroscopic methods including fluorescence emission, UV-vis adsorption, circular dichroism, Fourier transform infrared and mass spectrometry, nuclear magnetic resonance, X-ray diffraction, and light scattering techniques including small-angle X-ray scattering and small-angle neutron scattering, and calorimetric techniques (isothermal titration calorimetry and differential scanning calorimetry), microscopy, the techniques which have been successfully used for polyphenol-protein interactions. At the end the new methods based on single molecule detection with high potential to study polyphenol-protein interactions will be presented. The advantages and disadvantages of each technique will be discussed as well as the thermodynamic, kinetic or structural parameters, which can be obtained. The other relevant biophysical experimental techniques that have proven to be valuable, such electrochemical methods, hydrodynamic techniques and chromatographic techniques will not be described here.

  17. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  18. Analytic study of 1D diffusive relativistic shock acceleration

    Energy Technology Data Exchange (ETDEWEB)

    Keshet, Uri, E-mail: ukeshet@bgu.ac.il [Physics Department, Ben-Gurion University of the Negev, POB 653, Be' er-Sheva 84105 (Israel)

    2017-10-01

    Diffusive shock acceleration (DSA) by relativistic shocks is thought to generate the dN / dE ∝ E{sup −p} spectra of charged particles in various astronomical relativistic flows. We show that for test particles in one dimension (1D), p {sup −1}=1−ln[γ{sub d}(1+β{sub d})]/ln[γ{sub u}(1+β{sub u})], where β{sub u}(β{sub d}) is the upstream (downstream) normalized velocity, and γ is the respective Lorentz factor. This analytically captures the main properties of relativistic DSA in higher dimensions, with no assumptions on the diffusion mechanism. Unlike 2D and 3D, here the spectrum is sensitive to the equation of state even in the ultra-relativistic limit, and (for a J(üttner-Synge equation of state) noticeably hardens with increasing 1<γ{sub u}<57, before logarithmically converging back to p (γ{sub u→∞})=2. The 1D spectrum is sensitive to drifts, but only in the downstream, and not in the ultra-relativistic limit.

  19. AN ANALYTICAL STUDY OF DEATHS DUE TO POISONING IN VISAKHAPATNAM

    Directory of Open Access Journals (Sweden)

    V. Chandrasekhar

    2017-11-01

    Full Text Available BACKGROUND The aim of this study was to determine and classify the various types of poisoning deaths as seen at Andhra Medical College Mortuary, Visakhapatnam city. MATERIALS AND METHODS This is a retrospective study of all the deaths due to poisoning seen in the Department of Forensic Medicine & Toxicology, Andhra Medical College, Visakhapatnam City over a 15 year period (January 2001‐December 2015 as recorded in the autopsy registers and postmortem reports of the department. RESULTS Poisoning is one of the commonest methods of committing suicide especially in developing countries like India. A total of 22475 autopsies were done during the period. Two thousand seventy four cases representing 9.23% of all bodies received by the mortuary were deaths due to poisoning. Organophosphate compounds were the most commonly 78.98% abused substance. The common motive of poisoning was suicidal 93.43%with male to female ratio 6.69:1.Peak incidence was observed in the age group 21-40 years. Type of poison consumed, socioeconomic status and place of household are also ascertained. CONCLUSION This study shows the pattern of poisoning deaths in Visakhapatnam and this preliminary data will provide a baseline for future research and help in formulating policies to prevent deaths due to poisoning.

  20. Quality assurance of 3-D conformal radiation therapy for a cooperative group trial - RTOG 3D QA center initial experience

    International Nuclear Information System (INIS)

    Michalski, Jeff M.; Purdy, James A.; Harms, William B.; Bosch, Walter R.; Oehmke, Frederick; Cox, James D.

    1996-01-01

    PURPOSE: 3-D conformal radiation therapy (3DCRT) holds promise in allowing safe escalation of radiation dose to increase the local control of prostate cancer. Prospective evaluation of this new modality requires strict quality assurance (QA). We report the results of QA review on patients receiving 3DCRT for prostate cancer on a cooperative group trial. MATERIALS and METHODS: In 1993 the NCI awarded the ACR/RTOG and nine institutions an RFA grant to study the use of 3DCRT in the treatment of prostate cancer. A phase I/II trial was developed to: a) test the feasibility of conducting 3DCRT radiation dose escalation in a cooperative group setting; b) establish the maximum tolerated radiation dose that can be delivered to the prostate; and c) quantify the normal tissue toxicity rate when using 3DCRT. In order to assure protocol compliance each participating institution was required to implement data exchange capabilities with the RTOG 3D QA center. The QA center reviews at a minimum the first five case from each participating center and spot checks subsequent submissions. For each case review the following parameters are evaluated: 1) target volume delineation, 2) normal structure delineation, 3) CT data quality, 4) field placement, 5) field shaping, and 6) dose distribution. RESULTS: Since the first patient was registered on August 23, 1994, an additional 170 patients have been accrued. Each of the nine original approved institutions has participated and three other centers have recently passed quality assurance bench marks for study participation. Eighty patients have been treated at the first dose level (68.4 Gy minimum PTV dose) and accrual is currently ongoing at the second dose level (73.8 Gy minimum PTV dose). Of the 124 cases that have undergone complete or partial QA review, 30 cases (24%) have had some problems with data exchange. Five of 67 CT scans were not acquired by protocol standards. Target volume delineation required the submitting institution

  1. Estimation of eye lens dose during brain scans using Gafchromic XR-QA2 film in various multidetector CT scanners

    International Nuclear Information System (INIS)

    Akhilesh, Philomina; Jamhale, Shramika H.; Sharma, S.D.; Kumar, Rajesh; Datta, D.; Kulkarni, Arti R.

    2017-01-01

    The purpose of this study was to estimate eye lens dose during brain scans in 16-, 64-, 128- and 256-slice multidetector computed tomography (CT) scanners in helical acquisition mode and to test the feasibility of using radiochromic film as eye lens dosemeter during CT scanning. Eye lens dose measurements were performed using Gafchromic XR-QA2 film on a polystyrene head phantom designed with outer dimensions equivalent to the head size of a reference Indian man. The response accuracy of XR-QA2 film was validated by using thermoluminescence dosemeters. The eye lens dose measured using XR-QA2 film on head phantom for plain brain scanning in helical mode ranged from 43.8 to 45.8 mGy. The XR-QA2 film measured dose values were in agreement with TLD measured dose values within a maximum variation of 8.9%. The good correlation between the two data sets confirms the viability of using XR-QA2 film for eye lens dosimetry. (authors)

  2. Analytical study of the conjecture rule for the combination of multipole effects in LHC

    CERN Document Server

    Guignard, Gilbert

    1997-01-01

    This paper summarizes the analytical investigation done on the conjecture law found by tracking for the effect on the dynamic aperture of the combination of two multipoles of various order. A one-dimensional model leading to an integrable system has been used to find closed formulae for the dynamic aperture associated with a fully distributed multipole. The combination has then been studied and the resulting expression compared with the assumed conjecture law. For integrated multipoles small with respect to the focusing strength, the conjecture appears to hold, though with an exponent different from the one expected by crude reasoning.

  3. Tribes and chiefdoms: An analytical study of some Brazilian ceramics

    International Nuclear Information System (INIS)

    Sabino, C.V.S.; Prous, A.P.; Wuest, I.; Guapindaia, V.

    2003-01-01

    There is no evidence of urban civilization in Brazilian prehistory; most inhabitants lived in tribal organizations, probably with regional economic integration among several independent tribes. There is little evidence of seasonal migrations between the coastal and inland areas of southern Brazil. Some specialized horticulturists competed among themselves but other groups lived more in isolation, and probably peacefully, in the upper interfluvial regions.The chiefdom system is supposed to have existed only along the river Amazon. In this region, some pottery makers may have been specialized craftsmen, and the finest ceramics that could have been exported from one village or region to another can be found. Outside this region, pottery was generally plain, except the tupiguarani, which was partly decorated. In this study some limited possibilities were tested, in three different cultural and regional contexts, to find out if the application of chemical analysis to economically and politically 'simple' societies can produce any results of additional archaeological relevance. (author)

  4. Radioluminescence in Al2O3: C - analytical and numerical simulation results

    DEFF Research Database (Denmark)

    Pagonis, V.; Lawless, J.; Chen, R.

    2009-01-01

    The phenomenon of radioluminescence (RL) has been reported in a number of materials including Al2O3 : C, which is one of the main dosimetric materials. In this work, we study RL using a kinetic model involving two trapping states and two kinds of recombination centres. The model has been previous...

  5. Tank 241-SY-102, January 2000 Compatibility Grab Samples Analytical Results for the Final Report

    International Nuclear Information System (INIS)

    BELL, K.E.

    2000-01-01

    This document is the format IV, final report for the tank 241-SY-102 (SY-102) grab samples taken in January 2000 to address waste compatibility concerns. Chemical, radiochemical, and physical analyses on the tank SY-102 samples were performed as directed in Comparability Grab Sampling and Analysis Plan for Fiscal Year 2000 (Sasaki 1999). No notification limits were exceeded. Preliminary data on samples 2SY-99-5, -6, and -7 were reported in ''Format II Report on Tank 241-SY-102 Waste Compatibility Grab Samples Taken in January 2000'' (Lockrem 2000). The data presented here represent the final results

  6. Photocatalytic degradation of rosuvastatin: Analytical studies and toxicity evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Machado, Tiele Caprioli, E-mail: tiele@enq.ufrgs.br [Chemical Engineering Department, Federal University of Rio Grande do Sul, Rua Engenheiro Luiz Englert s/n, CEP: 90040-040 Porto Alegre, RS (Brazil); Pizzolato, Tânia Mara [Chemical Institute, Federal University of Rio Grande do Sul, Avenida Bento Gonçalves, 9500, CEP: 91501-970 Porto Alegre, RS (Brazil); Arenzon, Alexandre [Ecology Center, Federal University of Rio Grande do Sul, Avenida Bento Gonçalves, 9500, CEP: 91501-970 Porto Alegre, RS (Brazil); Segalin, Jeferson [Biotechnology Center, Federal University of Rio Grande do Sul, Avenida Bento Gonçalves, 9500, CEP: 91501-970 Porto Alegre, RS (Brazil); Lansarin, Marla Azário [Chemical Engineering Department, Federal University of Rio Grande do Sul, Rua Engenheiro Luiz Englert s/n, CEP: 90040-040 Porto Alegre, RS (Brazil)

    2015-01-01

    Photocatalytic degradation of rosuvastatin, which is a drug that has been used to reduce blood cholesterol levels, was studied in this work employing ZnO as catalyst. The experiments were carried out in a temperature-controlled batch reactor that was irradiated with UV light. Preliminary the effects of the photocatalyst loading, the initial pH and the initial rosuvastatin concentration were evaluated. The experimental results showed that rosuvastatin degradation is primarily a photocatalytic process, with pseudo-first order kinetics. The byproducts that were generated during the oxidative process were identified using nano-ultra performance liquid chromatography tandem mass spectrometry (nano-UPLC–MS/MS) and acute toxicity tests using Daphnia magna were done to evaluate the toxicity of the untreated rosuvastatin solution and the reactor effluent. - Highlights: • The photocatalytic degradation of rosuvastatin was studied under UV irradiation. • Commercial catalyst ZnO was used. • Initial rosuvastatin concentration, photocatalyst loading and pH were evaluated. • The byproducts generated during the oxidative process were detected and identified. • Acute toxicity tests using Daphnia magna were carried out.

  7. Comparative study of the vapor analytes of trinitrotoluene (TNT)

    Science.gov (United States)

    Edge, Cindy C.; Gibb, Julie; Dugan, Regina E.

    1998-12-01

    Trinitrotoluene (TNT) is a high explosive used in most antipersonnel and antitank landmines. The Institute for Biological Detection Systems (IBDS) has developed a quantitative vapor delivery system, termed olfactometer, for conducting canine olfactory research. The research is conducted utilizing dynamic conditions, therefore, it is imperative to evaluate the headspace of TNT to ensure consistency with the dynamic generation of vapor. This study quantified the vapor headspace of military- grade TNT utilizing two different vapor generated methodologies, static and dynamic, reflecting differences between field and laboratory environments. Static vapor collection, which closely mimics conditions found during field detection, is defined as vapor collected in an open-air environment at ambient temperature. Dynamic vapor collection incorporates trapping of gases from a high flow vapor generation cell used during olfactometer operation. Analysis of samples collected by the two methodologies was performed by gas chromatography/mass spectrometry and the results provided information with regard to the constituents detected. However, constituent concentration did vary between the sampling methods. This study provides essential information regarding the vapor constituents associated with the TNT sampled using different sampling methods. These differences may be important in determining the detection signature dogs use to recognize TNT.

  8. Experimental and analytical study of natural circulation in square loop

    International Nuclear Information System (INIS)

    Moorthi, A.; Prem Sai, K.; Ravi, K.V.

    2015-01-01

    The nuclear safety under station blackout conditions is a major concern in the design of the nuclear reactors. In the case of existing reactors, the heat removal capability of cooling systems under natural circulation conditions is to be ascertained by experiments/analysis. This will ensure the long term core cooling and thereby, the safety of reactor core. Natural circulation occurs when the heat sink is at a higher elevation compared with the heat source. In case, the heat source and the sink are nearly at the same elevation, the difference in the elevations of their thermal centres can provide the elevation head required for natural circulation. An experimental study of natural circulation in the above geometry was carried out. The effect of flow resistance, Heat Source strength (heater power) and elevation difference between the source and the sink on the heat transfer were studied. The results of the experiments were analysed using RELAP5/MOD 3.2 and a good match between the experimental data and RELAP5 predictions is observed. (author)

  9. Analytical and Experimental Feasibility Study of Combined OTEC on NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jeongtae; Oh, Kyemin; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Hoon [KEPCO Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    The concept of the Combined Ocean Thermal Energy Conversion (Combined OTEC) needs to study. Combined OTEC uses exhausted steam on Nuclear Power Plants (NPPs) as heat source instead surface water. Exhausted steam extracted from condenser evaporates working fluid of Combined OTEC at heat exchanger (Hx-W). Essential calculation for conceptual design of Combined OTEC was already performed and presented before. However, the technical issue whether sufficient extraction of exhausted steam from high degree of vacuum of condenser to Hx-W can be supplied or not was unclear, which is significant to continue a demonstration program. In this study, so, we calculated the rate of extracted steam to evaluate whether sufficient steam can be extracted using RELAP code. In aspect of implementation of Combined OTEC, confirmation of sufficient flow of exhausted steam into Hx-W is the starting point of research. As the result of RELAP calculation, we confirmed that exhausted steam would flow into Hx-W. Considering the amount of exhausted steam in NPPs which is 1000 MWe and has 36 % of efficiency, 9 % of flow rate to Hx-W is means that 160 MWt of heat can be available as heat source of Combined OTEC. Using this, it can be possible to improve efficiency of aged NPPs and can compensate power loss caused by increase of circulation water temperature particularly in summer season.

  10. Air Monitoring Network at Tonopah Test Range: Network Description, Capabilities, and Analytical Results

    International Nuclear Information System (INIS)

    Hartwell, William T.; Daniels, Jeffrey; Nikolich, George; Shadel, Craig; Giles, Ken; Karr, Lynn; Kluesner, Tammy

    2012-01-01

    During the period April to June 2008, at the behest of the Department of Energy (DOE), National Nuclear Security Administration, Nevada Site Office (NNSA/NSO); the Desert Research Institute (DRI) constructed and deployed two portable environmental monitoring stations at the Tonopah Test Range (TTR) as part of the Environmental Restoration Project Soils Activity. DRI has operated these stations since that time. A third station was deployed in the period May to September 2011. The TTR is located within the northwest corner of the Nevada Test and Training Range (NTTR), and covers an area of approximately 725.20 km2 (280 mi2). The primary objective of the monitoring stations is to evaluate whether and under what conditions there is wind transport of radiological contaminants from Soils Corrective Action Units (CAUs) associated with Operation Roller Coaster on TTR. Operation Roller Coaster was a series of tests, conducted in 1963, designed to examine the stability and dispersal of plutonium in storage and transportation accidents. These tests did not result in any nuclear explosive yield. However, the tests did result in the dispersal of plutonium and contamination of surface soils in the surrounding area.

  11. Investigation and analytical results of bituminized products in drums at filing room

    International Nuclear Information System (INIS)

    Shibata, Atsuhiro; Kato, Yoshiyuki; Sano, Yuichi; Kitajima, Takafumi; Fujita, Hideto

    1999-09-01

    This report describes the results of investigation of the bituminized products in drums, liquid waste in the receiving tank V21 and the bituminized mixture in the extruder. The investigation of the products in drums showed most of the unburned products filled after 28B had abnormality, such as hardened surfaces, caves and porous brittle products. The particle sizes of the salt fixed in bituminized products depended neither on batch number nor on feed rate. It indicates the fining of the salt particle caused by the decreased feed rate did not occur. The measured concentrations of metals and anions in the bituminized products showed no abnormality. The catalytic content was not recognized in the products. The infrared absorption spectra obtained with the bituminized products show the oxidation at the incident occurred without oxygen. There was no organic phase on the surface of liquid waste in V21. Chemical analysis and thermal analysis on the precipitate in V21 showed no abnormality. Concentration of sodium nitrate/nitrite in the mixture collected from the extruder was lower than normal products. These results show no chemical activation of the bituminized products. It can be concluded that the chemical characteristics of the products had little abnormality even around the incident. (author)

  12. Experimental and analytical studies on soil-structure interaction behavior of nuclear reactor building

    International Nuclear Information System (INIS)

    Tsushima, Y.

    1978-01-01

    The purpose of this study is to estimate damping effects due to soil-structure interaction by the dissipation of vibrational energy to the ground through the foundation in a building with a short fundamental period such as a nuclear reactor building. The author performed experimental and analytical studies on the vibrational characteristics of model steel structures ranging from one to four stories high erected on the rigid base and located on soil, which are simulated from the vibrational characteristics of a prototype reactor building: the former study is to obtain damping effects due to inner friction of steel frames and the latter to obtain radiation damping effects due to soil-structure interaction. The author also touches upon the results of experiments performed on a BWR-type reactor building in 1974, which showed damping ratios higher than 20% of those in fundamental modes. Then the author attempts to estimate the damping effects of the reactor building by his own method proposed in the report. Through these studies the author finally concludes that the experimental damping effects are remarkable in the lower modes by the energy dissipation and the analytical results show a fairly good fit to the experimental ones

  13. Analytical and Numerical Studies of the Complex Interaction of a Fast Ion Beam Pulse with a Background Plasma

    International Nuclear Information System (INIS)

    Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2003-01-01

    Plasma neutralization of an intense ion beam pulse is of interest for many applications, including plasma lenses, heavy ion fusion, high energy physics, etc. Comprehensive analytical, numerical, and experimental studies are underway to investigate the complex interaction of a fast ion beam with a background plasma. The positively charged ion beam attracts plasma electrons, and as a result the plasma electrons have a tendency to neutralize the beam charge and current. A suite of particle-in-cell codes has been developed to study the propagation of an ion beam pulse through the background plasma. For quasi-steady-state propagation of the ion beam pulse, an analytical theory has been developed using the assumption of long charge bunches and conservation of generalized vorticity. The analytical results agree well with the results of the numerical simulations. The visualization of the data obtained in the numerical simulations shows complex collective phenomena during beam entry into and ex it from the plasma

  14. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    Science.gov (United States)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  15. QA role in advanced energy activities: Reductionism, emergence, and functionalism; presuppositions in designing internal QA audits

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1988-06-01

    After a brief overview of the mission of Fermilab, this paper explores some of the problems associated with designing internal QA audits. The paper begins with several examples of how audits should not be designed, then goes on to analyze two types of presuppositions about organizational structure (reductionism and emergence) that can be misleading and skew the data sample if folded too heavily into the checklist. A third type of presupposition (functionalism), is proposed as a viable way of achieving a more well-rounded measure of the performance of an organization, i.e. its effectiveness, not just compliance.

  16. WE-AB-201-02: TPS Commissioning and QA: A Process Orientation and Application of Control Charts

    Energy Technology Data Exchange (ETDEWEB)

    Sharpe, M. [The Princess Margaret Cancer Centre - UHN (Canada)

    2015-06-15

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  17. WE-AB-201-02: TPS Commissioning and QA: A Process Orientation and Application of Control Charts

    International Nuclear Information System (INIS)

    Sharpe, M.

    2015-01-01

    defects in the future. Finally, the Gamma test has become a popular metric for reporting TPS Commissioning and QA results. It simplifies complex testing into a numerical index, but noisy data and casual application can make it misleading. A brief review of the issues around the use of the Gamma test will be presented. TPS commissioning and QA: A process orientation and application of control charts (Michael Sharpe) A framework for commissioning a treatment planning system will be presented, focusing on preparations, practical aspects of configuration, priorities, specifications, and establishing performance. The complexity of the modern TPS make modular testing of features inadequate, and modern QA tools can provide “too much information” about the performance of techniques like IMRT and VMAT. We have adopted a process orientation and quality tools, like control charts, for ongoing TPS QA and assessment of patient-specific tests. The trending nature of these tools reveals the overall performance of the TPS system, and quantifies the variations that arise from individual plans, discrete calculations, and experimentation based on discrete measurements. Examples demonstrating application of these tools to TPS QA will be presented. TPS commissioning and QA: Incorporating the entire planning process (Sasa Mutic) The TPS and its features do not perform in isolation. Instead, the features and modules are key components in a complex process that begins with CT Simulation and extends to treatment delivery, along with image guidance and verification. Most importantly, the TPS is used by people working in a multi-disciplinary environment. It is very difficult to predict the outcomes of human interactions with software. Therefore, an interdisciplinary approach to training, commissioning and QA will be presented, along with an approach to the physics chart check and end-to-end testing as a tool for TPS QA. The role of standardization and automation in QA will also be discussed

  18. Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers

  19. Tank 241-T-201, core 192 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-08-07

    This document is the final laboratory report for Tank 241-T-201. Push mode core segments were removed from Riser 3 between April 24, 1997, and April 25, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-T-201 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997), Letter of Instruction for Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI) (Bell, 1997), Additional Core Composite Sample from Drainable Liquid Samples for Tank 241-T-2 01 (ACC) (Hall, 1997), and Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DSC) analyses exceeded the notification limits stated in DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report.

  20. Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-10-24

    This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers.

  1. Tank 241-T-201, core 192 analytical results for the final report

    International Nuclear Information System (INIS)

    Nuzum, J.L.

    1997-01-01

    This document is the final laboratory report for Tank 241-T-201. Push mode core segments were removed from Riser 3 between April 24, 1997, and April 25, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-T-201 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997), Letter of Instruction for Core Sample Analysis of Tanks 241-T-201, 241-T-202, 241-T-203, and 241-T-204 (LOI) (Bell, 1997), Additional Core Composite Sample from Drainable Liquid Samples for Tank 241-T-2 01 (ACC) (Hall, 1997), and Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT) or differential scanning calorimetry (DSC) analyses exceeded the notification limits stated in DQO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report

  2. Analytical studies on optimization of containment design pressure

    International Nuclear Information System (INIS)

    Haware, S.K.; Ghosh, A.K.; Kushwaha, H.S.

    2005-01-01

    optimizing on the size of BOP in order to optimize the containment design pressure. The results of the optimization studies are presented and discussed in the paper. (authors)

  3. An analytical and numerical study of solar chimney use for room natural ventilation

    Energy Technology Data Exchange (ETDEWEB)

    Bassiouny, Ramadan; Koura, Nader S.A. [Department of Mechanical Power Engineering and Energy, Minia University, Minia 61111 (Egypt)

    2008-07-01

    The solar chimney concept used for improving room natural ventilation was analytically and numerically studied. The study considered some geometrical parameters such as chimney inlet size and width, which are believed to have a significant effect on space ventilation. The numerical analysis was intended to predict the flow pattern in the room as well as in the chimney. This would help optimizing design parameters. The results were compared with available published experimental and theoretical data. There was an acceptable trend match between the present analytical results and the published data for the room air change per hour, ACH. Further, it was noticed that the chimney width has a more significant effect on ACH compared to the chimney inlet size. The results showed that the absorber average temperature could be correlated to the intensity as: (T{sub w} = 3.51I{sup 0.461}) with an accepted range of approximation error. In addition the average air exit velocity was found to vary with the intensity as ({nu}{sub ex} = 0.013I{sup 0.4}). (author)

  4. Tank 103, 219-S Facility at 222-S Laboratory, analytical results for the final report

    International Nuclear Information System (INIS)

    Fuller, R.K.

    1998-01-01

    This is the final report for the polychlorinated biphenyls analysis of Tank-103 (TK-103) in the 219-S Facility at 222-S Laboratory. Twenty 1-liter bottles (Sample numbers S98SO00074 through S98SO00093) were received from TK-103 during two sampling events, on May 5 and May 7, 1998. The samples were centrifuged to separate the solids and liquids. The centrifuged sludge was analyzed for PCBs as Aroclor mixtures. The results are discussed on page 6. The sample breakdown diagram (Page 114) provides a cross-reference of sample identification of the bulk samples to the laboratory identification number for the solids. The request for sample analysis (RSA) form is provided as Page 117. The raw data is presented on Page 43. Sample Description, Handling, and Preparation Twenty samples were received in the laboratory in 1-Liter bottles. The first 8 samples were received on May 5, 1998. There were insufficient solids to perform the requested PCB analysis and 12 additional samples were collected and received on May 7, 1998. Breakdown and sub sampling was performed on May 8, 1998. Sample number S98SO00084 was lost due to a broken bottle. Nineteen samples were centrifuged and the solids were collected in 8 centrifuge cones. After the last sample was processed, the solids were consolidated into 2 centrifuge cones. The first cone contained 9.7 grams of solid and 13.0 grams was collected in the second cone. The wet sludge from the first centrifuge cone was submitted to the laboratory for PCB analysis (sample number S98SO00102). The other sample portion (S98SO00103) was retained for possible additional analyses

  5. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    Science.gov (United States)

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  6. SU-F-T-489: 4-Years Experience of QA in TomoTherapy MVCT: What Do We Look Out For?

    Energy Technology Data Exchange (ETDEWEB)

    Lee, F; Chan, K [Queen Elizabeth Hospital, Hong Kong (Hong Kong)

    2016-06-15

    Purpose: To evaluate the QA results of TomoTherapy MVCT from March 2012 to February 2016, and to identify issues that may affect consistency in HU numbers and reconstructed treatment dose in MVCT. Methods: Monthly QA was performed on our TomoHD system. Phantom with rod inserts of various mass densities was imaged in MVCT and compared to baseline to evaluate HU number consistency. To evaluate treatment dose reconstructed by delivered sinogram and MVCT, a treatment plan was designed on a humanoid skull phantom. The phantom was imaged with MVCT and treatment plan was delivered to obtain the sinogram. The dose reconstructed with the Planned Adaptive software was compared to the dose in the original plan. The QA tolerance for HU numbers was ±30 HU, and ±2% for discrepancy between original plan dose and reconstructed dose. Tolerances were referenced to AAPM TG148. Results: Several technical modifications or maintenance activities to the system have been identified which affected QA Results: 1) Upgrade in console system software which added a weekly HU calibration procedure; 2) Linac or MLC replacement leading to change in Accelerator Output Machine (AOM) parameters; 3) Upgrade in planning system algorithm affecting MVCT dose reconstruction. These events caused abrupt changes in QA results especially for the reconstructed dose. In the past 9 months, when no such modifications were done to the system, reconstructed dose was consistent with maximum deviation from baseline less than 0.6%. The HU number deviated less than 5HU. Conclusion: Routine QA is essential for MVCT, especially if the MVCT is used for daily dose reconstruction to monitor delivered dose to patients. Several technical events which may affect consistency of this are software changes, linac or MLC replacement. QA results reflected changes which justify re-calibration or system adjustment. In normal circumstances, the system should be relatively stable and quarterly QA may be sufficient.

  7. [Obstructive sleep apnea syndrome. Analytical study of 63 cases].

    Science.gov (United States)

    Battikh, Mohamed H; Joobeur, Sameh; Ben Sayeh, Mohamed M; Rouetbi, Naceur; Maatallah, Anis; Daami, Monia; el Kamel, Ali

    2004-02-01

    Obstructive sleep apnoea (OSA) is a relatively common disorder, in developed country with prevalence estimated to lie between 2 and 4% in adult population. The diagnosis of this syndrome is made on the basis of characteristic clinical features and the results of nocturnal polysomnography. There is no data concerning the OSA in developing country. It is therefore of interest to determine the clinic and polysomnographic profile of this disease and to landmark factors correlated with severity in our country. This was achieved by studying a set of 63 OSA. The mean of age was 53 + 13 years with sex ratio 1. The means of Epworth sleepiness scale score, BMI and Apnoea/Hypopnoea index (AHI) were respectively 16 + 4, 38.8 + 7 kg/m2 and 51.7 + 28.6. 44% of patients have OSA severe with IAH > 50/h. Arousal index and desaturation index were respectively 36.4 + 21.7 and 49 + 26. Trial of continuous positive airway pressure (CPAP) therapy was proposed first to 40 patients, 17 were able to use CPAP.

  8. Water hammer and its effect on ageing: an analytical study

    International Nuclear Information System (INIS)

    Kedia, Suruchi

    2006-01-01

    Water hammer can be disastrous from the point of view of ageing of the pipe(s)/Piping system. Design of restraints and protection devices for the various piping systems must consider severe stresses that may generate because of fluid transients. These fluid transients are termed as water hammer when it is restricted to water. But to have limited margins on the stress loads of piping system it is very important to predict the actual dimensions of the stresses. This paper covers various causes and analyses of the situations under which water hammer waves get generated and also way(s) to have control on occurrences of such situations. Few case studies are also covered showing the results and graphs of the stress waves generated because of water hammer. Effort has also been made in the paper in the direction to find out the methodology to compute the ageing of the system because of water hammer waves. Further in this paper an attempt is made to show the systematic methodology towards the diagnosis of water hammer that can be treated as a foundation stone for the creation of water hammer diagnosis system. Active measures to minimize the water hammer intensity by influencing fluid dynamic conditions of the system will also be suggested. Finally the paper will present the ageing aspects because of the stresses that generate due to water hammer. (author)

  9. Analytical methods applied to the study of lattice gauge and spin theories

    International Nuclear Information System (INIS)

    Moreo, Adriana.

    1985-01-01

    A study of interactions between quarks and gluons is presented. Certain difficulties of the quantum chromodynamics to explain the behaviour of quarks has given origin to the technique of lattice gauge theories. First the phase diagrams of the discrete space-time theories are studied. The analysis of the phase diagrams is made by numerical and analytical methods. The following items were investigated and studied: a) A variational technique was proposed to obtain very accurated values for the ground and first excited state energy of the analyzed theory; b) A mean-field-like approximation for lattice spin models in the link formulation which is a generalization of the mean-plaquette technique was developed; c) A new method to study lattice gauge theories at finite temperature was proposed. For the first time, a non-abelian model was studied with analytical methods; d) An abelian lattice gauge theory with fermionic matter at the strong coupling limit was analyzed. Interesting results applicable to non-abelian gauge theories were obtained. (M.E.L.) [es

  10. SU-F-T-271: Comparing IMRT QA Pass Rates Before and After MLC Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Mazza, A; Perrin, D; Fontenot, J [Mary Bird Perkins Cancer Center, Baton Rouge, LA (United States)

    2016-06-15

    Purpose: To compare IMRT QA pass rates before and after an in-house MLC leaf calibration procedure. Methods: The MLC leaves and backup jaws on four Elekta linear accelerators with MLCi2 heads were calibrated using the EPID-based RIT Hancock Test as the means for evaluation. The MLCs were considered to be successfully calibrated when they could pass the Hancock Test with criteria of 1 mm jaw position tolerance, and 1 mm leaf position tolerance. IMRT QA results were collected pre- and postcalibration and analyzed using gamma analysis with 3%/3mm DTA criteria. AAPM TG-119 test plans were also compared pre- and post-calibration, at both 2%/2mm DTA and 3%/3mm DTA. Results: A weighted average was performed on the results for all four linear accelerators. The pre-calibration IMRT QA pass rate was 98.3 ± 0.1%, compared with the post-calibration pass rate of 98.5 ± 0.1%. The TG-119 test plan results showed more of an improvement, particularly at the 2%/2mm criteria. The averaged results were 89.1% pre and 96.1% post for the C-shape plan, 94.8% pre and 97.1% post for the multi-target plan, 98.6% pre and 99.7% post for the prostate plan, 94.7% pre and 94.8% post for the head/neck plan. Conclusion: The patient QA results did not show statistically significant improvement at the 3%/3mm DTA criteria after the MLC calibration procedure. However, the TG-119 test cases did show significant improvement at the 2%/2mm level.

  11. MO-D-213-05: Sensitivity of Routine IMRT QA Metrics to Couch and Collimator Rotations

    International Nuclear Information System (INIS)

    Alaei, P

    2015-01-01

    Purpose: To assess the sensitivity of gamma index and other IMRT QA metrics to couch and collimator rotations. Methods: Two brain IMRT plans with couch and/or collimator rotations in one or more of the fields were evaluated using the IBA MatriXX ion chamber array and its associated software (OmniPro-I’mRT). The plans were subjected to routine QA by 1) Creating a composite planar dose in the treatment planning system (TPS) with the couch/collimator rotations and 2) Creating the planar dose after “zeroing” the rotations. Plan deliveries to MatriXX were performed with all rotations set to zero on a Varian 21ex linear accelerator. This in effect created TPS-created planar doses with an induced rotation error. Point dose measurements for the delivered plans were also performed in a solid water phantom. Results: The IMRT QA of the plans with couch and collimator rotations showed clear discrepancies in the planar dose and 2D dose profile overlays. The gamma analysis, however, did pass with the criteria of 3%/3mm (for 95% of the points), albeit with a lower percentage pass rate, when one or two of the fields had a rotation. Similar results were obtained with tighter criteria of 2%/2mm. Other QA metrics such as percentage difference or distance-to-agreement (DTA) histograms produced similar results. The point dose measurements did not obviously indicate the error due to location of dose measurement (on the central axis) and the size of the ion chamber used (0.6 cc). Conclusion: Relying on Gamma analysis, percentage difference, or DTA to determine the passing of an IMRT QA may miss critical errors in the plan delivery due to couch/collimator rotations. A combination of analyses for composite QA plans, or per-beam analysis, would detect these errors

  12. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Folkerts, M; Graves, Y; Tian, Z; Gu, X; Jia, X; Jiang, S

    2014-01-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  13. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  14. Analytical techniques applied to study cultural heritage objects

    International Nuclear Information System (INIS)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N.

    2015-01-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  15. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  16. An Analytical Study of Mammalian Bite Wounds Requiring Inpatient Management

    Directory of Open Access Journals (Sweden)

    Young-Geun Lee

    2013-11-01

    Full Text Available BackgroundMammalian bite injuries create a public health problem because of their frequency, potential severity, and increasing number. Some researchers have performed fragmentary analyses of bite wounds caused by certain mammalian species. However, little practical information is available concerning serious mammalian bite wounds that require hospitalization and intensive wound management. Therefore, the purpose of this study was to perform a general review of serious mammalian bite wounds.MethodsWe performed a retrospective review of the medical charts of 68 patients who were referred to our plastic surgery department for the treatment of bite wounds between January 2003 and October 2012. The cases were analyzed according to the species, patient demographics, environmental factors, injury characteristics, and clinical course.ResultsAmong the 68 cases of mammalian bite injury, 58 (85% were caused by dogs, 8 by humans, and 2 by cats. Most of those bitten by a human and both of those bitten by cats were male. Only one-third of all the patients were children or adolescents. The most frequent site of injury was the face, with 40 cases, followed by the hand, with 16 cases. Of the 68 patients, 7 were treated with secondary intention healing. Sixty-one patients underwent delayed procedures, including delayed direct closure, skin graft, composite graft, and local flap.ConclusionsBased on overall findings from our review of the 68 cases of mammalian bites, we suggest practical guidelines for the management of mammalian bite injuries, which could be useful in the treatment of serious mammalian bite wounds.

  17. MO-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA I

    Energy Technology Data Exchange (ETDEWEB)

    Clements, M [RAD Image, Colorado Springs, CO (United States); Wiesmeyer, M [Standard Imaging, Inc., Middleton, WI (United States)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Automated Imaging QA for TG-142 with RIT Presentation Time: 2:45 – 3:15 PM This presentation will discuss software tools for automated imaging QA and phantom analysis for TG-142. All modalities used in radiation oncology will be discussed, including CBCT, planar kV imaging, planar MV imaging, and imaging and treatment coordinate coincidence. Vendor supplied phantoms as well as a variety of third-party phantoms will be shown, along with appropriate analyses, proper phantom setup procedures and scanning settings, and a discussion of image quality metrics. Tools for process automation will be discussed which include: RIT Cognition (machine learning for phantom image identification), RIT Cerberus (automated file system monitoring and searching), and RunQueueC (batch processing of multiple images). In addition to phantom analysis, tools for statistical tracking, trending, and reporting will be discussed. This discussion will include an introduction to statistical process control, a valuable tool in analyzing data and determining appropriate tolerances. An Introduction to TG-142 Imaging QA Using Standard Imaging Products Presentation Time: 3:15 – 3:45 PM Medical Physicists want to understand the logic behind TG-142 Imaging QA. What is often missing is a firm understanding of the connections between the EPID and OBI phantom imaging, the software “algorithms” that calculate the QA metrics, the establishment of baselines, and the analysis and interpretation of the results. The goal of our brief presentation will be to

  18. MO-PIS-Exhibit Hall-01: Tools for TG-142 Linac Imaging QA I

    International Nuclear Information System (INIS)

    Clements, M; Wiesmeyer, M

    2014-01-01

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The therapy topic this year is solutions for TG-142 recommendations for linear accelerator imaging QA. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Automated Imaging QA for TG-142 with RIT Presentation Time: 2:45 – 3:15 PM This presentation will discuss software tools for automated imaging QA and phantom analysis for TG-142. All modalities used in radiation oncology will be discussed, including CBCT, planar kV imaging, planar MV imaging, and imaging and treatment coordinate coincidence. Vendor supplied phantoms as well as a variety of third-party phantoms will be shown, along with appropriate analyses, proper phantom setup procedures and scanning settings, and a discussion of image quality metrics. Tools for process automation will be discussed which include: RIT Cognition (machine learning for phantom image identification), RIT Cerberus (automated file system monitoring and searching), and RunQueueC (batch processing of multiple images). In addition to phantom analysis, tools for statistical tracking, trending, and reporting will be discussed. This discussion will include an introduction to statistical process control, a valuable tool in analyzing data and determining appropriate tolerances. An Introduction to TG-142 Imaging QA Using Standard Imaging Products Presentation Time: 3:15 – 3:45 PM Medical Physicists want to understand the logic behind TG-142 Imaging QA. What is often missing is a firm understanding of the connections between the EPID and OBI phantom imaging, the software “algorithms” that calculate the QA metrics, the establishment of baselines, and the analysis and interpretation of the results. The goal of our brief presentation will be to

  19. Application of X-ray fluorescence analytical techniques in phytoremediation and plant biology studies

    International Nuclear Information System (INIS)

    Necemer, Marijan; Kump, Peter; Scancar, Janez; Jacimovic, Radojko; Simcic, Jurij; Pelicon, Primoz; Budnar, Milos; Jeran, Zvonka; Pongrac, Paula; Regvar, Marjana; Vogel-Mikus, Katarina

    2008-01-01

    Phytoremediation is an emerging technology that employs the use of higher plants for the clean-up of contaminated environments. Progress in the field is however handicapped by limited knowledge of the biological processes involved in plant metal uptake, translocation, tolerance and plant-microbe-soil interactions; therefore a better understanding of the basic biological mechanisms involved in plant/microbe/soil/contaminant interactions would allow further optimization of phytoremediation technologies. In view of the needs of global environmental protection, it is important that in phytoremediation and plant biology studies the analytical procedures for elemental determination in plant tissues and soil should be fast and cheap, with simple sample preparation, and of adequate accuracy and reproducibility. The aim of this study was therefore to present the main characteristics, sample preparation protocols and applications of X-ray fluorescence-based analytical techniques (energy dispersive X-ray fluorescence spectrometry-EDXRF, total reflection X-ray fluorescence spectrometry-TXRF and micro-proton induced X-ray emission-micro-PIXE). Element concentrations in plant leaves from metal polluted and non-polluted sites, as well as standard reference materials, were analyzed by the mentioned techniques, and additionally by instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS). The results were compared and critically evaluated in order to assess the performance and capability of X-ray fluorescence-based techniques in phytoremediation and plant biology studies. It is the EDXRF, which is recommended as suitable to be used in the analyses of a large number of samples, because it is multi-elemental, requires only simple preparation of sample material, and it is analytically comparable to the most frequently used instrumental chemical techniques. The TXRF is compatible to FAAS in sample preparation, but relative to AAS it is fast, sensitive and

  20. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    Science.gov (United States)

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  1. On New Families of Integrals in Analytical Studies of Superconductors within the Conformal Transformation Method

    Directory of Open Access Journals (Sweden)

    Ryszard Gonczarek

    2015-01-01

    Full Text Available We show that, by applying the conformal transformation method, strongly correlated superconducting systems can be discussed in terms of the Fermi liquid with a variable density of states function. Within this approach, it is possible to formulate and carry out purely analytical study based on a set of fundamental equations. After presenting the mathematical structure of the s-wave superconducting gap and other quantitative characteristics of superconductors, we evaluate and discuss integrals inherent in fundamental equations describing superconducting systems. The results presented here extend the approach formulated by Abrikosov and Maki, which was restricted to the first-order expansion. A few infinite families of integrals are derived and allow us to express the fundamental equations by means of analytical formulas. They can be then exploited in order to find quantitative characteristics of superconducting systems by the method of successive approximations. We show that the results can be applied in studies of high-Tc superconductors and other superconducting materials of the new generation.

  2. Analytical and Numerical Study of Foam-Filled Corrugated Core Sandwich Panels under Low Velocity Impact

    Directory of Open Access Journals (Sweden)

    Mohammad Nouri Damghani

    2016-05-01

    Full Text Available Analytical and finite element simulations are used to predict the effect of core density on the energy absorption of composite sandwich panels under low-velocity impact. The composite sandwich panel contains two facesheets and a foam-filled corrugated core. Analytical model is defined as a two degree-of-freedom system based on equivalent mass, spring, and dashpot to predict the local and global deformation response of a simply supported panel. The results signify a good agreement between analytical and numerical predictions.

  3. ANALYTICAL STUDY OF ESSENTIAL INFANTILE ESOTROPIA AND ITS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Kandasamy Sivakumar

    2017-06-01

    Full Text Available BACKGROUND Essential Infantile Esotropia (EIE is the most common type of strabismus. About 0.1% of the newborn are found to have esotropia. 1 Though present since birth, it becomes manifest and remains constant around six months of age. The features are large angle constant strabismus, defective Binocular Single Vision (BSV, cross fixation, DVD and latent nystagmus. Most of the patients have mild-to-moderate hyperopia; the amount of deviation is unrelated to the amount and type of refractive error. MATERIALS AND METHODS Fifty cases with EIE were included in this prospective study. A thorough ophthalmic and orthoptic evaluation was done in all the patients. For patients more than three years of age, the angle of deviation was measured with prism bar cover test, and for patients less than three years of age, angle of deviation was measured with Hirschberg’s test. Associated features like cross fixation, abduction limitation, Dissociated Vertical Deviation (DVD, nystagmus, amblyopia and Inferior Oblique Overaction (IOOA were documented. Occlusion therapy was given to amblyopic patients prior to surgery. All these patients underwent surgery and were followed up for a period of six months. RESULTS The prevalence of EIE in our centre was 0.33%. Of the fifty patients, 28 were males and 22 were female patients. 39 patients (78% had deviation of 30-50 Prism Dioptres (PD. Incidence of DVD, inferior oblique overaction and nystagmus was found to be lower when compared to western population. Amblyopia should be diagnosed early and treated adequately before surgery. Standard surgical option is bimedial recession. Monocular recession-resection surgery in one eye can be opted for in cases of irreversible amblyopia. Three or four muscle surgery can be done if deviation is very large. If marked inferior oblique overaction is present, the same should be weakened in addition to the horizontal muscle surgery. CONCLUSION EIE is the most common type of strabismus

  4. Analytical Study on Thermal and Mechanical Design of Printed Circuit Heat Exchanger

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Su-Jong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung-Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    The analytical methodologies for the thermal design, mechanical design and cost estimation of printed circuit heat exchanger are presented in this study. In this study, three flow arrangements of parallel flow, countercurrent flow and crossflow are taken into account. For each flow arrangement, the analytical solution of temperature profile of heat exchanger is introduced. The size and cost of printed circuit heat exchangers for advanced small modular reactors, which employ various coolants such as sodium, molten salts, helium, and water, are also presented.

  5. Influence of centrifugation conditions on the results of 77 routine clinical chemistry analytes using standard vacuum blood collection tubes and the new BD-Barricor tubes.

    Science.gov (United States)

    Cadamuro, Janne; Mrazek, Cornelia; Leichtle, Alexander B; Kipman, Ulrike; Felder, Thomas K; Wiedemann, Helmut; Oberkofler, Hannes; Fiedler, Georg M; Haschke-Becher, Elisabeth

    2018-02-15

    Although centrifugation is performed in almost every blood sample, recommendations on duration and g-force are heterogeneous and mostly based on expert opinions. In order to unify this step in a fully automated laboratory, we aimed to evaluate different centrifugation settings and their influence on the results of routine clinical chemistry analytes. We collected blood from 41 healthy volunteers into BD Vacutainer PST II-heparin-gel- (LiHepGel), BD Vacutainer SST II-serum-, and BD Vacutainer Barricor heparin-tubes with a mechanical separator (LiHepBar). Tubes were centrifuged at 2000xg for 10 minutes and 3000xg for 7 and 5 minutes, respectively. Subsequently 60 and 21 clinical chemistry analytes were measured in plasma and serum samples, respectively, using a Roche COBAS instrument. High sensitive Troponin T, pregnancy-associated plasma protein A, ß human chorionic gonadotropin and rheumatoid factor had to be excluded from statistical evaluation as many of the respective results were below the measuring range. Except of free haemoglobin (fHb) measurements, no analyte result was altered by the use of shorter centrifugation times at higher g-forces. Comparing LiHepBar to LiHepGel tubes at different centrifugation setting, we found higher lactate-dehydrogenase (LD) (P = 0.003 to centrifuged at higher speed (3000xg) for a shorter amount of time (5 minutes) without alteration of the analytes tested in this study. When using LiHepBar tubes for blood collection, a separate LD reference value might be needed.

  6. Use of nuclear and related analytical techniques in environmental research as exemplified by selected air pollution studies

    International Nuclear Information System (INIS)

    Smodis, B.; Jacimovic, R.; Jeran, Z.; Stropnik, B.; Svetina, M.

    2000-01-01

    Among nuclear and nuclear related analytical techniques, neutron activation analysis and X-ray fluorescence spectrometry proved to be particularly useful for environmental studies owing to their nondestructive character and multi element capability. This paper emphasizes their importance among other multielement analytical methods by discussing their specific role due to specific physics basis, quite different to other destructive non-nuclear methods, and by summarizing results obtained in several studies related to air pollution research, including analyses of airborne particulate matter, water samples, lichens and mosses. (author)

  7. The effect of porosity on the mechanical properties of porous titanium scaffolds: comparative study on experimental and analytical values

    Science.gov (United States)

    Khodaei, Mohammad; Fathi, Mohammadhossein; Meratian, Mahmood; Savabi, Omid

    2018-05-01

    Reducing the elastic modulus and also improving biological fixation to the bone is possible by using porous scaffolds. In the present study, porous titanium scaffolds containing different porosities were fabricated using the space holder method. Pore distribution, formed phases and mechanical properties of titanium scaffolds were studied by Scanning Electron Microscope (SEM), x-ray diffraction (XRD) and cold compression test. Then the results of compression test were compared to the Gibson-Ashby model. Both experimentally measured and analytically calculated elastic modulus of porous titanium scaffolds decreased by porosity increment. The compliance between experimentally measured and analytically calculated elastic modulus of titanium scaffolds are also increased by porosity increment.

  8. A Factor Analytic Study of the Teaching Events Stress Inventory.

    Science.gov (United States)

    Alexander, Livingston; And Others

    The purpose of this study was to determine if definitive factors emerge from the responses of teachers to the Teaching Events Stress Inventory (TESI). In a series of three studies during the years 1980 to 1982, data were collected to assess the levels and sources of stress experienced by 660 teachers in central and western Kentucky. The subjects…

  9. A Factor Analytic Study of the Internet Usage Scale

    Science.gov (United States)

    Monetti, David M.; Whatley, Mark A.; Hinkle, Kerry T.; Cunningham, Kerry T.; Breneiser, Jennifer E.; Kisling, Rhea

    2011-01-01

    This study developed an Internet Usage Scale (IUS) for use with adolescent populations. The IUS is a 26-item scale that measures participants' beliefs about how their Internet usage impacts their behavior. The sample for this study consisted of 947 middle school students. An exploratory factor analysis with varimax rotation was conducted on the…

  10. Minorities in Islamic History: An Analytical Study of Four Documents ...

    African Journals Online (AJOL)

    Journal for Islamic Studies. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 20 (2000) >. Log in or Register to get access to full text downloads.

  11. Analytical study on the determination of boron in environmental water samples

    International Nuclear Information System (INIS)

    Lopez, F.J.; Gimenez, E.; Hernandez, F.

    1993-01-01

    An analytical study on the determination of boron in environmental water samples was carried out. The curcumin and carmine standard methods were compared with the most recent Azomethine-H method in order to evaluate their analytical characteristics and feasibility for the analysis of boron in water samples. Analyses of synthetic water, ground water, sea water and waste water samples were carried out and a statistical evaluation of the results was made. The Azomethine-H method was found to be the most sensitive (detection limit 0.02 mg l -1 ) and selective (no interference of commonly occurring ions in water was observed), showing also the best precision (relative standard deviation lower than 4%). Moreover, it gave good results for all types of samples analyzed. The accuracy of this method was tested by the addition of known amounts of standard solutions to different types of water samples. The slopes of standard additions and direct calibration graphs were similar and recoveries of added boron ranged from 99 to 107%. (orig.)

  12. Green supply chain management strategy selection using analytic network process: case study at PT XYZ

    Science.gov (United States)

    Adelina, W.; Kusumastuti, R. D.

    2017-01-01

    This study is about business strategy selection for green supply chain management (GSCM) for PT XYZ by using Analytic Network Process (ANP). GSCM is initiated as a response to reduce environmental impacts from industrial activities. The purposes of this study are identifying criteria and sub criteria in selecting GSCM Strategy, and analysing a suitable GSCM strategy for PT XYZ. This study proposes ANP network with 6 criteria and 29 sub criteria, which are obtained from the literature and experts’ judgements. One of the six criteria contains GSCM strategy options, namely risk-based strategy, efficiency-based strategy, innovation-based strategy, and closed loop strategy. ANP solves complex GSCM strategy-selection by using a more structured process and considering green perspectives from experts. The result indicates that innovation-based strategy is the most suitable green supply chain management strategy for PT XYZ.

  13. Analytical study for Japan's energy system with MARKAL model

    International Nuclear Information System (INIS)

    Koyama, Shigeo; Kashihara, Toshinori; Endo, Eiichi

    1984-01-01

    Taking part in the 1982 collaboration activity of the Energy Technology Systems Analysis Project (ETSAP), which was started in November, 1980 by the International Energy Agency (IEA), the authors have analyzed extensive scenarios, including common scenarios of the Project, using a version of energy system model MARKAL programmed by their group and input data set up with various ideas. Important points to be considered in conducting the analysis and noteworthy results obtained from the analysis of Japan's energy systems are given. (AUTHOR)

  14. Learning and study strategies: a learning analytics approach for feedback

    OpenAIRE

    De Laet, Tinne; Broos, Tom; Pinxten, Maarten; Vanhoudt, Joke; Verbert, Katrien; Van Soom, Carolien; Langie, Greet

    2017-01-01

    Due to the open entrance in the Flemish (Belgium) higher education system (any student with a secondary education diploma can enter the program), a substantial part of the first-year students enters without the right qualifications, resulting in an overall drop-out rate is around 40% in the Faculties of Science, Engineering Science, Engineering Technology, and Bio-engineering at the KU Leuven. Therefore, KU Leuven staff heavily invest in heavily invest in advising students before and througho...

  15. Analytical quality control in studies of environmental exposure to mercury

    International Nuclear Information System (INIS)

    Byrne, A.R.; Prosenc, N.; Smerke, J.; Horvat, M.

    1995-01-01

    The work of the laboratory for quality control in this co-ordinated project for the period from November 1993 to June 1994 is presented. The major effort was devoted to assisting in establishing the homogeneity and total methylmercury levels in two new hair reference materials prepared as control materials for the project, numbered 085 (spiked) and 086 (natural level). Results for some hair materials from participants are also given. (author)

  16. Perangkat Lunak Perhitungan Perubahan Jabatan Dengan Menggunakan Fuzzy Analytical Hierarchy Process (studi kasus : UIN Sunan Ampel Surabaya

    Directory of Open Access Journals (Sweden)

    Ilham .

    2017-04-01

    Full Text Available This study discusses the promotion in an institution by considering many factors, and the submission must be conducted objectively, not subjectively. To be able to give an objective assessment results in every employee by considering all the assessment criteria, one of the methods that can be used is the method of Fuzzy Analytical Hierarchy Process (AHP. From the results of research using Fuzzy Analytical Hierarchy Process method showed that the greatest weight virmansyah discount the value that is equal to 80.78 so a great opportunity to get a change or a promotion. Decision to change positions, it gives the rank order value candidates as a recommendation for promotion to employees. So with the Decision Support System with Analytical Hierarchy Process  method can help and facilitate career planning manager (sale or transfer to save time, costs, and more objective.

  17. A Study of Online Exams Procrastination Using Data Analytics Techniques

    Science.gov (United States)

    Levy, Yair; Ramim, Michelle M.

    2012-01-01

    Procrastination appears to be an inevitable part of daily life, especially for activities that are bounded by deadlines. It has implications for performance and is known to be linked to poor personal time management. Although research related to procrastination as a general behavior has been well established, studies assessing procrastination in…

  18. Assessing Vocal Performances Using Analytical Assessment: A Case Study

    Science.gov (United States)

    Gynnild, Vidar

    2016-01-01

    This study investigated ways to improve the appraisal of vocal performances within a national academy of music. Since a criterion-based assessment framework had already been adopted, the conceptual foundation of an assessment rubric was used as a guide in an action research project. The group of teachers involved wanted to explore thinking…

  19. An analytical model for the study of a small LFR core dynamics: development and benchmark

    International Nuclear Information System (INIS)

    Bortot, S.; Cammi, A.; Lorenzi, S.; Moisseytsev, A.

    2011-01-01

    An analytical model for the study of a small Lead-cooled Fast Reactor (LFR) control-oriented dynamics has been developed aimed at providing a useful, very flexible and straightforward, though accurate, tool allowing relatively quick transient design-basis and stability analyses. A simplified lumped-parameter approach has been adopted to couple neutronics and thermal-hydraulics: the point-kinetics approximation has been employed and an average-temperature heat-exchange model has been implemented. The reactor transient responses following postulated accident initiators such as Unprotected Control Rod Withdrawal (UTOP), Loss of Heat Sink (ULOHS) and Loss of Flow (ULOF) have been studied for a MOX and a metal-fuelled core at the Beginning of Cycle (BoC) and End of Cycle (EoC) configurations. A benchmark analysis has been then performed by means of the SAS4A/SASSYS-1 Liquid Metal Reactor Code System, in which a core model based on three representative channels has been built with the purpose of providing verification for the analytical outcomes and indicating how the latter relate to more realistic one-dimensional calculations. As a general result, responses concerning the main core characteristics (namely, power, reactivity, etc.) have turned out to be mutually consistent in terms of both steady-state absolute figures and transient developments, showing discrepancies of the order of only some percents, thus confirming a very satisfactory agreement. (author)

  20. Possibilities and limits of multiprofessional attention in the care of psychiatric emergencies: analytical study

    Directory of Open Access Journals (Sweden)

    Fernanda Lima de Paula

    2017-05-01

    Full Text Available Goal: to analyze the possibilities and limits of multiprofessional care in the attention to psychiatric emergencies. Method: it is an analytical study of the type integrative review of the comprehensive literature. Searches were conducted in the Latin American and Caribbean Literature (LILACS and Nursing Database (BDENF databases and in the ScieLo Virtual Library, with the use of Descriptors in Health Sciences (DECs: “Emergency Services, Psychiatric”, “Forensic Psychiatry”, “Psychiatric Rehabilitation”, in the period from 2007 to 2017. Results: after data analysis, two thematic categories emerged: “Possibilities and limits in multiprofessional care for patients in crisis” and “The continuity of care to the patient in crisis by the multiprofessional team”. The studies point out fragility in the management of the multiprofessional team of care to the patients in psychiatric crisis. Therefore, in the substitutive services to the psychiatric hospital, it is necessary to strengthen the care and bonding tools for continuity of treatment after the cases of psychiatric emergency of these patients. Conclusion: this research provided a deepening of the knowledge regarding the challenges of the multiprofessional team in the care of analytical psychiatric emergencies and in relation to the patient in crisis, considering the main multiprofessional actions, understanding how this approach is done and patient follow-up. Descriptors: Emergency Services, Psychiatric. Forensic Psychiatry. Psychiatric Rehabilitation.

  1. EMPLOYEE PROMOTION PLANNING IN ANALYTICAL HIERARCHY PROCESS PERSPECTIVE: STUDY ON NATIONAL PUBLIC PROCUREMENT AGENCY

    Directory of Open Access Journals (Sweden)

    Ayuningtyas A.K.

    2017-10-01

    Full Text Available The promotion process is part of the career development conducted by Civil State Apparatus Employee (Pegawai Aparatur Sipil Negara which should be implemented by applying merit system. Employee-related strategic decision making has not applied merit system as mandated in applied laws. It occurred due to Public Service Appointment Board (Badan Pertimbangan Jabatan dan Kepangkatan not possessing assessment model and criteria which could be used to support promotion process implementation in the appropriate structural position based on employee competence and performance. This study aims to describe and analyze assessment criteria and subcriteria required to be considered in State Civil Servant Officers promotion planning by applying Analytical Hierarchy Process (AHP method in National Public Procurement Agency (Lembaga Kebijakan Pengadaan Barang/Jasa Pemerintah. This study uses the explanative quantitative univariate method. Data collection technique used questionnaire instrument. Analytical tool used was AHP. Research result exhibits that ASN employee promotion planning using assessment model is described as follows: Employee Performance Assessment consist of Employee Work Performance element with three criteria and Employee Work Behavior with twenty-three criteria; and Evaluation of Employee Promotion Implementation with eleven criteria. Through the use of AHP methods employee, promotion planning could be utilized as a tool for Baperjakat to produce employee decisions that will be promoted objectively and effectively.

  2. An analytical model and parametric study of electrical contact resistance in proton exchange membrane fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhiliang; Wang, Shuxin; Zhang, Lianhong [School of Mechanical Engineering, Tianjin University, Tianjin 300072 (China); Hu, S. Jack [Department of Mechanical Engineering, The University of Michigan, Ann Arbor, MI 48109-2125 (United States)

    2009-04-15

    This paper presents an analytical model of the electrical contact resistance between the carbon paper gas diffusion layers (GDLs) and the graphite bipolar plates (BPPs) in a proton exchange membrane (PEM) fuel cell. The model is developed based on the classical statistical contact theory for a PEM fuel cell, using the same probability distributions of the GDL structure and BPP surface profile as previously described in Wu et al. [Z. Wu, Y. Zhou, G. Lin, S. Wang, S.J. Hu, J. Power Sources 182 (2008) 265-269] and Zhou et al. [Y. Zhou, G. Lin, A.J. Shih, S.J. Hu, J. Power Sources 163 (2007) 777-783]. Results show that estimates of the contact resistance compare favorably with experimental data by Zhou et al. [Y. Zhou, G. Lin, A.J. Shih, S.J. Hu, J. Power Sources 163 (2007) 777-783]. Factors affecting the contact behavior are systematically studied using the analytical model, including the material properties of the two contact bodies and factors arising from the manufacturing processes. The transverse Young's modulus of chopped carbon fibers in the GDL and the surface profile of the BPP are found to be significant to the contact resistance. The factor study also sheds light on the manufacturing requirements of carbon fiber GDLs for a better contact performance in PEM fuel cells. (author)

  3. Experimental and analytical studies on the vibration serviceability of long-span prestressed concrete floor

    Science.gov (United States)

    Cao, Liang; Liu, Jiepeng; Li, Jiang; Zhang, Ruizhi

    2018-04-01

    An extensive experimental and theoretical research study was undertaken to study the vibration serviceability of a long-span prestressed concrete floor system to be used in the lounge of a major airport. Specifically, jumping impact tests were carried out to obtain the floor's modal parameters, followed by an analysis of the distribution of peak accelerations. Running tests were also performed to capture the acceleration responses. The prestressed concrete floor was found to have a low fundam