WorldWideScience

Sample records for standard analysis techniques

  1. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  2. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  3. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  4. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    Science.gov (United States)

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  5. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  6. Standard Establishment Through Scenarios (SETS): A new technique for occupational fitness standards.

    Science.gov (United States)

    Blacklock, R E; Reilly, T J; Spivock, M; Newton, P S; Olinek, S M

    2015-01-01

    An objective and scientific task analysis provides the basis for establishing legally defensible Physical Employment Standards (PES), based on common and essential occupational tasks. Infrequent performance of these tasks creates challenges when developing PES based on criterion, or content validity. Develop a systematic approach using Subject Matter Experts (SME) to provide tasks with 1) an occupationally relevant scenario considered common to all personnel; 2) a minimum performance standard defined by time, distance, load or work. Examples provided here relate to the development of a new PES for the Canadian Armed Forces (CAF). SME of various experience are selected based on their eligibility criteria. SME are required to define a reasonable scenario for each task from personal experience, provide occupational performance requirements of the scenario in sub-groups, and discuss and agree by consensus vote on the final standard based on the definition of essential. A common and essential task for the CAF is detailed as a case example of process application. Techniques to avoid common SME rating errors are discussed and advantages to the method described. The SETS method was developed as a systematic approach to setting occupational performance standards and qualifying information from SME.

  7. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    Science.gov (United States)

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  8. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  9. Radiographic analysis of the temporomandibular joint by the standardized projection technique

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1983-01-01

    The purpose of this study was to investigate the radiographic images of the condylar head in clinically normal subjects and the TMJ patients using standardized projection technique. 45 subjects who have not clinical evidence of TMJ problems and 96 patients who have the clinical evidence of TMJ problems were evaluated, but the patients who had fracture, trauma and tumor on TMJ area were discluded in this study. For the evaluation of radiographic images, the author has observed the condylar head positions in closed mouth and 2.54 cm open mouth position taken by the standardized transcranial oblique lateral projection technique. The results were as follow: 1. In closed mouth position, the crest of condylar head took relatively posterior position to the deepest point of the glenoid fossa in 8.9% of the normals and in 26.6% of TMJ patients. 2. In 2.54 cm open mouth position, condylar head took relatively posterior position to the articular eminence in 2 .2% of TMJ patients and 39.6% of the normals. 3. In open mouth position, the horizontal distance from the deepest point of the glenoid fossa to the condylar head was 13.96 mm in the normals and 10.68 mm in TMJ patients. 4. The distance of true movement of condylar head was 13.49 mm in the normals and 10.27 mm in TMJ patients. 5. The deviation of mandible in TMJ patients was slightly greater than of the normals.

  10. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    Science.gov (United States)

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Force coordination in static manipulation tasks performed using standard and non-standard grasping techniques.

    Science.gov (United States)

    de Freitas, Paulo B; Jaric, Slobodan

    2009-04-01

    We evaluated coordination of the hand grip force (GF; normal component of the force acting at the hand-object contact area) and load force (LF; the tangential component) in a variety of grasping techniques and two LF directions. Thirteen participants exerted a continuous sinusoidal LF pattern against externally fixed handles applying both standard (i.e., using either the tips of the digits or the palms; the precision and palm grasps, respectively) and non-standard grasping techniques (using wrists and the dorsal finger areas; the wrist and fist grasp). We hypothesized (1) that the non-standard grasping techniques would provide deteriorated indices of force coordination when compared with the standard ones, and (2) that the nervous system would be able to adjust GF to the differences in friction coefficients of various skin areas used for grasping. However, most of the indices of force coordination remained similar across the tested grasping techniques, while the GF adjustments for the differences in friction coefficients (highest in the palm and the lowest in the fist and wrist grasp) provided inconclusive results. As hypothesized, GF relative to the skin friction was lowest in the precision grasp, but highest in the palm grasp. Therefore, we conclude that (1) the elaborate coordination of GF and LF consistently seen across the standard grasping techniques could be generalized to the non-standard ones, while (2) the ability to adjust GF using the same grasping technique to the differences in friction of various objects cannot be fully generalized to the GF adjustment when different grasps (i.e., hand segments) are used to manipulate the same object. Due to the importance of the studied phenomena for understanding both the functional and neural control aspects of manipulation, future studies should extend the current research to the transient and dynamic tasks, as well as to the general role of friction in our mechanical interactions with the environment.

  12. Standardization of surgical techniques used in facial bone contouring.

    Science.gov (United States)

    Lee, Tae Sung

    2015-12-01

    Since the introduction of facial bone contouring surgery for cosmetic purposes, various surgical methods have been used to improve the aesthetics of facial contours. In general, by standardizing the surgical techniques, it is possible to decrease complication rates and achieve more predictable surgical outcomes, thereby increasing patient satisfaction. The technical strategies used by the author to standardize facial bone contouring procedures are introduced here. The author uses various pre-manufactured surgical tools and hardware for facial bone contouring. During a reduction malarplasty or genioplasty procedure, double-bladed reciprocating saws and pre-bent titanium plates customized for the zygomatic body, arch and chin are used. Various guarded oscillating saws are used for mandibular angloplasty. The use of double-bladed saws and pre-bent plates to perform reduction malarplasty reduces the chances of post-operative asymmetry or under- or overcorrection of the zygoma contours due to technical faults. Inferior alveolar nerve injury and post-operative jawline asymmetry or irregularity can be reduced by using a guarded saw during mandibular angloplasty. For genioplasty, final placement of the chin in accordance with preoperative quantitative analysis can be easily performed with pre-bent plates, and a double-bladed saw allows more procedural accuracy during osteotomies. Efforts by the surgeon to avoid unintentional faults are key to achieving satisfactory results and reducing the incidence of complications. The surgical techniques described in this study in conjunction with various in-house surgical tools and modified hardware can be used to standardize techniques to achieve aesthetically gratifying outcomes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. ASTM standards for fire debris analysis: a review.

    Science.gov (United States)

    Stauffer, Eric; Lentini, John J

    2003-03-12

    The American Society for Testing and Materials (ASTM) recently updated its standards E 1387 and E 1618 for the analysis of fire debris. The changes in the classification of ignitable liquids are presented in this review. Furthermore, a new standard on extraction of fire debris with solid phase microextraction (SPME) was released. Advantages and drawbacks of this technique are presented and discussed. Also, the standard on cleanup by acid stripping has not been reapproved. Fire debris analysts that use the standards should be aware of these changes.

  14. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    Energy Technology Data Exchange (ETDEWEB)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin [VU University Medical Center, Department of Cardiology, and Institute for Cardiovascular Research (ICaR-VU), Amsterdam (Netherlands); Kuijer, Joost P.A. [VU University Medical Center, Department of Physics and Medical Technology, Amsterdam (Netherlands); Ven, Peter M. van de [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); Meine, Mathias [University Medical Center, Department of Cardiology, Utrecht (Netherlands); Croisille, Pierre; Clarysse, Patrick [Univ Lyon, UJM-Saint-Etienne, INSA, CNRS UMR 5520, INSERM U1206, CREATIS, Saint-Etienne (France)

    2017-12-15

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  15. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    International Nuclear Information System (INIS)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin; Kuijer, Joost P.A.; Ven, Peter M. van de; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick

    2017-01-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  16. Improvement of AC motor reliability from technique standardization

    International Nuclear Information System (INIS)

    Muniz, P.R.; Faria, M.D.R.; Mendes, M.P.; Silva, J.N.; Dos Santos, J.D.

    2005-01-01

    The purpose of this paper is to explain the increase of reliability of motors serviced in the Electrical Maintenance Shop of Companhia Siderurgica de Tubarao by standardization of the technique based on Brazilian and International Standards, manufacturer's recommendations and the experience of the maintenance staff. (author)

  17. Paediatric sutureless circumcision-an alternative to the standard technique.

    LENUS (Irish Health Repository)

    2012-01-31

    INTRODUCTION: Circumcision is one of the most commonly performed surgical procedures in male children. A range of surgical techniques exist for this commonly performed procedure. The aim of this study is to assess the safety, functional outcome and cosmetic appearance of a sutureless circumcision technique. METHODS: Over a 9-year period, 502 consecutive primary sutureless circumcisions were performed by a single surgeon. All 502 cases were entered prospectively into a database including all relevant clinical details and a review was performed. The technique used to perform the sutureless circumcision is a modification of the standard sleeve technique with the use of a bipolar diathermy and the application of 2-octyl cyanoacrylate (2-OCA) to approximate the tissue edges. RESULTS: All boys in this study were pre-pubescent and the ages ranged from 6 months to 12 years (mean age 3.5 years). All patients had this procedure performed as a day case and under general anaesthetic. Complications included: haemorrhage (2.2%), haematoma (1.4%), wound infection (4%), allergic reaction (0.2%) and wound dehiscence (0.8%). Only 9 (1.8%) parents or patients were dissatisfied with the cosmetic appearance. CONCLUSION: The use of 2-OCA as a tissue adhesive for sutureless circumcisions is an alternative to the standard suture technique. The use of this tissue adhesive, 2-OCA, results in comparable complication rates to the standard circumcision technique and results in excellent post-operative cosmetic satisfaction.

  18. Preparation of uranium standard solutions for x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Wong, C.M.; Cate, J.L.; Pickles, W.L.

    1978-03-01

    A method has been developed for gravimetrically preparing uranium nitrate standards with an estimated mean error of 0.1% (1 sigma) and a maximum error of 0.2% (1 sigma) for the total uranium weight. Two source materials, depleted uranium dioxide powder and NBS Standard Reference Material 960 uranium metal, were used to prepare stock solutions. The NBS metal proved to be superior because of the small but inherent uncertainty in the stoichiometry of the uranium oxide. These solutions were used to prepare standards in a freeze-dried configuration suitable for x-ray fluorescence analysis. Both gravimetric and freeze-drying techniques are presented. Volumetric preparation was found to be unsatisfactory for 0.1% precision for the sample size of interest. One of the primary considerations in preparing uranium standards for x-ray fluorescence analysis is the development of a technique for dispensing a 50-μl aliquot of a standard solution with a precision of 0.1% and an accuracy of 0.1%. The method developed corrects for variation in aliquoting and for evaporation loss during weighing. Two sets, each containing 50 standards have been produced. One set has been retained by LLL and one set retained by the Savannah River project

  19. Standardization of Berberis aristata extract through conventional and modern HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh K. Patel

    2012-05-01

    Full Text Available Objective: Berberis aristata (Berberidaceae is an important medicinal plant, found in the different region of the world. It has significant medicinal value in the traditional Indian and Chinese system of medicine. The aim of the present investigation includes qualitative and quantitative analysis of Berberis aristata extract. Methods: Present study includes determination of phytochemical analysis, solubility test, heavy metal analysis, antimicrobial study and quantitative analysis by HPTLC method. Results: Preliminary phytochemical analysis showed the presence of carbohydrate, glycoside, alkaloid, protein, amino acid, saponin, tannin and flavonoid. Solubility in water and alcohal were found to be 81.90% in water and 84.52% in 50% in alcohal. Loss on drying was found to be 5.32%. Total phenol and flavonoid content were found to be 0.11% and 2.8%. Level of lead, arsenic, mercury and cadmium complies the standard level. E. coli and salmonella was found to be absent whereas total bacterial count, yeast and moulds contents were found to be under the limit. Content of berberine was found to be 13.47% through HPTLC techniques. Conclusions: The results obtained from the present studies could be used as source of valuable information which can play an important role for the food scientists, researchers and even the consumers for its standards.

  20. Relationship between alveolar bone measured by 125I absorptiometry with analysis of standardized radiographs: 2. Bjorn technique

    International Nuclear Information System (INIS)

    Ortman, L.F.; McHenry, K.; Hausmann, E.

    1982-01-01

    The Bjorn technique is widely used in periodontal studies as a standardized measure of alveolar bone. Recent studies have demonstrated the feasibility of using 125 I absorptiometry to measure bone mass. The purpose of this study was to compare 125 I absorptiometry with the Bjorn technique in detecting small sequential losses of alveolary bone. Four periodontal-like defects of incrementally increasing size were produced in alveolar bone in the posterior segment of the maxilla of a human skull. An attempt was made to sequentially reduce the amount of bone in 10% increments until no bone remained, a through and through defect. The bone remaining at each step was measured using 125 I absorptiometry. At each site the 125 I absorptiometry measurements were made at the same location by fixing the photon source to a prefabricated precision-made occlusal splint. This site was just beneath the crest and midway between the borders of two adjacent teeth. Bone loss was also determined by the Bjorn technique. Standardized intraoral films were taken using a custom-fitted acrylic clutch, and bone measurements were made from the root apex to coronal height of the lamina dura. A comparison of the data indicates that: (1) in early bone loss, less than 30%, the Bjorn technique underestimates the amount of loss, and (2) in advanced bone loss, more than 60% the Bjorn technique overestimates it

  1. International Standardization of Library and Documentation Techniques.

    Science.gov (United States)

    International Federation for Documentation, The Hague (Netherlands).

    This comparative study of the national and international standards, rules and regulations on library and documentation techniques adopted in various countries was conducted as a preliminary step in determining the minimal bases for facilitating national and international cooperation between documentalists and librarians. The study compares and…

  2. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  3. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    International Nuclear Information System (INIS)

    Martens, Hans-Juergen von

    2010-01-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s 2 ). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  4. Multielement comparison of instrumental neutron activation analysis techniques using reference materials

    International Nuclear Information System (INIS)

    Ratner, R.T.; Vernetson, W.G.

    1995-01-01

    Several instrumental neutron activation analysis techniques (parametric, comparative, and k o -standardization) are evaluated using three reference materials. Each technique is applied to National Institute of Standards and Technology standard reference materials, SRM 1577a (Bovine Liver) and SRM 2704 (Buffalo River Sediment), and the United States Geological Survey standard BHVO-1 (Hawaiian Basalt Rock). Identical (but not optimum) irradiation, decay, and counting schemes are employed with each technique to provide a basis for comparison and to determine sensitivities in a routine irradiation scheme. Fifty-one elements are used in this comparison; however, several elements are not detected in the reference materials due to rigid analytical conditions (e.g. insufficient length of irradiation or activity for radioisotope of interest decaying below the lower limit of detection before counting interval). Most elements are normally distributed around certified or consensus values with a standard deviation of 10%. For some elements, discrepancies are observed and discussed. The accuracy, precision, and sensitivity of each technique are discussed by comparing the analytical results to consensus values for the Hawaiian Basalt Rock to demonstrate the diversity of multielement applications. (author) 4 refs.; 2 tabs

  5. Standard lymphadenectomy technique in the gastric adenocarcinoma

    International Nuclear Information System (INIS)

    Aguirre Fernandez, Roberto Eduardo; Fernandez Vazquez, Pedro Ivan; LLera Dominguez, Gerardo de la

    2012-01-01

    The surgical technique used from 1990 in the 'Celia Sanchez Manduley' Clinical Surgical Teaching Provincial Hospital in Manzanillo, Granma province to carry out the gastrectomy together with the standard lymphadenectomy in patients carriers of a gastric adenocarcinoma, allowing application of the current oncologic and surgical concepts of the Japanese Society for Research of Gastric Cancer, essential to obtain a better prognosis in these patients

  6. [Study on standardization of cupping technique: elucidation on the establishment of the National Standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping].

    Science.gov (United States)

    Gao, Shu-zhong; Liu, Bing

    2010-02-01

    From the aspects of basis, technique descriptions, core contents, problems and solutions, and standard thinking in standard setting process, this paper states experiences in the establishment of the national standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping, focusing on methodologies used in cupping standard setting process, the method selection and operating instructions of cupping standardization, and the characteristics of standard TCM. In addition, this paper states the scope of application, and precautions for this cupping standardization. This paper also explaines tentative ideas on the research of standardized manipulation of acupuncture and moxibustion.

  7. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  8. Preparation and analysis of standardized waste samples for Controlled Ecological Life Support Systems (CELSS)

    Science.gov (United States)

    Carden, J. L.; Browner, R.

    1982-01-01

    The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.

  9. Technique for fabrication of gradual standards of radiographic image blachening density

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    The technique of fabrication of gradual standards of blackening density for industrial radiography by contact printing from a negative is described. The technique is designed for possibilities of industrial laboratoriesof radiation defectoscopy possessing no special-purpose sensitometric equipment

  10. MIMO wireless networks channels, techniques and standards for multi-antenna, multi-user and multi-cell systems

    CERN Document Server

    Clerckx, Bruno

    2013-01-01

    This book is unique in presenting channels, techniques and standards for the next generation of MIMO wireless networks. Through a unified framework, it emphasizes how propagation mechanisms impact the system performance under realistic power constraints. Combining a solid mathematical analysis with a physical and intuitive approach to space-time signal processing, the book progressively derives innovative designs for space-time coding and precoding as well as multi-user and multi-cell techniques, taking into consideration that MIMO channels are often far from ideal. Reflecting developments

  11. New Theoretical Analysis of the LRRM Calibration Technique for Vector Network Analyzers

    OpenAIRE

    Purroy Martín, Francesc; Pradell i Cara, Lluís

    2001-01-01

    In this paper, a new theoretical analysis of the four-standards line-reflect-reflect-match (LRRM) vector network-analyzer (VNA) calibration technique is presented. As a result, it is shown that the reference-impedance (to which the LRRM calibration is referred) cannot generally be defined whenever nonideal standards are used. Based on this consideration, a new algorithm to determine the on-wafer match standard is proposed that improves the LRRM calibration accuracy. Experimental verification ...

  12. Metabolomic analysis using porcine skin: a pilot study of analytical techniques

    OpenAIRE

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-01-01

    Background: Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. Objectives: We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. ...

  13. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  14. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1995-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques. (author) 7 refs.; 5 tabs

  15. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1994-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques

  16. ELISA technique standardization for strongyloidiasis diagnosis

    International Nuclear Information System (INIS)

    Huapaya, P.; Espinoza, I.; Huiza, A.; Universidad Nacional Mayor de San Marcos, Lima; Sevilla, C.

    2002-01-01

    To standardize ELISA technique for human Strongyloides stercoralis infection diagnosis a crude antigen was prepared using filariform larvae obtained from positive stool samples cultured with charcoal. Harvested larvae were crushed by sonication and washed by centrifugation in order to obtain protein extracts to be used as antigen. Final protein concentration was 600 μg/mL. Several kinds of ELISA plates were tested and antigen concentration, sera dilution, conjugate dilution and cut off were determined to identify infection. Sera from patients with both hyper-infection syndrome and intestinal infection demonstrated by parasitological examination were positive controls and sera from people living in non-endemic areas with no infection demonstrated by parasitological examination were negative controls. Best values were 5 μg/mL for antigen, 1/64 for sera, 1/1000 for conjugate; optical density values for positive samples were 1,2746 (1,1065 - 1,4206, DS = 0,3284) and for negative samples 0,4457 (0,3324 - 0,5538, DS = 0,2230). Twenty sera samples from positive subjects and one hundred from negative subjects were examined, obtaining 90% sensitivity and 88% specificity. The results show this technique could be useful as strongyloidiasis screening test in population studies

  17. Data compression techniques and the ACR-NEMA digital interface communications standard

    International Nuclear Information System (INIS)

    Zielonka, J.S.; Blume, H.; Hill, D.; Horil, S.C.; Lodwick, G.S.; Moore, J.; Murphy, L.L.; Wake, R.; Wallace, G.

    1987-01-01

    Data compression offers the possibility of achieving high, effective information transfer rates between devices and of efficient utilization of digital storge devices in meeting department-wide archiving needs. Accordingly, the ARC-NEMA Digital Imaging and Communications Standards Committee established a Working Group to develop a means to incorporate the optimal use of a wide variety of current compression techniques while remaining compatible with the standard. This proposed method allows the use of public domain techniques, predetermined methods between devices already aware of the selected algorithm, and the ability for the originating device to specify algorithms and parameters prior to transmitting compressed data. Because of the latter capability, the technique has the potential for supporting many compression algorithms not yet developed or in common use. Both lossless and lossy methods can be implemented. In addition to description of the overall structure of this proposal, several examples using current compression algorithms are given

  18. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  19. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1978-01-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  20. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1979-02-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  1. Standardization of P-33 by the TDCR efficiency calculation technique

    CSIR Research Space (South Africa)

    Simpson, BRS

    2004-02-01

    Full Text Available The activity of the pure beta-emitter phosphorus-33 (P-33) has been directly determined by the triple-to-double coincidence ratio (TDCR) efficiency calculation technique, thus extending the number of radionuclides that have been standardized...

  2. Neutron activation analysis for certification of standard reference materials

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Perez Zayas, G.; Hernandez Rivero, A.; Ribeiro Guevara, S.

    1996-01-01

    Neutron activation analysis is used extensively as one of the analytical techniques in the certification of standard reference materials. Characteristics of neutron activation analysis which make it valuable in this role are: accuracy multielemental capability to asses homogeneity, high sensitivity for many elements, and essentially non-destructive method. This paper report the concentrations of 30 elements (major, minor and trace elements) in four Cuban samples. The samples were irradiated in a thermal neutron flux of 10 12- 10 13 n.cm 2. s -1. The gamma ray spectra were measured by HPGe detectors and were analyzed using ACTAN program development in Center of Applied Studies for Nuclear Development

  3. Comparison of ankle-brachial index measured by an automated oscillometric apparatus with that by standard Doppler technique in vascular patients

    DEFF Research Database (Denmark)

    Korno, M.; Eldrup, N.; Sillesen, H.

    2009-01-01

    was calculated twice using both the methods on both legs. MATERIALS AND METHODS: We tested the automated oscillometric blood pressure device, CASMED 740, for measuring ankle and arm blood pressure and compared it with the current gold standard, the hand-held Doppler technique, by the Bland-Altman analysis....... RESULTS: Using the Doppler-derived ABI as the gold standard, the sensitivity and specificity of the oscillometric method for determining an ABI Udgivelsesdato: 2009/11...

  4. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  5. Comparison of least-squares vs. maximum likelihood estimation for standard spectrum technique of β−γ coincidence spectrum analysis

    International Nuclear Information System (INIS)

    Lowrey, Justin D.; Biegalski, Steven R.F.

    2012-01-01

    The spectrum deconvolution analysis tool (SDAT) software code was written and tested at The University of Texas at Austin utilizing the standard spectrum technique to determine activity levels of Xe-131m, Xe-133m, Xe-133, and Xe-135 in β–γ coincidence spectra. SDAT was originally written to utilize the method of least-squares to calculate the activity of each radionuclide component in the spectrum. Recently, maximum likelihood estimation was also incorporated into the SDAT tool. This is a robust statistical technique to determine the parameters that maximize the Poisson distribution likelihood function of the sample data. In this case it is used to parameterize the activity level of each of the radioxenon components in the spectra. A new test dataset was constructed utilizing Xe-131m placed on a Xe-133 background to compare the robustness of the least-squares and maximum likelihood estimation methods for low counting statistics data. The Xe-131m spectra were collected independently from the Xe-133 spectra and added to generate the spectra in the test dataset. The true independent counts of Xe-131m and Xe-133 are known, as they were calculated before the spectra were added together. Spectra with both high and low counting statistics are analyzed. Studies are also performed by analyzing only the 30 keV X-ray region of the β–γ coincidence spectra. Results show that maximum likelihood estimation slightly outperforms least-squares for low counting statistics data.

  6. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  7. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  8. Performance Analysis of Modified Drain Gating Techniques for Low Power and High Speed Arithmetic Circuits

    Directory of Open Access Journals (Sweden)

    Shikha Panwar

    2014-01-01

    Full Text Available This paper presents several high performance and low power techniques for CMOS circuits. In these design methodologies, drain gating technique and its variations are modified by adding an additional NMOS sleep transistor at the output node which helps in faster discharge and thereby providing higher speed. In order to achieve high performance, the proposed design techniques trade power for performance in the delay critical sections of the circuit. Intensive simulations are performed using Cadence Virtuoso in a 45 nm standard CMOS technology at room temperature with supply voltage of 1.2 V. Comparative analysis of the present circuits with standard CMOS circuits shows smaller propagation delay and lesser power consumption.

  9. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    Science.gov (United States)

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  10. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  11. EUS-guided biliary drainage by using a standardized approach for malignant biliary obstruction: rendezvous versus direct transluminal techniques (with videos).

    Science.gov (United States)

    Khashab, Mouen A; Valeshabad, Ali Kord; Modayil, Rani; Widmer, Jessica; Saxena, Payal; Idrees, Mehak; Iqbal, Shahzad; Kalloo, Anthony N; Stavropoulos, Stavros N

    2013-11-01

    EUS-guided biliary drainage (EGBD) can be performed via direct transluminal or rendezvous techniques. It is unknown how both techniques compare in terms of efficacy and adverse events. To describe outcomes of EGBD performed by using a standardized approach and compare outcomes of rendezvous and transluminal techniques. Retrospective analysis of prospectively collected data. Two tertiary-care centers. Consecutive jaundiced patients with distal malignant biliary obstruction who underwent EGBD after failed ERCP between July 2006 and December 2012 were included. EGBD by using a standardized algorithm. Technical success, clinical success, and adverse events. During the study period, 35 patients underwent EGBD (rendezvous n = 13, transluminal n = 20). Technical success was achieved in 33 patients (94%), and clinical success was attained in 32 of 33 patients (97.0%). The mean postprocedure bilirubin level was 1.38 mg/dL in the rendezvous group and 1.33 mg/dL in the transluminal group (P = .88). Similarly, length of hospital stay was not different between groups (P = .23). There was no significant difference in adverse event rate between rendezvous and transluminal groups (15.4% vs 10%; P = .64). Long-term outcomes were comparable between groups, with 1 stent migration in the rendezvous group at 62 days and 1 stent occlusion in the transluminal group at 42 days after EGBD. Retrospective analysis, small number of patients, and selection bias. EGBD is safe and effective when the described standardized approach is used. Stent occlusion is not common during long-term follow-up. Both rendezvous and direct transluminal techniques seem to be equally effective and safe. The latter approach is a reasonable alternative to rendezvous EGBD. Copyright © 2013. Published by Mosby, Inc.

  12. Standardization of MIP technique in three-dimensional CT portography: usefulness in evaluation of portosystemic collaterals in cirrhotic patients

    International Nuclear Information System (INIS)

    Kim, Jong Gi; Kim, Yong; Kim, Chang Won; Lee, Jun Woo; Lee, Suk Hong

    2003-01-01

    To assess the usefulness of three-dimensional CT portography using a standardized maximum intensity projection (MIP) technique for the evaluation of portosystemic collaterals in cirrhotic patients. In 25 cirrhotic patients with portosystemic collaterals, three-phase CT using a multide-tector-row helical CT scanner was performed to evaluate liver disease. Late arterial-phase images were transferred to an Advantage Windows 3.1 workstation (Gener Electric). Axial images were reconstructed by means of three-dimensional CT portography, using both a standardized and a non-standardized MIP technique, and the respective reconstruction times were determined. Three-dimensional CT portography with the standardized technique involved eight planes, namely the spleno-portal confluence axis (coronal, lordotic coronal, lordotic coronal RAO 30 .deg. C, and lordotic coronal LAO 30 .deg. C), the left renal vein axis (lordotic coronal), and axial MIP images (lower esophagus level, gastric fundus level and splenic hilum). The eight MIP images obtained in each case were interpreted by two radiologists, who reached a consensus in their evaluation. The portosystemic collaterals evaluated were as follows: left gastric vein dilatation; esophageal, paraesophageal, gastric, and splenic varix; paraumbilical vein dilatation; gastro-renal, spleno-renal, and gastro-spleno-renal shunt; mesenteric, retroperitoneal, and omental collaterals. The average reconstruction time using the non-standardized MIP technique was 11 minutes 23 seconds, and with the standardized technique, the time was 6 minutes 5 seconds. Three-dimensional CT portography with the standardized technique demonstrated left gastric vein dilatation (n=25), esophageal varix (n=18), paraesophageal varix (n=13), gastric varix (n=4), splenic varix (n=4), paraumbilical vein dilatation (n=4), gastro-renal shunt (n=3), spleno-renal shunt (n=3), and gastro-spleno-renal shunt (n=1). Using three-dimensional CT protography and the non-standardized

  13. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  14. Standardized technique for single port laparoscopic ileostomy and colostomy.

    Science.gov (United States)

    Shah, A; Moftah, M; Hadi Nahar Al-Furaji, H; Cahill, R A

    2014-07-01

    Single site laparoscopic techniques and technology exploit maximum usefulness from confined incisions. The formation of an ileostomy or colostomy seems very applicable for this modality as the stoma occupies the solitary incision obviating any additional wounds. Here we detail the principles of our approach to defunctioning loop stoma formation using single port laparoscopic access in a stepwise and standardized fashion along with the salient specifics of five illustrative patients. No specialized instrumentation is required and the single access platform is established table-side using the 'glove port' technique. The approach has the intra-operative advantage of excellent visualization of the correct intestinal segment for exteriorization along with direct visual control of its extraction to avoid twisting. Postoperatively, abdominal wall trauma has been minimal allowing convalescence and stoma care education with only one parietal incision. Single incision stoma siting proves a ready, robust and reliable technique for diversion ileostomy and colostomy with a minimum of operative trauma for the patient. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  15. Performance of an iterative two-stage bayesian technique for population pharmacokinetic analysis of rich data sets

    NARCIS (Netherlands)

    Proost, Johannes H.; Eleveld, Douglas J.

    2006-01-01

    Purpose. To test the suitability of an Iterative Two-Stage Bayesian (ITSB) technique for population pharmacokinetic analysis of rich data sets, and to compare ITSB with Standard Two-Stage (STS) analysis and nonlinear Mixed Effect Modeling (MEM). Materials and Methods. Data from a clinical study with

  16. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  17. Standardization of the Descemet membrane endothelial keratoplasty technique: Outcomes of the first 450 consecutive cases.

    Science.gov (United States)

    Satué, M; Rodríguez-Calvo-de-Mora, M; Naveiras, M; Cabrerizo, J; Dapena, I; Melles, G R J

    2015-08-01

    To evaluate the clinical outcome of the first 450 consecutive cases after Descemet membrane endothelial keratoplasty (DMEK), as well as the effect of standardization of the technique. Comparison between 3 groups: Group I: (cases 1-125), as the extended learning curve; Group II: (cases 126-250), transition to technique standardization; Group III: (cases 251-450), surgery with standardized technique. Best corrected visual acuity, endothelial cell density, pachymetry and intra- and postoperative complications were evaluated before, and 1, 3 and 6 months after DMEK. At 6 months after surgery, 79% of eyes reached a best corrected visual acuity of≥0.8 and 43%≥1.0. Mean preoperative endothelial cell density was 2,530±220 cells/mm2 and 1,613±495 at 6 months after surgery. Mean pachymetry measured 668±92 μm and 526±46 μm pre- and (6 months) postoperatively, respectively. There were no significant differences in best corrected visual acuity, endothelial cell density and pachymetry between the 3 groups (P > .05). Graft detachment presented in 17.3% of the eyes. The detachment rate declined from 24% to 12%, and the rate of secondary surgeries from 9.6% to 3.5%, from group I to III respectively. Visual outcomes and endothelial cell density after DMEK are independent of the technique standardization. However, technique standardization may have contributed to a lower graft detachment rate and a relatively low number of secondary interventions required. As such, DMEK may become the first choice of treatment in corneal endothelial disease. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  18. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  19. New quantitative safety standards : Different techniques, different results?

    NARCIS (Netherlands)

    Rouvroye, J.L.; Brombacher, A.C.; Lydersen, S.; Hansen, G.K.; Sandtor, H.

    1998-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many parameters can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN [DIN19250, DIN0801] no quantitative analysis was demanded. The

  20. A Comparative Analysis of Uranium Ore using Laser Fluorimetric and gamma Spectrometry Techniques

    International Nuclear Information System (INIS)

    Madbouly, M.; Nassef, M. H.; El-Mongy, S.A.; Diab, A.M.

    2009-01-01

    A developed chemical separation method was used for the analysis of uranium in a standard U-ore (IAEA-RGU-1) by LASER fluorimetric technique. The non-destructive gamma assay technique was also applied to verify and compare the uranium content analyzed using laser technique. The results of the uranium analysis obtained by laser fluorimetry were found to be in the range of 360 - 420 μg/g with an average value of 390 μg/g. The bias between the measured and the certified value does not exceed 9.9%. For gamma-ray spectrometric analysis, the results of the measured uranium content were found to be in the range of 393.8 - 399.4 μg/g with an average value of 396.3 μg/g. The % difference in the case of γ- assay was 1.6 %. In general, the methods of analysis used in this study are applicable for a precise determination of uranium. It can be concluded that, laser analysis is preferred for assay of uranium ore due to the required small sample weight, the low time of sample preparation and cost of analysis.

  1. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  2. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  3. When is hub gene selection better than standard meta-analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  4. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  5. Text localization using standard deviation analysis of structure elements and support vector machines

    Directory of Open Access Journals (Sweden)

    Zagoris Konstantinos

    2011-01-01

    Full Text Available Abstract A text localization technique is required to successfully exploit document images such as technical articles and letters. The proposed method detects and extracts text areas from document images. Initially a connected components analysis technique detects blocks of foreground objects. Then, a descriptor that consists of a set of suitable document structure elements is extracted from the blocks. This is achieved by incorporating an algorithm called Standard Deviation Analysis of Structure Elements (SDASE which maximizes the separability between the blocks. Another feature of the SDASE is that its length adapts according to the requirements of the application. Finally, the descriptor of each block is used as input to a trained support vector machines that classify the block as text or not. The proposed technique is also capable of adjusting to the text structure of the documents. Experimental results on benchmarking databases demonstrate the effectiveness of the proposed method.

  6. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  7. Developing standardized connection analysis techniques for slim hole core rod designs

    International Nuclear Information System (INIS)

    Fehr, G.; Bailey, E.I.

    1994-01-01

    Slim hole core rod design remains essentially in the proprietary domain. API standardization provides the ability to perform engineering analyses and dimensional inspections through the use of documents, ie: Specifications, Bulletins, and Recommended Practices. In order to provide similar engineering capability for non-API slim hole connections, this paper develops the initial phase of what may evolve into an engineering tool to provide at least an indication of relative serviceability between two connection styles for a given application. The starting point for this process will look at bending strength ratios and connection strength calculations. Since empirical data are yet needed to verify the approaches proposed in this paper, it is recognized that the alternatives presented here are only a first step to developing useful rules of thumb which may lead to later standardization

  8. Rapid analysis of molybdenum contents in molybdenum master alloys by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Tongkong, P.

    1985-01-01

    Determination of molybdenum contents in molybdenum master alloy had been performed using energy dispersive x-ray fluorescence (EDX) technique where analysis were made via standard additions and calibration curves. Comparison of EDX technique with other analyzing techniques, i.e., wavelength dispersive x-ray fluorescence, neutron activation analysis and inductive coupled plasma spectrometry, showed consistency in the results. This technique was found to yield reliable results when molybdenum contents in master alloys were in the range of 13 to 50 percent using HPGe detector or proportional counter. When the required error was set at 1%, the minimum analyzing time was found to be 30 and 60 seconds for Fe-Mo master alloys with molybdenum content of 13.54 and 49.09 percent respectively. For Al-Mo master alloys, the minimum times required were 120 and 300 seconds with molybdenum content of 15.22 and 47.26 percent respectively

  9. A standards-based method for compositional analysis by energy dispersive X-ray spectrometry using multivariate statistical analysis: application to multicomponent alloys.

    Science.gov (United States)

    Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W

    2013-02-01

    Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.

  10. Modification of the cranial closing wedge ostectomy technique for the treatment of canine cruciate disease. Description and comparison with standard technique.

    Science.gov (United States)

    Wallace, A M; Addison, E S; Smith, B A; Radke, H; Hobbs, S J

    2011-01-01

    To describe a modification of the cranial closing wedge ostectomy (CCWO) technique and to compare its efficacy to the standard technique on cadaveric specimens. The standard and modified CCWO technique were applied to eight pairs of cadaveric tibiae. The following parameters were compared following the ostectomy: degrees of plateau levelling achieved (degrees), tibial long axis shift (degrees), reduction in tibial length (mm), area of bone wedge removed (cm²), and the area of proximal fragment (cm²). The size of the removed wedge of bone and the reduction in tibial length were significantly less with the modified CCWO technique. The modified CCWO has two main advantages. Firstly a smaller wedge is removed, allowing a greater preservation of bone stock in the proximal tibia, which is advantageous for implant placement. Secondly, the tibia is shortened to a lesser degree, which might reduce the risk of recurvatum, fibular fracture and patella desmitis. These factors are particularly propitious for the application of this technique to Terrier breeds with excessive tibial plateau angle, where large angular corrections are required. The modified CCWO is equally effective for plateau levelling and results in an equivalent tibial long-axis shift. A disadvantage with the modified technique is that not all of the cross sectional area of the distal fragment contributes to load sharing at the osteotomy.

  11. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  12. Multiplex Ligation-Dependent Probe Amplification Technique for Copy Number Analysis on Small Amounts of DNA Material

    DEFF Research Database (Denmark)

    Sørensen, Karina; Andersen, Paal; Larsen, Lars

    2008-01-01

    The multiplex ligation-dependent probe amplification (MLPA) technique is a sensitive technique for relative quantification of up to 50 different nucleic acid sequences in a single reaction, and the technique is routinely used for copy number analysis in various syndromes and diseases. The aim...... of the study was to exploit the potential of MLPA when the DNA material is limited. The DNA concentration required in standard MLPA analysis is not attainable from dried blood spot samples (DBSS) often used in neonatal screening programs. A novel design of MLPA probes has been developed to permit for MLPA...... analysis on small amounts of DNA. Six patients with congenital adrenal hyperplasia (CAH) were used in this study. DNA was extracted from both whole blood and DBSS and subjected to MLPA analysis using normal and modified probes. Results were analyzed using GeneMarker and manual Excel analysis. A total...

  13. Assessment of Snared-Loop Technique When Standard Retrieval of Inferior Vena Cava Filters Fails

    International Nuclear Information System (INIS)

    Doody, Orla; Noe, Geertje; Given, Mark F.; Foley, Peter T.; Lyon, Stuart M.

    2009-01-01

    Purpose To identify the success and complications related to a variant technique used to retrieve inferior vena cava filters when simple snare approach has failed. Methods A retrospective review of all Cook Guenther Tulip filters and Cook Celect filters retrieved between July 2006 and February 2008 was performed. During this period, 130 filter retrievals were attempted. In 33 cases, the standard retrieval technique failed. Retrieval was subsequently attempted with our modified retrieval technique. Results The retrieval was successful in 23 cases (mean dwell time, 171.84 days; range, 5-505 days) and unsuccessful in 10 cases (mean dwell time, 162.2 days; range, 94-360 days). Our filter retrievability rates increased from 74.6% with the standard retrieval method to 92.3% when the snared-loop technique was used. Unsuccessful retrieval was due to significant endothelialization (n = 9) and caval penetration by the filter (n = 1). A single complication occurred in the group, in a patient developing pulmonary emboli after attempted retrieval. Conclusion The technique we describe increased the retrievability of the two filters studied. Hook endothelialization is the main factor resulting in failed retrieval and continues to be a limitation with these filters.

  14. Certification of standard reference materials employing neutron activation analysis

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Hernandez Rivero, A.; Molina Insfran, J.; Ribeiro Guevara, S.; Santana Encinosa, C.; Perez Zayas, G.

    1997-01-01

    Neutron activation analysis (Naa) is used extensively as one of the analytical techniques in the certification of standard reference materials (Srm). Characteristics of Naa which make it valuable in this role are: accuracy; multielemental capability; ability to assess homogeneity; high sensitivity for many elements, and essentially non-destructive method. This paper reports the concentrations of thirty elements (major, minor and trace elements) in four Cuban Srm's. The samples were irradiated in a thermal neutron flux of 10 12 -10 13 neutrons.cm -2 .s -1 . The gamma-ray spectra were measured by HPGe detectors and were analysed using ACTAN program, developed in CEADEN. (author) [es

  15. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  16. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  17. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  18. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair.

    Science.gov (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu

    2010-03-01

    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  19. Agreement between gastrointestinal panel testing and standard microbiology methods for detecting pathogens in suspected infectious gastroenteritis: Test evaluation and meta-analysis in the absence of a reference standard.

    Science.gov (United States)

    Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James

    2017-01-01

    Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.

  20. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.

  1. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  2. Elemental analysis of the suspended particulate matter in the air of Tehran using INAA and AAS techniques. Appendix 11

    International Nuclear Information System (INIS)

    Sohrabpour, M.; Rostami, S.; Athari, M.

    1995-01-01

    A network of ten sampling stations for monitoring the elemental concentration of the suspended particulate matter (SPM) in the air of Tehran has been established. Instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS) techniques have been used for analysis of the Whatman-41 filters collected during the year 1994. Assessment of the preliminary results using the two techniques has produced the following twenty-one elements: Al, Br, Ca, Cd, Ce, Cl, Co, Cr, Cs, Fe, K, Mg, Mn, Na, Ni, Pb, Sb, Sc, Ti, V, Zn. Various standard solutions with known concentrations of elements, together with standard reference materials, have been used for quality assurance of the measured concentrations. (author)

  3. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  4. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  5. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  6. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  7. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  8. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  9. Comparison of QuadrapolarTM radiofrequency lesions produced by standard versus modified technique: an experimental model

    Directory of Open Access Journals (Sweden)

    Safakish R

    2017-06-01

    Full Text Available Ramin Safakish Allevio Pain Management Clinic, Toronto, ON, Canada Abstract: Lower back pain (LBP is a global public health issue and is associated with substantial financial costs and loss of quality of life. Over the years, different literature has provided different statistics regarding the causes of the back pain. The following statistic is the closest estimation regarding our patient population. The sacroiliac (SI joint pain is responsible for LBP in 18%–30% of individuals with LBP. Quadrapolar™ radiofrequency ablation, which involves ablation of the nerves of the SI joint using heat, is a commonly used treatment for SI joint pain. However, the standard Quadrapolar radiofrequency procedure is not always effective at ablating all the sensory nerves that cause the pain in the SI joint. One of the major limitations of the standard Quadrapolar radiofrequency procedure is that it produces small lesions of ~4 mm in diameter. Smaller lesions increase the likelihood of failure to ablate all nociceptive input. In this study, we compare the standard Quadrapolar radiofrequency ablation technique to a modified Quadrapolar ablation technique that has produced improved patient outcomes in our clinic. The methodology of the two techniques are compared. In addition, we compare results from an experimental model comparing the lesion sizes produced by the two techniques. Taken together, the findings from this study suggest that the modified Quadrapolar technique provides longer lasting relief for the back pain that is caused by SI joint dysfunction. A randomized controlled clinical trial is the next step required to quantify the difference in symptom relief and quality of life produced by the two techniques. Keywords: lower back pain, radiofrequency ablation, sacroiliac joint, Quadrapolar radiofrequency ablation

  10. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  11. Determination of 25 elements in biological standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Guzzi, G.; Pietra, R.; Sabbioni, E.

    1974-12-01

    Standard and Certified Reference Materials programme of the JRC includes the determination of trace elements in complex biological samples delivered by the U.S. National Bureau of Standards: Bovine liver (NBS SRM 1577), Orchard Leaves (NBS SRM 1571) and Tomato Leaves. The study has been performed by the use of neutron activation analysis. Due to the very low concentration of some elements, radiochemical groups or elemental separation procedures were necessary. The paper describes the techniques used to analyse 25 elements. Computer assisted instrumental neutron activation analysis with high resolution Ge(Li) spectrometry was considerably advantageous in the determination of Na, K, Cl, Mn, Fe, Rb and Co and in some cases of Ca, Zn, Cs, Sc, and Cr. For low contents of Ca, Mg, Ni and Si special chemical separation schemes, followed by Cerenkov counting have been developped. Two other separation procedures allowing the determination of As, Cd, Ga, Hg, Mo, Cu, Sr Se, Ba and P have been set up. The first, the simplified one involves the use of high resolution Ge(Li) detectors, the second, the more complete one involves a larger number of shorter measurements performed by simpler and more sensitive techniques, such as NaI(Tl) scintillation spectrometry and Cerenkov counting. The results obtained are presented and discussed

  12. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  13. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  14. Analysis of ISO 26262 compliant techniques for the automotive domain

    NARCIS (Netherlands)

    S., Manoj Kannan; Dajsuren, Y.; Luo, Y.; Barosan, I.; Antkiewicz, M.; Atlee, J.; Dingel, J.; S, R.

    2015-01-01

    The ISO 26262 standard defines functional safety for automotive E/E systems. Since the publication of the first edition of this standard in 2011, many different safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard

  15. Rendezvous technique for recanalization of long-segmental chronic total occlusion above the knee following unsuccessful standard angioplasty.

    Science.gov (United States)

    Cao, Jun; Lu, Hai-Tao; Wei, Li-Ming; Zhao, Jun-Gong; Zhu, Yue-Qi

    2016-04-01

    To assess the technical feasibility and efficacy of the rendezvous technique, a type of subintimal retrograde wiring, for the treatment of long-segmental chronic total occlusions above the knee following unsuccessful standard angioplasty. The rendezvous technique was attempted in eight limbs of eight patients with chronic total occlusions above the knee after standard angioplasty failed. The clinical symptoms and ankle-brachial index were compared before and after the procedure. At follow-up, pain relief, wound healing, limb salvage, and the presence of restenosis of the target vessels were evaluated. The rendezvous technique was performed successfully in seven patients (87.5%) and failed in one patient (12.5%). Foot pain improved in all seven patients who underwent successful treatment, with ankle-brachial indexes improving from 0.23 ± 0.13 before to 0.71 ± 0.09 after the procedure (P rendezvous technique is a feasible and effective treatment for chronic total occlusions above the knee when standard angioplasty fails. © The Author(s) 2015.

  16. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  17. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  18. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  19. IEEE standard requirements for reliability analysis in the design and operation of safety systems for nuclear power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The purpose of this standard is to provide uniform, minimum acceptable requirements for the performance of reliability analyses for safety-related systems found in nuclear-power generating stations, but not to define the need for an analysis. The need for reliability analysis has been identified in other standards which expand the requirements of regulations (e.g., IEEE Std 379-1972 (ANSI N41.2-1972), ''Guide for the Application of the Single-Failure Criterion to Nuclear Power Generating Station Protection System,'' which describes the application of the single-failure criterion). IEEE Std 352-1975, ''Guide for General Principles of Reliability Analysis of Nuclear Power Generating Station Protection Systems,'' provides guidance in the application and use of reliability techniques referred to in this standard

  20. Enhancements and Health-Related Studies of Neutron Activation Analysis Technique

    International Nuclear Information System (INIS)

    Soliman, M.A.M.

    2012-01-01

    The work presented in this thesis covers two major points. One algorithm concerns with establishment of an accurate standardization method with multi-elemental capabilities and low workload suitable for NAA standardization at ETRR-2. The second one deals with constructing and developing an effective nondestructive technique for analysis of liquid samples based on NAA using (very) short-lived radionuclides. To achieve the first goal, attention has been directed toward implementation of the k 0 -method for calculation of the elements concentrations in the samples. The k 0 -method of NAA standardization has a considerable success as a method for accurate multi-elemental analysis with comparable low workload. The k 0 - method is based on the fact that the unknown sample is irradiated with only one standard element as comparator. To access the implementation of this method at ETRR-2, careful and complete characterization of the neutron flux parameters in the irradiation positions as well as the efficiency calibration of the γ-ray spectrometer must be carried out. The required neutron flux parameters are: the ratio of the thermal to epithermal neutron fluxes (f) and the deviation factor (α) of the epithermal neutron flux from the ideal 1/E law. The work presented in Chapter 4 shows the efficiency calibration curve of the γ ray spectrometer system which was obtained using standard radioactive point sources. Moreover, the f and α parameters were determined in some selected irradiation sites using sets of Zr-Au as neutron flux monitors. Due to different locations relative to the reactor core, the available neutron fluxes in the selected irradiation positions differ substantially, so that different irradiation demands can be satisfied. The reference materials coal NIST 1632c and IAEA-Soil 7 were analyzed for data validation and good agreement between the experimental values and the certified values was obtained. The obtained results have revealed that the k 0 -NAA

  1. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  2. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  3. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  4. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  5. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  6. Cesarean sections, perfecting the technique and standardizing the practice: an analysis of the book Obstetrícia, by Jorge de Rezende.

    Science.gov (United States)

    Nakano, Andreza Rodrigues; Bonan, Claudia; Teixeira, Luiz Antônio

    2016-01-01

    This article discusses the development of techniques for cesarean sections by doctors in Brazil, during the 20th century, by analyzing the title "Operação Cesárea" (Cesarean Section), of three editions of the textbookObstetrícia, by Jorge de Rezende. His prominence as an author in obstetrics and his particular style of working, created the groundwork for the normalization of the practice of cesarean sections. The networks of meaning practiced within this scientific community included a "provision for feeling and for action" (Fleck) which established the C-section as a "normal" delivery: showing standards that exclude unpredictability, chaos, and dangers associated with the physiology of childbirth, meeting the demand for control, discipline and safety, qualities associated with practices, techniques and technologies of biomedicine.

  7. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  8. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  9. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  10. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  11. Multielement analysis of biological standards by neutron activation analysis

    International Nuclear Information System (INIS)

    Nadkarni, R.A.

    1977-01-01

    Up to 28 elements were determined in two IAEA standards: Animal Muscle H4 and Fish Soluble A 6/74, and three NBS standards: Spinach: SRM-1570, Tomato Leaves: SRM-1573 and Pine Needles: SRM-1575 by instrumental neutron-activation analysis. Seven noble metals were determined in two NBS standards: Coal: SRM-1632 and Coal Fly Ash: SRM-1633 by radiochemical procedure while 11 rare earth elements were determined in NBS standard Orchard Leaves: SRM-1571 by instrumental neutron-activation analysis. The results are in good agreement with the certified and/or literature data where available. The irradiations were performed at the Cornell TRIGA Mark II nuclear reactor at a thermal neutron flux of 1-3x10 12 ncm -2 sec -1 . The short-lived species were determined after a 2-minute irradiation in the pneumatic rabbit tube, and the longer-lived species after an 8-hour irradiation in the central thimble facility. The standards and samples were counted on coaxial 56-cm 3 Ge(Li) detector. The system resolution was 1.96 keV (FWHM) with a peak to Compton ratio of 37:1 and counting efficiency of 13%, all compared to the 1.332 MeV photopeak of Co-60. (T.I.)

  12. Post-voiding residual urine and capacity increase in orthotopic urinary diversion: Standard vs modified technique

    Directory of Open Access Journals (Sweden)

    Bančević Vladimir

    2010-01-01

    Full Text Available Background/Aim. Ever since the time when the first orthotopic urinary diversion (pouch was performed there has been a constant improvement and modification of surgical techniques. The aim has been to create a urinary reservoir similar to normal bladder, to decrease incidence of postoperative complications and provide an improved life quality. The aim of this study was to compare postvoiding residual urine (PVR and capacity of the pouch constructed by standard or modified technique. Methods. In this prospective and partially retrospective clinical study we included 79 patients. In the group of 41 patients (group ST pouch was constructed using 50-70 cm of the ileum (standard technique. In the group of 38 patients (group MT pouch was constructed using 25-35 cm of the ileum (modified technique. Postoperatively, PVR and pouch capacity were measured using ultrasound in a 3-, 6- and 12-month period. Results. Postoperatively, an increase in PVR and pouch capacity was noticed in both groups. Twelve months postoperatively, PVR was significantly smaller in the group MT than in the group ST [23 (0-90 mL vs 109 (0-570 mL, p < 0,001]. In the same period the pouch capacity was significantly smaller in the MT group than in the ST group [460 (290-710 mL vs 892 (480-2 050 mL, p < 0.001]. Conclusion. Postoperatively, an increase in PVR and pouch capacity was noticed during a 12-month period. A year following the operation the pouch created from a shorter ileal segment reached capacity of the 'normal' bladder with small PVR. The pouch created by standard technique developed an unnecessary large PVR and capacity.

  13. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  14. Analysis of standard substance human hair

    International Nuclear Information System (INIS)

    Zou Shuyun; Zhang Yongbao

    2005-01-01

    The human hair samples as standard substances were analyzed by the neutron activation analysis (NAA) on the miniature neutron source reactor. 19 elements, i.e. Al, As, Ba, Br, Ca, Cl, Cr, Co, Cu, Fe, Hg, I, Mg, Mn, Na, S, Se, V and Zn, were measured. The average content, standard deviation, relative standard deviation and the detection limit under the present research conditions were given for each element, and the results showed that the measured values of the samples were in agreement with the recommended values, which indicated that NAA can be used to analyze standard substance human hair with a relatively high accuracy. (authors)

  15. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  16. Placement of empty catheters for an HDR-emulating LDR prostate brachytherapy technique: comparison to standard intraoperative planning.

    Science.gov (United States)

    Niedermayr, Thomas R; Nguyen, Paul L; Murciano-Goroff, Yonina R; Kovtun, Konstantin A; Neubauer Sugar, Emily; Cail, Daniel W; O'Farrell, Desmond A; Hansen, Jorgen L; Cormack, Robert A; Buzurovic, Ivan; Wolfsberger, Luciant T; O'Leary, Michael P; Steele, Graeme S; Devlin, Philip M; Orio, Peter F

    2014-01-01

    We sought to determine whether placing empty catheters within the prostate and then inverse planning iodine-125 seed locations within those catheters (High Dose Rate-Emulating Low Dose Rate Prostate Brachytherapy [HELP] technique) would improve concordance between planned and achieved dosimetry compared with a standard intraoperative technique. We examined 30 consecutive low dose rate prostate cases performed by standard intraoperative technique of planning followed by needle placement/seed deposition and compared them to 30 consecutive low dose rate prostate cases performed by the HELP technique. The primary endpoint was concordance between planned percentage of the clinical target volume that receives at least 100% of the prescribed dose/dose that covers 90% of the volume of the clinical target volume (V100/D90) and the actual V100/D90 achieved at Postoperative Day 1. The HELP technique had superior concordance between the planned target dosimetry and what was actually achieved at Day 1 and Day 30. Specifically, target D90 at Day 1 was on average 33.7 Gy less than planned for the standard intraoperative technique but was only 10.5 Gy less than planned for the HELP technique (p 0.05). Placing empty needles first and optimizing the plan to the known positions of the needles resulted in improved concordance between the planned and the achieved dosimetry to the target, possibly because of elimination of errors in needle placement. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  17. Comparison of global sensitivity analysis techniques and importance measures in PSA

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.; Tarantola, S.; Saltelli, A.

    2003-01-01

    This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell-Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA

  18. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  19. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  20. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    Science.gov (United States)

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  1. Reduced Rate of Dehiscence After Implementation of a Standardized Fascial Closure Technique in Patients Undergoing Emergency Laparotomy

    DEFF Research Database (Denmark)

    Tolstrup, Mai-Britt; Watt, Sara Kehlet; Gögenur, Ismail

    2017-01-01

    to 2013 with 2014 to 2015. Factors associated with dehiscence were male gender [hazard ratio (HR) 2.8, 95% confidence interval (95% CI) (1.8-4.4), P ... (1.6-4.9), P 4%, P = 0.008. CONCLUSION: The standardized procedure of closing the midline laparotomy by using a "small steps" technique of continuous suturing...... and multivariate Cox regression analysis were performed. RESULTS: We included 494 patients from 2014 to 2015 and 1079 patients from our historical cohort for comparison. All patients had a midline laparotomy in an emergency setting. The rate of dehiscence was reduced from 6.6% to 3.8%, P = 0.03 comparing year 2009...

  2. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  3. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  4. Salivary Fluoride level in preschool children after toothbrushing with standard and low fluoride content dentifrice, using the transversal dentifrice application technique: pilot study

    Directory of Open Access Journals (Sweden)

    Fabiana Jandre Melo

    2008-01-01

    Full Text Available Objective: To investigate the salivary fluoride concentration in pre-school children after toothbrushing with dentifrice containing standard (1100ppmF/NaF and low (500ppmF/NaF fluoride concentration, using the transversal technique of placing the product on the toothbrush. Methods: Eight children of both sexes, ranging from 4 to 9 years, and 5 years and 6 months of age, participated in the study. The experiment was divided into two phases with a weekly interval. In the first stage, the children used the standard concentration dentifrice for one week, and in the second, the low concentration product. Samples were collected at the end of each experimental stage, at the following times: Before brushing, immediately afterwards, and after 15, 30 and 45 minutes. The fluoride contents were analyzed by the microdiffusion technique. Statistical analysis was done by the analysis of variance ANOVA and Student’s-t test (p<0.05. Results: The salivary fluoride concentration was significantly higher at all times, when the standard concentration product was used. The comparison between the Halogen concentration found before bushing and immediately afterwards, showed that there was a 6.8 times increase in the standard dentifrice (0.19 x 1.29μgF/ml and in the low concentration product, an increase of 20.5 times (0.02 x 0.41μgF/ml. Conclusion: Toothbrushing with both products promoted relevant increases in the salivary fluoride concentration; however, longitudinal studies are necessary to verify the clinical result of this measurement.

  5. METHODOLOGY COMPARATIVE EVALUATION OF PROFESSIONAL STANDARDS AND EDUCATION STANDARDS WITH THE USE OF NON-NUMERIC DATA PROCESSING METHODS

    Directory of Open Access Journals (Sweden)

    Gennady V. Abramov

    2016-01-01

    Full Text Available The article discusses the development of a technique that allows for a comparative assessment of the requirements of the professional standard and the federal state educational standards. The results can be used by universities to adjust the learning process for the analysis of their curricula to better compliance with professional standards

  6. Photon and proton activation analysis of iron and steel standards using the internal standard method coupled with the standard addition method

    International Nuclear Information System (INIS)

    Masumoto, K.; Hara, M.; Hasegawa, D.; Iino, E.; Yagi, M.

    1997-01-01

    The internal standard method coupled with the standard addition method has been applied to photon activation analysis and proton activation analysis of minor elements and trace impurities in various types of iron and steel samples issued by the Iron and Steel Institute of Japan (ISIJ). Samples and standard addition samples were once dissolved to mix homogeneously, an internal standard and elements to be determined and solidified as a silica-gel to make a similar matrix composition and geometry. Cerium and yttrium were used as an internal standard in photon and proton activation, respectively. In photon activation, 20 MeV electron beam was used for bremsstrahlung irradiation to reduce matrix activity and nuclear interference reactions, and the results were compared with those of 30 MeV irradiation. In proton activation, iron was removed by the MIBK extraction method after dissolving samples to reduce the radioactivity of 56 Co from iron via 56 Fe(p, n) 56 Co reaction. The results of proton and photon activation analysis were in good agreement with the standard values of ISIJ. (author)

  7. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  8. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    M. S. Kannan; Y. Dajsuren (Yanjindulam); Y. Luo; I. Barosan

    2015-01-01

    htmlabstractThe ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the

  9. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  10. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  11. Concrete blocks. Analysis of UNE, ISO en standards and comparison with other international standards

    Directory of Open Access Journals (Sweden)

    Álvarez Alonso, Marina

    1990-12-01

    Full Text Available This paper attempts to describe the recently approved UNE standards through a systematic analysis of the main specifications therein contained and the values considered for each of them, as well as the drafts for ISO and EN concrete block standards. Furthermore, the study tries to place the set of ISO standards in the international environment through a comparative analysis against a representative sample of the standards prevailing in various geographical regions of the globe to determine the analogies and differences among them. PALABRAS CLAVE: albañilería, análisis de sistemas, bloque de hormigón, muros de fábrica, normativa KEY WORDS: masonry, system analysis, concrete blocks, masonry walls, standards

    En este trabajo se pretende describir la reciente aprobada normativa UNE, analizando sistemáticamente las principales prescripciones contempladas y los valores considerados para cada una de ellas, así como los proyectos de Norma ISO, y EN sobre bloques de hormigón. Asimismo se intenta situar la normativa UNE en al ámbito internacional, haciendo un análisis comparativo con una representación de Normas de distintas regiones geográficas del mundo, determinando sus analogías y diferencias.

  12. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  13. Compressed air injection technique to standardize block injection pressures.

    Science.gov (United States)

    Tsui, Ban C H; Li, Lisa X Y; Pillay, Jennifer J

    2006-11-01

    Presently, no standardized technique exists to monitor injection pressures during peripheral nerve blocks. Our objective was to determine if a compressed air injection technique, using an in vitro model based on Boyle's law and typical regional anesthesia equipment, could consistently maintain injection pressures below a 1293 mmHg level associated with clinically significant nerve injury. Injection pressures for 20 and 30 mL syringes with various needle sizes (18G, 20G, 21G, 22G, and 24G) were measured in a closed system. A set volume of air was aspirated into a saline-filled syringe and then compressed and maintained at various percentages while pressure was measured. The needle was inserted into the injection port of a pressure sensor, which had attached extension tubing with an injection plug clamped "off". Using linear regression with all data points, the pressure value and 99% confidence interval (CI) at 50% air compression was estimated. The linearity of Boyle's law was demonstrated with a high correlation, r = 0.99, and a slope of 0.984 (99% CI: 0.967-1.001). The net pressure generated at 50% compression was estimated as 744.8 mmHg, with the 99% CI between 729.6 and 760.0 mmHg. The various syringe/needle combinations had similar results. By creating and maintaining syringe air compression at 50% or less, injection pressures will be substantially below the 1293 mmHg threshold considered to be an associated risk factor for clinically significant nerve injury. This technique may allow simple, real-time and objective monitoring during local anesthetic injections while inherently reducing injection speed.

  14. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  15. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  16. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  17. SDAT. Analysis of 131mXe with 133Xe interference

    International Nuclear Information System (INIS)

    Biegalski, S.R.F.; Foltz Biegalski, K.M.

    2009-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed at The University of Texas at Austin. SDAT utilizes a standard spectrum technique for the analysis of β-γ coincidence spectra. Testing was performed on the software to compare the standard spectrum analysis technique with a region of interest (ROI) analysis technique. Experimentally produced standard spectra and sample data were produced at the Nuclear Engineering Teaching Laboratory (NETL) TRIGA reactor. The results of the testing showed that the standard spectrum technique had lower errors than the ROI analysis technique for samples with low counting statistics. In contrast, the ROI analysis technique outperformed the standard spectrum technique in high counting statistics samples. It was also shown that the standard spectrum technique benefitted from a compression of the number of channels within the spectra. (author)

  18. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  19. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  20. Chemical Separation Technique of Strontium-90 in the Soil Water as theStandard Methods for Environmental Radioactivity Analysis

    International Nuclear Information System (INIS)

    Ngasifudin-Hamdani; Suratman; Djoko-Sardjono, Ign; Winduanto-Wahyu SP

    2000-01-01

    Research about separation technique of strontium-90 from its materialmatrix using chemical precipitation method has been done. That technique wasapplied on the detection of radionuclide strontium-90 containing in the soilwater of near nuclear reactor facility P3TM BATAN in three location. The twoimportant parameters used in this technique were growth time of Y-90 andstirring time. The result shown that activity of strontium-90 in the pos-01was between 1.801x10 -19 - 9.616x10 -17 μCi/cm 3 , pos-02 was8.448x10 -19 - 1.003x X 10 -16 μCi/cm 3 and pos-03 was 6.719x10 -19 - 11.644x10 -16 μCi/cm 3 . From those data shown that activity of Sr-90in the soil water of near nuclear reactor facility P3TM BATAN was still belowthe limit value of maximum concentration permitted i.e. 4.0x10 -7 -3.5x10 -6 μCi/cm 3 . The statistic test using analysis of varian twofactorial with random block design shown that the activity of Sr-90 in thesoil water was influenced by the interaction which take place between growthlong time of Y-90 and stirring long time. (author)

  1. NET European Network on Neutron Techniques Standardization for Structural Integrity

    International Nuclear Information System (INIS)

    Youtsos, A.

    2004-01-01

    Improved performance and safety of European energy production systems is essential for providing safe, clean and inexpensive electricity to the citizens of the enlarged EU. The state of the art in assessing internal stresses, micro-structure and defects in welded nuclear components -as well as their evolution due to complex thermo-mechanical loads and irradiation exposure -needs to be improved before relevant structural integrity assessment code requirements can safely become less conservative. This is valid for both experimental characterization techniques and predictive numerical algorithms. In the course of the last two decades neutron methods have proven to be excellent means for providing valuable information required in structural integrity assessment of advanced engineering applications. However, the European industry is hampered from broadly using neutron research due to lack of harmonised and standardized testing methods. 35 European major industrial and research/academic organizations have joined forces, under JRC coordination, to launch the NET European Network on Neutron Techniques Standardization for Structural Integrity in May 2002. The NET collaborative research initiative aims at further development and harmonisation of neutron scattering methods, in support of structural integrity assessment. This is pursued through a number of testing round robin campaigns on neutron diffraction and small angle neutron scattering - SANS and supported by data provided by other more conventional destructive and non-destructive methods, such as X-ray diffraction and deep and surface hole drilling. NET also strives to develop more reliable and harmonized simulation procedures for the prediction of residual stress and damage in steel welded power plant components. This is pursued through a number of computational round robin campaigns based on advanced FEM techniques, and on reliable data obtained by such novel and harmonized experimental methods. The final goal of

  2. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  3. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  4. Internal standard method for determination of gallium and some trace elements in bauxite by neutron activation analysis

    International Nuclear Information System (INIS)

    Chen, S.G.; Tsai, H.T.

    1983-01-01

    A method is described for the determination of gallium and other trace elements such as Ce, Cr, Hf, Lu and Th in bauxite by the technique of neutron activation analysis using gold as internal standard. Isopropyl ether was used as organic extractant radioactive gallium from the sample. This method yields very good accuracy with a relative error of +-3%. (author)

  5. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  6. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  7. Standard test method for analysis of uranium and thorium in soils by energy dispersive X-Ray fluorescence spectroscopy

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method covers the energy dispersive X-ray fluorescence (EDXRF) spectrochemical analysis of trace levels of uranium and thorium in soils. Any sample matrix that differs from the general ground soil composition used for calibration (that is, fertilizer or a sample of mostly rock) would have to be calibrated separately to determine the effect of the different matrix composition. 1.2 The analysis is performed after an initial drying and grinding of the sample, and the results are reported on a dry basis. The sample preparation technique used incorporates into the sample any rocks and organic material present in the soil. This test method of sample preparation differs from other techniques that involve tumbling and sieving the sample. 1.3 Linear calibration is performed over a concentration range from 20 to 1000 μg per gram for uranium and thorium. 1.4 The values stated in SI units are to be regarded as the standard. The inch-pound units in parentheses are for information only. 1.5 This standard...

  8. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    NARCIS (Netherlands)

    Zweerink, A.; Allaart, C.P.; Kuijer, J.P.A.; Wu, L.; Beek, A.M.; Ven, P.M. van de; Meine, M.; Croisille, P.; Clarysse, P.; Rossum, A.C. van; Nijveldt, R.

    2017-01-01

    OBJECTIVES: Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive

  9. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  10. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  11. Study of some environmental problem in egypt using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    El-Karim, A.H.M.G.

    2003-01-01

    this thesis deals with the investigation of the possibility of using the new (second) egyptian research reactor (ETRR-2) at Inshas (22 MW) for the neutron activation analysis (ANN) of trace elements, particularly in air dust, collected from cairo and some other cities of egypt. in this concern chapter 1 gives an introduction about the activation methods in general, describing the various techniques used and a comparison of the methods with other instrumental methods of analysis . as a main classification, the neutron activation methods involve prompt γ-ray NAA and delayed γ-ray NAA; cyclic NAA (repeated activation) was also outlined. the methodology of NAA involves the absolute method, the relative method and the mono standard (single comparator) method , which is in between the absolute and relative methods

  12. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  13. Development of isotope dilution-liquid chromatography/mass spectrometry combined with standard addition techniques for the accurate determination of tocopherols in infant formula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joonhee; Jang, Eun-Sil; Kim, Byungjoo, E-mail: byungjoo@kriss.re.kr

    2013-07-17

    Graphical abstract: -- Highlights: •ID-LC/MS method showed biased results for tocopherols analysis in infant formula. •H/D exchange of deuterated tocopherols in sample preparation was the source of bias. •Standard addition (SA)-ID-LC/MS was developed as an alternative to ID-LC/MS. •Details of calculation and uncertainty evaluation of the SA-IDMS were described. •SA-ID-LC/MS showed a higher-order metrological quality as a reference method. -- Abstract: During the development of isotope dilution-liquid chromatography/mass spectrometry (ID-LC/MS) for tocopherol analysis in infant formula, biased measurement results were observed when deuterium-labeled tocopherols were used as internal standards. It turned out that the biases came from intermolecular H/D exchange and intramolecular H/D scrambling of internal standards in sample preparation processes. Degrees of H/D exchange and scrambling showed considerable dependence on sample matrix. Standard addition-isotope dilution mass spectrometry (SA-IDMS) based on LC/MS was developed in this study to overcome the shortcomings of using deuterium-labeled internal standards while the inherent advantage of isotope dilution techniques is utilized for the accurate recovery correction in sample preparation processes. Details of experimental scheme, calculation equation, and uncertainty evaluation scheme are described in this article. The proposed SA-IDMS method was applied to several infant formula samples to test its validity. The method was proven to have a higher-order metrological quality with providing very accurate and precise measurement results.

  14. The quantitative analysis of Bowen's kale by PIXE using the internal standard

    International Nuclear Information System (INIS)

    Navarrete, V.R.; Izawa, G.; Shiokawa, T.; Kamiya, M.; Morita, S.

    1978-01-01

    The internal standard method was used for non-destructive quantitative determination of trace elements by PIXE. The uniform distribution of the internal standard element in the Bowen's kale powder sample was obtained by using homogenization technique. Eleven elements are determined quantitatively for the sample prepared into self-supporting targets having lower relative standard deviations than non-self-supporting targets. (author)

  15. Round robin analyses of hydrogen isotope thin films standards

    Energy Technology Data Exchange (ETDEWEB)

    Banks, J.C. E-mail: jcbanks@sandia.gov; Browning, J.F.; Wampler, W.R.; Doyle, B.L.; LaDuca, C.A.; Tesmer, J.R.; Wetteland, C.J.; Wang, Y.Q

    2004-06-01

    Hydrogen isotope thin film standards have been manufactured at Sandia National Laboratories for use by the materials characterization community. Several considerations were taken into account during the manufacture of the ErHD standards, with accuracy and stability being the most important. The standards were fabricated by e-beam deposition of Er onto a Mo substrate and the film stoichiometrically loaded with hydrogen and deuterium. To determine the loading accuracy of the standards two random samples were measured by thermal desorption mass spectrometry and atomic absorption spectrometry techniques with a stated combined accuracy of {approx}1.6% (1{sigma}). All the standards were then measured by high energy RBS/ERD and RBS/NRA with the accuracy of the techniques {approx}5% (1{sigma}). The standards were then distributed to the IBA materials characterization community for analysis. This paper will discuss the suitability of the standards for use by the IBA community and compare measurement results to highlight the accuracy of the techniques used.

  16. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  17. Reliability evaluation of high-performance, low-power FinFET standard cells based on mixed RBB/FBB technique

    Science.gov (United States)

    Wang, Tian; Cui, Xiaoxin; Ni, Yewen; Liao, Kai; Liao, Nan; Yu, Dunshan; Cui, Xiaole

    2017-04-01

    With shrinking transistor feature size, the fin-type field-effect transistor (FinFET) has become the most promising option in low-power circuit design due to its superior capability to suppress leakage. To support the VLSI digital system flow based on logic synthesis, we have designed an optimized high-performance low-power FinFET standard cell library based on employing the mixed FBB/RBB technique in the existing stacked structure of each cell. This paper presents the reliability evaluation of the optimized cells under process and operating environment variations based on Monte Carlo analysis. The variations are modelled with Gaussian distribution of the device parameters and 10000 sweeps are conducted in the simulation to obtain the statistical properties of the worst-case delay and input-dependent leakage for each cell. For comparison, a set of non-optimal cells that adopt the same topology without employing the mixed biasing technique is also generated. Experimental results show that the optimized cells achieve standard deviation reduction of 39.1% and 30.7% at most in worst-case delay and input-dependent leakage respectively while the normalized deviation shrinking in worst-case delay and input-dependent leakage can be up to 98.37% and 24.13%, respectively, which demonstrates that our optimized cells are less sensitive to variability and exhibit more reliability. Project supported by the National Natural Science Foundation of China (No. 61306040), the State Key Development Program for Basic Research of China (No. 2015CB057201), the Beijing Natural Science Foundation (No. 4152020), and Natural Science Foundation of Guangdong Province, China (No. 2015A030313147).

  18. Standard practice for monitoring atmospheric SO2 using the sulfation plate technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice covers a weighted average effective SO2 level for a 30-day interval through the use of the sulfation plate method, a technique for estimating the effective SO2 content of the atmosphere, and especially with regard to the atmospheric corrosion of stationary structures or panels. This practice is aimed at determining SO2 levels rather than sulfuric acid aerosol or acid precipitation. 1.2 The results of this practice correlate approximately with volumetric SO2 concentrations, although the presence of dew or condensed moisture tends to enhance the capture of SO2 into the plate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  19. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  20. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  1. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  2. 78 FR 45447 - Revisions to Modeling, Data, and Analysis Reliability Standard

    Science.gov (United States)

    2013-07-29

    ...; Order No. 782] Revisions to Modeling, Data, and Analysis Reliability Standard AGENCY: Federal Energy... Analysis (MOD) Reliability Standard MOD- 028-2, submitted to the Commission for approval by the North... Organization. The Commission finds that the proposed Reliability Standard represents an improvement over the...

  3. Suitable pellets standards development for LA-ICPMS analysis of Al2O3 powders

    International Nuclear Information System (INIS)

    Ferraz, Israel Elias; Sousa, Talita Alves de; Silva, Ieda de Souza; Gomide, Ricardo Goncalves; Oliveira, Luis Claudio de

    2013-01-01

    Chemical and physical characterization of aluminium oxides has a special interest for the nuclear industry, despite arduous chemical digestion process. Therefore, laser ablation inductively coupled plasma mass spectrometry is an attractive method for analysis. However, due to the lack of suitable matrix-matched certified reference materials (MRC) for such powders and ceramic pellets analysis, LA-ICPMS has not yet been fully applied. Furthermore, establishing calibrate curves to trace element quantification using external standards raises a significant problem. In this context, the development of suitable standard pellets to have calibration curves for chemical determination of the impurities onto aluminium oxide powders by LA-ICPMS analytical technique was aimed in this work. It was developed using two different analytical strategies: (I) boric acid pressed pellets and (II) lithium tetra-borate melted pellets, both spiked with high purity oxides of Si, Mg, Ca, Na,Fe, Cr and Ni. The analytical strategy (II) which presented the best analytical parameters was selected, a reference certificated material was analyzed and the results compared. The limits of detection, linearity, precision, accuracy and recovery study results are presented and discussed. (author)

  4. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    Science.gov (United States)

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  5. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  6. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  7. The standard deviation of extracellular water/intracellular water is associated with all-cause mortality and technique failure in peritoneal dialysis patients.

    Science.gov (United States)

    Tian, Jun-Ping; Wang, Hong; Du, Feng-He; Wang, Tao

    2016-09-01

    The mortality rate of peritoneal dialysis (PD) patients is still high, and the predicting factors for PD patient mortality remain to be determined. This study aimed to explore the relationship between the standard deviation (SD) of extracellular water/intracellular water (E/I) and all-cause mortality and technique failure in continuous ambulatory PD (CAPD) patients. All 152 patients came from the PD Center between January 1st 2006 and December 31st 2007. Clinical data and at least five-visit E/I ratio defined by bioelectrical impedance analysis were collected. The patients were followed up till December 31st 2010. The primary outcomes were death from any cause and technique failure. Kaplan-Meier analysis and Cox proportional hazards models were used to identify risk factors for mortality and technique failure in CAPD patients. All patients were followed up for 59.6 ± 23.0 months. The patients were divided into two groups according to their SD of E/I values: lower SD of E/I group (≤0.126) and higher SD of E/I group (>0.126). The patients with higher SD of E/I showed a higher all-cause mortality (log-rank χ (2) = 10.719, P = 0.001) and technique failure (log-rank χ (2) = 9.724, P = 0.002) than those with lower SD of E/I. Cox regression analysis found that SD of E/I independently predicted all-cause mortality (HR  3.551, 95 % CI 1.442-8.746, P = 0.006) and technique failure (HR  2.487, 95 % CI 1.093-5.659, P = 0.030) in CAPD patients after adjustment for confounders except when sensitive C-reactive protein was added into the model. The SD of E/I was a strong independent predictor of all-cause mortality and technique failure in CAPD patients.

  8. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  9. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  10. A measurement technique for counting processes

    International Nuclear Information System (INIS)

    Cantoni, V.; Pavia Univ.; De Lotto, I.; Valenziano, F.

    1980-01-01

    A technique for the estimation of first and second order properties of a stationary counting process is presented here which uses standard instruments for analysis of a continuous stationary random signal. (orig.)

  11. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  12. Specific binding assay technique; standardization of reagent

    International Nuclear Information System (INIS)

    Huggins, K.G.; Roitt, I.M.

    1979-01-01

    The standardization of a labelled constituent, such as anti-IgE, for use in a specific binding assay method is disclosed. A labelled ligand, such as IgE, is standardized against a ligand reference substance, such as WHO standard IgE, to determine the weight of IgE protein represented by the labelled ligand. Anti-light chain antibodies are contacted with varying concentrations of the labelled ligand. The ligand is then contacted with the labelled constituent which is then quantitated in relation to the amount of ligand protein present. The preparation of 131 I-labelled IgE is described. Also disclosed is an improved specific binding assay test method for determining the potency of an allergen extract in serum from an allergic individual. The improvement involved using a parallel model system of a second complex which consisted of anti-light chain antibodies, labelled ligand and the standardized labelled constituent (anti-IgE). The amount of standardized labelled constituent bound to the ligand in the first complex was determined, as described above, and the weight of ligand inhibited by addition of soluble allergen was then used as a measure of the potency of the allergen extract. (author)

  13. Analysis of obsydians and films of silicon carbide by RBS technique

    International Nuclear Information System (INIS)

    Franco S, F.

    1998-01-01

    Motivated by archaeological interest this work is presented, which consist in the characterization of obsydian samples from different mineral sites in Mexico and films of silicon carbide, undertaken by an Ion Beam Analysis: RBS (Rutherford Back Scattering). As part of an intensive investigation of obsydian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in Central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In the first part of this work, the non-destructive IBA technique, RBS are used to analyze obsydian samples. The last part is an analysis of thin films of silicon carbide as a part of a research program of the Universidad Nacional Autonoma de Mexico and ININ. The application of this technique were carried out at the IF-UNAM, and the analysis was performed at laboratories of the ININ Nuclear Centre facilities. The samples considered in this work were mounted on a sample holder designed for the purpose of exposing each sample to the alpha particles beam. This RBS analysis was carried out with an ET Tandem accelerator at the IF UNAM. The spectrometry was carried out with employing a Si(Li) detector set at 15 degrees in relation to the target normal. The mean projectile energy was 2.00 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (MIchoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza (Puebla), Guadalupe Victoria (Puebla) and Oyameles (Puebla). The mean values are accompanied by errors expressed as one standard devistion of the mean for each element

  14. Rare earths analysis of rock samples by instrumental neutron activation analysis, internal standard method

    International Nuclear Information System (INIS)

    Silachyov, I.

    2016-01-01

    The application of instrumental neutron activation analysis for the determination of long-lived rare earth elements (REE) in rock samples is considered in this work. Two different methods are statistically compared: the well established external standard method carried out using standard reference materials, and the internal standard method (ISM), using Fe, determined through X-ray fluorescence analysis, as an element-comparator. The ISM proved to be the more precise method for a wide range of REE contents and can be recommended for routine practice. (author)

  15. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Standard Test Method for Oxygen Content Using a 14-MeV Neutron Activation and Direct-Counting Technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method covers the measurement of oxygen concentration in almost any matrix by using a 14-MeV neutron activation and direct-counting technique. Essentially, the same system may be used to determine oxygen concentrations ranging from over 50 % to about 10 g/g, or less, depending on the sample size and available 14-MeV neutron fluence rates. Note 1 - The range of analysis may be extended by using higher neutron fluence rates, larger samples, and higher counting efficiency detectors. 1.2 This test method may be used on either solid or liquid samples, provided that they can be made to conform in size, shape, and macroscopic density during irradiation and counting to a standard sample of known oxygen content. Several variants of this method have been described in the technical literature. A monograph is available which provides a comprehensive description of the principles of activation analysis using a neutron generator (1). 1.3 The values stated in either SI or inch-pound units are to be regarded...

  17. A new technique for the deposition of standard solutions in total reflection X-ray fluorescence spectrometry (TXRF) using pico-droplets generated by inkjet printers and its applicability for aerosol analysis with SR-TXRF

    International Nuclear Information System (INIS)

    Fittschen, U.E.A.; Hauschild, S.; Amberger, M.A.; Lammel, G.; Streli, C.; Foerster, S.; Wobrauschek, P.; Jokubonis, C.; Pepponi, G.; Falkenberg, G.; Broekaert, J.A.C.

    2006-01-01

    A new technique for the deposition of standard solutions on particulate aerosol samples using pico-droplets for elemental determinations with total reflection X-ray fluorescence spectrometry (TXRF) is described. It enables short analysis times without influencing the sample structure and avoids time consuming scanning of the sample with the exciting beam in SR-TXRF analysis. Droplets of picoliter volume (∼ 5-130 pL) were generated with commercially available and slightly modified inkjet printers operated with popular image processing software. The size of the dried droplets on surfaces of different polarity namely silicone coated and untreated quartz reflectors, was determined for five different printer types and ten different cartridge types. The results show that droplets generated by inkjet printers are between 50 and 200 μm in diameter (corresponding to volumes of 5 to 130 pL) depending on the cartridge type, which is smaller than the width of the synchrotron beam used in the experiments (< 1 mm at an energy of 17 keV at the beamline L at HASYLAB, Hamburg). The precision of the printing of a certain amount of a single element standard solution was found to be comparable to aliquoting with micropipettes in TXRF, where for 2.5 ng of cobalt relative standard deviations of 12% are found. However, it could be shown that the printing of simple patterns is possible, which is important when structured samples have to be analysed

  18. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  19. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  20. Standardization: using comparative maintenance costs in an economic analysis

    OpenAIRE

    Clark, Roger Nelson

    1987-01-01

    Approved for public release; distribution is unlimited This thesis investigates the use of comparative maintenance costs of functionally interchangeable equipments in similar U.S. Navy shipboard applications in an economic analysis of standardization. The economics of standardization, life-cycle costing, and the Navy 3-M System are discussed in general. An analysis of 3-M System maintenance costs for a selected equipment, diesel engines, is conducted. The potential use of comparative ma...

  1. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  2. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  3. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  4. Feasibility of CBCT-based dose calculation: Comparative analysis of HU adjustment techniques

    International Nuclear Information System (INIS)

    Fotina, Irina; Hopfgartner, Johannes; Stock, Markus; Steininger, Thomas; Lütgendorf-Caucig, Carola; Georg, Dietmar

    2012-01-01

    Background and purpose: The aim of this work was to compare the accuracy of different HU adjustments for CBCT-based dose calculation. Methods and materials: Dose calculation was performed on CBCT images of 30 patients. In the first two approaches phantom-based (Pha-CC) and population-based (Pop-CC) conversion curves were used. The third method (WAB) represents override of the structures with standard densities for water, air and bone. In ROI mapping approach all structures were overridden with average HUs from planning CT. All techniques were benchmarked to the Pop-CC and CT-based plans by DVH comparison and γ-index analysis. Results: For prostate plans, WAB and ROI mapping compared to Pop-CC showed differences in PTV D median below 2%. The WAB and Pha-CC methods underestimated the bladder dose in IMRT plans. In lung cases PTV coverage was underestimated by Pha-CC method by 2.3% and slightly overestimated by the WAB and ROI techniques. The use of the Pha-CC method for head–neck IMRT plans resulted in difference in PTV coverage up to 5%. Dose calculation with WAB and ROI techniques showed better agreement with pCT than conversion curve-based approaches. Conclusions: Density override techniques provide an accurate alternative to the conversion curve-based methods for dose calculation on CBCT images.

  5. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  6. Uncertainty determination of analysis of Ti, V, Cl, Ce, Cr, Cs, Sc, Co, Fe and Ca in solid samples by INAA method using standard addition according to ISO - guide 17025

    International Nuclear Information System (INIS)

    Sumining; Agus Taftazani

    2003-01-01

    Uncertainty of analysis of Ti, V, Cl, Ce, Cr, Cs, Sc, Co, Fe and Ca in solid samples by INAA (/instrumental Neutron Activation Analysis) method using comparative technique and standard addition have been carried out at INAA laboratory of P3TM BATAN. The calculation of Ti have been presented as the example. Uncertainty sources of INAA are sampling, sample and standard preparation, irradiation and counting. Sample were come from IAEA (International Atomic Energy Agency) which had ready for analyzed therefore only for sample and standard preparation, irradiation and counting factors were determined. Analysis were done by relative technique, that sample and standard were irradiated together in same capsule therefore irradiation time, neutron flux, irradiation geometry and isotopic properties. will be eliminated. Uncertainty of counting factors were covering radioactivity decay during the counting, pulse losses caused by random counting, counting geometry, and counting rate. Relative technique makes the uncertainty come from counting time for sample and standard that was settled by same counting equipment can be neglected. Uncertainty of counting geometry and thickness of uranium was not detected so there is no contribution come from The fission product. Variation of fuel target nuclides number didn't occurred because the combustion was not occurred during irradiation, and analytical results were not influenced by the chemical status. (author)

  7. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  8. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  9. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  10. Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis.

    Science.gov (United States)

    Jamshidy, Ladan; Mozaffari, Hamid Reza; Faraji, Payam; Sharifi, Roohollah

    2016-01-01

    Introduction . One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods . A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL) regions by a stereomicroscope using a standard method. Results . The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion . The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique.

  11. The preparation of primary standard solutions for each of the noble metals

    International Nuclear Information System (INIS)

    Mallett, R.C.; Wall, G.J.; Jones, E.A.; Royal, S.J.

    1977-01-01

    A revised method for the preparation of primary standard solutions for each of the noble metals is described. It is now recommended that standard noble-metal solutions should be made from the pure metals and not from salts as previously described. Metals should have a certified purity of 99,95 per cent or better, and the purity should be confirmed by analysis, the techniques of emission spectography or spark-source mass spectrography being used. After the metals have been dissolved, the solutions are made up to volume and the metal content of the standard solutions is checked. For most instrumental techniques for which the standards are intended, the check analysis should be within 0,3 per cent of the certified value

  12. Principal Components Analysis on the spectral Bidirectional Reflectance Distribution Function of ceramic colour standards.

    Science.gov (United States)

    Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A

    2011-09-26

    The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America

  13. Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.

    Science.gov (United States)

    American Society for Testing and Materials, Philadelphia, PA.

    Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…

  14. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  15. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  16. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  17. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    Science.gov (United States)

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  18. Classification Technique for Ultrasonic Weld Inspection Signals using a Neural Network based on 2-dimensional fourier Transform and Principle Component Analysis

    International Nuclear Information System (INIS)

    Kim, Jae Joon

    2004-01-01

    Neural network-based signal classification systems are increasingly used in the analysis of large volumes of data obtained in NDE applications. Ultrasonic inspection methods on the other hand are commonly used in the nondestructive evaluation of welds to detect flaws. An important characteristic of ultrasonic inspection is the ability to identify the type of discontinuity that gives rise to a peculiar signal. Standard techniques rely on differences in individual A-scans to classify the signals. This paper proposes an ultrasonic signal classification technique based on the information tying in the neighboring signals. The approach is based on a 2-dimensional Fourier transform and the principal component analysis to generate a reduced dimensional feature vector for classification. Results of applying the technique to data obtained from the inspection of actual steel welds are presented

  19. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  20. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  1. The standardisation of trace elements in international biological standard reference materials with neutron activation analysis and atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Pieterse, H.

    1981-12-01

    An investigation was undertaken into the analytical procedures and the identification of problem areas, for the certification of a new biological standard reference material supplied by the International Atomic Energy Agency, namely, a human hair sample designated as HH-I. The analyses comprised the determination of the elements As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Ni, Sb, Se, and Zn in the hair sample by using two analytical techniques, namely, Instrumental Neutron Activation Analysis and Atomic Absorption. Three other certified biological reference materials, namely, Orchard Leaves (ORCH-L), Sea Plant Material (SPM-I) and Copepod (MAA-I) were used as control standards. Determinations were made of the moisture content of the samples, using varying conditions of drying, and the necessary corrections were applied to all analytical results so that the final elemental values related to dry weight of samples. Attention was also given to the possible loss of specific elements during ashing of the samples prior to the actual instrumental analysis. The results obtained for the hair sample by the two techniques were in good agreement for the elements Co, Fe, Mn, and Zn, but did not agree for the elements Cr and Sb. As, Hg and Se could only be determined with Instrumental Neutron Activation Analysis, and Cd, Cu and Ni only with Atomic Absorption. Most of the results obtained for the three control standard reference materials were within the ranges specified for the individual elements in each sample. The analytical procedures used for determining Cd, Cr, Cu, Ni and Sb with Instrumental Neutron Activation Analysis and As, Cr, Sb and Se with Atomic Absorption, need further investigation. The measurement of the moisture content and the ashing of samples also require further investigation with a view to improving accuracy

  2. Evaluation of pressed powders and thin section standards for multi-elemental analysis by conventional and micro-PIXE analysis

    International Nuclear Information System (INIS)

    Homma-Takeda, Shino; Iso, Hiroyuki; Ito, Masaki

    2010-01-01

    For multi-elemental analysis, various standards are used to quantify the elements consists of environmental and biological samples. In this paper two different configuration standards, pressed powders and thin section standards, were assessed for their purpose as standards by conventional and micro-PIXE analysis. Homogeneity of manganese, iron, zinc (Zn), copper and yttrium added to pressed powder standard materials were validated and the relative standard deviation (RSD) of the X-ray intensity of the standards was 2 area and the metal concentration was acceptable. (author)

  3. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  4. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  5. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  6. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  7. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  8. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  9. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  10. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2011-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the

  11. Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Ladan Jamshidy

    2016-01-01

    Full Text Available Introduction. One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods. A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL regions by a stereomicroscope using a standard method. Results. The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion. The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique.

  12. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  13. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  14. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS.

    Science.gov (United States)

    Creech, J B; Baker, J A; Handler, M R; Bizzarro, M

    2014-01-10

    We report a method for the chemical purification of Pt from geological materials by ion-exchange chromatography for subsequent Pt stable isotope analysis by multiple-collector inductively coupled plasma mass spectrometry (MC-ICPMS) using a 196 Pt- 198 Pt double-spike to correct for instrumental mass bias. Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily dissolved in acid in preparation for anion-exchange chemistry. Pt was recovered from anion-exchange resin in concentrated HNO 3 acid after elution of matrix elements, including the other platinum group elements (PGE), in dilute HCl and HNO 3 acids. The separation method has been calibrated using a precious metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5 ng g -1 to 4 μg g -1 . This analytical method has been shown to have an external reproducibility on δ 198 Pt (permil difference in the 198 Pt/ 194 Pt ratio from the IRMM-010 standard) of ±0.040 (2 sd) on Pt solution standards (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2 sd). Pt stable isotope data for the full set of reference materials have a range of δ 198 Pt values with offsets of up to 0.4‰ from the IRMM-010 standard, which are readily resolved with this technique. These

  15. In-cylinder pressure-based direct techniques and time frequency analysis for combustion diagnostics in IC engines

    International Nuclear Information System (INIS)

    D’Ambrosio, S.; Ferrari, A.; Galleani, L.

    2015-01-01

    Highlights: • Direct pressure-based techniques have been applied successfully to spark-ignition engines. • The burned mass fraction of pressure-based techniques has been compared with that of 2- and 3-zone combustion models. • The time frequency analysis has been employed to simulate complex diesel combustion events. - Abstract: In-cylinder pressure measurement and analysis has historically been a key tool for off-line combustion diagnosis in internal combustion engines, but online applications for real-time condition monitoring and combustion management have recently become popular. The present investigation presents and compares different low computing-cost in-cylinder pressure based methods for the analyses of the main features of combustion, that is, the start of combustion, the end of combustion and the crankshaft angle that responds to half of the overall burned mass. The instantaneous pressure in the combustion chamber has been used as an input datum for the described analytical procedures and it has been measured by means of a standard piezoelectric transducer. Traditional pressure-based techniques have been shown to be able to predict the burned mass fraction time history more accurately in spark ignition engines than in diesel engines. The most suitable pressure-based techniques for both spark ignition and compression ignition engines have been chosen on the basis of the available experimental data. Time–frequency analysis has also been applied to the analysis of diesel combustion, which is richer in events than spark ignited combustion. Time frequency algorithms for the calculation of the mean instantaneous frequency are computationally efficient, allow the main events of the diesel combustion to be identified and provide the greatest benefits in the presence of multiple injection events. These algorithms can be optimized and applied to onboard diagnostics tools designed for real control, but can also be used as an advanced validation tool for

  16. Open Partial Nephrectomy in Renal Cancer: A Feasible Gold Standard Technique in All Hospitals

    Directory of Open Access Journals (Sweden)

    J. M. Cozar

    2008-01-01

    Full Text Available Introduction. Partial nephrectomy (PN is playing an increasingly important role in localized renal cell carcinoma (RCC as a true alternative to radical nephrectomy. With the greater experience and expertise of surgical teams, it has become an alternative to radical nephrectomy in young patients when the tumor diameter is 4 cm or less in almost all hospitals since cancer-specific survival outcomes are similar to those obtained with radical nephrectomy. Materials and Methods. The authors comment on their own experience and review the literature, reporting current indications and outcomes including complications. The surgical technique of open partial nephrectomy is outlined. Conclusions. Nowadays, open PN is the gold standard technique to treat small renal masses, and all nonablative techniques must pass the test of time to be compared to PN. It is not ethical for patients to undergo radical surgery just because the urologists involved do not have adequate experience with PN. Patients should be involved in the final treatment decision and, when appropriate, referred to specialized centers with experience in open or laparoscopic partial nephrectomies.

  17. Suitable activated carbon-13 tracer techniques

    International Nuclear Information System (INIS)

    Zhang Weicheng; Peng Xiuru; Wang Yuhua

    1995-12-01

    Feasibility and applicability studies of the proton induced gamma ray emission (PIGE) have been performed. The graphite was firstly bombarded at various proton energies to determine gamma ray yield (and, thus, sensitivities) for the reaction of interest. The accuracy for the determination of 13 C abundance was checked, and the precision with which this value and ratios 13 C/ 12 C may be obtained was established by repetitive analysis samples. The performance of different standards in this determination was assessed. The mathematical treatment was developed for the determination of 13 C abundance in tracer studies, and to derive the equations that govern this method of analysis from first principles, to arrive finally at a simple expression by virtue of the observed regularities. The system was calibrated by measuring the gamma ray yield form the 12 C (p, γ) 13 N and 13 C(p,γ) 14 N reaction as a function of known 13 C enrichment. Using this experimentally determined calibration curve, unknown materials can be assayed. This technique is applicable to the analysis of samples with 13 C enrichments between 0.1% and 90%. The samples of human breath natural samples were analyzed against graphite and Cylinder CO 2 standards. Relative standard deviations were 13 C abundance, an increase in 13 C per cent isotopic abundance from the natural 1.11% (average) to only 1.39% may be ascertained. Finally, PIGE is compared with more classical techniques for analysis of 13 C tracer experiments. Ease and speed are important advantages of this technique over mass spectrometry, and its error is compatible with the natural variation of biological results. (9 refs., 11 figs., 9 tabs.)

  18. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  19. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  20. Analysis of soil and sewage sludge by ICP-OES and the German standard DIN 38414 sample preparation technique (P3)

    International Nuclear Information System (INIS)

    Edlund, M.; Heitland, P.; Visser, H.

    2002-01-01

    Full text: The elemental analyses of soil and sewage sludge has developed to become one of the main applications for ICP optical emission spectrometry (ICP-OES) and is described in many official procedures. These methods include different acid mixtures and digestion techniques. Even though the German standard DIN 38414 part 7 and the Dutch NEN 6465 do not guarantee complete recoveries for all elements, they are widely accepted in Europe. This paper describes sample preparation, line selection and investigates precision, accuracy and Limits of detection. The SPECTRO CIROSCCD EOP with axial plasma observation and the SPECTRO CIROSCCD SOP with radial observation were compared and evaluated for the analyses of soil and sewage sludge. Accuracy was investigated using the certified reference materials CRM-141 R, CRM-143 R and GSD 11. Both instruments show excellent performance in terms of speed, precision, accuracy and detection limits for the determination of trace metals in soil and sewage sludge. (author)

  1. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  2. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  3. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  4. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  5. Extended standard vector analysis for plasma physics

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-02-01

    Standard vector analysis in 3-dimensional space, as found in most tables and textbooks, is complemented by a number of basic formulas that seem to be largely unknown, but are important in themselves and for some plasma physics applications, as is shown by several examples. (orig.)

  6. Comparison of laser fluorimetry, high resolution gamma-ray spectrometry and neutron activation analysis techniques for determination of uranium content in soil samples

    International Nuclear Information System (INIS)

    Ghods, A.; Asgharizadeh, F.; Salimi, B.; Abbasi, A.

    2004-01-01

    Much more concern is given nowadays for exposure of the world population to natural radiation especially to uranium since 57% of that exposure is due to radon-222, which is a member of uranium decay series. Most of the methods used for uranium determination is low concentration require either tedious separation and preconcentration or the accessibility to special instrumentation for detection of uranium at this low level. this study compares three techniques and methods for uranium analysis among different soil sample with variable uranium contents. Two of these techniques, neutron activation analysis and high resolution gamma-ray spectrometry , are non-destructive while the other, laser fluorimetry is done via chemical extraction of uranium. Analysis of standard materials is done also to control the quality and accuracy of the work. In spite of having quite variable ranges of detection limit, results obtained by high resolution gamma-ray spectrometry based on the assumption of having secular equilibrium between uranium and its daughters, which causes deviation whenever this condition was missed. For samples with reasonable uranium content, neutron activation analysis would be a rapid and reliable technique, while for low uranium content laser fluorimetry would be the most appropriate and accurate technique

  7. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  8. Structured Analysis - IDEF0

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    This note introduces the IDEF0 modelling language (semantics and syntax), and associated rules and techniques, for developing structured graphical representations of a system or enterprise. Use of this standard for IDEF0 permits the construction of models comprising system functions (activities...... that require a modelling technique for the analysis, development, re-engineering, integration, or acquisition of information systems; and incorporate a systems or enterprise modelling technique into a business process analysis or software engineering methodology.This note is a summary of the Standard...... for Integration Definition for Function Modelling (IDEF0). I.e. the Draft Federal Information Processing Standards Publication 183, 1993, December 21, Announcing the Standard for Integration Definition for Function Modelling (IDEF0)....

  9. Analysis of formalin-fixed, paraffin-embedded (FFPE) tissue via proteomic techniques and misconceptions of antigen retrieval.

    Science.gov (United States)

    O'Rourke, Matthew B; Padula, Matthew P

    2016-01-01

    Since emerging in the late 19(th) century, formaldehyde fixation has become a standard method for preservation of tissues from clinical samples. The advantage of formaldehyde fixation is that fixed tissues can be stored at room temperature for decades without concern for degradation. This has led to the generation of huge tissue banks containing thousands of clinically significant samples. Here we review techniques for proteomic analysis of formalin-fixed, paraffin-embedded (FFPE) tissue samples with a specific focus on the methods used to extract and break formaldehyde crosslinks. We also discuss an error-of-interpretation associated with the technique known as "antigen retrieval." We have discovered that this term has been mistakenly applied to two disparate molecular techniques; therefore, we argue that a terminology change is needed to ensure accurate reporting of experimental results. Finally, we suggest that more investigation is required to fully understand the process of formaldehyde fixation and its subsequent reversal.

  10. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  11. Automated quantitative analysis of in-situ NaI measured spectra in the marine environment using a wavelet-based smoothing technique

    International Nuclear Information System (INIS)

    Tsabaris, Christos; Prospathopoulos, Aristides

    2011-01-01

    An algorithm for automated analysis of in-situ NaI γ-ray spectra in the marine environment is presented. A standard wavelet denoising technique is implemented for obtaining a smoothed spectrum, while the stability of the energy spectrum is achieved by taking advantage of the permanent presence of two energy lines in the marine environment. The automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. The results of the algorithm performance, presented for two different cases, show that analysis of short-term spectra with poor statistical information is considerably improved and that incorporation of further advancements could allow the use of the algorithm in early-warning marine radioactivity systems. - Highlights: → Algorithm for automated analysis of in-situ NaI γ-ray marine spectra. → Wavelet denoising technique provides smoothed spectra even at parts of the energy spectrum that exhibits strong statistical fluctuations. → Automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. → Analysis of short-term spectra with poor statistical information is considerably improved.

  12. The Comparison Study of Neutron Activation Analysis and Fission Track Technique for Uranium Determination

    International Nuclear Information System (INIS)

    Sirinuntavid, Alice; Rodthongkom, Chouvana

    2007-08-01

    Full text: Comparison between Neutron Activation Analysis (NAA) and fission track technique for uranium determination in solid samples was studied by use of standard reference materials, i.e., ore, coal fly ash, soil. For NAA, the epithermal neutron was applied for activated irradiation. Then, the 74.5 keV gamma from U-239 or 277.7 keV gamma from Np-239 was measured. For high Uranium content samples, NAA method with 74.5 keV gamma measurement, gave higher precision result than the 277.7 keV gamma measurement method. NAA method with 277.7 keV gamma measurement, gave higher sensitivity and precision result for low Uranium content samples and the uranium contained less than 10 ppm samples. Nevertheless, the latter procedure needed longer time for neutron irradiation and analysis procedure. In comparison the results of Uranium analysis between NAA and fission track, it was found that no significant difference within 95 % of confidence level

  13. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    Science.gov (United States)

    Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng

    2012-12-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.

  14. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    International Nuclear Information System (INIS)

    Zhang Tie-Yan; Zhao Yan; Xie Xiang-Peng

    2012-01-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach. (general)

  15. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  16. Development of suitable plastic standards for X-ray fluorescence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mans, Christian [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: c.mans@fh-muenster.de; Hanning, Stephanie [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: hanning@fh-muenster.de; Simons, Christoph [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: simons@fh-muenster.de; Wegner, Anne [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: awegner@fh-muenster.de; Janssen, Anton [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: janssena@fh-muenster.de; Kreyenschmidt, Martin [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: martin.kreyenschmidt@fh-muenster.de

    2007-02-15

    For the adoption of the EU directive 'Restriction on use of certain Hazardous Substances' and 'Waste Electrical and Electronic Equipment' using X-ray fluorescence analysis suitable standard materials are required. Plastic standards based on acrylonitrile-butadiene-styrene terpolymer, containing the regulated elements Br, Cd, Cr, Hg and Pb were developed and produced as granulates and solid bodies. The calibration materials were not generated as a dilution from one master batch but rather the element concentrations were distributed over nine independent calibration samples. This was necessary to enable inter-elemental corrections and empirical constant mass absorption coefficients. The produced standard materials are characterized by a homogenous element distribution, which is more than sufficient for X-ray fluorescence analysis. Concentrations for all elements except for Br could be determined by Inductively Coupled Plasma Atomic Emission Spectroscopy after microwave assisted digestion. The concentration of Br was determined by use of Neutron Activation Analysis at Hahn-Meitner-Institute in Berlin, Germany. The correlation of the X-ray fluorescence analysis measurements with the values determined using Inductively Coupled Plasma Atomic Emission Spectroscopy and Neutron Activation Analysis showed a very good linearity.

  17. Multiscale analysis of replication technique efficiency for 3D roughness characterization of manufactured surfaces

    Science.gov (United States)

    Jolivet, S.; Mezghani, S.; El Mansori, M.

    2016-09-01

    The replication of topography has been generally restricted to optimizing material processing technologies in terms of statistical and single-scale features such as roughness. By contrast, manufactured surface topography is highly complex, irregular, and multiscale. In this work, we have demonstrated the use of multiscale analysis on replicates of surface finish to assess the precise control of the finished replica. Five commercial resins used for surface replication were compared. The topography of five standard surfaces representative of common finishing processes were acquired both directly and by a replication technique. Then, they were characterized using the ISO 25178 standard and multiscale decomposition based on a continuous wavelet transform, to compare the roughness transfer quality at different scales. Additionally, atomic force microscope force modulation mode was used in order to compare the resins’ stiffness properties. The results showed that less stiff resins are able to replicate the surface finish along a larger wavelength band. The method was then tested for non-destructive quality control of automotive gear tooth surfaces.

  18. Provenience studies using neutron activation analysis: the role of standardization

    International Nuclear Information System (INIS)

    Harbottle, G.

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined

  19. Elemental analysis of some Egyptian medicinal plants using INAA and FAAS techniques

    International Nuclear Information System (INIS)

    Walley El-Dine, N.; Sroor, A.; Hammed, S.S.; El-Shershaby, A.; Alsamed, M.A

    2009-01-01

    Thirteen Egyptian medicinal plants used for the treatment and cure of various diseases have been elementally analyzed by instrumental neutron activation analysis (INAA) and flame atomic absorption spectrometry (FAAS). The pneumatic rabbit transfer system (PRTS)of 100 kw Budapest research reactor (BRR) was used , for short time irradiation, 300 s, with a thermal neutron flux of 2.4 * 10 12 n/(cm 2 .s). Long time irradiation, 4 hours, was performed at the second research Egyptian reactor (Et-Rr-2) with thermal neutron flux of 5.6 * 10 13 n/(cm 2 .s).Gamma ray spectra were measured by a HPGe detection system . The concentrations of fifteen elements namely Sc,Cr,Fe,Co ,Zn,Rb ,Mo,Sb,La,Ce,Nd, Sm, Yb, Hf and Pa have been determined by long irradiation time and some of them were determined also by FAAS technique. Fourteen elements Na,Mg,Al ,Cd,Cl,K,Ca,Ti,V,Mn ,Ni, Sr,Pb,and Cu, have been identified by short irradiation time and FAAS technique. The precision and accuracy of the method were evaluated using the standard reference material NIST SRM-1571. Comparison of the data obtained give agreement between the concentration of elements determined by the two techniques. The importance of these elements related to human health and nutrition has been discussed

  20. Neutron activation analysis of Standard Materials of Comparison IAEA- the corn and soya flour

    International Nuclear Information System (INIS)

    Dadakhanov, J.A.; Sadykov, I.I.; Salimov, M.I.

    2005-01-01

    It is known that maintenance of quality of results of neutron activation analysis (NAA), no less than in other analytical methods, is one of key problems. Thus first of all it is necessary to provide correctness of results. The most correct way of revealing and elimination of regular errors is carrying out of analyses of Standard Samples of Comparison (SSC) by the developed techniques and comparison of the received results with the certificated results. Therefore, the analysis and certification of various SSC is one of the most actual tasks of modern analytical chemistry. One of few organizations engaged in manufacture SSC for NAA, is IAEA which has organized the Project on certification of samples of comparison - a corn and soya flour. Among many laboratories worldwide, in this Project the Laboratory of the Activation Analysis of Pure Materials of Institute of Nuclear Physics Academy of Sciences Republic of Uzbekistan also participated. We carry out series of analyses of samples of corn and soya flour, candidates for standard samples of comparison, by the method of instrumental NAA. The preparing of samples was carried out by the technique described in the technical project applied to these materials. Samples before the analysis dried up at temperature 80 degree C during 24 h. Cooled, weighed and irradiated in the vertical channel of a nuclear reactor of VVR-SM of Institute of Nuclear Physics Academy of Sciences Republic of Uzbekistan during 0,5-1 h (depending on determined elements) with density of a stream of neutrons 1 x 10 14 neutrons/cm 2 sec. Time of cooling from 10 min up to 10 days. Time of measurement from 100 sec up to 3000 sec. Measurements were carried out on gamma spectrometer consisting of HPGe detector GC1518 and digital multichannel analyzer DSA-1000('Canberra', USA). Processing of the spectrometer information carried out with the help of software package Genie-2000. As a result of the carried out analyses we determine contents of 21 elements in

  1. An Innovative Technique to Assess Spontaneous Baroreflex Sensitivity with Short Data Segments: Multiple Trigonometric Regressive Spectral Analysis.

    Science.gov (United States)

    Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf

    2018-01-01

    Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.

  2. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    Science.gov (United States)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  3. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  4. Setting Standards for Medically-Based Running Analysis

    Science.gov (United States)

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  5. The Delphi Technique in Educational Research

    Directory of Open Access Journals (Sweden)

    Ravonne A. Green

    2014-04-01

    Full Text Available The Delphi Technique has been useful in educational settings in forming guidelines, standards, and in predicting trends. Judd lists these major uses of the Delphi Technique in higher education: (a cost-effectiveness, (b cost–benefit analysis, (c curriculum and campus planning, and (d university-wide educational goals and objectives. The thorough Delphi researcher seeks to reconcile the Delphi consensus with current literature, institutional research, and the campus environment. This triangle forms a sound base for responsible research practice. This book gives an overview of the Delphi Technique and the primary uses of this technique in research. This article on the Delphi Technique will give the researcher an invaluable resource for learning about the Delphi Technique and for applying this method in educational research projects.

  6. Application of simulated standard spectra in natural radioactivity measurements using gamma spectrometry

    International Nuclear Information System (INIS)

    Narayani, K.; Pant, A.D.; Bhosle, Nitin; Anilkumar, S.; Singh, Rajvir; Pradeepkumar, K.S.

    2014-01-01

    Gamma ray spectrometry is one of the well known analytical techniques for environmental radioactivity measurements. Gamma spectrometer based on NaI(Tl) scintillation detectors is very popular since it offers high efficiency, low cost and case in handling. The poor energy resolution of the NaI(TI) detector is the major disadvantage making tile analysis of complex gamma ray spectra difficult. Least square method or the full spectrum analysis method is widely used for the analysis of complex spectra from scintillation detectors. The main requirement of this method is that the individual standard spectra of all nuclides expected in the complex spectrum in the same measurement geometry must be available. It is not always possible and feasible to have all the standards of nuclides in the desired geometry. A methodology based on the use of simulated standard spectra generated by Monte Carlo technique was proposed for analysis of complex spectra of nuclides. In the present work, for the analysis of 238 U, 233 Th and 40 K in soil samples, the same methodology was applied by using the simulated standard spectra in soil matrix. The details of the simulation method and results analysis of 238 U, 232 Th and 40 K in environmental samples are discussed in this paper

  7. ANSI/ASHRAE/IESNA Standard 90.1-2007 Final Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Richman, Eric E.; Winiarski, David W.

    2011-05-01

    The United States (U.S.) Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2007 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2004. The final analysis considered each of the 44 addenda to ANSI/ASHRAE/IESNA Standard 90.1-2004 that were included in ANSI/ASHRAE/IESNA Standard 90.1-2007. All 44 addenda processed by ASHRAE in the creation of Standard 90.1-2007 from Standard 90.1-2004 were reviewed by DOE, and their combined impact on a suite of 15 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 44 addenda, 9 were preliminarily determined to have measureable and quantifiable impact.

  8. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  9. Solar Cell Calibration and Measurement Techniques

    Science.gov (United States)

    Bailey, Sheila; Brinker, Dave; Curtis, Henry; Jenkins, Phillip; Scheiman, Dave

    2004-01-01

    The increasing complexity of space solar cells and the increasing international markets for both cells and arrays has resulted in workshops jointly sponsored by NASDA, ESA and NASA. These workshops are designed to obtain international agreement on standardized values for the AMO spectrum and constant, recommend laboratory measurement practices and establish a set of protocols for international comparison of laboratory measurements. A working draft of an ISO standard, WD15387, "Requirements for Measurement and Calibration Procedures for Space Solar Cells" was discussed with a focus on the scope of the document, a definition of primary standard cell, and required error analysis for all measurement techniques. Working groups addressed the issues of Air Mass Zero (AMO) solar constant and spectrum, laboratory measurement techniques, and te international round robin methodology. A summary is presented of the current state of each area and the formulation of the ISO document.

  10. Nonlinear analysis techniques of block masonry walls in nuclear power plants

    International Nuclear Information System (INIS)

    Hamid, A.A.; Harris, H.G.

    1986-01-01

    Concrete masonry walls have been used extensively in nuclear power plants as non-load bearing partitions serving as pipe supports, fire walls, radiation shielding barriers, and similar heavy construction separations. When subjected to earthquake loads, these walls should maintain their structural integrity. However, some of the walls do not meet design requirements based on working stress allowables. Consequently, utilities have used non-linear analysis techniques, such as the arching theory and the energy balance technique, to qualify such walls. This paper presents a critical review of the applicability of non-linear analysis techniques for both unreinforced and reinforced block masonry walls under seismic loading. These techniques are critically assessed in light of the performance of walls from limited available test data. It is concluded that additional test data are needed to justify the use of nonlinear analysis techniques to qualify block walls in nuclear power plants. (orig.)

  11. Portable optical frequency standard based on sealed gas-filled hollow-core fiber using a novel encapsulation technique

    DEFF Research Database (Denmark)

    Triches, Marco; Brusch, Anders; Hald, Jan

    2015-01-01

    A portable stand-alone optical frequency standard based on a gas-filled hollow-core photonic crystal fiber is developed to stabilize a fiber laser to the 13C2H2 P(16) (ν1 + ν3) transition at 1542 nm using saturated absorption. A novel encapsulation technique is developed to permanently seal...

  12. Limitations of transient power loads on DEMO and analysis of mitigation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Maviglia, F., E-mail: francesco.maviglia@euro-fusion.org [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Federici, G. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Strohmayer, G. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Wenninger, R. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Bachmann, C. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Albanese, R. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Ambrosino, R. [Consorzio CREATE University Napoli Parthenope, Naples (Italy); Li, M. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Loschiavo, V.P. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); You, J.H. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Zani, L. [CEA, IRFM, F-13108 St Paul-Lez-Durance (France)

    2016-11-01

    Highlights: • A parametric thermo-hydraulic analysis of the candidate DEMO divertor is presented. • The operational space assessment is presented under static and transient heat loads. • Strike points sweeping is analyzed as a divertor power exhaust mitigation technique. • Results are presented on sweeping installed power required, AC losses and thermal fatigue. - Abstract: The present European standard DEMO divertor target technology is based on a water-cooled tungsten mono-block with a copper alloy heat sink. This paper presents the assessment of the operational space of this technology under static and transient heat loads. A transient thermo-hydraulic analysis was performed using the code RACLETTE, which allowed a broad parametric scan of the target geometry and coolant conditions. The limiting factors considered were the coolant critical heat flux (CHF), and the temperature limits of the materials. The second part of the work is devoted to the study of the plasma strike point sweeping as a mitigation technique for the divertor power exhaust. The RACLETTE code was used to evaluate the impact of a large range of sweeping frequencies and amplitudes. A reduced subset of cases, which complied with the constraints, was benchmarked with a 3D FEM model. A reduction of the heat flux to the coolant, up to a factor ∼4, and lower material temperatures were found for an incident heat flux in the range (15–30) MW/m{sup 2}. Finally, preliminary assessments were performed on the installed power required for the sweeping, the AC losses in the superconductors and thermal fatigue analysis. No evident show stoppers were found.

  13. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  14. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  15. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  16. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    Science.gov (United States)

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  17. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles

    Science.gov (United States)

    Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey

    2013-09-01

    Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the

  19. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  20. Least squares analysis of fission neutron standard fields

    International Nuclear Information System (INIS)

    Griffin, P.J.; Williams, J.G.

    1997-01-01

    A least squares analysis of fission neutron standard fields has been performed using the latest dosimetry cross sections. Discrepant nuclear data are identified and adjusted spectra for 252 Cf spontaneous fission and 235 U thermal fission fields are presented

  1. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  2. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  3. [Standardization of Blastocystis hominis diagnosis using different staining techniques].

    Science.gov (United States)

    Eymael, Dayane; Schuh, Graziela Maria; Tavares, Rejane Giacomelli

    2010-01-01

    The present study was carried out from March to May 2008, with the aim of evaluating the effectiveness of different techniques for diagnosing Blastocystis hominis in a sample of the population attended at the Biomedicine Laboratory of Feevale University, Novo Hamburgo, Rio Grande do Sul. On hundred feces samples from children and adults were evaluated. After collection, the samples were subjected to the techniques of spontaneous sedimentation (HPJ), sedimentation in formalin-ether (Ritchie) and staining by means of Gram and May-Grünwald-Giemsa (MGG). The presence of Blastocystis hominis was observed in 40 samples, when staining techniques were used (MGG and Gram), while sedimentation techniques were less efficient (32 positive samples using the Ritchie technique and 20 positive samples using the HPJ technique). Our results demonstrate that HPJ was less efficient than the other methods, thus indicating the need to include laboratory techniques that enable parasite identification on a routine basis.

  4. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  5. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  6. Lightening protection, techniques, applied codes and standards. Vol. 4

    International Nuclear Information System (INIS)

    Mahmoud, M.; Shaaban, H.; Lamey, S.

    1996-01-01

    Lightening is the only natural disaster that protection against is highly effective. Therefore for the safety of critical installations specifically nuclear, an effective lightening protection system (LPS) is required. The design and installation of LPS's have been addressed by many international codes and standards. In this paper, the various LPS's are discussed and compared, including radioactive air terminals, ionizing air terminals, and terminals equipped with electrical trigging devices. Also, the so-called dissipation array systems are discussed and compared to other systems technically and economically. Moreover, the available international codes and standards related to the lightening protection are discussed. such standards include those published by the national fire protection association (NFPA), lightening protection institute (LPI), underwriters laboratories (UL), and british standards Finally, the possibility of developing an egyptian national standards is discussed

  7. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.

  8. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques.

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent.

  9. Toward a standard method for determination of waterborne radon

    International Nuclear Information System (INIS)

    Vitz, E.

    1990-01-01

    When the USEPA specifies the maximum contaminant level (MCL) for any contaminant, a standard method for analysis must be simultaneously stipulated. Promulgation of the proposed MCL and standard method for radon in drinking water is expected by early next year, but a six-month comment period and revision will precede final enactment. The standard method for radon in drinking water will probably specify that either the Lucas cell technique or liquid scintillation spectrometry be used. This paper reports results which support a standard method with the following features: samples should be collected by an explicitly stated technique to control degassing, in glass vials with or without scintillation cocktail, and possibly in duplicate; samples should be measured by liquid scintillation spectroscopy in a specified energy window', in a glass vial with particular types of cocktails; radium standards should be prepared with controlled quench levels and specified levels of carriers, but radium-free controls prepared by a specified method should be used in interlaboratory comparison studies

  10. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  11. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  12. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  13. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  14. Extended substitution-diffusion based image cipher using chaotic standard map

    Science.gov (United States)

    Kumar, Anil; Ghose, M. K.

    2011-01-01

    This paper proposes an extended substitution-diffusion based image cipher using chaotic standard map [1] and linear feedback shift register to overcome the weakness of previous technique by adding nonlinearity. The first stage consists of row and column rotation and permutation which is controlled by the pseudo-random sequences which is generated by standard chaotic map and linear feedback shift register, second stage further diffusion and confusion is obtained in the horizontal and vertical pixels by mixing the properties of the horizontally and vertically adjacent pixels, respectively, with the help of chaotic standard map. The number of rounds in both stage are controlled by combination of pseudo-random sequence and original image. The performance is evaluated from various types of analysis such as entropy analysis, difference analysis, statistical analysis, key sensitivity analysis, key space analysis and speed analysis. The experimental results illustrate that performance of this is highly secured and fast.

  15. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  16. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  17. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  18. Development of international standards for surface analysis by ISO technical committee 201 on surface chemical analysis

    International Nuclear Information System (INIS)

    Powell, C.J.

    1999-01-01

    Full text: The International Organization for Standardization (ISO) established Technical Committee 201 on Surface Chemical Analysis in 1991 to develop documentary standards for surface analysis. ISO/TC 201 met first in 1992 and has met annually since. This committee now has eight subcommittees (Terminology, General Procedures, Data Management and Treatment, Depth Profiling, AES, SIMS, XPS, and Glow Discharge Spectroscopy (GDS)) and one working group (Total X-Ray Fluorescence Spectroscopy). Each subcommittee has one or more working groups to develop standards on particular topics. Australia has observer-member status on ISO/TC 201 and on all ISO/TC 201 subcommittees except GDS where it has participator-member status. I will outline the organization of ISO/TC 201 and summarize the standards that have been or are being developed. Copyright (1999) Australian X-ray Analytical Association Inc

  19. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  20. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  1. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  2. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  3. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  4. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  5. Newly developed standard reference materials for organic contaminant analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poster, D.; Kucklick, J.; Schantz, M.; Porter, B.; Wise, S. [National Inst. of Stand. and Technol., Gaithersburg, MD (USA). Center for Anal. Chem.

    2004-09-15

    The National Institute of Standards and Technology (NIST) has issued a number of Standard Reference Materials (SRM) for specified analytes. The SRMs are biota and biological related materials, sediments and particle related SRMs. The certified compounds for analysis are polychlorinated biphenyls (PCB), polycylic aromatic hydrocarbons (PAH) and their nitro-analogues, chlorinated pesticides, methylmercury, organic tin compounds, fatty acids, polybrominated biphenyl ethers (PBDE). The authors report on origin of materials and analytic methods. (uke)

  6. A comparative examination of several techniques for the routine determination of mercury in biological samples by neutron activation analysis

    International Nuclear Information System (INIS)

    Faanhof, A.; Das, H.A.

    1978-01-01

    A comparative examination of the most important techniques for the separation of mercury from irradiated biological material was made. Procedures for routine analysis and results for standard materials are given. Activation was performed at a thermal neutron flux of approximately 5x10 12 nxcm -2 xs -1 during ( 3 ) 2 offers a convenient solution to this problem. The variation of the neutron flux with the irradiation position can be measured by the application of thin iron rings as flux monitors. Losses of mercury due to uptake in the wall of the irradiation containers are negligible. The most powerful destruction technique for large samples is that based on a stainless-steel bomb. (T. I.)

  7. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  8. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Avila P, P.

    1990-03-01

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  9. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  10. Factors influencing incidence of acute grade 2 morbidity in conformal and standard radiation treatment of prostate cancer

    International Nuclear Information System (INIS)

    Hanks, Gerald E.; Schultheiss, Timothy E.; Hunt, Margie A.; Epstein, Barry

    1995-01-01

    Purpose: The fundament hypothesis of conformal radiation therapy is that tumor control can be increased by using conformal treatment techniques that allow a higher tumor dose while maintaining an acceptable level of complications. To test this hypothesis, it is necessary first to estimate the incidence of morbidity for both standard and conformal fields. In this study, we examine factors that influence the incidence of acute grade 2 morbidity in patients treated with conformal and standard radiation treatment for prostate cancer. Methods and Materials: Two hundred and forty-seven consecutive patients treated with conformal technique are combined with and compared to 162 consecutive patients treated with standard techniques. The conformal technique includes special immobilization by a cast, careful identification of the target volume in three dimensions, localization of the inferior border of the prostate using the retrograde urethrogram, and individually shaped portals that conform to the Planning Target Volume (PTV). Univariate analysis compares differences in the incidence of RTOG-EORTC grade two acute morbidity by technique, T stage, age, irradiated volume, and dose. Multivariate logistic regression includes these same variables. Results: In nearly all categories, the conformal treatment group experienced significantly fewer acute grade 2 complications than the standard treatment group. Only volume (prostate ± whole pelvis) and technique (conformal vs. standard) were significantly related to incidence of morbidity on multivariate analysis. When dose is treated as a continuous variable (rather than being dichotomized into two levels), a trend is observed on multivariate analysis, but it does not reach significant levels. The incidence of acute grade 2 morbidity in patients 65 years or older is significantly reduced by use of the conformal technique. Conclusion: The conformal technique is associated with fewer grade 2 acute toxicities for all patients. This

  11. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  12. Robotic and endoscopic transaxillary thyroidectomies may be cost prohibitive when compared to standard cervical thyroidectomy: a cost analysis.

    Science.gov (United States)

    Cabot, Jennifer C; Lee, Cho Rok; Brunaud, Laurent; Kleiman, David A; Chung, Woong Youn; Fahey, Thomas J; Zarnegar, Rasa

    2012-12-01

    This study presents a cost analysis of the standard cervical, gasless transaxillary endoscopic, and gasless transaxillary robotic thyroidectomy approaches based on medical costs in the United States. A retrospective review of 140 patients who underwent standard cervical, transaxillary endoscopic, or transaxillary robotic thyroidectomy at 2 tertiary centers was conducted. The cost model included operating room charges, anesthesia fee, consumables cost, equipment depreciation, and maintenance cost. Sensitivity analyses assessed individual cost variables. The mean operative times for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were 121 ± 18.9, 185 ± 26.0, and 166 ± 29.4 minutes, respectively. The total cost for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were $9,028 ± $891, $12,505 ± $1,222, and $13,670 ± $1,384, respectively. Transaxillary approaches were significantly more expensive than the standard cervical technique (standard cervical/transaxillary endoscopic, P cost when transaxillary endoscopic operative time decreased to 111 minutes and transaxillary robotic operative time decreased to 68 minutes. Increasing the case load did not resolve the cost difference. Transaxillary endoscopic and transaxillary robotic thyroidectomies are significantly more expensive than the standard cervical approach. Decreasing operative times reduces this cost difference. The greater expense may be prohibitive in countries with a flat reimbursement schedule. Copyright © 2012 Mosby, Inc. All rights reserved.

  13. Analysis of Angolan human hair samples by the k0-NAA technique on the Dalat research reactor

    International Nuclear Information System (INIS)

    Lemos, P.C.D; Ho Manh Dung; Cao Dong Vu; Nguyen Thi Sy; Nguyen Mong Sinh

    2006-01-01

    There is personal difference in concentrations of trace elements in the human hair according to human life or history such as occupation, sex, age, food, habit, social condition and so on. It is also found that the individual's deviation of elemental concentrations reflecting the degree of environmental pollutants exposure to the human body, intakes of food and metabolism. The k 0 -standardization method of neutron activation analysis (k 0 -NAA) on research reactor has been recommended by WHO and IAEA as a main analytical technique with the advantages of sensitivity, precision, accuracy, multi-element and routine. This report presents the results of determination of about 20 elements in 23 human hair samples, which have been collected from different places in Angola by using k 0 -NAA technique on Dalat nuclear research reactor. Accuracy of the method was ascertained by analysis of two human hair certified reference materials (CRMs), i.e. NIES-5 and GBW-09101 and assessed by the deviation of experiment to certified values generally within 10% and U-score values mostly lower 2. (author)

  14. Determination of trace quantities of uranium in rocks mass spectrometric isotope dilution technique

    International Nuclear Information System (INIS)

    Kakazu, Mauricio Hiromitu

    1980-01-01

    A detailed experimental investigation on the thermionic emission of uranium deposited on a single flat type rhenium filament has been carried out. The study was aimed at determining the influence of various forms of deposition on the emission sensitivity and thermal stability of U + , UO + and UO 2 + ions. Based on these investigations, a technique, involving an addition of a small quantity of colloidal suspension of graphite on top of the uranyl nitrate sample deposited, was chosen because of its higher, emission sensitivity for uranium metal ions. The experimental parameters of the technique were optimised and the technique was employed in the determination of trace quantities of uranium in rock samples using mass spectrometric isotope dilution method. For the mass spectrometric isotope dilution analysis National Bureau of Standards uranium isotopic standard NBS-U 970 was employed as a tracer, where as the mass discrimination effect in the uranium isotope analysis was corrected using the uranium isotopic standard NBS-U500. Uranium was determined in each of the seven granite samples from Wyoming, USA and two USGS standard rocks. The precision of the analysis was found to be ±1% . The uranium values obtained on the rock samples were compared with the analyses of other investigators. Influence of the sample splitting on the uranium analysis was discussed in the light of the analytical results obtained.(author)

  15. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  16. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  17. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  18. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  19. Adaptive, Small-Rotation-Based, Corotational Technique for Analysis of 2D Nonlinear Elastic Frames

    Directory of Open Access Journals (Sweden)

    Jaroon Rungamornrat

    2014-01-01

    Full Text Available This paper presents an efficient and accurate numerical technique for analysis of two-dimensional frames accounted for both geometric nonlinearity and nonlinear elastic material behavior. An adaptive remeshing scheme is utilized to optimally discretize a structure into a set of elements where the total displacement can be decomposed into the rigid body movement and one possessing small rotations. This, therefore, allows the force-deformation relationship for the latter part to be established based on small-rotation-based kinematics. Nonlinear elastic material model is integrated into such relation via the prescribed nonlinear moment-curvature relationship. The global force-displacement relation for each element can be derived subsequently using corotational formulations. A final system of nonlinear algebraic equations along with its associated gradient matrix for the whole structure is obtained by a standard assembly procedure and then solved numerically by Newton-Raphson algorithm. A selected set of results is then reported to demonstrate and discuss the computational performance including the accuracy and convergence of the proposed technique.

  20. Standards guide for space and earth sciences computer software

    Science.gov (United States)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  1. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  2. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  3. Preliminary results of standard quantitative analysis by ED-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A., E-mail: alellara@hotmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Fisica; Denyak, Valeriy, E-mail: denyak@gmail.com [Instituto de Pesquisa Pele Pequeno Principe (IPPP), Curitiba, PR (Brazil)

    2013-07-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step.

  4. Wavelength standards in the infrared

    CERN Document Server

    Rao, KN

    2012-01-01

    Wavelength Standards in the Infrared is a compilation of wavelength standards suitable for use with high-resolution infrared spectrographs, including both emission and absorption standards. The book presents atomic line emission standards of argon, krypton, neon, and xenon. These atomic line emission standards are from the deliberations of Commission 14 of the International Astronomical Union, which is the recognized authority for such standards. The text also explains the techniques employed in determining spectral positions in the infrared. One of the techniques used includes the grating con

  5. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  6. Application of a microwave-based desolvation system for multi-elemental analysis of wine by inductively coupled plasma based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Grindlay, Guillermo [Department of Analytical Chemistry, Nutrition and Food Sciences, University of Alicante, P.O. Box 99, 03080 Alicante (Spain)], E-mail: guillermo.grindlay@ua.es; Mora, Juan; Maestre, Salvador; Gras, Luis [Department of Analytical Chemistry, Nutrition and Food Sciences, University of Alicante, P.O. Box 99, 03080 Alicante (Spain)

    2008-11-23

    Elemental wine analysis is often required from a nutritional, toxicological, origin and authenticity point of view. Inductively coupled plasma based techniques are usually employed for this analysis because of their multi-elemental capabilities and good limits of detection. However, the accurate analysis of wine samples strongly depends on their matrix composition (i.e. salts, ethanol, organic acids) since they lead to both spectral and non-spectral interferences. To mitigate ethanol (up to 10% w/w) related matrix effects in inductively coupled plasma atomic emission spectrometry (ICP-AES), a microwave-based desolvation system (MWDS) can be successfully employed. This finding suggests that the MWDS could be employed for elemental wine analysis. The goal of this work is to evaluate the applicability of the MWDS for elemental wine analysis in ICP-AES and inductively coupled plasma mass spectrometry (ICP-MS). For the sake of comparison a conventional sample introduction system (i.e. pneumatic nebulizer attached to a spray chamber) was employed. Matrix effects, precision, accuracy and analysis throughput have been selected as comparison criteria. For ICP-AES measurements, wine samples can be directly analyzed without any sample treatment (i.e. sample dilution or digestion) using pure aqueous standards although internal standardization (IS) (i.e. Sc) is required. The behaviour of the MWDS operating with organic solutions in ICP-MS has been characterized for the first time. In this technique the MWDS has shown its efficiency to mitigate ethanol related matrix effects up to concentrations of 1% (w/w). Therefore, wine samples must be diluted to reduce the ethanol concentration up to this value. The results obtained have shown that the MWDS is a powerful device for the elemental analysis of wine samples in both ICP-AES and ICP-MS. In general, the MWDS has some attractive advantages for elemental wine analysis when compared to a conventional sample introduction system such

  7. Application of a microwave-based desolvation system for multi-elemental analysis of wine by inductively coupled plasma based techniques

    International Nuclear Information System (INIS)

    Grindlay, Guillermo; Mora, Juan; Maestre, Salvador; Gras, Luis

    2008-01-01

    Elemental wine analysis is often required from a nutritional, toxicological, origin and authenticity point of view. Inductively coupled plasma based techniques are usually employed for this analysis because of their multi-elemental capabilities and good limits of detection. However, the accurate analysis of wine samples strongly depends on their matrix composition (i.e. salts, ethanol, organic acids) since they lead to both spectral and non-spectral interferences. To mitigate ethanol (up to 10% w/w) related matrix effects in inductively coupled plasma atomic emission spectrometry (ICP-AES), a microwave-based desolvation system (MWDS) can be successfully employed. This finding suggests that the MWDS could be employed for elemental wine analysis. The goal of this work is to evaluate the applicability of the MWDS for elemental wine analysis in ICP-AES and inductively coupled plasma mass spectrometry (ICP-MS). For the sake of comparison a conventional sample introduction system (i.e. pneumatic nebulizer attached to a spray chamber) was employed. Matrix effects, precision, accuracy and analysis throughput have been selected as comparison criteria. For ICP-AES measurements, wine samples can be directly analyzed without any sample treatment (i.e. sample dilution or digestion) using pure aqueous standards although internal standardization (IS) (i.e. Sc) is required. The behaviour of the MWDS operating with organic solutions in ICP-MS has been characterized for the first time. In this technique the MWDS has shown its efficiency to mitigate ethanol related matrix effects up to concentrations of 1% (w/w). Therefore, wine samples must be diluted to reduce the ethanol concentration up to this value. The results obtained have shown that the MWDS is a powerful device for the elemental analysis of wine samples in both ICP-AES and ICP-MS. In general, the MWDS has some attractive advantages for elemental wine analysis when compared to a conventional sample introduction system such

  8. Appendix 1: Analytical Techniques (Online supplementary material ...

    Indian Academy of Sciences (India)

    HP

    Further details of analytical techniques are given in http://www.actlabs.com. Zircon U–Pb dating and trace element analysis. The zircons were separated using standard procedures including crushing (in iron mortar and pestle), sieving (375 to 75 micron), tabling, heavy liquid separation (bromoform and methylene iodide) ...

  9. On the evaluation of micromatter thin standards by RBS

    International Nuclear Information System (INIS)

    Ionescu, M.; Stelcer, E.; Hawas, O.; Siegele, R.; Cohen, D.; Linch, D.; Sarbutt, A.; Garton, D.

    2005-01-01

    Thin film standards are routinely used in PIXE and PIGE techniques for elemental analysis of particulates present in air samples, collected on Teflon filters. A number of parameters such as thickness, homogeneity and the type and amount of impurities present in the standards are crucial in order to perform high accuracy measurements. In this paper we report the use of RBS on the new STAR 2MV accelerator for characterisation of thin film standards obtained commercially. All standards were produced by MicroMatter Co. on polymer substrates, using a room temperature evaporation method. (author). 4 refs., 5 figs., 1 tab

  10. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  11. Tribological analysis of nano clay/epoxy/glass fiber by using Taguchi’s technique

    International Nuclear Information System (INIS)

    Senthil Kumar, M.S.; Mohana Sundara Raju, N.; Sampath, P.S.; Vivek, U.

    2015-01-01

    Highlights: • To study the tribological property of modified epoxy with and without E glass fiber. • To analyze the tribological property of specimens by Taguchi’s technique and ANOVA. • To investigate the surface morphology of test specimens with SEM. - Abstract: In this work, a detailed analysis was performed to profoundly study the tribological property of various nano clay (Cloisite 25A) loaded epoxy, with and without inclusion of E-glass fiber using Taguchi’s technique. For this purpose, the test samples were prepared according to the ASTM standard, and the test was carried out with the assistance of pin-on-disk machine. To proceed further, L 25 orthogonal array was constructed to evaluate the tribological property with four control variables such as filler content, normal load, sliding velocity and sliding distance at each level. The results indicated that the combination of factors greatly influenced the process to achieve the minimum wear and coefficient of friction. Overall, the experiment results depicted least wear and friction coefficient for fiber reinforced laminates. In the same way, appreciable wear and friction coefficient was noted for without fiber laminates. Additionally, the SN ratio results too exhibited the similar trend. Moreover, ANOVA analysis revealed that the fiber inclusion on laminates has lesser contribution on coefficient of friction and wear when compared to without fiber laminates. At last, the microstructure behavior of the test samples was investigated with an assistance of Scanning Electron Microscope (SEM) to analyze the surface morphology

  12. Use of Atomic and Nuclear Techniques in Elemental and Isotopic Analysis

    International Nuclear Information System (INIS)

    2008-01-01

    This book is divided into four chapters which were presented by six authors of the best Arab specialists who have used the atomic and nuclear techniques for a long time and recognized their importance and capabilities in scientific researches. Atomic and Nuclear techniques are very successful in the field of analysis because they are the only way to proceed the analysis process with the requested accuracy and they are the cheapest at the same time. A number of these techniques were collected in this book on the basis of their accuracy and the abundance of using them in the analysis of material components, specially when these elements exist with insignificant percentage as in the case of poisons science, archaeology, nutrition, medicine and other applications.

  13. Analysis of gold in jewellery articles by energy dispersive XRF

    International Nuclear Information System (INIS)

    Meor Yusoff Meor Sulaiman; Latifah Amin

    2001-01-01

    The value of a precious metal article is much related to its fineness. For gold assay, conventional fire assay technique has been used as the standard technique for more than 500 years. Alternative modern techniques like energy dispersive x-ray fluorescence can also be used in the determination of gold purity. Advantages of this technique compared to the conventional method including non-destructive analysis, does not use any toxic or hazardous chemicals, automatic computer control and is user friendly, requires minimum number of personnel, shorter analysis time and able to determine associated elements in the metal. Analysis was performed on different sizes and purity of gold. Comparison results for the analysis using different reference standards show small differences between technique and its certified value. The technique also gives small standard deviation value in its repeatability test. (Author)

  14. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  15. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  16. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  17. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  18. Analysis and Comparison of the Antioxidant Component of Portulaca Oleracea Leaves Obtained by Different Solid-Liquid Extraction Techniques

    Science.gov (United States)

    Conte, Esterina

    2017-01-01

    Portulaca oleracea is a wild plant pest of orchards and gardens, but is also an edible vegetable rich in beneficial nutrients. It possesses many antioxidant properties due to the high content of vitamins, minerals, omega-3 essential fatty acids and other healthful compounds; therefore, the intake of purslane and/or its bioactive compounds could help to improve the health and function of the whole human organism. Accordingly, in this work it was analyzed and compared to the extractive capacity of the antioxidant component of purslane leaves obtained by solid-liquid extraction techniques such as: hot-maceration, maceration with ultrasound, rapid solid-liquid dynamic extraction using the Naviglio extractor, and a combination of two techniques (mix extraction). The chromatographic analysis by High Performance Liquid Chromatography (HPLC) of the methanolic extract of dried purslane leaves allowed the identification of various polyphenolic compounds for comparison with the standards. In addition, the properties of the different extracts were calculated on dry matter and the antioxidant properties of the total polyphenol components analyzed by the DPPH (2,2-diphenyl-1-picrylhydrazyl) assay. The results showed that mix extraction was the most efficient compared to other techniques. In fact, it obtained a quantity of polyphenols amounting to 237.8 mg Gallic Acid Equivalents (GAE)/100 g of fresh weight, while in other techniques, the range varied from 60–160 mg GAE/100 g fresh weight. In addition, a qualitative analysis by Liquid Chromatography-Tandem Mass Spectrometry (LC/MS/MS) of the phenolic compounds present in the purslane leaves examined was carried out. The compounds were identified by comparison of their molecular weight, fragmentation pattern and retention time with those of standards, using the “Multiple Reaction Monitoring” mode (MRM). Therefore, this study allowed the re-evaluation of a little-known plant that possesses as its beneficial properties, a

  19. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  20. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  1. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  2. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  3. Numerical analysis of standard and modified osteosynthesis in long bone fractures treatment.

    Science.gov (United States)

    Sisljagić, Vladimir; Jovanović, Savo; Mrcela, Tomislav; Radić, Radivoje; Selthofer, Robert; Mrcela, Milanka

    2010-03-01

    The fundamental problem in osteoporotic fracture treatment is significant decrease in bone mass and bone tissue density resulting in decreased firmness and elasticity of osteoporotic bone. Application of standard implants and standard surgical techniques in osteoporotic bone fracture treatment makes it almost impossible to achieve stable osteosynthesis sufficient for early mobility, verticalization and load. Taking into account the form and the size of the contact surface as well as distribution of forces between the osteosynthetic materials and the bone tissue numerical analysis showed advantages of modified osteosynthesis with bone cement filling in the screw bed. The applied numerical model consisted of three sub-models: 3D model from solid elements, 3D cross section of the contact between the plate and the bone and the part of 3D cross section of the screw head and body. We have reached the conclusion that modified osteosynthesis with bone cement resulted in weaker strain in the part of the plate above the fracture fissure, more even strain on the screws, plate and bone, more even strain distribution along all the screws' bodies, significantly greater strain in the part of the screw head opposite to the fracture fissure, firm connection of the screw head and neck and the plate hole with the whole plate and more even bone strain around the screw.

  4. Standard Guide for Wet Sieve Analysis of Ceramic Whiteware Clays

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This guide covers the wet sieve analysis of ceramic whiteware clays. This guide is intended for use in testing shipments of clay as well as for plant control tests. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  5. Methodological aspects and development of techniques for neutron activation analysis of microcomponents in materials of geologic origin

    International Nuclear Information System (INIS)

    Cohen, I.M.

    1982-01-01

    Some aspects of the activation analysis methodology applied to geological samples activated in nuclear reactors were studied, and techniques were developed for the determination of various elements in different types of matrixes, using gamma spectrometry for the measurement of the products. The consideration of the methodological aspects includes the study of the working conditions, the preparation of samples and standards, irradiations, treatment of the irradiated material, radiochemical separation and measurement. Experiments were carried out on reproducibility and errors in relation to the behaviour of the measurement equipment and that of the methods of area calculation (total area, Covell and Wasson), as well as on the effects of geometry variations on the results of the measurements, the RA-3 reactors's flux variations, and the homogeneity of the samples and standards. Also studied were: the selection of the conditions of determination, including the irradiation and decay times; the irradiation with thermal and epithermal neutrons; the measurement with the use of absorbers, and the resolution of complex peaks. Both non-destructive and radiochemical separation techniques were developed for the analysis of 5 types of geological materials. These methods were applied to the following determinations: a) In, Cd, Mn, Ga and Co in blende; b) La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in fluorites; c) La, Ca, Eu, Tb, Yb, Se and Th in barites and celestites; d) Cu and Zn in soils. The spectral interferences or those due to nuclear reactions were studied and evaluated by mathematical calculation. (M.E.L.) [es

  6. 1998 Annual Study Report. Standards development of chemical analysis and non destructive inspection methods for pure titanium metals; 1998 nendo seika hokokusho. Jun chitan no shiken hyoka hoho no hyojunka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This study was conducted to standardize the chemical analysis and non-destructive inspection methods for pure titanium metals of industrial grade. These methods are among those serving bases for international standardization of products. The chemical analysis is aimed at quantitative analysis of trace impurities, in particular, present in pure titanium metals of industrial grade by developing and standardizing the inductively coupled plasma atomic emission spectroscopy, known for its low detectable limit, and, at the same time, spark and glow discharged atomic emission spectrometry as the improved routine analysis methods. These methods, although being used by, e.g., steel makers, have not been standardized because the effects of titanium-peculiar matrix are not elucidated. The non-destructive testing is aimed at standardization of the techniques useful for automatic production lines. More concretely, these include optical methods aided by a laser or CCD camera for plate surface defect inspection, ultrasonic methods for plate internal defect inspection, and pressure differential methods for air-tightness of welded pipes. They have not been used yet for automatic production lines. (NEDO)

  7. Iterative categorization (IC): a systematic technique for analysing qualitative data

    Science.gov (United States)

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  8. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  9. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  10. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  11. Application of energy dispersive x-ray techniques for water analysis

    International Nuclear Information System (INIS)

    Funtua, I. I.

    2000-07-01

    Energy dispersive x-ray fluorescence (EDXRF) is a class of emission spectroscopic techniques that depends upon the emission of characteristic x-rays following excitation of the atomic electron energy levels by tube or isotopic source x-rays. The technique has found wide range of applications that include determination of chemical elements of water and water pollutants. Three EDXRF systems, the isotopic source, secondary target and total reflection (TXRF) are available at the Centre for Energy research and Training. These systems have been applied for the analysis of sediments, suspensions, ground water, river and rainwater. The isotopic source is based on 55 Fe, 109 Cd and 241 Am excitations while the secondary target and the total reflection are utilizing a Mo x-ray tube. Sample preparation requirements for water analysis range from physical and chemical pre-concentration steps to direct analysis and elements from Al to U can be determined with these systems. The EDXRF techniques, TXRF in particular with its multielement capability, low detection limit and possibility of direct analysis for water have competitive edge over the traditional methods of atomic absorption and flame photometry

  12. Processing data collected from radiometric experiments by multivariate technique

    International Nuclear Information System (INIS)

    Urbanski, P.; Kowalska, E.; Machaj, B.; Jakowiuk, A.

    2005-01-01

    Multivariate techniques applied for processing data collected from radiometric experiments can provide more efficient extraction of the information contained in the spectra. Several techniques are considered: (i) multivariate calibration using Partial Least Square Regression and Artificial Neural Network, (ii) standardization of the spectra, (iii) smoothing of collected spectra were autocorrelation function and bootstrap were used for the assessment of the processed data, (iv) image processing using Principal Component Analysis. Application of these techniques is illustrated on examples of some industrial applications. (author)

  13. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  14. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  15. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  16. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  17. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  18. Establishing working standards of chromosome aberrations analysis for biological dosimetry

    International Nuclear Information System (INIS)

    Bui Thi Kim Luyen; Tran Que; Pham Ngoc Duy; Nguyen Thi Kim Anh; Ha Thi Ngoc Lien

    2015-01-01

    Biological dosimetry is an dose assessment method using specify bio markers of radiation. IAEA (International Atomic Energy Agency) and ISO (International Organization for Standardization) defined that dicentric chromosome is specify for radiation, it is a gold standard for biodosimetry. Along with the documents published by IAEA, WHO, ISO and OECD, our results of study on the chromosome aberrations induced by radiation were organized systematically in nine standards that dealing with chromosome aberration test and micronucleus test in human peripheral blood lymphocytes in vitro. This standard addresses: the reference dose-effect for dose estimation, the minimum detection levels, cell culture, slide preparation, scoring procedure for chromosome aberrations use for biodosimetry, the criteria for converting aberration frequency into absorbed dose, reporting of results. Following these standards, the automatic analysis devices were calibrated for improving biological dosimetry method. This standard will be used to acquire and maintain accreditation of the Biological Dosimetry laboratory in Nuclear Research Institute. (author)

  19. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  20. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  1. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  2. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  3. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  4. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Recommendations for a proposed standard for performing systems analysis

    International Nuclear Information System (INIS)

    LaChance, J.; Whitehead, D.; Drouin, M.

    1998-01-01

    In August 1995, the Nuclear Regulatory Commission (NRC) issued a policy statement proposing improved regulatory decisionmaking by increasing the use of PRA [probabilistic risk assessment] in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. A key aspect in using PRA in risk-informed regulatory activities is establishing the appropriate scope and attributes of the PRA. In this regard, ASME decided to develop a consensus PRA Standard. The objective is to develop a PRA Standard such that the technical quality of nuclear plant PRAs will be sufficient to support risk-informed regulatory applications. This paper presents examples recommendations for the systems analysis element of a PRA for incorporation into the ASME PRA Standard

  6. Tooth contact analysis of spur gears. Part 1-SAM analysis of standard gears

    Directory of Open Access Journals (Sweden)

    Creţu Spiridon

    2017-01-01

    Full Text Available The involute gears are sensitive to the misalignment of their axes which determines transmission errors and perturbations of pressures distributions along the tooth flank. The concentrated contacts in gears are no longer as Hertz type. A semi-analytical method was developed to find the contact area, pressures distribution and depth stresses state. The matrix of initial separations is found analytically for standard and non-standard spur gears. The presence of misalignment as well as the flank crowning and flank end relief are included in the numerical analysis process.

  7. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  8. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  9. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  10. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  11. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Hart, Reid; Athalye, Rahul A.; Rosenberg, Michael I.; Richman, Eric E.; Winiarski, David W.

    2014-03-01

    Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. When the U.S. Department of Energy (DOE) issues an affirmative determination on Standard 90.1, states are statutorily required to certify within two years that they have reviewed and updated the commercial provisions of their building energy code, with respect to energy efficiency, to meet or exceed the revised standard. This report provides a preliminary qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition).

  12. International cooperative analysis of standard substance, IAEA-0390

    International Nuclear Information System (INIS)

    Kawamoto, Keizo; Takada, Jitsuya; Moriyama, Hirotake; Akaboshi, Mitsuhiko

    1999-01-01

    Three kinds of algae (IAEA-0391, IAEA-0392 and IAEA-0393) were defined as the biological standard substance to monitor environmental pollution by Analytical Quality Control Service of IAEA (IAEA-AQCS). In this study, analysis of these standard substances were made using ICP-MS to compare with the results of simultaneously conducted radioactivation analysis (INAA). The respective cultures of the three algae were cooperatively prepared by IAEA-AQCS and microbial Institute of Czechoslovakia. After drying and sterilizing by Co-60 exposure, these samples were sent to KURRI. When the results from the experiment in KURRI were compared with the values recommended through statistical treatment of the data obtained by IAEA, these values of 5 elements, Fe, Cr, Mg, Mn and Na were well coincident for either of IAEA-0391, IAEA-0392 and IAEA-0393 and the values of As, Ca, Cd, Co, Cu, K and Zn were nearly coincident between them. Regarding Hg and La, the data from INAA and ICP-MS were very different from the recommended values of IAEA for either of samples. (M.N.)

  13. Selected Bibliography of the Nephrourology standard techniques

    International Nuclear Information System (INIS)

    1999-01-01

    In the mark of the first meeting of project coordinators ARCAL XXXVI a selected Bibliography is presented about standardization of technical of Nuclear Nephrourology .In this selection it found: radiopharmaceuticals used, quality control,dosimetry, obstruction, clearance and renal function paediatric aspects pielonephritis,Renovascular hypertension and renal transplant [es

  14. Nuclear techniques for on-line analysis in the mineral and energy industries

    International Nuclear Information System (INIS)

    Sowerby, B.D.; Watt, J.S.

    1994-01-01

    Nuclear techniques are the basis of many on-line analysis systems which are now widely used in the mineral and energy industries. Some of the systems developed by the CSIRO depend entirely on nuclear techniques; others use a combination of nuclear techniques and microwave, capacitance, or ultrasonic techniques. The continuous analysis and rapid response of these CSIRO systems has led to improved control of mining, processing and blending operations, with increased productivity valued at A$50 million per year to Australia, and $90 million per year world wide. This paper reviews developments in nuclear on-line analysis systems by the On-Line Analysis Group in CSIRO at Lucas Heights. Commercialised systems based on this work analyse mineral and coal slurries and determine the ash and moisture contents of coal and coke on conveyors. This paper also reviews two on-line nuclear analysis systems recently developed and licensed to industry, firstly for the determination of the mass flow rates of oil/water/gas mixtures in pipelines, and secondly for determination of the moisture, specific energy, ash and fouling index in low rank coals. 8 refs., 3 tabs., 4 figs

  15. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  16. SRAC: JAERI thermal reactor standard code system for reactor design and analysis

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Takano, Hideki; Horikami, Kunihiko; Ishiguro, Yukio; Kaneko, Kunio; Hara, Toshiharu.

    1983-01-01

    The SRAC (Standard Reactor Analysis Code) is a code system for nuclear reactor analysis and design. It is composed of neutron cross section libraries and auxiliary processing codes, neutron spectrum routines, a variety of transport, 1-, 2- and 3-D diffusion routines, dynamic parameters and cell burn-up routines. By making the best use of the individual code function in the SRAC system, the user can select either the exact method for an accurate estimate of reactor characteristics or the economical method aiming at a shorter computer time, depending on the purpose of study. The user can select cell or core calculation; fixed source or eigenvalue problem; transport (collision probability or Sn) theory or diffusion theory. Moreover, smearing and collapsing of macroscopic cross sections are separately done by the user's selection. And a special attention is paid for double heterogeneity. Various techniques are employed to access the data storage and to optimize the internal data transfer. Benchmark calculations using the SRAC system have been made extensively for the Keff values of various types of critical assemblies (light water, heavy water and graphite moderated systems, and fast reactor systems). The calculated results show good prediction for the experimental Keff values. (author)

  17. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    Science.gov (United States)

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P piezosurgery groups.The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data.Our meta-analysis indicates that although patients undergoing piezosurgery

  18. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  19. Analysis of rocks involving the x-ray diffraction, infrared and thermal gravimetric techniques

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Munir, N.

    1998-01-01

    Chemical analysis of rocks and minerals are usually obtained by a number of analytical techniques. The purpose of present work is to investigate the chemical composition of the rock samples and also to find that how far the results obtained by different instrumental methods are closely related. Chemical tests wee performed before using the instrumental techniques in order to determined the nature of these rocks. The chemical analysis indicated mainly the presence of carbonate and hence the carbonate nature of these rocks. The x-ray diffraction, infrared spectroscopy and thermal gravimetric analysis techniques were used for the determination of chemical composition of these samples. The results obtained by using these techniques have shown a great deal of similarities. (author)

  20. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  1. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  2. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  3. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Athalye, Rahul A.; Rosenberg, Michael I.; Xie, YuLong; Wang, Weimin; Hart, Philip R.; Zhang, Jian; Goel, Supriya; Mendon, Vrushali V.

    2014-09-04

    This report provides a final quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in improved energy efficiency in commercial buildings. The final analysis considered each of the 110 addenda to Standard 90.1-2010 that were included in Standard 90.1-2013. PNNL reviewed all addenda included by ASHRAE in creating Standard 90.1-2013 from Standard 90.1-2010, and considered their combined impact on a suite of prototype building models across all U.S. climate zones. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 110 total addenda, 30 were identified as having a measureable and quantifiable impact.

  4. Composition analysis of Ta-W alloy using NAA and EDXRF techniques

    International Nuclear Information System (INIS)

    Swain, K.K.; Remya Devi, P.S.; Chavan, Trupti A.; Verma, R.; Reddy, A.V.R.

    2015-01-01

    Tantalum-Tungsten (Ta-W) alloy is a high strength alloy and is used in corrosion resistant chemical process equipment's including heat exchangers, condensers, heating and cooling coils and reaction vessels. Ta-W alloy is also used as ion extraction plate during laser Isotope separation of uranium and hence the composition is critical for its optimal application. The composition of the alloy was determined by neutron activation analysis (NAA) and energy dispersive X-ray fluorescence spectrometry (EDXRF) techniques. Ta-W alloy sample was received from Nuclear Fuel Complex (NFC), Hyderabad. For NAA, samples (50 - 500 mg) were sealed in polyethylene. High purity Ta foil (30 - 40 mg) and W foil (10 - 20 mg) were packed and used as comparators. Samples and standards were irradiated in the graphite reflector position of Advanced Heavy Water Reactor Critical Facility (AHWR CF) reactor, BARC, Mumbai for 4 hours. After suitable decay period, radioactivity assay was carried out using a 45% relative efficiency high purity germanium (HPGe) detector coupled to MCA with 8 k conversion gain

  5. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  6. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  7. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  8. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  9. 3-D portal image analysis in clinical practice: an evaluation of 2-D and 3-D analysis techniques as applied to 30 prostate cancer patients

    International Nuclear Information System (INIS)

    Remeijer, Peter; Geerlof, Erik; Ploeger, Lennert; Gilhuijs, Kenneth; Herk, Marcel van; Lebesque, Joos V.

    2000-01-01

    Purpose: To investigate the clinical importance and feasibility of a 3-D portal image analysis method in comparison with a standard 2-D portal image analysis method for pelvic irradiation techniques. Methods and Materials: In this study, images of 30 patients who were treated for prostate cancer were used. A total of 837 imaged fields were analyzed by a single technologist, using automatic 2-D and 3-D techniques independently. Standard deviations (SDs) of the random, systematic, and overall variations, and the overall mean were calculated for the resulting data sets (2-D and 3-D), in the three principal directions (left-right [L-R], cranial-caudal [C-C], anterior-posterior [A-P]). The 3-D analysis included rotations as well. For the translational differences between the three data sets, the overall SD and overall mean were computed. The influence of out-of-plane rotations on the 2-D registration accuracy was determined by analyzing the difference between the 2-D and 3-D translation data as function of rotations. To assess the reliability of the 2-D and 3-D methods, the number of times the automatic match was manually adjusted was counted. Finally, an estimate of the workload was made. Results: The SDs of the random and systematic components of the rotations around the three orthogonal axes were 1.1 (L-R), 0.6 (C-C), 0.5 (A-P) and 0.9 (L-R), 0.6 (C-C), 0.8 (A-P) degrees, respectively. The overall mean rotation around the L-R axis was 0.7 deg., which deviated significantly from zero. Translational setup errors were comparable for 2-D and 3-D analysis (ranging from 1.4 to 2.2 mm SD and from 1.5 to 2.5 mm SD, respectively). The variation of the difference between the 2-D and 3-D translation data increased from 1.1 mm (SD) for zero rotations to 2.7 mm (SD) for out-of-plane rotations of 3 deg., due to a reduced 2-D registration accuracy for large rotations. The number of times the analysis was not considered acceptable and was manually adjusted was 44% for the 2-D

  10. A Strategy Modelling Technique for Financial Services

    OpenAIRE

    Heinrich, Bernd; Winter, Robert

    2004-01-01

    Strategy planning processes often suffer from a lack of conceptual models that can be used to represent business strategies in a structured and standardized form. If natural language is replaced by an at least semi-formal model, the completeness, consistency, and clarity of strategy descriptions can be drastically improved. A strategy modelling technique is proposed that is based on an analysis of modelling requirements, a discussion of related work and a critical analysis of generic approach...

  11. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  12. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  13. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  14. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  15. Proposal for inclusion of the risk based inspection technique in Regulatory Standard NR 13; Proposta de inclusao da tecnica de inspecao baseada em risco na Norma Regulamentadora NR 13

    Energy Technology Data Exchange (ETDEWEB)

    Esteves, Vinicius Teixeira; Lima, Marco Aurelio Oliveira [Det Norske Veritas Ltda. (DNV), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    In Brazil, the Regulatory Standard n. 13 (NR 13) establishes requirements for the inspection of boilers and pressure vessels which has main objective of preventing accidents with these types of equipment. Additionally, it has the Risk-Based Inspection (RBI) technique as an effective way to manage the mechanical integrity of various types of static mechanical equipment by through an inspection planning based on the risk factor. In this study, it is being proposed to include the RBI technique, in the NR 13, for the planning and definition of periods for the safety inspection of boilers and pressure vessels in order to promote an increase in the operational safety in process industries in Brazil. In this study it was carried out a critical analysis of NR 13 and RBI, and beyond that a bibliographic research of various international documents that relate the operational safety of pressurized equipment with the inspection activity, and the acceptability of RBI by governments, agencies and organizations around the world. It is considered that the inclusion and formal acceptance of RBI technique in the NR 13 must be accompanied by a rigorous control to avoid the 'trivialization' of its use and ensure the implementation rational, efficient and reliable. Finally, it was developed and suggested basic elements and minimum requirements to be inserted in the NR 13, to be attended, in order mandatory, by the companies that choose the implementation and use of the RBI technique as a tool for the planning of safety inspection of boilers and pressure vessels. It is concluded that the formal acceptance of the RBI technique in the NR 13 could aggregate much value to this standard, with regard to the prevention of accidents involving boilers or pressure vessels, and provide a technological jump to the companies that make use of RBI technique in Brazil. (author)

  16. Radiation Safety Analysis In The NFEC For Assessing Possible Implementation Of The ICRP-60 Standard

    International Nuclear Information System (INIS)

    Yowono, I.

    1998-01-01

    Radiation safety analysis of the 3 facilities in the nuclear fuel element center (NFEC) for assessing possible implementation of the ICRP-60 standard has been done. The analysis has covered the radiation dose received by workers, dose rate in the working area, surface contamination level, air contamination level and the level of radioactive gas release to the environment. The analysis has been based on BATAN regulation and ICRP-60 standard. The result of the analysis has showed that the highest radiation dose received has been found to be only around 15% of the set value in the ICRP-60 standard and only 6% of the set value in the BATAN regulation. Thus the ICRP-60 as radiation safety standard could be implemented without changing the laboratory design

  17. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  18. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  19. Standard test method for isotopic analysis of uranium hexafluoride by double standard single-collector gas mass spectrometer method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This is a quantitative test method applicable to determining the mass percent of uranium isotopes in uranium hexafluoride (UF6) samples with 235U concentrations between 0.1 and 5.0 mass %. 1.2 This test method may be applicable for the entire range of 235U concentrations for which adequate standards are available. 1.3 This test method is for analysis by a gas magnetic sector mass spectrometer with a single collector using interpolation to determine the isotopic concentration of an unknown sample between two characterized UF6 standards. 1.4 This test method is to replace the existing test method currently published in Test Methods C761 and is used in the nuclear fuel cycle for UF6 isotopic analyses. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro...

  20. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  1. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  2. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Directory of Open Access Journals (Sweden)

    Gaetano Luglio

    2015-06-01

    Conclusion: Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon.

  3. Sleep disordered breathing analysis in a general population using standard pulse oximeter signals.

    Science.gov (United States)

    Barak-Shinar, Deganit; Amos, Yariv; Bogan, Richard K

    2013-09-01

    Obstructive sleep apnea reported as the apnea-hypopnea index (AHI) is usually measured in sleep laboratories using a high number of electrodes connected to the patient's body. In this study, we examined the use of a standard pulse oximeter system with an automated analysis based on the photoplethysmograph (PPG) signal for the diagnosis of sleep disordered breathing. Using a standard and simple device with high accuracy might provide a convenient diagnostic or screening solution for patient evaluation at home or in other out of center testing environments. The study included 140 consecutive patients that were referred routinely to a sleep laboratory [SleepMed Inc.] for the diagnosis of sleep disordered breathing. Each patient underwent an overnight polysomnography (PSG) study according to AASM guidelines in an AASM-accredited sleep laboratory. The automatic analysis is based on photoplethysmographic and saturation signals only. Those two signals were recorded for the entire night as part of the full overnight PSG sleep study. The AHI calculated from the PPG analysis is compared to the AHI calculated from the manual scoring gold standard full PSG. The AHI and total respiratory events measured by the pulse oximeter analysis correlated very well with the corresponding results obtained by the gold standard full PSG. The sensitivity and specificity of AHI = or > 5 and 15 levels measured by the analysis are both above 90 %. The sensitivity and positive predictive value for the detection of respiratory event are both above 84 %. The tested system in this study yielded an acceptable result of sleep disordered breathing compared to the gold standard PSG in patients with moderate to severe sleep apnea. Accordingly and given the convenience and simplicity of the standard pulse oximeter device, the new system can be considered suitable for home and ambulatory diagnosis or screening of sleep disordered breathing patients.

  4. Analysis of standard problem six (Semiscale test S-02-6) data

    International Nuclear Information System (INIS)

    Cartmill, C.E.

    1977-08-01

    Test S-02-6 of the Semiscale Mod-1 blowdown heat transfer test series was conducted to supply data for the U.S. Nuclear Regulatory Commission Standard Problem Six. To determine the credibility of the data and thus establish the validity of Standard Problem Six, an analysis of the results of Test S-02-6 was performed and is presented. This analysis consisted of investigations of system hydraulic and core thermal data. The credibility of the system hydraulic data was investigated through comparisons of the data with data and calculations from related sources (Test S-02-4) and, when necessary, through assessment of physical events. The credibility of the core thermal data was based on a thorough analysis of physical events. The results of these investigations substantiate the validity of Test S-02-6 data

  5. Ion backscattering techniques applied in materials science research

    International Nuclear Information System (INIS)

    Sood, D.K.

    1978-01-01

    The applications of Ion Backscattering Technique (IBT) to material analysis have expanded rapidly during the last decade. It is now regarded as an analysis tool indispensable for a versatile materials research program. The technique consists of simply shooting a beam of monoenergetic ions (usually 4 He + ions at about 2 MeV) onto a target, and measuring their energy distribution after backscattering at a fixed angle. Simple Rutherford scattering analysis of the backscattered ion spectrum yields information on the mass, the absolute amount and the depth profile of elements present upto a few microns of the target surface. The technique is nondestructive, quick, quantitative and the only known method of analysis which gives quantitative results without recourse to calibration standards. Its major limitations are the inability to separate elements of similar mass and a complete absence of chemical-binding information. A typical experimental set up and spectrum analysis have been described. Examples, some of them based on the work at the Bhabha Atomic Research Centre, Bombay, have been given to illustrate the applications of this technique to semiconductor technology, thin film materials science and nuclear energy materials. Limitations of IBT have been illustrated and a few remedies to partly overcome these limitations are presented. (auth.)

  6. The preparation of synthetic standards for use in instrumental neutron-activation analysis

    International Nuclear Information System (INIS)

    Eddy, B.T.; Watterson, J.I.W.; Erasmus, C.S.

    1979-01-01

    An account is given of the formulation and preparation of synthetic standards suitable for the routine analysis of minerals, ores, and ore concentrates by instrumental neutron activation. Fifteen standards were prepared, each containing from one to seven elements. The standards contain forty-four elements that produce isotopes with half-lives longer than 12 hours. An evaluation of the accuracy and precision of the method of preparation is given

  7. A radioanalytical technique using (n,2n) reaction for the elemental analysis of samples

    International Nuclear Information System (INIS)

    Labor, M.

    1985-11-01

    A technique to determine elemental composition of samples is reported. The principle of the technique employs the internal standard method and involves the resolution of complex annihilation spectra. The technique has been applied to the determination of the mass of nitrogen, msub(N), and that of potassium, msub(K), in known masses of potassium nitrate. The percentage difference between the calculated mass and actual masses in 2g and 3g of potassium nitrate is 1.0 and 0.7 respectively for potassium, and 1.0 for nitrogen. The use of more simultaneous equations than necessary in solving for msub(N) and msub(K) offers one of the advantages of the technique. (author)

  8. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  9. Design and analysis of control charts for standard deviation with estimated parameters

    NARCIS (Netherlands)

    Schoonhoven, M.; Riaz, M.; Does, R.J.M.M.

    2011-01-01

    This paper concerns the design and analysis of the standard deviation control chart with estimated limits. We consider an extensive range of statistics to estimate the in-control standard deviation (Phase I) and design the control chart for real-time process monitoring (Phase II) by determining the

  10. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Hart, Philip R.; Richman, Eric E.; Athalye, Rahul A.; Winiarski, David W.

    2014-09-04

    This report provides a final qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition). All addenda in creating Standard 90.1-2013 were evaluated for their projected impact on energy efficiency. Each addendum was characterized as having a positive, neutral, or negative impact on overall building energy efficiency.

  11. A new technique for quantitative analysis of hair loss in mice using grayscale analysis.

    Science.gov (United States)

    Ponnapakkam, Tulasi; Katikaneni, Ranjitha; Gulati, Rohan; Gensure, Robert

    2015-03-09

    Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.

  12. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  13. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  14. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    Science.gov (United States)

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  15. Next generation initiation techniques

    Science.gov (United States)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The

  16. Standardization of a Volumetric Displacement Measurement for Two-Body Abrasion Scratch Test Data Analysis

    Science.gov (United States)

    Street, K. W. Jr.; Kobrick, R. L.; Klaus, D. M.

    2011-01-01

    A limitation has been identified in the existing test standards used for making controlled, two-body abrasion scratch measurements based solely on the width of the resultant score on the surface of the material. A new, more robust method is proposed for analyzing a surface scratch that takes into account the full three-dimensional profile of the displaced material. To accomplish this, a set of four volume- displacement metrics was systematically defined by normalizing the overall surface profile to denote statistically the area of relevance, termed the Zone of Interaction. From this baseline, depth of the trough and height of the plowed material are factored into the overall deformation assessment. Proof-of-concept data were collected and analyzed to demonstrate the performance of this proposed methodology. This technique takes advantage of advanced imaging capabilities that allow resolution of the scratched surface to be quantified in greater detail than was previously achievable. When reviewing existing data analysis techniques for conducting two-body abrasive scratch tests, it was found that the ASTM International Standard G 171 specified a generic metric based only on visually determined scratch width as a way to compare abraded materials. A limitation to this method was identified in that the scratch width is based on optical surface measurements, manually defined by approximating the boundaries, but does not consider the three-dimensional volume of material that was displaced. With large, potentially irregular deformations occurring on softer materials, it becomes unclear where to systematically determine the scratch width. Specifically, surface scratches on different samples may look the same from a top view, resulting in an identical scratch width measurement, but may vary in actual penetration depth and/or plowing deformation. Therefore, two different scratch profiles would be measured as having identical abrasion properties, although they differ

  17. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  18. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  19. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  20. Advances in U.S. reactor physics standards

    International Nuclear Information System (INIS)

    Cokinos, Dimitrios

    2008-01-01

    The standards for Reactor Design, widely used in the nuclear industry, provide guidance and criteria for performing and validating a wide range of nuclear reactor calculations and measurements. Advances, over the past decades in reactor technology, nuclear data and infrastructure in the data handling field, led to major improvements in the development and application of reactor physics standards. A wide variety of reactor physics methods and techniques are being used by reactor physicists for the design and analysis of modern reactors. ANSI (American National Standards Institute) reactor physics standards, covering such areas as nuclear data, reactor design, startup testing, decay heat and fast neutron fluence in the pressure vessel, are summarized and discussed. These standards are regularly undergoing review to respond to an evolving nuclear technology and are being successfully used in the U.S and abroad contributing to improvements in reactor design, safe operation and quality assurance. An overview of the overall program of reactor physics standards is presented. New standards currently under development are also discussed. (authors)

  1. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  2. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  3. Control charts technique - a tool to data analysis for chemical experiments

    International Nuclear Information System (INIS)

    Yadav, M.B.; Venugopal, V.

    1999-01-01

    A procedure using control charts technique has been developed to analyse data of a chemical experiment which was conducted to assign a value to uranium content in Rb 2 U(SO 4 ) 3 . A value of (34.164 ± 0.031)% has been assigned against (34.167 ± 0.042)% already assigned by analysis of variance (ANOVA) technique. These values do not differ significantly. Merits and demerits of the two techniques have been discussed. (author)

  4. Assessment of chromium biostabilization in contaminated soils using standard leaching and sequential extraction techniques

    International Nuclear Information System (INIS)

    Papassiopi, Nymphodora; Kontoyianni, Athina; Vaxevanidou, Katerina; Xenidis, Anthimos

    2009-01-01

    The iron reducing microorganism Desulfuromonas palmitatis was evaluated as potential biostabilization agent for the remediation of chromate contaminated soils. D. palmitatis were used for the treatment of soil samples artificially contaminated with Cr(VI) at two levels, i.e. 200 and 500 mg kg -1 . The efficiency of the treatment was evaluated by applying several standard extraction techniques on the soil samples before and after treatment, such as the EN12457 standard leaching test, the US EPA 3060A alkaline digestion method and the BCR sequential extraction procedure. The water soluble chromium as evaluated with the EN leaching test, was found to decrease after the biostabilization treatment from 13 to less than 0.5 mg kg -1 and from 120 to 5.6 mg kg -1 for the soil samples contaminated with 200 and 500 mg Cr(VI) per kg soil respectively. The BCR sequential extraction scheme, although not providing accurate estimates about the initial chromium speciation in contaminated soils, proved to be a useful tool for monitoring the relative changes in element partitioning, as a consequence of the stabilization treatment. After bioreduction, the percentage of chromium retained in the two least soluble BCR fractions, i.e. the 'oxidizable' and 'residual' fractions, increased from 54 and 73% to more than 96% in both soils

  5. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  6. Stakeholder analysis for adopting a personal health record standard in Korea.

    Science.gov (United States)

    Kang, Min-Jeoung; Jung, Chai Young; Kim, Soyoun; Boo, Yookyung; Lee, Yuri; Kim, Sundo

    Interest in health information exchanges (HIEs) is increasing. Several countries have adopted core health data standards with appropriate strategies. This study was conducted to determine the feasibility of a continuity of care record (CCR) as the standard for an electronic version of the official transfer note and the HIE in Korean healthcare. A technical review of the CCR standard and analysis of stakeholders' views were undertaken. Transfer notes were reviewed and matched with CCR standard categories. The standard for the Korean coding system was selected. Stakeholder analysis included an online survey of members of the Korean Society of Medical Informatics, a public hearing to derive opinions of consumers, doctors, vendors, academic societies and policy makers about the policy process, and a focus group meeting with EMR vendors to determine which HIE objects were technically applicable. Data objects in the official transfer note form matched CCR standards. Korean Classification of Diseases, Korean Standard Terminology of Medicine, Electronic Data Interchange code (EDI code), Logical Observation Identifiers Names and Codes, and Korean drug codes (KD code) were recommended as the Korean coding standard.'Social history', 'payers', and 'encounters' were mostly marked as optional or unnecessary sections, and 'allergies', 'alerts', 'medication list', 'problems/diagnoses', 'results',and 'procedures' as mandatory. Unlike the US, 'social history' was considered optional and 'advance directives' mandatory.At the public hearing there was some objection from the Korean Medical Association to the HIE on legal grounds in termsof intellectual property and patients' personal information. Other groups showed positive or neutral responses. Focus group members divided CCR data objects into three phases based onpredicted adoption time in CCR: (i) immediate adoption; (ii) short-term adoption ('alerts', 'family history'); and (iii) long-term adoption ('results', 'advanced directives

  7. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  8. Error modelling of quantum Hall array resistance standards

    Science.gov (United States)

    Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa

    2018-04-01

    Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.

  9. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  10. Research Note: Energy dispersive x-ray fluorescence analysis ...

    African Journals Online (AJOL)

    Energy Dispersive X-Ray fluorescence (EDXRF) technique for the analysis of geological, biological and environmental samples is described. The technique has been applied in the analysis of 10 (geological, biological, environmental) standard reference materials. The accuracy and precision of the technique were attested ...

  11. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Standardized Effect Size Measures for Mediation Analysis in Cluster-Randomized Trials

    Science.gov (United States)

    Stapleton, Laura M.; Pituch, Keenan A.; Dion, Eric

    2015-01-01

    This article presents 3 standardized effect size measures to use when sharing results of an analysis of mediation of treatment effects for cluster-randomized trials. The authors discuss 3 examples of mediation analysis (upper-level mediation, cross-level mediation, and cross-level mediation with a contextual effect) with demonstration of the…

  13. Evaluation of the laser-induced breakdown spectroscopy technique for determination of the chemical composition of copper concentrates

    International Nuclear Information System (INIS)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Drzymała, Jan; Abramski, Krzysztof M.

    2014-01-01

    Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards. - Highlights: • A laser-induced breakdown spectroscopy technique is introduced for composition monitoring in industrial copper concentrates. • Calibration samples consisted of pellets produced from the tested materials. • The proposed method of post-processing significantly minimizes matrix effects. • The possible uses of this technique are limited mainly by accurate characterization of the standard samples

  14. Evaluation of the laser-induced breakdown spectroscopy technique for determination of the chemical composition of copper concentrates

    Energy Technology Data Exchange (ETDEWEB)

    Łazarek, Łukasz, E-mail: lukasz.lazarek@pwr.wroc.pl [Laser and Fiber Electronics Group, Faculty of Electronics, Wroclaw University of Technology, Wyb. Wyspianskiego 27, 50-370 Wroclaw (Poland); Antończak, Arkadiusz J.; Wójcik, Michał R. [Laser and Fiber Electronics Group, Faculty of Electronics, Wroclaw University of Technology, Wyb. Wyspianskiego 27, 50-370 Wroclaw (Poland); Drzymała, Jan [Faculty of Geoengineering, Mining and Geology, Wroclaw University of Technology, Wyb. Wyspianskiego 27, 50-370 Wroclaw (Poland); Abramski, Krzysztof M. [Laser and Fiber Electronics Group, Faculty of Electronics, Wroclaw University of Technology, Wyb. Wyspianskiego 27, 50-370 Wroclaw (Poland)

    2014-07-01

    Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards. - Highlights: • A laser-induced breakdown spectroscopy technique is introduced for composition monitoring in industrial copper concentrates. • Calibration samples consisted of pellets produced from the tested materials. • The proposed method of post-processing significantly minimizes matrix effects. • The possible uses of this technique are limited mainly by accurate characterization of the standard samples.

  15. Standards for holdup measurement

    International Nuclear Information System (INIS)

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  16. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  17. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  18. The development of a standard format for accelerator data analysis

    International Nuclear Information System (INIS)

    Cohen, S.

    1989-01-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and ''understood'' by a large set of applications running on the control and data-analysis computers at LAMPF as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. It is a straightforward process to code a parser for both the control computer and pc's once a consensus on the file syntax has been established. This paper describes the file format and the methods used to integrate the format into existing diagnostic and control software

  19. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  20. Sentiment Analysis in Geo Social Streams by using Machine Learning Techniques

    OpenAIRE

    Twanabasu, Bikesh

    2018-01-01

    Treball de Final de Màster Universitari Erasmus Mundus en Tecnologia Geoespacial (Pla de 2013). Codi: SIW013. Curs acadèmic 2017-2018 Massive amounts of sentiment rich data are generated on social media in the form of Tweets, status updates, blog post, reviews, etc. Different people and organizations are using these user generated content for decision making. Symbolic techniques or Knowledge base approaches and Machine learning techniques are two main techniques used for analysis sentiment...

  1. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  2. Nuclear microprobe analysis of the standard reference materials

    International Nuclear Information System (INIS)

    Jaksic, M.; Fazinic, S.; Bogdanovic, I.; Tadic, T.

    2002-01-01

    Most of the presently existing Standard Reference Materials (SRM) for nuclear analytical methods are certified for the analyzed mass of the order of few hundred mg. Typical mass of sample which is analyzed by PIXE or XRF methods is very often below 1 mg. By the development of focused proton or x-ray beams, masses which can be typically analyzed go down to μg or even ng level. It is difficult to make biological or environmental SRMs which can give desired homogeneity at such low scale. However, use of fundamental parameter quantitative evaluation procedures (absolute method), minimize needs for SRMs. In PIXE and micro PIXE setup at our Institute, fundamental parameter approach is used. For exact calibration of the quantitative analysis procedure just one standard sample is needed. In our case glass standards which showed homogeneity down to micron scale were used. Of course, it is desirable to use SRMs for quality assurance, and therefore need for homogenous materials can be justified even for micro PIXE method. In this presentation, brief overview of PIXE setup calibration is given, along with some recent results of tests of several SRMs

  3. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  4. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  5. Gas Chromatographic Determination of Methyl Salicylate in Rubbing Alcohol: An Experiment Employing Standard Addition.

    Science.gov (United States)

    Van Atta, Robert E.; Van Atta, R. Lewis

    1980-01-01

    Provides a gas chromatography experiment that exercises the quantitative technique of standard addition to the analysis for a minor component, methyl salicylate, in a commercial product, "wintergreen rubbing alcohol." (CS)

  6. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  7. Russian Language Development Assessment as a Standardized Technique for Assessing Communicative Function in Children Aged 3–9 Years

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.,

    2016-10-01

    Full Text Available The article describes the Russian Language Development Assessment, a standardized individual diagnostic tool for children aged from 3 to 9 that helps to assess the following components of a child’s communicative function: passive vocabulary, expressive vocabulary, knowledge of semantic constructs with logical, temporal and spatial relations, passive perception and active use of syntactic and morphological features of words in a sentence, active and passive phonological awareness, active and passive knowledge of syntactic structures and categories. The article provides descriptions of content and diagnostic procedures for all 7 subtests included in the assessment (Passive Vocabulary, Active Vocabulary, Linguistic Operators, Sentence structure, Word Structure, Phonology, Sentence Repetition. Basing on the data collected in the study that involved 86 first- graders of a Moscow school, the article analyzes the internal consistency and construct validity of each subtest of the technique. It concludes that the Russian Language Development Assessment technique can be of much use both in terms of diagnostic purposes and in supporting children with ASD taking into account the lack of standardized tools for language and speech development assessment in Russian and the importance of this measure in general.

  8. A comparative study of standard vs. high definition colonoscopy for adenoma and hyperplastic polyp detection with optimized withdrawal technique.

    Science.gov (United States)

    East, J E; Stavrindis, M; Thomas-Gibson, S; Guenther, T; Tekkis, P P; Saunders, B P

    2008-09-15

    Colonoscopy has a known miss rate for polyps and adenomas. High definition (HD) colonoscopes may allow detection of subtle mucosal change, potentially aiding detection of adenomas and hyperplastic polyps. To compare detection rates between HD and standard definition (SD) colonoscopy. Prospective, cohort study with optimized withdrawal technique (withdrawal time >6 min, antispasmodic, position changes, re-examining flexures and folds). One hundred and thirty patients attending for routine colonoscopy were examined with either SD (n = 72) or HD (n = 58) colonoscopes. Groups were well matched. Sixty per cent of patients had at least one adenoma detected with SD vs. 71% with HD, P = 0.20, relative risk (benefit) 1.32 (95% CI 0.85-2.04). Eighty-eight adenomas (mean +/- standard deviation 1.2 +/- 1.4) were detected using SD vs. 93 (1.6 +/- 1.5) with HD, P = 0.12; however more nonflat, diminutive (9 mm) hyperplastic polyps was 7% (0.09 +/- 0.36). High definition did not lead to a significant increase in adenoma or hyperplastic polyp detection, but may help where comprehensive lesion detection is paramount. High detection rates appear possible with either SD or HD, when using an optimized withdrawal technique.

  9. Soil texture analysis by laser diffraction - standardization needed

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Palviainen, M.; Kjønaas, O. Janne

    2017-01-01

    Soil texture is a central soil quality property. Laser diffraction (LD) for determination of particle size distribution (PSD) is now widespread due to easy analysis and low cost. However, pretreatment methods and interpretation of the resulting soil PSD’s are not standardized. Comparison of LD data...... with sedimentation and sieving data may cause misinterpretation and confusion. In literature that reports PSD’s based on LD, pretreatment methods, operating procedures and data methods are often underreported or not reported, although literature stressing the importance exists (e.g. Konert and Vandenberghe, 1997...... and many newer; ISO 13320:2009). PSD uncertainty caused by pretreatments and PSD bias caused by plate-shaped clay particles still calls for more method standardization work. If LD is used more generally, new pedotransfer functions for other soil properties (e.g water retention) based on sieving...

  10. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  11. Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, P.; Logan, J.; Bird, L.; Short, W.

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  12. A Secure Test Technique for Pipelined Advanced Encryption Standard

    Science.gov (United States)

    Shi, Youhua; Togawa, Nozomu; Yanagisawa, Masao; Ohtsuki, Tatsuo

    In this paper, we presented a Design-for-Secure-Test (DFST) technique for pipelined AES to guarantee both the security and the test quality during testing. Unlike previous works, the proposed method can keep all the secrets inside and provide high test quality and fault diagnosis ability as well. Furthermore, the proposed DFST technique can significantly reduce test application time, test data volume, and test generation effort as additional benefits.

  13. On criteria for examining analysis quality with standard reference material

    International Nuclear Information System (INIS)

    Yang Huating

    1997-01-01

    The advantages and disadvantages and applicability of some criteria for examining analysis quality with standard reference material are discussed. The combination of the uncertainties of the instrument examined and the reference material should be determined on the basis of specific situations. Without the data of the instrument's uncertainty, it would be applicable to substitute the standard deviation multiplied by certain times for the uncertainty. The result of the examining should not result in more error reported in routine measurements than it really is. Over strict examining should also be avoided

  14. Learning mediastinoscopy: the need for education, experience and modern techniques--interdependency of the applied technique and surgeon's training level.

    Science.gov (United States)

    Walles, Thorsten; Friedel, Godehard; Stegherr, Tobias; Steger, Volker

    2013-04-01

    Mediastinoscopy represents the gold standard for invasive mediastinal staging. While learning and teaching the surgical technique are challenging due to the limited accessibility of the operation field, both benefited from the implementation of video-assisted techniques. However, it has not been established yet whether video-assisted mediastinoscopy improves the mediastinal staging in itself. Retrospective single-centre cohort analysis of 657 mediastinoscopies performed at a specialized tertiary care thoracic surgery unit from 1994 to 2006. The number of specimens obtained per procedure and per lymph node station (2, 4, 7, 8 for mediastinoscopy and 2-9 for open lymphadenectomy), the number of lymph node stations examined, sensitivity and negative predictive value with a focus on the technique employed (video-assisted vs standard technique) and the surgeon's experience were calculated. Overall sensitivity was 60%, accuracy was 90% and negative predictive value 88%. With the conventional technique, experience alone improved sensitivity from 49 to 57% and it was predominant at the paratracheal right region (from 62 to 82%). But with the video-assisted technique, experienced surgeons rose sensitivity from 57 to 79% in contrast to inexperienced surgeons who lowered sensitivity from 49 to 33%. We found significant differences concerning (i) the total number of specimens taken, (ii) the amount of lymph node stations examined, (iii) the number of specimens taken per lymph node station and (iv) true positive mediastinoscopies. The video-assisted technique can significantly improve the results of mediastinoscopy. A thorough education on the modern video-assisted technique is mandatory for thoracic surgeons until they can fully exhaust its potential.

  15. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  16. Synthetic multielement standards used for instrumental neutron activation analysis as rock imitations

    International Nuclear Information System (INIS)

    Leypunskaya, D.I.; Drynkin, V.I.; Belenky, B.V.; Kolomijtsev, M.A.; Dundera, V.Yu.; Pachulia, N.V.

    1975-01-01

    Complex (multielemental) standards representing microelement composition of standard rocks such as trap ST-1 (USSR), gabbrodiorite SGD-1 (USSR), albitized granite SG-1 (USSR), basalt BCR-1 (USA) and granodiorite GSP-1 (USA) have been synthesized. It has been shown that the concentration of each microelement in the synthetic standards can be given with a high precision. Comparative investigation has been carried out of the synthetic imitations and the above natural standard rocks. It has been found that the result of the instrumental neutron activation analysis using the synthetic standards is as good as in the case when natural standard rocks are used. The results obtained have been also used for substantiation of the versatility of the method used for standard preparation, i.e. a generalization has been made of a possibility of using this method for the preparation of synthetic standards representing the microelement composition of any natural rocks with various compositions and concentrations of microelements. (T.G.)

  17. Digital hilar tomography. Comparison with conventional technique

    International Nuclear Information System (INIS)

    Schaefer, C.B.; Braunschweig, R.; Teufl, F.; Kaiser, W.; Claussen, C.D.

    1993-01-01

    The aim of the following study was to compare conventional hilar tomography and digital hilar tomography. 20 patients were examined both with conventional and digital hilar tomography using the same tomographic technique and the identical exposure dose. All patients underwent computed tomography of the chest as a golden standard. The digital technique, especially the edge-enhanced image version, showed superior image quality. ROC-analysis by 4 readers found equal diagnostic performance without any statistical difference. Digital hilar tomography shows a superior and constant image quality and lowers the rate of re-exposure. Therefore, digital hilar tomography is the preferable method. (orig.) [de

  18. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  19. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  20. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  1. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  2. A microhistological technique for analysis of food habits of mycophagous rodents.

    Science.gov (United States)

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  3. Standardization of the Fricke gel dosimetry method and tridimensional dose evaluation using the magnetic resonance imaging technique

    International Nuclear Information System (INIS)

    Cavinato, Christianne Cobello

    2009-01-01

    This study standardized the method for obtaining the Fricke gel solution developed at IPEN. The results for different gel qualities used in the preparation of solutions and the influence of the gelatin concentration in the response of dosimetric solutions were compared. Type tests such as: dose response dependence, minimum and maximum detection limits, response reproducibility, among others, were carried out using different radiation types and the Optical Absorption (OA) spectrophotometry and Magnetic Resonance (MR) techniques. The useful dose ranges for Co 60 gamma radiation and 6 MeV photons are 0,4 to 30,0 Gy and 0,5 to 100,0 Gy , using OA and MR techniques, respectively. A study of ferric ions diffusion in solution was performed to determine the optimum time interval between irradiation and samples evaluation; until 2,5 hours after irradiation to obtain sharp MR images. A spherical simulator consisting of Fricke gel solution prepared with 5% by weight 270 Bloom gelatine (national quality) was developed to be used to three-dimensional dose assessment using the Magnetic Resonance Imaging (MRI) technique. The Fricke gel solution prepared with 270 Bloom gelatine, that, in addition to low cost, can be easily acquired on the national market, presents satisfactory results on the ease of handling, sensitivity, response reproducibility and consistency. The results confirm their applicability in the three-dimensional dosimetry using MRI technique. (author)

  4. MONJU experimental data analysis and its feasibility evaluation to build up the standard data base for large FBR nuclear core design

    International Nuclear Information System (INIS)

    Sugino, K.; Iwai, T.

    2006-01-01

    MONJU experimental data analysis was performed by using the detailed calculation scheme for fast reactor cores developed in Japan. Subsequently, feasibility of the MONJU integral data was evaluated by the cross-section adjustment technique for the use of FBR nuclear core design. It is concluded that the MONJU integral data is quite valuable for building up the standard data base for large FBR nuclear core design. In addition, it is found that the application of the updated data base has a possibility to considerably improve the prediction accuracy of neutronic parameters for MONJU. (authors)

  5. Initial testing of a neutron activation analysis system by analysing standard reference materials

    International Nuclear Information System (INIS)

    Suhaimi Hamzah; Roslan Idris; Abdul Khalik Haji Wood; Che Seman Mahmood; Abdul Rahim Mohamad Noor.

    1983-01-01

    This paper describes the data acquisition and processing system in our laboratories (ND6600), the methods of activation analysis and the results obtained from our analysis of IAEA standard reference material (SL-l lake sediments and NBS coal ash 1632a). These standards were analysed in order to check the capability of the system, which was designed in such a way as to enable the user to independently collect and process data from multiple radiation detectors. (author)

  6. Discussion on the Standardization of Shielding Materials — Sensitivity Analysis of Material Compositions

    Directory of Open Access Journals (Sweden)

    Ogata Tomohiro

    2017-01-01

    Full Text Available The overview of standardization activities for shielding materials is described. We propose a basic approach for standardizing material composition used in radiation shielding design for nuclear and accelerator facilities. We have collected concrete composition data from actual concrete samples to organize a representative composition and its variance data. Then the sensitivity analysis of the composition variance has been performed through a simple 1-D dose calculation. Recent findings from the analysis are summarized.

  7. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  8. Evaluation of the laser-induced breakdown spectroscopy technique for determination of the chemical composition of copper concentrates

    Science.gov (United States)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Drzymała, Jan; Abramski, Krzysztof M.

    2014-07-01

    Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards.

  9. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  10. Gas chromatography/ion trap mass spectrometry applied for the analysis of triazine herbicides in environmental waters by an isotope dilution technique

    International Nuclear Information System (INIS)

    Cai Zongwei; Wang Dongli; Ma, W.T.

    2004-01-01

    A gas chromatography/ion trap mass spectrometry method was developed for the analysis of simazine, atrazine, cyanazine, as well as the degradation products of atrazine, such as deethylatrazine and deisopropylatrazine in environmental water samples. Isotope dilution technique was applied for the quantitative analysis of atrazine in water at low ng/l levels. One liter of water sample spiked with stable isotope internal standard atrazine-d 5 was extracted with a C 18 solid-phase extraction cartridge. The analysis was performed on an ion trap mass spectrometer operated in MS/MS method. The extraction recoveries were in the range of 83-94% for the triazine herbicides in water at the concentrations of 24, 200, and 1000 ng/l, while poor recoveries were obtained for the degradation products of atrazine. The relative standard deviation (R.S.D.) were within the range of 3.2-16.1%. The detection limits of the method were between 0.75 and 12 ng/l when 1 l of water was analyzed. The method was successfully applied to analyze environmental water samples collected from a reservoir and a river in Hong Kong for atrazine detected at concentrations between 3.4 and 26 ng/l

  11. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  12. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    methods, being superior to standard Monte Carlo sampling both in terms of accuracy and computational cost. This demonstrates that such PC techniques can provide a viable alternative to random sampling even for larger scale systems, which is especially appealing for the S and U analysis of problems using legacy codes common in the nuclear field

  13. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  14. Accuracy of molecular biology techniques for the diagnosis of Strongyloides stercoralis infection-A systematic review and meta-analysis.

    Science.gov (United States)

    Buonfrate, Dora; Requena-Mendez, Ana; Angheben, Andrea; Cinquini, Michela; Cruciani, Mario; Fittipaldo, Andrea; Giorli, Giovanni; Gobbi, Federico; Piubelli, Chiara; Bisoffi, Zeno

    2018-02-01

    Strongyloides stercoralis infection is a neglected tropical disease which can lead to severe symptoms and even death in immunosuppressed people. Unfortunately, its diagnosis is hampered by the lack of a gold standard, as the sensitivity of traditional parasitological tests (including microscopic examination of stool samples and coproculture) is low. Hence, alternative diagnostic methods, such as molecular biology techniques (mostly polymerase chain reaction, PCR) have been implemented. However, there are discrepancies in the reported accuracy of PCR. A systematic review with meta-analysis was conducted in order to evaluate the accuracy of PCR for the diagnosis of S. stercoralis infection. The protocol was registered with PROSPERO International Prospective Register of Systematic Reviews (record: CRD42016054298). Fourteen studies, 12 of which evaluating real-time PCR, were included in the analysis. The specificity of the techniques resulted high (ranging from 93 to 95%, according to the reference test(s) used). When all molecular techniques were compared to parasitological methods, the sensitivity of PCR was assessed at 71.8% (95% CI 52.2-85.5), that decreased to 61.8% (95% CI 42.0-78.4) when serology was added among the reference tests. Similarly, sensitivity of real-time PCR resulted 64.4% (95% CI 46.2-77.7) when compared to parasitological methods only, 56.5% (95% CI 39.2-72.4) including serology. PCR might not be suitable for screening purpose, whereas it might have a role as a confirmatory test.

  15. Application of optimal estimation techniques to FFTF decay heat removal analysis

    International Nuclear Information System (INIS)

    Nutt, W.T.; Additon, S.L.; Parziale, E.A.

    1979-01-01

    The verification and adjustment of plant models for decay heat removal analysis using a mix of engineering judgment and formal techniques from control theory are discussed. The formal techniques facilitate dealing with typical test data which are noisy, redundant and do not measure all of the plant model state variables directly. Two pretest examples are presented. 5 refs

  16. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  17. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  18. The development of a standard format for accelerator data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, S [Los Alamos National Lab., NM (USA)

    1990-08-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam-diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and 'understood' by a large set of applications running on the control and data-analysis computers at LAMPF, as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. Once a consensus on the file syntax has been established, it is a straightforward process to code a parser for both the control computer and PCs. This paper describes the file format and the method used to integrate the format into existing diagnostic and control software. (orig.).

  19. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  20. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  1. Astrophysical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kitchin, C R

    1984-01-01

    The subject is covered in chapters, entitled: detectors (optical and infrared detection; radio and microwave detection; X-ray and gamma-ray detection; cosmic ray detectors; neutrino detectors; gravitational radiation); imaging (photography; electronic imaging; scanning; interferometry; speckle interferometry; occultations; radar); photometry and photometers; spectroscopy and spectroscopes; other techniques (astrometry; polarimetry; solar studies; magnetometry). Appendices: magnitudes and spectral types of bright stars; north polar sequence; standard stars for the UBV photometric system; standard stars for the UVBY photometric system; standard stars for MK spectral types; standard stars for polarimetry; Julian date; catalogues; answers to the exercises.

  2. Environmental protection standards - from the point of view of systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, K

    1978-11-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors.

  3. Environmental protection standards - from the point of view of systems analysis

    International Nuclear Information System (INIS)

    Becker, K.

    1978-01-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors. (orig.) [de

  4. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  5. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  6. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1990-01-01

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  7. Comparative study of macrotexture analysis using X-ray diffraction and electron backscattered diffraction techniques

    International Nuclear Information System (INIS)

    Serna, Marilene Morelli

    2002-01-01

    The macrotexture is one of the main characteristics in metallic materials, which the physical properties depend on the crystallographic direction. The analysis of the macrotexture to middles of the decade of 80 was just accomplished by the techniques of Xray diffraction and neutrons diffraction. The possibility of the analysis of the macrotexture using, the technique of electron backscattering diffraction in the scanning electronic microscope, that allowed to correlate the measure of the orientation with its location in the micro structure, was a very welcome tool in the area of engineering of materials. In this work it was studied the theoretical aspects of the two techniques and it was used of both techniques for the analysis of the macrotexture of aluminum sheets 1050 and 3003 with intensity, measured through the texture index 'J', from 2.00 to 5.00. The results obtained by the two techniques were shown reasonably similar, being considered that the statistics of the data obtained by the technique of electron backscatter diffraction is much inferior to the obtained by the X-ray diffraction. (author)

  8. Study of the standard direct costs of various techniques of advanced endoscopy. Comparison with surgical alternatives.

    Science.gov (United States)

    Loras, Carme; Mayor, Vicenç; Fernández-Bañares, Fernando; Esteve, Maria

    2018-03-12

    The complexity of endoscopy has carried out an increase in cost that has a direct effect on the healthcare systems. However, few studies have analyzed the cost of advanced endoscopic procedures (AEP). To carry out a calculation of the standard direct costs of AEP, and to make a financial comparison with their surgical alternatives. Calculation of the standard direct cost in carrying out each procedure. An endoscopist detailed the time, personnel, materials, consumables, recovery room time, stents, pathology and medication used. The cost of surgical procedures was the average cost recorded in the hospital. Thirty-eight AEP were analyzed. The technique showing lowest cost was gastroscopy + APC (€116.57), while that with greatest cost was ERCP with cholangioscopy + stent placement (€5083.65). Some 34.2% of the procedures registered average costs of €1000-2000. In 57% of cases, the endoscopic alternative was 2-5 times more cost-efficient than surgery, in 31% of cases indistinguishable or up to 1.4 times more costly. Standard direct cost of the majority of AEP is reported using a methodology that enables easy application in other centers. For the most part, endoscopic procedures are more cost-efficient than the corresponding surgical procedure. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  9. Body composition analysis techniques in adult and pediatric patients: how reliable are they? How useful are they clinically?

    Science.gov (United States)

    Woodrow, Graham

    2007-06-01

    Complex abnormalities of body composition occur in peritoneal dialysis (PD). These abnormalities reflect changes in hydration, nutrition, and body fat, and they are of major clinical significance. Clinical assessment of these body compartments is insensitive and inaccurate. Frequently, simultaneous changes of hydration, wasting, and body fat content can occur, confounding clinical assessment of each component. Body composition can be described by models of varying complexity that use one or more measurement techniques. "Gold standard" methods provide accurate and precise data, but are not practical for routine clinical use. Dual energy X-ray absorptiometry allows for measurement of regional as well as whole-body composition, which can provide further information of clinical relevance. Simpler techniques such as anthropometry and bioelectrical impedance analysis are suited to routine use in clinic or at the bedside, but may be less accurate. Body composition methodology sometimes makes assumptions regarding relationships between components, particularly in regard to hydration, which may be invalid in pathologic states. Uncritical application of these methods to the PD patient may result in erroneous interpretation of results. Understanding the foundations and limitations of body composition techniques allows for optimal application in clinical practice.

  10. Elemental analysis of human placenta by neutron irradiation and gamma-ray spectrometry (standard, prompt and fast-neutron)

    International Nuclear Information System (INIS)

    Ward, N.I.

    1987-01-01

    Human placental tissue from 100 hospitalized deliveries were analysed for Ag, Al, As, Au, B, Ba, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, F, Fe, I, Hg, K, La, Mg, Mn, Mo, Na, Ni, Rb, S, Sb, Sc, Se, Sn, Sr, Ti, V, W and Zn using a combination of pre-chemical separation of sodium with hydrated antimony pentoxide and INAA. Boron and Si values were determined using prompt gamma-ray and fast-neutron techniques, respectively. Analysis of NBS-SRM Bovine Liver 1577 and a 'pooled standard' placental tissue for 33 elements showed a good agreement with most coefficients. Only Cd(-) and Zn(+) showed statistically significant correlations with birth weight, gestational age and placental weight. (author) 54 refs.; 3 tables

  11. Performance Analysis of a Utility Helicopter with Standard and Advanced Rotors

    National Research Council Canada - National Science Library

    Yeo, Hyeonsoo; Bousman, William G; Johnson, Wayne

    2002-01-01

    Flight test measurements of the performance of the UH-60 Black Hawk helicopter with both standard and advanced rotors are compared with calculations obtained using the comprehensive helicopter analysis CAMRAD II...

  12. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  13. Advanced analytical techniques for boiling water reactor chemistry control

    Energy Technology Data Exchange (ETDEWEB)

    Alder, H P; Schenker, E [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-02-01

    The analytical techniques applied can be divided into 5 classes: OFF-LINE (discontinuous, central lab), AT-LINE (discontinuous, analysis near loop), ON-LINE (continuous, analysis in bypass). In all cases pressure and temperature of the water sample are reduced. In a strict sense only IN-LINE (continuous, flow disturbance) and NON-INVASIVE (continuous, no flow disturbance) techniques are suitable for direct process control; - the ultimate goal. An overview of the analytical techniques tested in the pilot loop is given. Apart from process and overall water quality control, standard for BWR operation, the main emphasis is on water impurity characterization (crud particles, hot filtration, organic carbon); on stress corrosion crackling control for materials (corrosion potential, oxygen concentration) and on the characterization of the oxide layer on austenites (impedance spectroscopy, IR-reflection). The above mentioned examples of advanced analytical techniques have the potential of in-line or non-invasive application. They are different stages of development and are described in more detail. 28 refs, 1 fig., 5 tabs.

  14. Analysis of Standards Efficiency in Digital Television Via Satellite at Ku and Ka Bands

    Directory of Open Access Journals (Sweden)

    Landeros-Ayala Salvador

    2013-06-01

    Full Text Available In this paper, an analysis on the main technical features of digital television standards for satellite transmission is carried out. Based on simulations and link budgets, the standard with the best operational performance is defined, based on simulations and link budget analysis, as well as a comparative efficiency analysis is conducted for the Ku and Ka bands for both transparent and regenerative transponders in terms of power, bandwidth, information rate and link margin, including clear sky, uplink rain, downlink rain and rain in both.

  15. Performance evaluation using bootstrapping DEA techniques: Evidence from industry ratio analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2010-01-01

    In Data Envelopment Analysis (DEA) context financial data/ ratios have been used in order to produce a unified measure of performance metric. However, several scholars have indicated that the inclusion of financial ratios create biased efficiency estimates with implications on firms’ and industries’ performance evaluation. There have been several DEA formulations and techniques dealing with this problem including sensitivity analysis, Prior-Ratio-Analysis and DEA/ output–input ratio analysis ...

  16. Standard Test Method for Application and Analysis of Helium Accumulation Fluence Monitors for Reactor Vessel Surveillance, E706 (IIIC)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method describes the concept and use of helium accumulation for neutron fluence dosimetry for reactor vessel surveillance. Although this test method is directed toward applications in vessel surveillance, the concepts and techniques are equally applicable to the general field of neutron dosimetry. The various applications of this test method for reactor vessel surveillance are as follows: 1.1.1 Helium accumulation fluence monitor (HAFM) capsules, 1.1.2 Unencapsulated, or cadmium or gadolinium covered, radiometric monitors (RM) and HAFM wires for helium analysis, 1.1.3 Charpy test block samples for helium accumulation, and 1.1.4 Reactor vessel (RV) wall samples for helium accumulation. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  17. Standard practice for leaks using bubble emission techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This practice describes accepted procedures for and factors that influence laboratory immersion corrosion tests, particularly mass loss tests. These factors include specimen preparation, apparatus, test conditions, methods of cleaning specimens, evaluation of results, and calculation and reporting of corrosion rates. This practice also emphasizes the importance of recording all pertinent data and provides a checklist for reporting test data. Other ASTM procedures for laboratory corrosion tests are tabulated in the Appendix. (Warning-In many cases the corrosion product on the reactive metals titanium and zirconium is a hard and tightly bonded oxide that defies removal by chemical or ordinary mechanical means. In many such cases, corrosion rates are established by mass gain rather than mass loss.) 1.2 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. This standard does not purport to address all of the safety concerns, if any, assoc...

  18. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  19. Application of the neutron activation analysis technique in trace elements analysis

    International Nuclear Information System (INIS)

    Khamis, I.; Sarheel, A.; Al-Somel, N.

    2006-12-01

    The main objective in this study is the implementation k 0 -standardization method (single comparator method) using gold comparator as a routine method in neutron activation analysis laboratory in Engineering Nuclear Department. Cadmium ratio Rcd; Cd-ratio = [A s p/(A s p) C d] and the nuclear reactor constants (f=φ t h/φ e pi subcadimum thermal- to- epithermal neutron flux ratio and α with describing the φ e (E)∼ 1/E 1+α neutron flux distribution) were determined in the inner and outer irradiation sites at MNS Reactor. K 0 -IAEA software, which provided by the Agency, has been installed and applied in our laboratory. Trace elements in many kinds of samples (biological, environmental, alloy ...etc) were determined using K 0 -IAEA software. The results of standard reference materials (SRM's) obtained in this work show a good agreement with the certified values, and we got these results with a good accuracy closer to results which we got from relative NAA method. (author)

  20. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  1. Functional Group and Structural Characterization of Unmodified and Functionalized Lignin by Titration, Elemental Analysis, 1H NMR and FTIR Techniques

    Directory of Open Access Journals (Sweden)

    Ramin Bairami Habashi

    2017-11-01

    Full Text Available Lignin is the second most abundant polymer in the world after cellulose. Therefore, characterization of the structure and functional groups of lignin in order to assess its potential applications in various technical fields has become a necessity. One of the major problems related to the characterization of lignin is the lack of well-defined protocols and standards. In this paper, systematic studies have been done to characterize the structure and functional groups of lignin quantitatively using different techniques such as elemental analysis, titration and 1H NMR and FTIR techniques. Lignin as a black liquor was obtained from Choka Paper Factory and it was purified before any test. The lignin was reacted with α-bromoisobutyryl bromide to calculate the number of hydroxyl and methoxyl moles. Using 1H NMR spectroscopic method on α-bromoisobutyrylated lignin (BiBL in the presence of a given amount of N,N-dimethylformamide (DMF as an internal standard, the number of moles of hydroxyl and methoxyl groups per gram of lignin was found to be 6.44 mmol/g and 6.64 mmol/g, respectively. Using aqueous titration, the number of moles of phenolic hydroxyl groups and carboxyl groups of the lignin were calculated as 3.13 mmol/g and 2.84 mmol/g, respectively. The findings obtained by 1H NMR and elemental analysis indicated to phenyl propane unit of the lignin with C9 structural formula as C9 HAl 3.84HAr2.19S0.2O0.8(OH1.38(OCH31.42. Due to poor solubility of the lignin in tetrahydrofuran (THF, acetylated lignin was used in the GPC analysis, by which number-average molecular weight  of the lignin was calculated as 992 g/mol.

  2. A dedicated on-line system for the preparation and validation of standard beads in XRF analysis

    International Nuclear Information System (INIS)

    Yamamoto, Yasuyuki; Ogasawara, Noriko; Nakata, Akio; Shoji, Shizuko.

    1995-01-01

    A dedicated on-line system in X-ray Fluorescence (XRF) analysis with glass-bead method was developed in which preparation of standard beads was automated including proper choice of reagents, assignment of bead compositions and validation of the prepared beads. This system features: a. Fundamental Parameter (FP) Method for validation of standard beads. b. An original database of high purity reagents for standards. c. Automatic calculation of suitable composition for each standard bead, by giving a range for each element and the number of standard beads. 1) The calculation is based on random numbers, and makes a random assignment of composition for each bead. 2) The calculation results are automatically stored in a computer as a condition file for quantitative analysis. 3) An amount of a material for a standard mixture is corrected if a valence or a chemical compound for an analysis element is different from that of the standard material in the database. In order to realize these features, many high purity reagents were examined for their purities and other characteristics to test a suitability to use for a standard material, and a software for on-line processings was originally developed. (author)

  3. Analysis of fresh fallout from Chinese tests by beta counting technique

    International Nuclear Information System (INIS)

    Mishra, U.C.; Lalit, B.Y.; Shukla, V.K.; Ramachandran, T.V.

    1979-01-01

    The paper describes beta counting techniques used in the analysis of fresh radioactive fallout samples from nuclear weapon tests. Fresh fallout samples have been collected by swiping the exposed portion of the engine covers of commercial aircrafts arriving at Bombay from New York after Chinese tests on September 26, 1976 and September 17, 1977. Activities of short-lived radionuclides such as Ag 111, Sr 89, Mo 99, U 237 and Np 239 were determined using these techniques. The results obtained from this analysis is discussed in brief in relation to the kind of fissile material, the extent of thermonuclear reaction in the weapon and the mode of detonation. (orig.) [de

  4. Piping benchmark problems for the Westinghouse AP600 Standardized Plant

    International Nuclear Information System (INIS)

    Bezler, P.; DeGrassi, G.; Braverman, J.; Wang, Y.K.

    1997-01-01

    To satisfy the need for verification of the computer programs and modeling techniques that will be used to perform the final piping analyses for the Westinghouse AP600 Standardized Plant, three benchmark problems were developed. The problems are representative piping systems subjected to representative dynamic loads with solutions developed using the methods being proposed for analysis for the AP600 standard design. It will be required that the combined license licensees demonstrate that their solutions to these problems are in agreement with the benchmark problem set

  5. Techniques for incorporating operator expertise into intelligent decision aids and training

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1988-01-01

    An experiment is presented that was designed to investigate the use of protocol analysis, during task performance, as a technique for knowledge engineering that provides a direct tie between knowledge and performance. The technique is described and problem solving strategies are presented that were found to correlate with optimal performance. The results indicate that protocol analysis adds a dimension to the more standard knowledge engineering approaches by providing a more complete picture of the expert's knowledge and a performance yardstick to determine the most optimal problem solving strategies. Implications for the developers of expert systems and training programs are discussed. (author)

  6. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  7. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.

  8. Analyses of the phosphate standard of the Egyptian Nuclear Materials Corp. (phosphate-1)

    Energy Technology Data Exchange (ETDEWEB)

    Aly, M M

    1986-01-01

    A new whole rock phosphorite has been prepared in the Nuclear Materials Corp. Egypt. This is a part of the Egytian N.M.C. program to prepare a series of rock standards. This program began in 1983 to fulfil the continuous demand for such standards in all the research laboratories. Statistical analysis for silica, phosphorus, calcium, iron and strontium in selected fractions of this rock showed the homogeneity of the sample. Conventional methods, as well as rapid analysis, inductively coupled plasma spectrometry, X-ray fluorescence, activation analysis, i.r. spectroscopy, atomic absorption and laser induced fluorescence techniques have been used to give a complete chemical characterization of the sample.

  9. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  10. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS

    DEFF Research Database (Denmark)

    Creech, John Benjamin; Baker, J. A.; Handler, M. R.

    2014-01-01

    . Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily...... metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one...

  11. Tissue-Based MRI Intensity Standardization: Application to Multicentric Datasets

    Directory of Open Access Journals (Sweden)

    Nicolas Robitaille

    2012-01-01

    Full Text Available Intensity standardization in MRI aims at correcting scanner-dependent intensity variations. Existing simple and robust techniques aim at matching the input image histogram onto a standard, while we think that standardization should aim at matching spatially corresponding tissue intensities. In this study, we present a novel automatic technique, called STI for STandardization of Intensities, which not only shares the simplicity and robustness of histogram-matching techniques, but also incorporates tissue spatial intensity information. STI uses joint intensity histograms to determine intensity correspondence in each tissue between the input and standard images. We compared STI to an existing histogram-matching technique on two multicentric datasets, Pilot E-ADNI and ADNI, by measuring the intensity error with respect to the standard image after performing nonlinear registration. The Pilot E-ADNI dataset consisted in 3 subjects each scanned in 7 different sites. The ADNI dataset consisted in 795 subjects scanned in more than 50 different sites. STI was superior to the histogram-matching technique, showing significantly better intensity matching for the brain white matter with respect to the standard image.

  12. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  13. Cartographic standards to improve maps produced by the Forest Inventory and Analysis program

    Science.gov (United States)

    Charles H. (Hobie) Perry; Mark D. Nelson

    2009-01-01

    The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...

  14. Application of ko-NAA technique on Dalat research reactor for human hair analysis in environmental pollution study

    International Nuclear Information System (INIS)

    Ho Manh Dung; Mai Van Nhon

    2006-01-01

    The k o -standardization method of neutron activation analysis (k o -NAA) has recently been developed on Dalat research reactor. However, in order to apply the k o -NAA technique for practical research objects, it is necessary to establish different experimental procedures for each object. This work is aiming at establishing such a k o -NAA procedure on Dalat research reactor for human hair samples to solve the environmental pollution study prob;em. Therefore, the sample collection and preparation, irradiation, gamma-ray spectrum measurement and data processing, as well as quality assurance and quality control of the k o -NAA procedure for human hair samples have been assessed by comparing with elemental concentrations in terms of the experimental to certified values ratio and U-score. The experimental results showed that the k o -NAA for multi-element in human hair sample analysis is able to apply on Dalat research reactor with a rather good analytical quality. (author)

  15. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  16. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  17. Determination of dynamic fracture toughness using a new experimental technique

    Directory of Open Access Journals (Sweden)

    Cady Carl M.

    2015-01-01

    Full Text Available In other studies dynamic fracture toughness has been measured using Charpy impact and modified Hopkinson Bar techniques. In this paper results will be shown for the measurement of fracture toughness using a new test geometry. The crack propagation velocities range from ∼0.15 mm/s to 2.5 m/s. Digital image correlation (DIC will be the technique used to measure both the strain and the crack growth rates. The boundary of the crack is determined using the correlation coefficient generated during image analysis and with interframe timing the crack growth rate and crack opening can be determined. A comparison of static and dynamic loading experiments will be made for brittle polymeric materials. The analysis technique presented by Sammis et al. [1] is a semi-empirical solution, however, additional Linear Elastic Fracture Mechanics analysis of the strain fields generated as part of the DIC analysis allow for the more commonly used method resembling the crack tip opening displacement (CTOD experiment. It should be noted that this technique was developed because limited amounts of material were available and crack growth rates were to fast for a standard CTOD method.

  18. Determination of dynamic fracture toughness using a new experimental technique

    Science.gov (United States)

    Cady, Carl M.; Liu, Cheng; Lovato, Manuel L.

    2015-09-01

    In other studies dynamic fracture toughness has been measured using Charpy impact and modified Hopkinson Bar techniques. In this paper results will be shown for the measurement of fracture toughness using a new test geometry. The crack propagation velocities range from ˜0.15 mm/s to 2.5 m/s. Digital image correlation (DIC) will be the technique used to measure both the strain and the crack growth rates. The boundary of the crack is determined using the correlation coefficient generated during image analysis and with interframe timing the crack growth rate and crack opening can be determined. A comparison of static and dynamic loading experiments will be made for brittle polymeric materials. The analysis technique presented by Sammis et al. [1] is a semi-empirical solution, however, additional Linear Elastic Fracture Mechanics analysis of the strain fields generated as part of the DIC analysis allow for the more commonly used method resembling the crack tip opening displacement (CTOD) experiment. It should be noted that this technique was developed because limited amounts of material were available and crack growth rates were to fast for a standard CTOD method.

  19. Generation of gaseous methanol reference standards

    International Nuclear Information System (INIS)

    Geib, R.C.

    1991-01-01

    Methanol has been proposed as an automotive fuel component. Reliable, accurate methanol standards are essential to support widespread monitoring programs. The monitoring programs may include quantification of methanol from tailpipe emissions, evaporative emissions, plus ambient air methanol measurements. This paper will present approaches and results in the author's investigation to develop high accuracy methanol standards. The variables upon which the authors will report results are as follows: (1) stability of methanol gas standards, the studies will focus on preparation requirements and stability results from 10 to 1,000 ppmv; (2) cylinder to instrument delivery system components and purge technique, these studies have dealt with materials in contact with the sample stream plus static versus flow injection; (3) optimization of gas chromatographic analytical system will be discussed; (4) gas chromatography and process analyzer results and utility for methanol analysis will be presented; (5) the accuracy of the methanol standards will be qualified using data from multiple studies including: (a) gravimetric preparation; (b) linearity studies; (c) independent standards sources such as low pressure containers and diffusion tubes. The accuracy will be provided as a propagation of error from multiple sources. The methanol target concentrations will be 10 to 500 ppmv

  20. Fast charging technique for high power LiFePO4 batteries: A mechanistic analysis of aging

    Science.gov (United States)

    Anseán, D.; Dubarry, M.; Devie, A.; Liaw, B. Y.; García, V. M.; Viera, J. C.; González, M.

    2016-07-01

    One of the major issues hampering the acceptance of electric vehicles (EVs) is the anxiety associated with long charging time. Hence, the ability to fast charging lithium-ion battery (LIB) systems is gaining notable interest. However, fast charging is not tolerated by all LIB chemistries because it affects battery functionality and accelerates its aging processes. Here, we investigate the long-term effects of multistage fast charging on a commercial high power LiFePO4-based cell and compare it to another cell tested under standard charging. Coupling incremental capacity (IC) and IC peak area analysis together with mechanistic model simulations ('Alawa' toolbox with harvested half-cell data), we quantify the degradation modes that cause aging of the tested cells. The results show that the proposed fast charging technique caused similar aging effects as standard charging. The degradation is caused by a linear loss of lithium inventory, coupled with a less degree of linear loss of active material on the negative electrode. This study validates fast charging as a feasible mean of operation for this particular LIB chemistry and cell architecture. It also illustrates the benefits of a mechanistic approach to understand cell degradation on commercial cells.