WorldWideScience

Sample records for standard analysis techniques

  1. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    Science.gov (United States)

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  3. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  4. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  5. Radiographic analysis of the temporomandibular joint by the standardized projection technique

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1983-01-01

    The purpose of this study was to investigate the radiographic images of the condylar head in clinically normal subjects and the TMJ patients using standardized projection technique. 45 subjects who have not clinical evidence of TMJ problems and 96 patients who have the clinical evidence of TMJ problems were evaluated, but the patients who had fracture, trauma and tumor on TMJ area were discluded in this study. For the evaluation of radiographic images, the author has observed the condylar head positions in closed mouth and 2.54 cm open mouth position taken by the standardized transcranial oblique lateral projection technique. The results were as follow: 1. In closed mouth position, the crest of condylar head took relatively posterior position to the deepest point of the glenoid fossa in 8.9% of the normals and in 26.6% of TMJ patients. 2. In 2.54 cm open mouth position, condylar head took relatively posterior position to the articular eminence in 2 .2% of TMJ patients and 39.6% of the normals. 3. In open mouth position, the horizontal distance from the deepest point of the glenoid fossa to the condylar head was 13.96 mm in the normals and 10.68 mm in TMJ patients. 4. The distance of true movement of condylar head was 13.49 mm in the normals and 10.27 mm in TMJ patients. 5. The deviation of mandible in TMJ patients was slightly greater than of the normals.

  6. Developing standardized connection analysis techniques for slim hole core rod designs

    International Nuclear Information System (INIS)

    Fehr, G.; Bailey, E.I.

    1994-01-01

    Slim hole core rod design remains essentially in the proprietary domain. API standardization provides the ability to perform engineering analyses and dimensional inspections through the use of documents, ie: Specifications, Bulletins, and Recommended Practices. In order to provide similar engineering capability for non-API slim hole connections, this paper develops the initial phase of what may evolve into an engineering tool to provide at least an indication of relative serviceability between two connection styles for a given application. The starting point for this process will look at bending strength ratios and connection strength calculations. Since empirical data are yet needed to verify the approaches proposed in this paper, it is recognized that the alternatives presented here are only a first step to developing useful rules of thumb which may lead to later standardization

  7. Relationship between alveolar bone measured by 125I absorptiometry with analysis of standardized radiographs: 2. Bjorn technique

    International Nuclear Information System (INIS)

    Ortman, L.F.; McHenry, K.; Hausmann, E.

    1982-01-01

    The Bjorn technique is widely used in periodontal studies as a standardized measure of alveolar bone. Recent studies have demonstrated the feasibility of using 125 I absorptiometry to measure bone mass. The purpose of this study was to compare 125 I absorptiometry with the Bjorn technique in detecting small sequential losses of alveolary bone. Four periodontal-like defects of incrementally increasing size were produced in alveolar bone in the posterior segment of the maxilla of a human skull. An attempt was made to sequentially reduce the amount of bone in 10% increments until no bone remained, a through and through defect. The bone remaining at each step was measured using 125 I absorptiometry. At each site the 125 I absorptiometry measurements were made at the same location by fixing the photon source to a prefabricated precision-made occlusal splint. This site was just beneath the crest and midway between the borders of two adjacent teeth. Bone loss was also determined by the Bjorn technique. Standardized intraoral films were taken using a custom-fitted acrylic clutch, and bone measurements were made from the root apex to coronal height of the lamina dura. A comparison of the data indicates that: (1) in early bone loss, less than 30%, the Bjorn technique underestimates the amount of loss, and (2) in advanced bone loss, more than 60% the Bjorn technique overestimates it

  8. Chemical Separation Technique of Strontium-90 in the Soil Water as theStandard Methods for Environmental Radioactivity Analysis

    International Nuclear Information System (INIS)

    Ngasifudin-Hamdani; Suratman; Djoko-Sardjono, Ign; Winduanto-Wahyu SP

    2000-01-01

    Research about separation technique of strontium-90 from its materialmatrix using chemical precipitation method has been done. That technique wasapplied on the detection of radionuclide strontium-90 containing in the soilwater of near nuclear reactor facility P3TM BATAN in three location. The twoimportant parameters used in this technique were growth time of Y-90 andstirring time. The result shown that activity of strontium-90 in the pos-01was between 1.801x10 -19 - 9.616x10 -17 μCi/cm 3 , pos-02 was8.448x10 -19 - 1.003x X 10 -16 μCi/cm 3 and pos-03 was 6.719x10 -19 - 11.644x10 -16 μCi/cm 3 . From those data shown that activity of Sr-90in the soil water of near nuclear reactor facility P3TM BATAN was still belowthe limit value of maximum concentration permitted i.e. 4.0x10 -7 -3.5x10 -6 μCi/cm 3 . The statistic test using analysis of varian twofactorial with random block design shown that the activity of Sr-90 in thesoil water was influenced by the interaction which take place between growthlong time of Y-90 and stirring long time. (author)

  9. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    Science.gov (United States)

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  10. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    International Nuclear Information System (INIS)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin; Kuijer, Joost P.A.; Ven, Peter M. van de; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick

    2017-01-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  11. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    Energy Technology Data Exchange (ETDEWEB)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin [VU University Medical Center, Department of Cardiology, and Institute for Cardiovascular Research (ICaR-VU), Amsterdam (Netherlands); Kuijer, Joost P.A. [VU University Medical Center, Department of Physics and Medical Technology, Amsterdam (Netherlands); Ven, Peter M. van de [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); Meine, Mathias [University Medical Center, Department of Cardiology, Utrecht (Netherlands); Croisille, Pierre; Clarysse, Patrick [Univ Lyon, UJM-Saint-Etienne, INSA, CNRS UMR 5520, INSERM U1206, CREATIS, Saint-Etienne (France)

    2017-12-15

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  12. Standard lymphadenectomy technique in the gastric adenocarcinoma

    International Nuclear Information System (INIS)

    Aguirre Fernandez, Roberto Eduardo; Fernandez Vazquez, Pedro Ivan; LLera Dominguez, Gerardo de la

    2012-01-01

    The surgical technique used from 1990 in the 'Celia Sanchez Manduley' Clinical Surgical Teaching Provincial Hospital in Manzanillo, Granma province to carry out the gastrectomy together with the standard lymphadenectomy in patients carriers of a gastric adenocarcinoma, allowing application of the current oncologic and surgical concepts of the Japanese Society for Research of Gastric Cancer, essential to obtain a better prognosis in these patients

  13. International Standardization of Library and Documentation Techniques.

    Science.gov (United States)

    International Federation for Documentation, The Hague (Netherlands).

    This comparative study of the national and international standards, rules and regulations on library and documentation techniques adopted in various countries was conducted as a preliminary step in determining the minimal bases for facilitating national and international cooperation between documentalists and librarians. The study compares and…

  14. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  15. Comparison of least-squares vs. maximum likelihood estimation for standard spectrum technique of β−γ coincidence spectrum analysis

    International Nuclear Information System (INIS)

    Lowrey, Justin D.; Biegalski, Steven R.F.

    2012-01-01

    The spectrum deconvolution analysis tool (SDAT) software code was written and tested at The University of Texas at Austin utilizing the standard spectrum technique to determine activity levels of Xe-131m, Xe-133m, Xe-133, and Xe-135 in β–γ coincidence spectra. SDAT was originally written to utilize the method of least-squares to calculate the activity of each radionuclide component in the spectrum. Recently, maximum likelihood estimation was also incorporated into the SDAT tool. This is a robust statistical technique to determine the parameters that maximize the Poisson distribution likelihood function of the sample data. In this case it is used to parameterize the activity level of each of the radioxenon components in the spectra. A new test dataset was constructed utilizing Xe-131m placed on a Xe-133 background to compare the robustness of the least-squares and maximum likelihood estimation methods for low counting statistics data. The Xe-131m spectra were collected independently from the Xe-133 spectra and added to generate the spectra in the test dataset. The true independent counts of Xe-131m and Xe-133 are known, as they were calculated before the spectra were added together. Spectra with both high and low counting statistics are analyzed. Studies are also performed by analyzing only the 30 keV X-ray region of the β–γ coincidence spectra. Results show that maximum likelihood estimation slightly outperforms least-squares for low counting statistics data.

  16. ELISA technique standardization for strongyloidiasis diagnosis

    International Nuclear Information System (INIS)

    Huapaya, P.; Espinoza, I.; Huiza, A.; Universidad Nacional Mayor de San Marcos, Lima; Sevilla, C.

    2002-01-01

    To standardize ELISA technique for human Strongyloides stercoralis infection diagnosis a crude antigen was prepared using filariform larvae obtained from positive stool samples cultured with charcoal. Harvested larvae were crushed by sonication and washed by centrifugation in order to obtain protein extracts to be used as antigen. Final protein concentration was 600 μg/mL. Several kinds of ELISA plates were tested and antigen concentration, sera dilution, conjugate dilution and cut off were determined to identify infection. Sera from patients with both hyper-infection syndrome and intestinal infection demonstrated by parasitological examination were positive controls and sera from people living in non-endemic areas with no infection demonstrated by parasitological examination were negative controls. Best values were 5 μg/mL for antigen, 1/64 for sera, 1/1000 for conjugate; optical density values for positive samples were 1,2746 (1,1065 - 1,4206, DS = 0,3284) and for negative samples 0,4457 (0,3324 - 0,5538, DS = 0,2230). Twenty sera samples from positive subjects and one hundred from negative subjects were examined, obtaining 90% sensitivity and 88% specificity. The results show this technique could be useful as strongyloidiasis screening test in population studies

  17. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  18. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  19. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    NARCIS (Netherlands)

    Zweerink, A.; Allaart, C.P.; Kuijer, J.P.A.; Wu, L.; Beek, A.M.; Ven, P.M. van de; Meine, M.; Croisille, P.; Clarysse, P.; Rossum, A.C. van; Nijveldt, R.

    2017-01-01

    OBJECTIVES: Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive

  20. Specific binding assay technique; standardization of reagent

    International Nuclear Information System (INIS)

    Huggins, K.G.; Roitt, I.M.

    1979-01-01

    The standardization of a labelled constituent, such as anti-IgE, for use in a specific binding assay method is disclosed. A labelled ligand, such as IgE, is standardized against a ligand reference substance, such as WHO standard IgE, to determine the weight of IgE protein represented by the labelled ligand. Anti-light chain antibodies are contacted with varying concentrations of the labelled ligand. The ligand is then contacted with the labelled constituent which is then quantitated in relation to the amount of ligand protein present. The preparation of 131 I-labelled IgE is described. Also disclosed is an improved specific binding assay test method for determining the potency of an allergen extract in serum from an allergic individual. The improvement involved using a parallel model system of a second complex which consisted of anti-light chain antibodies, labelled ligand and the standardized labelled constituent (anti-IgE). The amount of standardized labelled constituent bound to the ligand in the first complex was determined, as described above, and the weight of ligand inhibited by addition of soluble allergen was then used as a measure of the potency of the allergen extract. (author)

  1. Standards for backscattering analysis

    International Nuclear Information System (INIS)

    Mitchell, I.V.; Eschbach, H.L.

    1978-01-01

    The need for backscattering standards appears to be twofold and depends on the uses and requirements of the users. The first is as a calibrated reference by which samples of a similar nature to the standard may be absolutely compared. The second is as a means of intercomparing the relative results obtained by different laboratories using, as near as possible, identical samples. This type of comparison is of a relative nature and the absolute values are not necessarily required. In the present work the authors try to satisfy both needs by providing identical samples which have been absolutely calibrated to a high accuracy. Very thin copper and vanadium layers were evaporated onto bismuth implanted silicon crystals and on glass plates under carefully controlled conditions. The mass of the deposits was determined in situ using a sensitive UHV microbalance. In addition, two quartz oscillator monitors were used. The samples have been analysed by Rutherford backscattering and the absolute quantity of bismuth determined by a comparison with the known amounts of deposited material. (Auth.)

  2. Cesarean sections, perfecting the technique and standardizing the practice: an analysis of the book Obstetrícia, by Jorge de Rezende.

    Science.gov (United States)

    Nakano, Andreza Rodrigues; Bonan, Claudia; Teixeira, Luiz Antônio

    2016-01-01

    This article discusses the development of techniques for cesarean sections by doctors in Brazil, during the 20th century, by analyzing the title "Operação Cesárea" (Cesarean Section), of three editions of the textbookObstetrícia, by Jorge de Rezende. His prominence as an author in obstetrics and his particular style of working, created the groundwork for the normalization of the practice of cesarean sections. The networks of meaning practiced within this scientific community included a "provision for feeling and for action" (Fleck) which established the C-section as a "normal" delivery: showing standards that exclude unpredictability, chaos, and dangers associated with the physiology of childbirth, meeting the demand for control, discipline and safety, qualities associated with practices, techniques and technologies of biomedicine.

  3. Selected Bibliography of the Nephrourology standard techniques

    International Nuclear Information System (INIS)

    1999-01-01

    In the mark of the first meeting of project coordinators ARCAL XXXVI a selected Bibliography is presented about standardization of technical of Nuclear Nephrourology .In this selection it found: radiopharmaceuticals used, quality control,dosimetry, obstruction, clearance and renal function paediatric aspects pielonephritis,Renovascular hypertension and renal transplant [es

  4. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  5. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  6. Analysis of soil and sewage sludge by ICP-OES and the German standard DIN 38414 sample preparation technique (P3)

    International Nuclear Information System (INIS)

    Edlund, M.; Heitland, P.; Visser, H.

    2002-01-01

    Full text: The elemental analyses of soil and sewage sludge has developed to become one of the main applications for ICP optical emission spectrometry (ICP-OES) and is described in many official procedures. These methods include different acid mixtures and digestion techniques. Even though the German standard DIN 38414 part 7 and the Dutch NEN 6465 do not guarantee complete recoveries for all elements, they are widely accepted in Europe. This paper describes sample preparation, line selection and investigates precision, accuracy and Limits of detection. The SPECTRO CIROSCCD EOP with axial plasma observation and the SPECTRO CIROSCCD SOP with radial observation were compared and evaluated for the analyses of soil and sewage sludge. Accuracy was investigated using the certified reference materials CRM-141 R, CRM-143 R and GSD 11. Both instruments show excellent performance in terms of speed, precision, accuracy and detection limits for the determination of trace metals in soil and sewage sludge. (author)

  7. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  8. Standardization of surgical techniques used in facial bone contouring.

    Science.gov (United States)

    Lee, Tae Sung

    2015-12-01

    Since the introduction of facial bone contouring surgery for cosmetic purposes, various surgical methods have been used to improve the aesthetics of facial contours. In general, by standardizing the surgical techniques, it is possible to decrease complication rates and achieve more predictable surgical outcomes, thereby increasing patient satisfaction. The technical strategies used by the author to standardize facial bone contouring procedures are introduced here. The author uses various pre-manufactured surgical tools and hardware for facial bone contouring. During a reduction malarplasty or genioplasty procedure, double-bladed reciprocating saws and pre-bent titanium plates customized for the zygomatic body, arch and chin are used. Various guarded oscillating saws are used for mandibular angloplasty. The use of double-bladed saws and pre-bent plates to perform reduction malarplasty reduces the chances of post-operative asymmetry or under- or overcorrection of the zygoma contours due to technical faults. Inferior alveolar nerve injury and post-operative jawline asymmetry or irregularity can be reduced by using a guarded saw during mandibular angloplasty. For genioplasty, final placement of the chin in accordance with preoperative quantitative analysis can be easily performed with pre-bent plates, and a double-bladed saw allows more procedural accuracy during osteotomies. Efforts by the surgeon to avoid unintentional faults are key to achieving satisfactory results and reducing the incidence of complications. The surgical techniques described in this study in conjunction with various in-house surgical tools and modified hardware can be used to standardize techniques to achieve aesthetically gratifying outcomes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  9. Human reliability assessment in a 99Mo/99mTc generator production facility using the standardized plant analysis risk-human (SPAR-H) technique.

    Science.gov (United States)

    Eyvazlou, Meysam; Dadashpour Ahangar, Ali; Rahimi, Azin; Davarpanah, Mohammad Reza; Sayyahi, Seyed Soheil; Mohebali, Mehdi

    2018-02-13

    Reducing human error is an important factor for enhancing safety protocols in various industries. Hence, analysis of the likelihood of human error in nuclear industries such as radiopharmaceutical production facilities has become more essential. This cross-sectional descriptive study was conducted to quantify the probability of human errors in a 99 Mo/ 99m Tc generator production facility in Iran. First, through expert interviews, the production process of the 99 Mo/ 99m Tc generator was analyzed using hierarchical task analysis (HTA). The standardized plant analysis risk-human (SPAR-H) method was then applied in order to calculate the probability of human error. Twenty tasks were determined using HTA. All of the eight performance shaping factors (PSF S ) were evaluated for the tasks. The mean probability of human error was 0.320. The highest and the lowest probability of human error in the 99 Mo/ 99m Tc generator production process, related to the 'loading the generator with the molybdenum solution' task and the 'generator elution' task, were 0.858 and 0.059, respectively. Required measures for reducing the human error probability (HEP) were suggested. These measures were derived from the level of PSF S that were evaluated in this study.

  10. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  11. Improvement of AC motor reliability from technique standardization

    International Nuclear Information System (INIS)

    Muniz, P.R.; Faria, M.D.R.; Mendes, M.P.; Silva, J.N.; Dos Santos, J.D.

    2005-01-01

    The purpose of this paper is to explain the increase of reliability of motors serviced in the Electrical Maintenance Shop of Companhia Siderurgica de Tubarao by standardization of the technique based on Brazilian and International Standards, manufacturer's recommendations and the experience of the maintenance staff. (author)

  12. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  13. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  14. Standard Establishment Through Scenarios (SETS): A new technique for occupational fitness standards.

    Science.gov (United States)

    Blacklock, R E; Reilly, T J; Spivock, M; Newton, P S; Olinek, S M

    2015-01-01

    An objective and scientific task analysis provides the basis for establishing legally defensible Physical Employment Standards (PES), based on common and essential occupational tasks. Infrequent performance of these tasks creates challenges when developing PES based on criterion, or content validity. Develop a systematic approach using Subject Matter Experts (SME) to provide tasks with 1) an occupationally relevant scenario considered common to all personnel; 2) a minimum performance standard defined by time, distance, load or work. Examples provided here relate to the development of a new PES for the Canadian Armed Forces (CAF). SME of various experience are selected based on their eligibility criteria. SME are required to define a reasonable scenario for each task from personal experience, provide occupational performance requirements of the scenario in sub-groups, and discuss and agree by consensus vote on the final standard based on the definition of essential. A common and essential task for the CAF is detailed as a case example of process application. Techniques to avoid common SME rating errors are discussed and advantages to the method described. The SETS method was developed as a systematic approach to setting occupational performance standards and qualifying information from SME.

  15. Paediatric sutureless circumcision-an alternative to the standard technique.

    LENUS (Irish Health Repository)

    2012-01-31

    INTRODUCTION: Circumcision is one of the most commonly performed surgical procedures in male children. A range of surgical techniques exist for this commonly performed procedure. The aim of this study is to assess the safety, functional outcome and cosmetic appearance of a sutureless circumcision technique. METHODS: Over a 9-year period, 502 consecutive primary sutureless circumcisions were performed by a single surgeon. All 502 cases were entered prospectively into a database including all relevant clinical details and a review was performed. The technique used to perform the sutureless circumcision is a modification of the standard sleeve technique with the use of a bipolar diathermy and the application of 2-octyl cyanoacrylate (2-OCA) to approximate the tissue edges. RESULTS: All boys in this study were pre-pubescent and the ages ranged from 6 months to 12 years (mean age 3.5 years). All patients had this procedure performed as a day case and under general anaesthetic. Complications included: haemorrhage (2.2%), haematoma (1.4%), wound infection (4%), allergic reaction (0.2%) and wound dehiscence (0.8%). Only 9 (1.8%) parents or patients were dissatisfied with the cosmetic appearance. CONCLUSION: The use of 2-OCA as a tissue adhesive for sutureless circumcisions is an alternative to the standard suture technique. The use of this tissue adhesive, 2-OCA, results in comparable complication rates to the standard circumcision technique and results in excellent post-operative cosmetic satisfaction.

  16. Standardization of P-33 by the TDCR efficiency calculation technique

    CSIR Research Space (South Africa)

    Simpson, BRS

    2004-02-01

    Full Text Available The activity of the pure beta-emitter phosphorus-33 (P-33) has been directly determined by the triple-to-double coincidence ratio (TDCR) efficiency calculation technique, thus extending the number of radionuclides that have been standardized...

  17. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  18. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  19. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    Science.gov (United States)

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  20. A new technique for the deposition of standard solutions in total reflection X-ray fluorescence spectrometry (TXRF) using pico-droplets generated by inkjet printers and its applicability for aerosol analysis with SR-TXRF

    International Nuclear Information System (INIS)

    Fittschen, U.E.A.; Hauschild, S.; Amberger, M.A.; Lammel, G.; Streli, C.; Foerster, S.; Wobrauschek, P.; Jokubonis, C.; Pepponi, G.; Falkenberg, G.; Broekaert, J.A.C.

    2006-01-01

    A new technique for the deposition of standard solutions on particulate aerosol samples using pico-droplets for elemental determinations with total reflection X-ray fluorescence spectrometry (TXRF) is described. It enables short analysis times without influencing the sample structure and avoids time consuming scanning of the sample with the exciting beam in SR-TXRF analysis. Droplets of picoliter volume (∼ 5-130 pL) were generated with commercially available and slightly modified inkjet printers operated with popular image processing software. The size of the dried droplets on surfaces of different polarity namely silicone coated and untreated quartz reflectors, was determined for five different printer types and ten different cartridge types. The results show that droplets generated by inkjet printers are between 50 and 200 μm in diameter (corresponding to volumes of 5 to 130 pL) depending on the cartridge type, which is smaller than the width of the synchrotron beam used in the experiments (< 1 mm at an energy of 17 keV at the beamline L at HASYLAB, Hamburg). The precision of the printing of a certain amount of a single element standard solution was found to be comparable to aliquoting with micropipettes in TXRF, where for 2.5 ng of cobalt relative standard deviations of 12% are found. However, it could be shown that the printing of simple patterns is possible, which is important when structured samples have to be analysed

  1. Force coordination in static manipulation tasks performed using standard and non-standard grasping techniques.

    Science.gov (United States)

    de Freitas, Paulo B; Jaric, Slobodan

    2009-04-01

    We evaluated coordination of the hand grip force (GF; normal component of the force acting at the hand-object contact area) and load force (LF; the tangential component) in a variety of grasping techniques and two LF directions. Thirteen participants exerted a continuous sinusoidal LF pattern against externally fixed handles applying both standard (i.e., using either the tips of the digits or the palms; the precision and palm grasps, respectively) and non-standard grasping techniques (using wrists and the dorsal finger areas; the wrist and fist grasp). We hypothesized (1) that the non-standard grasping techniques would provide deteriorated indices of force coordination when compared with the standard ones, and (2) that the nervous system would be able to adjust GF to the differences in friction coefficients of various skin areas used for grasping. However, most of the indices of force coordination remained similar across the tested grasping techniques, while the GF adjustments for the differences in friction coefficients (highest in the palm and the lowest in the fist and wrist grasp) provided inconclusive results. As hypothesized, GF relative to the skin friction was lowest in the precision grasp, but highest in the palm grasp. Therefore, we conclude that (1) the elaborate coordination of GF and LF consistently seen across the standard grasping techniques could be generalized to the non-standard ones, while (2) the ability to adjust GF using the same grasping technique to the differences in friction of various objects cannot be fully generalized to the GF adjustment when different grasps (i.e., hand segments) are used to manipulate the same object. Due to the importance of the studied phenomena for understanding both the functional and neural control aspects of manipulation, future studies should extend the current research to the transient and dynamic tasks, as well as to the general role of friction in our mechanical interactions with the environment.

  2. NET European Network on Neutron Techniques Standardization for Structural Integrity

    International Nuclear Information System (INIS)

    Youtsos, A.

    2004-01-01

    Improved performance and safety of European energy production systems is essential for providing safe, clean and inexpensive electricity to the citizens of the enlarged EU. The state of the art in assessing internal stresses, micro-structure and defects in welded nuclear components -as well as their evolution due to complex thermo-mechanical loads and irradiation exposure -needs to be improved before relevant structural integrity assessment code requirements can safely become less conservative. This is valid for both experimental characterization techniques and predictive numerical algorithms. In the course of the last two decades neutron methods have proven to be excellent means for providing valuable information required in structural integrity assessment of advanced engineering applications. However, the European industry is hampered from broadly using neutron research due to lack of harmonised and standardized testing methods. 35 European major industrial and research/academic organizations have joined forces, under JRC coordination, to launch the NET European Network on Neutron Techniques Standardization for Structural Integrity in May 2002. The NET collaborative research initiative aims at further development and harmonisation of neutron scattering methods, in support of structural integrity assessment. This is pursued through a number of testing round robin campaigns on neutron diffraction and small angle neutron scattering - SANS and supported by data provided by other more conventional destructive and non-destructive methods, such as X-ray diffraction and deep and surface hole drilling. NET also strives to develop more reliable and harmonized simulation procedures for the prediction of residual stress and damage in steel welded power plant components. This is pursued through a number of computational round robin campaigns based on advanced FEM techniques, and on reliable data obtained by such novel and harmonized experimental methods. The final goal of

  3. New quantitative safety standards : Different techniques, different results?

    NARCIS (Netherlands)

    Rouvroye, J.L.; Brombacher, A.C.; Lydersen, S.; Hansen, G.K.; Sandtor, H.

    1998-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many parameters can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN [DIN19250, DIN0801] no quantitative analysis was demanded. The

  4. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  5. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  6. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  7. Standardized technique for single port laparoscopic ileostomy and colostomy.

    Science.gov (United States)

    Shah, A; Moftah, M; Hadi Nahar Al-Furaji, H; Cahill, R A

    2014-07-01

    Single site laparoscopic techniques and technology exploit maximum usefulness from confined incisions. The formation of an ileostomy or colostomy seems very applicable for this modality as the stoma occupies the solitary incision obviating any additional wounds. Here we detail the principles of our approach to defunctioning loop stoma formation using single port laparoscopic access in a stepwise and standardized fashion along with the salient specifics of five illustrative patients. No specialized instrumentation is required and the single access platform is established table-side using the 'glove port' technique. The approach has the intra-operative advantage of excellent visualization of the correct intestinal segment for exteriorization along with direct visual control of its extraction to avoid twisting. Postoperatively, abdominal wall trauma has been minimal allowing convalescence and stoma care education with only one parietal incision. Single incision stoma siting proves a ready, robust and reliable technique for diversion ileostomy and colostomy with a minimum of operative trauma for the patient. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  8. Compressed air injection technique to standardize block injection pressures.

    Science.gov (United States)

    Tsui, Ban C H; Li, Lisa X Y; Pillay, Jennifer J

    2006-11-01

    Presently, no standardized technique exists to monitor injection pressures during peripheral nerve blocks. Our objective was to determine if a compressed air injection technique, using an in vitro model based on Boyle's law and typical regional anesthesia equipment, could consistently maintain injection pressures below a 1293 mmHg level associated with clinically significant nerve injury. Injection pressures for 20 and 30 mL syringes with various needle sizes (18G, 20G, 21G, 22G, and 24G) were measured in a closed system. A set volume of air was aspirated into a saline-filled syringe and then compressed and maintained at various percentages while pressure was measured. The needle was inserted into the injection port of a pressure sensor, which had attached extension tubing with an injection plug clamped "off". Using linear regression with all data points, the pressure value and 99% confidence interval (CI) at 50% air compression was estimated. The linearity of Boyle's law was demonstrated with a high correlation, r = 0.99, and a slope of 0.984 (99% CI: 0.967-1.001). The net pressure generated at 50% compression was estimated as 744.8 mmHg, with the 99% CI between 729.6 and 760.0 mmHg. The various syringe/needle combinations had similar results. By creating and maintaining syringe air compression at 50% or less, injection pressures will be substantially below the 1293 mmHg threshold considered to be an associated risk factor for clinically significant nerve injury. This technique may allow simple, real-time and objective monitoring during local anesthetic injections while inherently reducing injection speed.

  9. ASTM standards for fire debris analysis: a review.

    Science.gov (United States)

    Stauffer, Eric; Lentini, John J

    2003-03-12

    The American Society for Testing and Materials (ASTM) recently updated its standards E 1387 and E 1618 for the analysis of fire debris. The changes in the classification of ignitable liquids are presented in this review. Furthermore, a new standard on extraction of fire debris with solid phase microextraction (SPME) was released. Advantages and drawbacks of this technique are presented and discussed. Also, the standard on cleanup by acid stripping has not been reapproved. Fire debris analysts that use the standards should be aware of these changes.

  10. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  11. Rat pancreatic islet size standardization by the "hanging drop" technique.

    Science.gov (United States)

    Cavallari, G; Zuellig, R A; Lehmann, R; Weber, M; Moritz, W

    2007-01-01

    Rejection and hypoxia are the main factors that limit islet engraftment in the recipient liver in the immediate posttransplant period. Recently authors have reported a negative relationship of graft function and islet size, concluding that small islets are superior to large islets. Islets can be dissociated into single cells and reaggregated into so called "pseudoislets," which are functionally equivalent to intact islets but exhibit reduced immunogenicity. The aim of our study was develop a technique that enabled one to obtain pseudoislets of defined, preferably small, dimensions. Islets were harvested from Lewis rats by the collagenase digestion procedure. After purification, the isolated islets were dissociated into single cells by trypsin digestion. Fractions with different cell numbers were seeded into single drops onto cell culture dishes, which were inverted and incubated for 5 to 8 days under cell culture conditions. Newly formed pseudoislets were analyzed for dimension, morphology, and cellular composition. The volume of reaggregated pseudoislets strongly correlated with the cell number (r(2) = .995). The average diameter of a 250-cell aggregate was 95 +/- 8 microm (mean +/- SD) compared with 122 +/- 46 microm of freshly isolated islets. Islet cell loss may be minimized by performing reaggregation in the presence of medium glucose (11 mmol/L) and the GLP-1 analogue Exendin-4. Morphology, cellular composition, and architecture of reaggregated islets were comparable to intact islets. The "hanging drop" culture method allowed us to obtain pseudoislets of standardized size and regular shape, which did not differ from intact islets in terms of cellular composition or architecture. Further investigations are required to minimize cell loss and test in vivo function of transplanted pseudoislets.

  12. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  13. Analysis of standard substance human hair

    International Nuclear Information System (INIS)

    Zou Shuyun; Zhang Yongbao

    2005-01-01

    The human hair samples as standard substances were analyzed by the neutron activation analysis (NAA) on the miniature neutron source reactor. 19 elements, i.e. Al, As, Ba, Br, Ca, Cl, Cr, Co, Cu, Fe, Hg, I, Mg, Mn, Na, S, Se, V and Zn, were measured. The average content, standard deviation, relative standard deviation and the detection limit under the present research conditions were given for each element, and the results showed that the measured values of the samples were in agreement with the recommended values, which indicated that NAA can be used to analyze standard substance human hair with a relatively high accuracy. (authors)

  14. [Standardization of Blastocystis hominis diagnosis using different staining techniques].

    Science.gov (United States)

    Eymael, Dayane; Schuh, Graziela Maria; Tavares, Rejane Giacomelli

    2010-01-01

    The present study was carried out from March to May 2008, with the aim of evaluating the effectiveness of different techniques for diagnosing Blastocystis hominis in a sample of the population attended at the Biomedicine Laboratory of Feevale University, Novo Hamburgo, Rio Grande do Sul. On hundred feces samples from children and adults were evaluated. After collection, the samples were subjected to the techniques of spontaneous sedimentation (HPJ), sedimentation in formalin-ether (Ritchie) and staining by means of Gram and May-Grünwald-Giemsa (MGG). The presence of Blastocystis hominis was observed in 40 samples, when staining techniques were used (MGG and Gram), while sedimentation techniques were less efficient (32 positive samples using the Ritchie technique and 20 positive samples using the HPJ technique). Our results demonstrate that HPJ was less efficient than the other methods, thus indicating the need to include laboratory techniques that enable parasite identification on a routine basis.

  15. [Study on standardization of cupping technique: elucidation on the establishment of the National Standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping].

    Science.gov (United States)

    Gao, Shu-zhong; Liu, Bing

    2010-02-01

    From the aspects of basis, technique descriptions, core contents, problems and solutions, and standard thinking in standard setting process, this paper states experiences in the establishment of the national standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping, focusing on methodologies used in cupping standard setting process, the method selection and operating instructions of cupping standardization, and the characteristics of standard TCM. In addition, this paper states the scope of application, and precautions for this cupping standardization. This paper also explaines tentative ideas on the research of standardized manipulation of acupuncture and moxibustion.

  16. Lightening protection, techniques, applied codes and standards. Vol. 4

    International Nuclear Information System (INIS)

    Mahmoud, M.; Shaaban, H.; Lamey, S.

    1996-01-01

    Lightening is the only natural disaster that protection against is highly effective. Therefore for the safety of critical installations specifically nuclear, an effective lightening protection system (LPS) is required. The design and installation of LPS's have been addressed by many international codes and standards. In this paper, the various LPS's are discussed and compared, including radioactive air terminals, ionizing air terminals, and terminals equipped with electrical trigging devices. Also, the so-called dissipation array systems are discussed and compared to other systems technically and economically. Moreover, the available international codes and standards related to the lightening protection are discussed. such standards include those published by the national fire protection association (NFPA), lightening protection institute (LPI), underwriters laboratories (UL), and british standards Finally, the possibility of developing an egyptian national standards is discussed

  17. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  18. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  19. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  20. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  1. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  2. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  3. A Secure Test Technique for Pipelined Advanced Encryption Standard

    Science.gov (United States)

    Shi, Youhua; Togawa, Nozomu; Yanagisawa, Masao; Ohtsuki, Tatsuo

    In this paper, we presented a Design-for-Secure-Test (DFST) technique for pipelined AES to guarantee both the security and the test quality during testing. Unlike previous works, the proposed method can keep all the secrets inside and provide high test quality and fault diagnosis ability as well. Furthermore, the proposed DFST technique can significantly reduce test application time, test data volume, and test generation effort as additional benefits.

  4. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  5. Extended standard vector analysis for plasma physics

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-02-01

    Standard vector analysis in 3-dimensional space, as found in most tables and textbooks, is complemented by a number of basic formulas that seem to be largely unknown, but are important in themselves and for some plasma physics applications, as is shown by several examples. (orig.)

  6. Technique for fabrication of gradual standards of radiographic image blachening density

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    The technique of fabrication of gradual standards of blackening density for industrial radiography by contact printing from a negative is described. The technique is designed for possibilities of industrial laboratoriesof radiation defectoscopy possessing no special-purpose sensitometric equipment

  7. Standard practice for leaks using bubble emission techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This practice describes accepted procedures for and factors that influence laboratory immersion corrosion tests, particularly mass loss tests. These factors include specimen preparation, apparatus, test conditions, methods of cleaning specimens, evaluation of results, and calculation and reporting of corrosion rates. This practice also emphasizes the importance of recording all pertinent data and provides a checklist for reporting test data. Other ASTM procedures for laboratory corrosion tests are tabulated in the Appendix. (Warning-In many cases the corrosion product on the reactive metals titanium and zirconium is a hard and tightly bonded oxide that defies removal by chemical or ordinary mechanical means. In many such cases, corrosion rates are established by mass gain rather than mass loss.) 1.2 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. This standard does not purport to address all of the safety concerns, if any, assoc...

  8. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  9. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  10. A standardized surgical technique for rat superior cervical ganglionectomy

    DEFF Research Database (Denmark)

    Savastano, Luis Emilio; Castro, Analía Elizabeth; Fitt, Marcos René

    2010-01-01

    Superior cervical ganglionectomy (SCGx) is a valuable microsurgical model to study the role of the sympathetic nervous system in a vast array of physiological and pathological processes, including homeostatic regulation, circadian biology and the dynamics of neuronal dysfunction and recovery afte...... expect that the following standardized and optimized protocol will allow researchers to organize knowledge into a cohesive framework in those areas where the SCGx is applied....

  11. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  12. Palliative Spleen Irradiation: Can we Standardize its Technique?

    International Nuclear Information System (INIS)

    NAZMY, M.S.; RADWAN, A.; MOKHTAR, M.

    2008-01-01

    To explore the pattern of practice of palliative splenic irradiation (PSI) at the National Cancer Institute (NCI), Cairo University. Patients and Methods: The medical records of patients referred for PSI during the time period from 1990 to 2005 were retrospectively reviewed. We compared the three most common planning techniques (two parallel opposing, single direct field, anterior and lateral fields). Results: Eighteen patients who received PSI were identified. Thirteen patients were diagnosed as CML and 5 as CLL. The mean age of the patients was 44 (±16) years and the majority were men (60%). Spleen enlargement was documented in all cases. The single direct anterior field was the most commonly used technique. The dose per fraction ranged from 25 c Gy to 100 c Gy. The total dose ranged from 125 c Gy to 1200 c Gy and the median was 200 c Gy (mean 327 c Gy). There was no significant difference between CML and CLL patients regarding the dose level. Three out of 5 CLL patients and only one out of 13 CML patients received re-irradiation. All patients showed subjective improvement regarding pain and swelling. There was a significant increase in the hemoglobin level and a significant decrease in the WBC count. The single direct field shows variations in the dose from 56 to 102%; however, it is the simplest and the best regarding the dose to the surrounding normal tissues especially the kidney and the liver. Conclusion: PSI has a significant palliative benefit. Although the most widely accepted technique is the 2 parallel opposing anterior-posterior fields, single anterior field is also considered as a suitable option. Higher doses are needed for CLL patients compared to CML patients

  13. Standard evaluation techniques for containment and surveillance radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1982-01-01

    Evaluation techniques used at Los Alamos for personnel and vehicle radiation monitors that safeguard nuclear material determine the worst-case sensitivity. An evaluation tests a monitor's lowest sensitivity regions with sources that have minimum emission rates. The result of our performance tests are analyzed as a binomial experiment. The number of trials that are required to verify the monitor's probability of detection is determined by a graph derived from the confidence limits for a binomial distribution. Our testing results are reported in a way that characterizes the monitor yet does not compromise security by revealing its routine performance for detecting process materials

  14. Basic prediction techniques in modern video coding standards

    CERN Document Server

    Kim, Byung-Gyu

    2016-01-01

    This book discusses in detail the basic algorithms of video compression that are widely used in modern video codec. The authors dissect complicated specifications and present material in a way that gets readers quickly up to speed by describing video compression algorithms succinctly, without going to the mathematical details and technical specifications. For accelerated learning, hybrid codec structure, inter- and intra- prediction techniques in MPEG-4, H.264/AVC, and HEVC are discussed together. In addition, the latest research in the fast encoder design for the HEVC and H.264/AVC is also included.

  15. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  16. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  17. Standards in radiographically guided biopsies - indications, techniques, complications

    International Nuclear Information System (INIS)

    Feuerbach, S.; Schreyer, A.; Schlottmann, K.

    2003-01-01

    In the first place, different needle types are presented, in particular, biopsy cannulae applying the ''TruCut'' principle and devices suitable for bone biopsy. Important aids for the daily practice, such as tandem technology and coaxial technology, are presented. Advantages and disadvantages are discussed, together with the most important sites of target-directed fluoroscopy, sonography and computer tomography as well as CT-fluoroscopy. Local anesthesia and analgosedation are presented, and the general and specific caliber- or entrance-dependent contraindications are described. The literature is reviewed for data of severe complications, such as death or tumor cell deposits along the puncture site. For the different targets in thorax and abdomen, the typical indications, points of entrance, contraindications, complications and special techniques are described, and the value of the biopsy for these localizations is presented. Under the heading ''Tips and Tricks'', practical advice useful for the daily routine can be found. (orig.) [de

  18. Preliminary detection of explosive standard components with Laser Raman Technique

    International Nuclear Information System (INIS)

    Botti, S.; Ciardi, R.

    2008-01-01

    Presently, our section is leader of the ISOTREX project (Integrated System for On-line TRace EXplosives detection in solid, liquid and vapour state), funded in the frame of the PASR 2006 action (Preparatory Action on the enhancement of the European industrial potential in the field of Security Research Preparatory Action) of the 6. EC framework. ISOTREX project will exploit the capabilities of different laser techniques as LIBS (Laser Induced Breakdown Spectroscopy), LPA (Laser Photo Acustic) and CRDS (Cavity Ring Down Spectroscopy) to monitor explosive traces. In this frame, we extended our investigation also to the laser induced Raman effect spectroscopy, in order to investigate its capabilities and possible future integration. We analysed explosive samples in bulk solid phase, diluted liquid phase and as evaporated films over suitable substrate. In the following, we present the main results obtained, outlining preliminary conclusions [it

  19. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  20. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  1. Standardization of Berberis aristata extract through conventional and modern HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh K. Patel

    2012-05-01

    Full Text Available Objective: Berberis aristata (Berberidaceae is an important medicinal plant, found in the different region of the world. It has significant medicinal value in the traditional Indian and Chinese system of medicine. The aim of the present investigation includes qualitative and quantitative analysis of Berberis aristata extract. Methods: Present study includes determination of phytochemical analysis, solubility test, heavy metal analysis, antimicrobial study and quantitative analysis by HPTLC method. Results: Preliminary phytochemical analysis showed the presence of carbohydrate, glycoside, alkaloid, protein, amino acid, saponin, tannin and flavonoid. Solubility in water and alcohal were found to be 81.90% in water and 84.52% in 50% in alcohal. Loss on drying was found to be 5.32%. Total phenol and flavonoid content were found to be 0.11% and 2.8%. Level of lead, arsenic, mercury and cadmium complies the standard level. E. coli and salmonella was found to be absent whereas total bacterial count, yeast and moulds contents were found to be under the limit. Content of berberine was found to be 13.47% through HPTLC techniques. Conclusions: The results obtained from the present studies could be used as source of valuable information which can play an important role for the food scientists, researchers and even the consumers for its standards.

  2. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  3. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  4. Neutron activation analysis for certification of standard reference materials

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Perez Zayas, G.; Hernandez Rivero, A.; Ribeiro Guevara, S.

    1996-01-01

    Neutron activation analysis is used extensively as one of the analytical techniques in the certification of standard reference materials. Characteristics of neutron activation analysis which make it valuable in this role are: accuracy multielemental capability to asses homogeneity, high sensitivity for many elements, and essentially non-destructive method. This paper report the concentrations of 30 elements (major, minor and trace elements) in four Cuban samples. The samples were irradiated in a thermal neutron flux of 10 12- 10 13 n.cm 2. s -1. The gamma ray spectra were measured by HPGe detectors and were analyzed using ACTAN program development in Center of Applied Studies for Nuclear Development

  5. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  6. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  7. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  8. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  10. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  11. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  12. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  13. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  14. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  15. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  16. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  17. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  18. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  19. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  20. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  1. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  2. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  3. Female stress urinary incontinence: standard techniques revisited and critical evaluation of innovative techniques

    Science.gov (United States)

    de Riese, Cornelia; de Riese, Werner T. W.

    2003-06-01

    Objectives: The treatment of female urinary incontinence (UI) is a growing health care concern in our aging society. Publications of recent innovations and modifications are creating expectations. This brief review provides some insight and structure regarding indications and expected outcomes for the different approaches. Materials: Data extraction is part of a Medline data base search, which was performed for "female stress incontinence" from 1960 until 2000. Additional literature search was performed to cover 2001 and 2002. Outcome data were extracted. Results: (1) INJECTION OF BULKING AGENTS (collagen, synthetic agents): The indication for mucosal coaptation was more clearly defined and in the majority of articles limited to ISD. (2) OPEN COLPOSUSPENSION (Burch, MMK): Best long-term results of all operative procedures, to date considered the gold standard. (3) LAPAROSCOPIC COLPOSUSPENSION (different modifications): Long-term success rates appear dependent on operator skills. There are few long-term data. (4) NEEDLE SUSPENSION: (Stamey, Pareyra and modifications): Initial results were equal to Burch with less morbidity, but long-term success rates are worse. (5) SLING PROCEDURES (autologous, synthetic, allogenic graft materials, different modes of support and anchoring, free tapes): The suburethral sling has traditionally been considered a procedure for those in whom suspension had failed and for those with severe ISD. The most current trend shows its use as a primary procedure for SUI. Long-term data beyond 5 years are insufficient. (6) EXTERNAL OCCLUSIVE DEVICES (vaginal sponges and pessaries, urethral insert): Both vaginal and urethral insert devices can be effective in selected patients. (7) IMPLANTABLE ARTEFICIAL URETHRAL SPHINCTERS: Modifications and improvements of the devices resulted in improved clinical results regarding durability and efficacy. CONCLUSION: (1) The Burch colposuspension is still considered the gold standard in the treatment of female

  4. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    International Nuclear Information System (INIS)

    Martens, Hans-Juergen von

    2010-01-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s 2 ). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  5. Preparation and analysis of standardized waste samples for Controlled Ecological Life Support Systems (CELSS)

    Science.gov (United States)

    Carden, J. L.; Browner, R.

    1982-01-01

    The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.

  6. Certification of standard reference materials employing neutron activation analysis

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Hernandez Rivero, A.; Molina Insfran, J.; Ribeiro Guevara, S.; Santana Encinosa, C.; Perez Zayas, G.

    1997-01-01

    Neutron activation analysis (Naa) is used extensively as one of the analytical techniques in the certification of standard reference materials (Srm). Characteristics of Naa which make it valuable in this role are: accuracy; multielemental capability; ability to assess homogeneity; high sensitivity for many elements, and essentially non-destructive method. This paper reports the concentrations of thirty elements (major, minor and trace elements) in four Cuban Srm's. The samples were irradiated in a thermal neutron flux of 10 12 -10 13 neutrons.cm -2 .s -1 . The gamma-ray spectra were measured by HPGe detectors and were analysed using ACTAN program, developed in CEADEN. (author) [es

  7. Standards in radiographically guided biopsies - indications, techniques, complications; Standards radiologisch bildgesteuerter Biopsien - Indikationsstellung, Technik, Komplikationen

    Energy Technology Data Exchange (ETDEWEB)

    Feuerbach, S.; Schreyer, A. [Universitaetsklinikum Regensburg (Germany). Inst. fuer Roentgendiagnostik; Schlottmann, K. [Universitaetsklinikum Regensburg (Germany). Klinik und Poliklinik fuer Innere Medizin I

    2003-09-01

    In the first place, different needle types are presented, in particular, biopsy cannulae applying the ''TruCut'' principle and devices suitable for bone biopsy. Important aids for the daily practice, such as tandem technology and coaxial technology, are presented. Advantages and disadvantages are discussed, together with the most important sites of target-directed fluoroscopy, sonography and computer tomography as well as CT-fluoroscopy. Local anesthesia and analgosedation are presented, and the general and specific caliber- or entrance-dependent contraindications are described. The literature is reviewed for data of severe complications, such as death or tumor cell deposits along the puncture site. For the different targets in thorax and abdomen, the typical indications, points of entrance, contraindications, complications and special techniques are described, and the value of the biopsy for these localizations is presented. Under the heading ''Tips and Tricks'', practical advice useful for the daily routine can be found. (orig.) [German] Zunaechst werden verschiedene Nadeltypen vorgestellt, insbesondere die nach dem ''Tru-Cut''-Prinzip funktionierenden Biopsiekanuelen, und Bestecke, die sich zur Knochenbiopsie eignen. Fuer die taegliche Praxis wichtige Hilfsmittel wie Tandemtechnik und Koaxialtechnik werden dargestellt. Auf die Vor- und Nachteile und damit auch die wichtigen Einsatzgebiete der Zielverfahren Fluoroskopie, Ultraschall und Computertomographie sowie CT-Fluoroskopie wird eingegangen. Lokalanaesthesie und Analogsedierung werden ebenso dargestellt, die allgemeinen und spezifischen, kaliber- oder zugangsabhaengigen Kontraindikationen werden beschrieben und auf die Daten zur Literatur hinsichtlich schwerer Komplikationen wie Todesfaelle oder Tumorzellenverschleppung in den Stichkanal wird eingegangen. Fuer die unterschiedlichen Punktionsziele im Thorax und Abdomen werden die typischen

  8. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  9. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  10. Multielement analysis of biological standards by neutron activation analysis

    International Nuclear Information System (INIS)

    Nadkarni, R.A.

    1977-01-01

    Up to 28 elements were determined in two IAEA standards: Animal Muscle H4 and Fish Soluble A 6/74, and three NBS standards: Spinach: SRM-1570, Tomato Leaves: SRM-1573 and Pine Needles: SRM-1575 by instrumental neutron-activation analysis. Seven noble metals were determined in two NBS standards: Coal: SRM-1632 and Coal Fly Ash: SRM-1633 by radiochemical procedure while 11 rare earth elements were determined in NBS standard Orchard Leaves: SRM-1571 by instrumental neutron-activation analysis. The results are in good agreement with the certified and/or literature data where available. The irradiations were performed at the Cornell TRIGA Mark II nuclear reactor at a thermal neutron flux of 1-3x10 12 ncm -2 sec -1 . The short-lived species were determined after a 2-minute irradiation in the pneumatic rabbit tube, and the longer-lived species after an 8-hour irradiation in the central thimble facility. The standards and samples were counted on coaxial 56-cm 3 Ge(Li) detector. The system resolution was 1.96 keV (FWHM) with a peak to Compton ratio of 37:1 and counting efficiency of 13%, all compared to the 1.332 MeV photopeak of Co-60. (T.I.)

  11. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  12. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  13. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  14. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  15. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  16. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  17. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  18. Standard Procedure for Grid Interaction Analysis

    International Nuclear Information System (INIS)

    Svensson, Bertil; Lindahl, Sture; Karlsson, Daniel; Joensson, Jonas; Heyman, Fredrik

    2015-01-01

    Grid events, simultaneously affecting all safety related auxiliary systems in a nuclear power plant, are critical and must be carefully addressed in the design, upgrading and operational processes. Up to now, the connecting grid has often been treated as either fully available or totally unavailable, and too little attention has been paid to specify the grid performance criteria. This paper deals with standard procedures for grid interaction analysis, to derive tools and criteria to handle grid events challenging the safety systems of the plant. Critical external power system events are investigated and characterised, with respect to severity and rate of occurrence. These critical events are then grouped with respect to impact on the safety systems, when a disturbance propagates into the plant. It is then important to make sure that 1) the impact of the disturbance will never reach any critical system, 2) the impact of the disturbance will be eliminated before it will hurt any critical system, or 3) the critical systems will be proven to be designed in such a way that they can withstand the impact of the disturbance, and the associated control and protection systems can withstand voltage and frequency transients associated with the disturbances. A number of representative disturbance profiles, reflecting connecting grid conditions, are therefore derived, to be used for equipment testing. (authors)

  19. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  20. Acid Rain Analysis by Standard Addition Titration.

    Science.gov (United States)

    Ophardt, Charles E.

    1985-01-01

    The standard addition titration is a precise and rapid method for the determination of the acidity in rain or snow samples. The method requires use of a standard buret, a pH meter, and Gran's plot to determine the equivalence point. Experimental procedures used and typical results obtained are presented. (JN)

  1. MIMO wireless networks channels, techniques and standards for multi-antenna, multi-user and multi-cell systems

    CERN Document Server

    Clerckx, Bruno

    2013-01-01

    This book is unique in presenting channels, techniques and standards for the next generation of MIMO wireless networks. Through a unified framework, it emphasizes how propagation mechanisms impact the system performance under realistic power constraints. Combining a solid mathematical analysis with a physical and intuitive approach to space-time signal processing, the book progressively derives innovative designs for space-time coding and precoding as well as multi-user and multi-cell techniques, taking into consideration that MIMO channels are often far from ideal. Reflecting developments

  2. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  3. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  4. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  5. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  6. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  7. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  8. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  9. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  10. Data compression techniques and the ACR-NEMA digital interface communications standard

    International Nuclear Information System (INIS)

    Zielonka, J.S.; Blume, H.; Hill, D.; Horil, S.C.; Lodwick, G.S.; Moore, J.; Murphy, L.L.; Wake, R.; Wallace, G.

    1987-01-01

    Data compression offers the possibility of achieving high, effective information transfer rates between devices and of efficient utilization of digital storge devices in meeting department-wide archiving needs. Accordingly, the ARC-NEMA Digital Imaging and Communications Standards Committee established a Working Group to develop a means to incorporate the optimal use of a wide variety of current compression techniques while remaining compatible with the standard. This proposed method allows the use of public domain techniques, predetermined methods between devices already aware of the selected algorithm, and the ability for the originating device to specify algorithms and parameters prior to transmitting compressed data. Because of the latter capability, the technique has the potential for supporting many compression algorithms not yet developed or in common use. Both lossless and lossy methods can be implemented. In addition to description of the overall structure of this proposal, several examples using current compression algorithms are given

  11. Standardization of the Descemet membrane endothelial keratoplasty technique: Outcomes of the first 450 consecutive cases.

    Science.gov (United States)

    Satué, M; Rodríguez-Calvo-de-Mora, M; Naveiras, M; Cabrerizo, J; Dapena, I; Melles, G R J

    2015-08-01

    To evaluate the clinical outcome of the first 450 consecutive cases after Descemet membrane endothelial keratoplasty (DMEK), as well as the effect of standardization of the technique. Comparison between 3 groups: Group I: (cases 1-125), as the extended learning curve; Group II: (cases 126-250), transition to technique standardization; Group III: (cases 251-450), surgery with standardized technique. Best corrected visual acuity, endothelial cell density, pachymetry and intra- and postoperative complications were evaluated before, and 1, 3 and 6 months after DMEK. At 6 months after surgery, 79% of eyes reached a best corrected visual acuity of≥0.8 and 43%≥1.0. Mean preoperative endothelial cell density was 2,530±220 cells/mm2 and 1,613±495 at 6 months after surgery. Mean pachymetry measured 668±92 μm and 526±46 μm pre- and (6 months) postoperatively, respectively. There were no significant differences in best corrected visual acuity, endothelial cell density and pachymetry between the 3 groups (P > .05). Graft detachment presented in 17.3% of the eyes. The detachment rate declined from 24% to 12%, and the rate of secondary surgeries from 9.6% to 3.5%, from group I to III respectively. Visual outcomes and endothelial cell density after DMEK are independent of the technique standardization. However, technique standardization may have contributed to a lower graft detachment rate and a relatively low number of secondary interventions required. As such, DMEK may become the first choice of treatment in corneal endothelial disease. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  12. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  13. Analysis of ISO 26262 compliant techniques for the automotive domain

    NARCIS (Netherlands)

    S., Manoj Kannan; Dajsuren, Y.; Luo, Y.; Barosan, I.; Antkiewicz, M.; Atlee, J.; Dingel, J.; S, R.

    2015-01-01

    The ISO 26262 standard defines functional safety for automotive E/E systems. Since the publication of the first edition of this standard in 2011, many different safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard

  14. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  15. Analysis of cryptocurrencies as standard financial instruments

    OpenAIRE

    Bartoš, Jakub

    2014-01-01

    This paper analyzes cryptocurrencies as financial instruments. Firstly, we introduced the main features of cryptocurrencies and summarized the brief history. We found out that price of the most famous cryptocurrency Bitcoin follows the hypothesis of efficient markets and it immediately react on publicly announce information. Furthermore, Bitcoin can be seen as standard economic good that is priced by interaction of supply and demand on the market. These factors can be driven by macro financia...

  16. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  17. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.

    Science.gov (United States)

    American Society for Testing and Materials, Philadelphia, PA.

    Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…

  20. Comparison of QuadrapolarTM radiofrequency lesions produced by standard versus modified technique: an experimental model

    Directory of Open Access Journals (Sweden)

    Safakish R

    2017-06-01

    Full Text Available Ramin Safakish Allevio Pain Management Clinic, Toronto, ON, Canada Abstract: Lower back pain (LBP is a global public health issue and is associated with substantial financial costs and loss of quality of life. Over the years, different literature has provided different statistics regarding the causes of the back pain. The following statistic is the closest estimation regarding our patient population. The sacroiliac (SI joint pain is responsible for LBP in 18%–30% of individuals with LBP. Quadrapolar™ radiofrequency ablation, which involves ablation of the nerves of the SI joint using heat, is a commonly used treatment for SI joint pain. However, the standard Quadrapolar radiofrequency procedure is not always effective at ablating all the sensory nerves that cause the pain in the SI joint. One of the major limitations of the standard Quadrapolar radiofrequency procedure is that it produces small lesions of ~4 mm in diameter. Smaller lesions increase the likelihood of failure to ablate all nociceptive input. In this study, we compare the standard Quadrapolar radiofrequency ablation technique to a modified Quadrapolar ablation technique that has produced improved patient outcomes in our clinic. The methodology of the two techniques are compared. In addition, we compare results from an experimental model comparing the lesion sizes produced by the two techniques. Taken together, the findings from this study suggest that the modified Quadrapolar technique provides longer lasting relief for the back pain that is caused by SI joint dysfunction. A randomized controlled clinical trial is the next step required to quantify the difference in symptom relief and quality of life produced by the two techniques. Keywords: lower back pain, radiofrequency ablation, sacroiliac joint, Quadrapolar radiofrequency ablation

  1. Assessment of Snared-Loop Technique When Standard Retrieval of Inferior Vena Cava Filters Fails

    International Nuclear Information System (INIS)

    Doody, Orla; Noe, Geertje; Given, Mark F.; Foley, Peter T.; Lyon, Stuart M.

    2009-01-01

    Purpose To identify the success and complications related to a variant technique used to retrieve inferior vena cava filters when simple snare approach has failed. Methods A retrospective review of all Cook Guenther Tulip filters and Cook Celect filters retrieved between July 2006 and February 2008 was performed. During this period, 130 filter retrievals were attempted. In 33 cases, the standard retrieval technique failed. Retrieval was subsequently attempted with our modified retrieval technique. Results The retrieval was successful in 23 cases (mean dwell time, 171.84 days; range, 5-505 days) and unsuccessful in 10 cases (mean dwell time, 162.2 days; range, 94-360 days). Our filter retrievability rates increased from 74.6% with the standard retrieval method to 92.3% when the snared-loop technique was used. Unsuccessful retrieval was due to significant endothelialization (n = 9) and caval penetration by the filter (n = 1). A single complication occurred in the group, in a patient developing pulmonary emboli after attempted retrieval. Conclusion The technique we describe increased the retrievability of the two filters studied. Hook endothelialization is the main factor resulting in failed retrieval and continues to be a limitation with these filters.

  2. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  3. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.

  4. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    Science.gov (United States)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  5. Post-voiding residual urine and capacity increase in orthotopic urinary diversion: Standard vs modified technique

    Directory of Open Access Journals (Sweden)

    Bančević Vladimir

    2010-01-01

    Full Text Available Background/Aim. Ever since the time when the first orthotopic urinary diversion (pouch was performed there has been a constant improvement and modification of surgical techniques. The aim has been to create a urinary reservoir similar to normal bladder, to decrease incidence of postoperative complications and provide an improved life quality. The aim of this study was to compare postvoiding residual urine (PVR and capacity of the pouch constructed by standard or modified technique. Methods. In this prospective and partially retrospective clinical study we included 79 patients. In the group of 41 patients (group ST pouch was constructed using 50-70 cm of the ileum (standard technique. In the group of 38 patients (group MT pouch was constructed using 25-35 cm of the ileum (modified technique. Postoperatively, PVR and pouch capacity were measured using ultrasound in a 3-, 6- and 12-month period. Results. Postoperatively, an increase in PVR and pouch capacity was noticed in both groups. Twelve months postoperatively, PVR was significantly smaller in the group MT than in the group ST [23 (0-90 mL vs 109 (0-570 mL, p < 0,001]. In the same period the pouch capacity was significantly smaller in the MT group than in the ST group [460 (290-710 mL vs 892 (480-2 050 mL, p < 0.001]. Conclusion. Postoperatively, an increase in PVR and pouch capacity was noticed during a 12-month period. A year following the operation the pouch created from a shorter ileal segment reached capacity of the 'normal' bladder with small PVR. The pouch created by standard technique developed an unnecessary large PVR and capacity.

  6. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  7. Neutron Activation Analysis with k0 Standardization

    International Nuclear Information System (INIS)

    Pomme, S.

    1998-01-01

    SCK-CEN's programme on Neutron Activation Analysis with k 0 -standardisation aims to: (1) develop and implement k 0 -standardisation method for NAA; (2) to exploit the inherent qualities of NAA such as accuracy, traceability, and multi-element capability; (3) to acquire technical spin-off for nuclear measurements services. Main achievements in 1997 are reported

  8. Development, enhancement, and evaluation of aircraft measurement techniques for national ambient air quality standard criteria pollutants

    Science.gov (United States)

    Brent, Lacey Cluff

    The atmospheric contaminants most harmful to human health are designated Criteria Pollutants. To help Maryland attain the national ambient air quality standards (NAAQS) for Criteria Pollutants, and to improve our fundamental understanding of atmospheric chemistry, I conducted aircraft measurements in the Regional Atmospheric Measurement Modeling Prediction Program (RAMMPP). These data are used to evaluate model simulations and satellite observations. I developed techniques for improving airborne observation of two NAAQS pollutants, particulate matter (PM) and nitrogen dioxide (NO2). While structure and composition of organic aerosol are important for understanding PM formation, the molecular speciation of organic ambient aerosol remains largely unknown. The spatial distribution of reactive nitrogen is likewise poorly constrained. To examine water-soluble organic aerosol (WSOA) during an air pollution episode, I designed and implemented a shrouded aerosol inlet system to collect PM onto quartz fiber filters from a Cessna 402 research aircraft. Inlet evaluation conducted during a side-by-side flight with the NASA P3 demonstrated agreement to within 30%. An ion chromatographic mass spectrometric method developed using the NIST Standard Reference Material (SRM) 1649b Urban Dust, as a surrogate material resulted in acidic class separation and resolution of at least 34 organic acids; detection limits approach pg/g concentrations. Analysis of aircraft filter samples resulted in detection of 8 inorganic species and 16 organic acids of which 12 were quantified. Aged, re-circulated metropolitan air showed a greater number of dicarboxylic acids compared to air recently transported from the west. While the NAAQS for NO2 is rarely exceeded, it is a precursor molecule for ozone, America's most recalcitrant pollutant. Using cavity ringdown spectroscopy employing a light emitting diode (LED), I measured vertical profiles of NO2 (surface to 2.5 km) west (upwind) of the Baltimore

  9. New Theoretical Analysis of the LRRM Calibration Technique for Vector Network Analyzers

    OpenAIRE

    Purroy Martín, Francesc; Pradell i Cara, Lluís

    2001-01-01

    In this paper, a new theoretical analysis of the four-standards line-reflect-reflect-match (LRRM) vector network-analyzer (VNA) calibration technique is presented. As a result, it is shown that the reference-impedance (to which the LRRM calibration is referred) cannot generally be defined whenever nonideal standards are used. Based on this consideration, a new algorithm to determine the on-wafer match standard is proposed that improves the LRRM calibration accuracy. Experimental verification ...

  10. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  11. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  12. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    Science.gov (United States)

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  13. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  14. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  15. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    M. S. Kannan; Y. Dajsuren (Yanjindulam); Y. Luo; I. Barosan

    2015-01-01

    htmlabstractThe ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the

  16. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  17. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  18. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  19. Standard practice for monitoring atmospheric SO2 using the sulfation plate technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice covers a weighted average effective SO2 level for a 30-day interval through the use of the sulfation plate method, a technique for estimating the effective SO2 content of the atmosphere, and especially with regard to the atmospheric corrosion of stationary structures or panels. This practice is aimed at determining SO2 levels rather than sulfuric acid aerosol or acid precipitation. 1.2 The results of this practice correlate approximately with volumetric SO2 concentrations, although the presence of dew or condensed moisture tends to enhance the capture of SO2 into the plate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  20. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  1. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  2. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  3. Least squares analysis of fission neutron standard fields

    International Nuclear Information System (INIS)

    Griffin, P.J.; Williams, J.G.

    1997-01-01

    A least squares analysis of fission neutron standard fields has been performed using the latest dosimetry cross sections. Discrepant nuclear data are identified and adjusted spectra for 252 Cf spontaneous fission and 235 U thermal fission fields are presented

  4. Standardization: using comparative maintenance costs in an economic analysis

    OpenAIRE

    Clark, Roger Nelson

    1987-01-01

    Approved for public release; distribution is unlimited This thesis investigates the use of comparative maintenance costs of functionally interchangeable equipments in similar U.S. Navy shipboard applications in an economic analysis of standardization. The economics of standardization, life-cycle costing, and the Navy 3-M System are discussed in general. An analysis of 3-M System maintenance costs for a selected equipment, diesel engines, is conducted. The potential use of comparative ma...

  5. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  6. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  7. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  8. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  9. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  10. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  11. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  12. Preliminary results of standard quantitative analysis by ED-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A., E-mail: alellara@hotmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Fisica; Denyak, Valeriy, E-mail: denyak@gmail.com [Instituto de Pesquisa Pele Pequeno Principe (IPPP), Curitiba, PR (Brazil)

    2013-07-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step.

  13. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  14. Establishing working standards of chromosome aberrations analysis for biological dosimetry

    International Nuclear Information System (INIS)

    Bui Thi Kim Luyen; Tran Que; Pham Ngoc Duy; Nguyen Thi Kim Anh; Ha Thi Ngoc Lien

    2015-01-01

    Biological dosimetry is an dose assessment method using specify bio markers of radiation. IAEA (International Atomic Energy Agency) and ISO (International Organization for Standardization) defined that dicentric chromosome is specify for radiation, it is a gold standard for biodosimetry. Along with the documents published by IAEA, WHO, ISO and OECD, our results of study on the chromosome aberrations induced by radiation were organized systematically in nine standards that dealing with chromosome aberration test and micronucleus test in human peripheral blood lymphocytes in vitro. This standard addresses: the reference dose-effect for dose estimation, the minimum detection levels, cell culture, slide preparation, scoring procedure for chromosome aberrations use for biodosimetry, the criteria for converting aberration frequency into absorbed dose, reporting of results. Following these standards, the automatic analysis devices were calibrated for improving biological dosimetry method. This standard will be used to acquire and maintain accreditation of the Biological Dosimetry laboratory in Nuclear Research Institute. (author)

  15. Comparison of ankle-brachial index measured by an automated oscillometric apparatus with that by standard Doppler technique in vascular patients

    DEFF Research Database (Denmark)

    Korno, M.; Eldrup, N.; Sillesen, H.

    2009-01-01

    was calculated twice using both the methods on both legs. MATERIALS AND METHODS: We tested the automated oscillometric blood pressure device, CASMED 740, for measuring ankle and arm blood pressure and compared it with the current gold standard, the hand-held Doppler technique, by the Bland-Altman analysis....... RESULTS: Using the Doppler-derived ABI as the gold standard, the sensitivity and specificity of the oscillometric method for determining an ABI Udgivelsesdato: 2009/11...

  16. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  17. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  18. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  19. Study of the standard direct costs of various techniques of advanced endoscopy. Comparison with surgical alternatives.

    Science.gov (United States)

    Loras, Carme; Mayor, Vicenç; Fernández-Bañares, Fernando; Esteve, Maria

    2018-03-12

    The complexity of endoscopy has carried out an increase in cost that has a direct effect on the healthcare systems. However, few studies have analyzed the cost of advanced endoscopic procedures (AEP). To carry out a calculation of the standard direct costs of AEP, and to make a financial comparison with their surgical alternatives. Calculation of the standard direct cost in carrying out each procedure. An endoscopist detailed the time, personnel, materials, consumables, recovery room time, stents, pathology and medication used. The cost of surgical procedures was the average cost recorded in the hospital. Thirty-eight AEP were analyzed. The technique showing lowest cost was gastroscopy + APC (€116.57), while that with greatest cost was ERCP with cholangioscopy + stent placement (€5083.65). Some 34.2% of the procedures registered average costs of €1000-2000. In 57% of cases, the endoscopic alternative was 2-5 times more cost-efficient than surgery, in 31% of cases indistinguishable or up to 1.4 times more costly. Standard direct cost of the majority of AEP is reported using a methodology that enables easy application in other centers. For the most part, endoscopic procedures are more cost-efficient than the corresponding surgical procedure. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques.

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent.

  1. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  2. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  3. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  4. Open Partial Nephrectomy in Renal Cancer: A Feasible Gold Standard Technique in All Hospitals

    Directory of Open Access Journals (Sweden)

    J. M. Cozar

    2008-01-01

    Full Text Available Introduction. Partial nephrectomy (PN is playing an increasingly important role in localized renal cell carcinoma (RCC as a true alternative to radical nephrectomy. With the greater experience and expertise of surgical teams, it has become an alternative to radical nephrectomy in young patients when the tumor diameter is 4 cm or less in almost all hospitals since cancer-specific survival outcomes are similar to those obtained with radical nephrectomy. Materials and Methods. The authors comment on their own experience and review the literature, reporting current indications and outcomes including complications. The surgical technique of open partial nephrectomy is outlined. Conclusions. Nowadays, open PN is the gold standard technique to treat small renal masses, and all nonablative techniques must pass the test of time to be compared to PN. It is not ethical for patients to undergo radical surgery just because the urologists involved do not have adequate experience with PN. Patients should be involved in the final treatment decision and, when appropriate, referred to specialized centers with experience in open or laparoscopic partial nephrectomies.

  5. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  6. Concrete blocks. Analysis of UNE, ISO en standards and comparison with other international standards

    Directory of Open Access Journals (Sweden)

    Álvarez Alonso, Marina

    1990-12-01

    Full Text Available This paper attempts to describe the recently approved UNE standards through a systematic analysis of the main specifications therein contained and the values considered for each of them, as well as the drafts for ISO and EN concrete block standards. Furthermore, the study tries to place the set of ISO standards in the international environment through a comparative analysis against a representative sample of the standards prevailing in various geographical regions of the globe to determine the analogies and differences among them. PALABRAS CLAVE: albañilería, análisis de sistemas, bloque de hormigón, muros de fábrica, normativa KEY WORDS: masonry, system analysis, concrete blocks, masonry walls, standards

    En este trabajo se pretende describir la reciente aprobada normativa UNE, analizando sistemáticamente las principales prescripciones contempladas y los valores considerados para cada una de ellas, así como los proyectos de Norma ISO, y EN sobre bloques de hormigón. Asimismo se intenta situar la normativa UNE en al ámbito internacional, haciendo un análisis comparativo con una representación de Normas de distintas regiones geográficas del mundo, determinando sus analogías y diferencias.

  7. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  8. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  9. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  10. Analysis of triethylenediamine by theoretical techniques

    International Nuclear Information System (INIS)

    Mancuso, J.; McEachern, R.J.

    1997-03-01

    Triethylenediamine (TEDA) was modeled to illustrate the utility of modem computational methods. The software used was HyperChem 4.5 for Silicon Graphics workstations. The semiempirical method PM3 (Parametric Method 3) was used for all calculations. The formation and structure of the methyl-TEDA iodide salt and the electron-donor-acceptor complex between TEDA and iodomethane were examined. The infrared vibrational frequencies of TEDA were determined, and the mean deviation between calculated and experimental frequencies was found to be ±59 cm -1 . As a test of the accuracy of PM3, the heat of formation for various compounds, as well as various reaction enthalpies were calculated. The absolute unsigned error of standard enthalpies for 28 compounds was found to be ±6 kcal/mol. Average unsigned error of reaction enthalpies was found to be ±31 kcal/mol. Carbon-substituted TEDA analogs were modeled and studied in terms of their reaction enthalpies to see if these improved the efficiency of the reaction between TEDA and iodomethane. Trimethylsilyl and 2-siliconeopentyl substituted TEDA analogs were found to have reaction enthalpies significantly lower than TEDA, by 14 kcal/mol and 12 kcal/mol respectively. Such substituted TEDA molecules may well have enhanced performance in trapping methyl iodide. (author)

  11. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  12. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  13. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  14. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  15. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  16. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  17. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  18. Newly developed standard reference materials for organic contaminant analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poster, D.; Kucklick, J.; Schantz, M.; Porter, B.; Wise, S. [National Inst. of Stand. and Technol., Gaithersburg, MD (USA). Center for Anal. Chem.

    2004-09-15

    The National Institute of Standards and Technology (NIST) has issued a number of Standard Reference Materials (SRM) for specified analytes. The SRMs are biota and biological related materials, sediments and particle related SRMs. The certified compounds for analysis are polychlorinated biphenyls (PCB), polycylic aromatic hydrocarbons (PAH) and their nitro-analogues, chlorinated pesticides, methylmercury, organic tin compounds, fatty acids, polybrominated biphenyl ethers (PBDE). The authors report on origin of materials and analytic methods. (uke)

  19. Metabolomic analysis using porcine skin: a pilot study of analytical techniques

    OpenAIRE

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-01-01

    Background: Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. Objectives: We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. ...

  20. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  1. Assessment of chromium biostabilization in contaminated soils using standard leaching and sequential extraction techniques

    International Nuclear Information System (INIS)

    Papassiopi, Nymphodora; Kontoyianni, Athina; Vaxevanidou, Katerina; Xenidis, Anthimos

    2009-01-01

    The iron reducing microorganism Desulfuromonas palmitatis was evaluated as potential biostabilization agent for the remediation of chromate contaminated soils. D. palmitatis were used for the treatment of soil samples artificially contaminated with Cr(VI) at two levels, i.e. 200 and 500 mg kg -1 . The efficiency of the treatment was evaluated by applying several standard extraction techniques on the soil samples before and after treatment, such as the EN12457 standard leaching test, the US EPA 3060A alkaline digestion method and the BCR sequential extraction procedure. The water soluble chromium as evaluated with the EN leaching test, was found to decrease after the biostabilization treatment from 13 to less than 0.5 mg kg -1 and from 120 to 5.6 mg kg -1 for the soil samples contaminated with 200 and 500 mg Cr(VI) per kg soil respectively. The BCR sequential extraction scheme, although not providing accurate estimates about the initial chromium speciation in contaminated soils, proved to be a useful tool for monitoring the relative changes in element partitioning, as a consequence of the stabilization treatment. After bioreduction, the percentage of chromium retained in the two least soluble BCR fractions, i.e. the 'oxidizable' and 'residual' fractions, increased from 54 and 73% to more than 96% in both soils

  2. Dosimetric comparison of intensity modulated radiotherapy techniques and standard wedged tangents for whole breast radiotherapy

    International Nuclear Information System (INIS)

    Fong, Andrew; Bromley, Regina; Beat, Mardi; Vien, Din; Dineley, Jude; Morgan, Graeme

    2009-01-01

    Full text: Prior to introducing intensity modulated radiotherapy (IMRT) for whole breast radiotherapy (WBRT) into our department we undertook a comparison of the dose parameters of several IMRT techniques and standard wedged tangents (SWT). Our aim was to improve the dose distribution to the breast and to decrease the dose to organs at risk (OAR): heart, lung and contralateral breast (Contra Br). Treatment plans for 20 women (10 right-sided and 10 left-sided) previously treated with SWT for WBRT were used to compare (a) SWT; (b) electronic compensators IMRT (E-IMRT); (c) tangential beam IMRT (T-IMRT); (d) coplanar multi-field IMRT (CP-IMRT); and (e) non-coplanar multi-field IMRT (NCP-IMRT). Plans for the breast were compared for (i) dose homogeneity (DH); (ii) conformity index (CI); (iii) mean dose; (iv) maximum dose; (v) minimum dose; and dose to OAR were calculated (vi) heart; (vii) lung and (viii) Contra Br. Compared with SWT, all plans except CP-IMRT gave improvement in at least two of the seven parameters evaluated. T-IMRT and NCP-IMRT resulted in significant improvement in all parameters except DH and both gave significant reduction in doses to OAR. As on initial evaluation NCP-IMRT is likely to be too time consuming to introduce on a large scale, T-IMRT is the preferred technique for WBRT for use in our department.

  3. Arthroscopic Latarjet Techniques: Graft and Fixation Positioning Assessed With 2-Dimensional Computed Tomography Is Not Equivalent With Standard Open Technique.

    Science.gov (United States)

    Neyton, Lionel; Barth, Johannes; Nourissat, Geoffroy; Métais, Pierre; Boileau, Pascal; Walch, Gilles; Lafosse, Laurent

    2018-05-19

    To analyze graft and fixation (screw and EndoButton) positioning after the arthroscopic Latarjet technique with 2-dimensional computed tomography (CT) and to compare it with the open technique. We performed a retrospective multicenter study (March 2013 to June 2014). The inclusion criteria included patients with recurrent anterior instability treated with the Latarjet procedure. The exclusion criterion was the absence of a postoperative CT scan. The positions of the hardware, the positions of the grafts in the axial and sagittal planes, and the dispersion of values (variability) were compared. The study included 208 patients (79 treated with open technique, 87 treated with arthroscopic Latarjet technique with screw fixation [arthro-screw], and 42 treated with arthroscopic Latarjet technique with EndoButton fixation [arthro-EndoButton]). The angulation of the screws was different in the open group versus the arthro-screw group (superior, 10.3° ± 0.7° vs 16.9° ± 1.0° [P open inferior screws (P = .003). In the axial plane (level of equator), the arthroscopic techniques resulted in lateral positions (arthro-screw, 1.5 ± 0.3 mm lateral [P open technique (0.9 ± 0.2 mm medial). At the level of 25% of the glenoid height, the arthroscopic techniques resulted in lateral positions (arthro-screw, 0.3 ± 0.3 mm lateral [P open technique (1.0 ± 0.2 mm medial). Higher variability was observed in the arthro-screw group. In the sagittal plane, the arthro-screw technique resulted in higher positions (55% ± 3% of graft below equator) and the arthro-EndoButton technique resulted in lower positions (82% ± 3%, P open technique (71% ± 2%). Variability was not different. This study shows that the position of the fixation devices and position of the bone graft with the arthroscopic techniques are statistically significantly different from those with the open technique with 2-dimensional CT assessment. In the sagittal plane, the arthro-screw technique provides the highest

  4. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  5. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  6. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1978-01-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  7. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1979-02-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  8. Reduced Rate of Dehiscence After Implementation of a Standardized Fascial Closure Technique in Patients Undergoing Emergency Laparotomy

    DEFF Research Database (Denmark)

    Tolstrup, Mai-Britt; Watt, Sara Kehlet; Gögenur, Ismail

    2017-01-01

    to 2013 with 2014 to 2015. Factors associated with dehiscence were male gender [hazard ratio (HR) 2.8, 95% confidence interval (95% CI) (1.8-4.4), P ... (1.6-4.9), P 4%, P = 0.008. CONCLUSION: The standardized procedure of closing the midline laparotomy by using a "small steps" technique of continuous suturing...... and multivariate Cox regression analysis were performed. RESULTS: We included 494 patients from 2014 to 2015 and 1079 patients from our historical cohort for comparison. All patients had a midline laparotomy in an emergency setting. The rate of dehiscence was reduced from 6.6% to 3.8%, P = 0.03 comparing year 2009...

  9. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  10. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  11. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  12. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  13. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  14. Standard Guide for Wet Sieve Analysis of Ceramic Whiteware Clays

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This guide covers the wet sieve analysis of ceramic whiteware clays. This guide is intended for use in testing shipments of clay as well as for plant control tests. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  15. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  16. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  17. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  18. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  19. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  20. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  1. Preparation of uranium standard solutions for x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Wong, C.M.; Cate, J.L.; Pickles, W.L.

    1978-03-01

    A method has been developed for gravimetrically preparing uranium nitrate standards with an estimated mean error of 0.1% (1 sigma) and a maximum error of 0.2% (1 sigma) for the total uranium weight. Two source materials, depleted uranium dioxide powder and NBS Standard Reference Material 960 uranium metal, were used to prepare stock solutions. The NBS metal proved to be superior because of the small but inherent uncertainty in the stoichiometry of the uranium oxide. These solutions were used to prepare standards in a freeze-dried configuration suitable for x-ray fluorescence analysis. Both gravimetric and freeze-drying techniques are presented. Volumetric preparation was found to be unsatisfactory for 0.1% precision for the sample size of interest. One of the primary considerations in preparing uranium standards for x-ray fluorescence analysis is the development of a technique for dispensing a 50-μl aliquot of a standard solution with a precision of 0.1% and an accuracy of 0.1%. The method developed corrects for variation in aliquoting and for evaporation loss during weighing. Two sets, each containing 50 standards have been produced. One set has been retained by LLL and one set retained by the Savannah River project

  2. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  3. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  4. [Abdominothoracic esophageal resection according to Ivor Lewis with intrathoracic anastomosis : standardized totally minimally invasive technique].

    Science.gov (United States)

    Runkel, N; Walz, M; Ketelhut, M

    2015-05-01

    The clinical and scientific interest in minimally invasive techniques for esophagectomy (MIE) are increasing; however, the intrathoracic esophagogastric anastomosis remains a surgical challenge and lacks standardization. Surgeons either transpose the anastomosis to the cervical region or perform hybrid thoracotomy for stapler access. This article reports technical details and early experiences with a completely laparoscopic-thoracoscopic approach for Ivor Lewis esophagectomy without additional thoracotomy. The extent of radical dissection follows clinical guidelines. Laparoscopy is performed with the patient in a beach chair position and thoracoscopy in a left lateral decubitus position using single lung ventilation. The anvil of the circular stapler is placed transorally into the esophageal stump. The specimen and gastric conduit are exteriorized through a subcostal rectus muscle split incision. The stapler body is placed into the gastric conduit and both are advanced through the abdominal mini-incision transhiatally into the right thoracic cavity, where the anastomosis is constructed. Data were collected prospectively and analyzed retrospectively. A total of 23 non-selected consecutive patients (mean age 69 years, range 46-80 years) with adenocarcinoma (n = 19) or squamous cell carcinoma (n = 4) were surgically treated between June 2010 and July 2013. Neoadjuvant therapy was performed in 15 patients resulting in 10 partial and 4 complete remissions. There were no technical complications and no conversions. Mean operative time was 305 min (range 220-441 min). The median lymph node count was 16 (range 4-42). An R0 resection was achieved in 91 % of patients and 3 anastomotic leaks occurred which were successfully managed endoscopically. There were no postoperative deaths. The intrathoracic esophagogastric anastomosis during minimally invasive Ivor Lewis esophagectomy can be constructed in a standardized fashion without an additional thoracotomy

  5. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1994-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques

  6. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1995-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques. (author) 7 refs.; 5 tabs

  7. Standardization of MIP technique in three-dimensional CT portography: usefulness in evaluation of portosystemic collaterals in cirrhotic patients

    International Nuclear Information System (INIS)

    Kim, Jong Gi; Kim, Yong; Kim, Chang Won; Lee, Jun Woo; Lee, Suk Hong

    2003-01-01

    To assess the usefulness of three-dimensional CT portography using a standardized maximum intensity projection (MIP) technique for the evaluation of portosystemic collaterals in cirrhotic patients. In 25 cirrhotic patients with portosystemic collaterals, three-phase CT using a multide-tector-row helical CT scanner was performed to evaluate liver disease. Late arterial-phase images were transferred to an Advantage Windows 3.1 workstation (Gener Electric). Axial images were reconstructed by means of three-dimensional CT portography, using both a standardized and a non-standardized MIP technique, and the respective reconstruction times were determined. Three-dimensional CT portography with the standardized technique involved eight planes, namely the spleno-portal confluence axis (coronal, lordotic coronal, lordotic coronal RAO 30 .deg. C, and lordotic coronal LAO 30 .deg. C), the left renal vein axis (lordotic coronal), and axial MIP images (lower esophagus level, gastric fundus level and splenic hilum). The eight MIP images obtained in each case were interpreted by two radiologists, who reached a consensus in their evaluation. The portosystemic collaterals evaluated were as follows: left gastric vein dilatation; esophageal, paraesophageal, gastric, and splenic varix; paraumbilical vein dilatation; gastro-renal, spleno-renal, and gastro-spleno-renal shunt; mesenteric, retroperitoneal, and omental collaterals. The average reconstruction time using the non-standardized MIP technique was 11 minutes 23 seconds, and with the standardized technique, the time was 6 minutes 5 seconds. Three-dimensional CT portography with the standardized technique demonstrated left gastric vein dilatation (n=25), esophageal varix (n=18), paraesophageal varix (n=13), gastric varix (n=4), splenic varix (n=4), paraumbilical vein dilatation (n=4), gastro-renal shunt (n=3), spleno-renal shunt (n=3), and gastro-spleno-renal shunt (n=1). Using three-dimensional CT protography and the non-standardized

  8. Standardization of the radioimmunoassay technique for the determination of human gastrin and its clinical application

    International Nuclear Information System (INIS)

    Peig Ginabredra, M.G.

    1989-01-01

    It was developed and standardized a system of radioimmunoassay for the determination of gastrin, employing synthetic human gastrin for radioiodination and preparation of standard as well as specific antibody raised rabbits. The hormone was labeled with 125 I by the Cloramine T techique and purified by anion exchange chromatography in QAE-Sephadex A-25, being determined its specific activity. The tracer thus obtained was submitted to analysis of purity by poliacrilamide gel eletrophoresis and precipitation of proteins by trichloroacetic acid. Its stability evaluated according to the time of storage, being its purity and adequation for the use in radioimmunoassay also compared to a tracer obtained from a commercial diagnosis kit. The assays were performed by incubation of radioiodinated gastrin, standard gastrin prepared in plasma free from this hormone (from zero to 500 pmol/l) or samples to be assayed with the antiserum for 4 days at 4 0 C. The separation between the free gastrin and the gastrin bound to the antibody was carried out by adsorption of the free hormone to the charcoal, whose ideal concentration was previously determined. Plasma free from gastrin was obtained from time-expired blood bank plasma submitted to extraction with charcoal. When performed the quality control, this radioimmunoassay was shown specific, accurate, precise and sensitive, allowing the performance of valid assays. Its validation was even confirmed by clear discrimination not only of the gastrin concentration in subjects with very low levels (gastrectomized) and extremely high levels (Zollinger-Ellison syndrome) as well as gastrin concentrations in subjects with other diseases, such as Chagas disease, pernicious anemia and chronic renal failure. (author) [pt

  9. Development, improvement and calibration of neutronic reaction rates measurements: elaboration of a standard techniques basis

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-06-01

    In order to improve and to validate the neutronics calculation schemes, perfecting integral measurements of neutronics parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronics reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on

  10. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-01-01

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on Thorium

  11. Standardization of pulmonary ventilation technique using volume-controlled ventilators in rats with congenital diaphragmatic hernia

    Directory of Open Access Journals (Sweden)

    Rodrigo Melo Gallindo

    Full Text Available OBJECTIVE: To standardize a technique for ventilating rat fetuses with Congenital Diaphragmatic Hernia (CDH using a volume-controlled ventilator. METHODS: Pregnant rats were divided into the following groups: a control (C; b exposed to nitrofen with CDH (CDH; and c exposed to nitrofen without CDH (N-. Fetuses of the three groups were randomly divided into the subgroups ventilated (V and non-ventilated (N-V. Fetuses were collected on day 21.5 of gestation, weighed and ventilated for 30 minutes using a volume-controlled ventilator. Then the lungs were collected for histological study. We evaluated: body weight (BW, total lung weight (TLW, left lung weight (LLW, ratios TLW / BW and LLW / BW, morphological histology of the airways and causes of failures of ventilation. RESULTS: BW, TLW, LLW, TLW / BW and LLW / BW were higher in C compared with N- (p 0.05. The morphology of the pulmonary airways showed hypoplasia in groups N- and CDH, with no difference between V and N-V (p <0.05. The C and N- groups could be successfully ventilated using a tidal volume of 75 ìl, but the failure of ventilation in the CDH group decreased only when ventilated with 50 ìl. CONCLUSION: Volume ventilation is possible in rats with CDH for a short period and does not alter fetal or lung morphology.

  12. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  13. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  14. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  15. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  16. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  17. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  18. On criteria for examining analysis quality with standard reference material

    International Nuclear Information System (INIS)

    Yang Huating

    1997-01-01

    The advantages and disadvantages and applicability of some criteria for examining analysis quality with standard reference material are discussed. The combination of the uncertainties of the instrument examined and the reference material should be determined on the basis of specific situations. Without the data of the instrument's uncertainty, it would be applicable to substitute the standard deviation multiplied by certain times for the uncertainty. The result of the examining should not result in more error reported in routine measurements than it really is. Over strict examining should also be avoided

  19. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  20. Provenience studies using neutron activation analysis: the role of standardization

    International Nuclear Information System (INIS)

    Harbottle, G.

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined

  1. Recommendations for a proposed standard for performing systems analysis

    International Nuclear Information System (INIS)

    LaChance, J.; Whitehead, D.; Drouin, M.

    1998-01-01

    In August 1995, the Nuclear Regulatory Commission (NRC) issued a policy statement proposing improved regulatory decisionmaking by increasing the use of PRA [probabilistic risk assessment] in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. A key aspect in using PRA in risk-informed regulatory activities is establishing the appropriate scope and attributes of the PRA. In this regard, ASME decided to develop a consensus PRA Standard. The objective is to develop a PRA Standard such that the technical quality of nuclear plant PRAs will be sufficient to support risk-informed regulatory applications. This paper presents examples recommendations for the systems analysis element of a PRA for incorporation into the ASME PRA Standard

  2. Multielement comparison of instrumental neutron activation analysis techniques using reference materials

    International Nuclear Information System (INIS)

    Ratner, R.T.; Vernetson, W.G.

    1995-01-01

    Several instrumental neutron activation analysis techniques (parametric, comparative, and k o -standardization) are evaluated using three reference materials. Each technique is applied to National Institute of Standards and Technology standard reference materials, SRM 1577a (Bovine Liver) and SRM 2704 (Buffalo River Sediment), and the United States Geological Survey standard BHVO-1 (Hawaiian Basalt Rock). Identical (but not optimum) irradiation, decay, and counting schemes are employed with each technique to provide a basis for comparison and to determine sensitivities in a routine irradiation scheme. Fifty-one elements are used in this comparison; however, several elements are not detected in the reference materials due to rigid analytical conditions (e.g. insufficient length of irradiation or activity for radioisotope of interest decaying below the lower limit of detection before counting interval). Most elements are normally distributed around certified or consensus values with a standard deviation of 10%. For some elements, discrepancies are observed and discussed. The accuracy, precision, and sensitivity of each technique are discussed by comparing the analytical results to consensus values for the Hawaiian Basalt Rock to demonstrate the diversity of multielement applications. (author) 4 refs.; 2 tabs

  3. Modification of the cranial closing wedge ostectomy technique for the treatment of canine cruciate disease. Description and comparison with standard technique.

    Science.gov (United States)

    Wallace, A M; Addison, E S; Smith, B A; Radke, H; Hobbs, S J

    2011-01-01

    To describe a modification of the cranial closing wedge ostectomy (CCWO) technique and to compare its efficacy to the standard technique on cadaveric specimens. The standard and modified CCWO technique were applied to eight pairs of cadaveric tibiae. The following parameters were compared following the ostectomy: degrees of plateau levelling achieved (degrees), tibial long axis shift (degrees), reduction in tibial length (mm), area of bone wedge removed (cm²), and the area of proximal fragment (cm²). The size of the removed wedge of bone and the reduction in tibial length were significantly less with the modified CCWO technique. The modified CCWO has two main advantages. Firstly a smaller wedge is removed, allowing a greater preservation of bone stock in the proximal tibia, which is advantageous for implant placement. Secondly, the tibia is shortened to a lesser degree, which might reduce the risk of recurvatum, fibular fracture and patella desmitis. These factors are particularly propitious for the application of this technique to Terrier breeds with excessive tibial plateau angle, where large angular corrections are required. The modified CCWO is equally effective for plateau levelling and results in an equivalent tibial long-axis shift. A disadvantage with the modified technique is that not all of the cross sectional area of the distal fragment contributes to load sharing at the osteotomy.

  4. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  5. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  6. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  7. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  8. The development of a standard format for accelerator data analysis

    International Nuclear Information System (INIS)

    Cohen, S.

    1989-01-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and ''understood'' by a large set of applications running on the control and data-analysis computers at LAMPF as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. It is a straightforward process to code a parser for both the control computer and pc's once a consensus on the file syntax has been established. This paper describes the file format and the methods used to integrate the format into existing diagnostic and control software

  9. The development of a standard format for accelerator data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, S [Los Alamos National Lab., NM (USA)

    1990-08-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam-diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and 'understood' by a large set of applications running on the control and data-analysis computers at LAMPF, as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. Once a consensus on the file syntax has been established, it is a straightforward process to code a parser for both the control computer and PCs. This paper describes the file format and the method used to integrate the format into existing diagnostic and control software. (orig.).

  10. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  11. Local properties of analytic functions and non-standard analysis

    International Nuclear Information System (INIS)

    O'Brian, N.R.

    1976-01-01

    This is an expository account which shows how the methods of non-standard analysis can be applied to prove the Nullstellensatz for germs of analytic functions. This method of proof was discovered originally by Abraham Robinson. The necessary concepts from model theory are described in some detail and the Nullstellensatz is proved by investigating the relation between the set of infinitesimal elements in the complex n-plane and the spectrum of the ring of germs of analytic functions. (author)

  12. Standardization of the PCR technique for the detection of delta toxin in Staphylococcus spp.

    Directory of Open Access Journals (Sweden)

    C. Marconi

    2005-06-01

    Full Text Available Coagulase-negative staphylococci (CNS, components of the normal flora of neonates, have emerged as important opportunistic pathogens of nosocomial infections that occur in neonatal intensive care units. Some authors have reported the ability of some CNS strains, particularly Staphylococcus epidermidis, to produce a toxin similar to S. aureus delta toxin. This toxin is an exoprotein that has a detergent action on the membranes of various cell types resulting in rapid cell lysis. The objectives of the present study were to standardize the Polymerase Chain Reaction (PCR technique for the detection of the gene responsible for the production of delta toxin (hld gene in staphylococcal species isolated from catheters and blood cultures obtained from neonates, and to compare the results to those obtained with the phenotypic synergistic hemolysis method. Detection of delta toxin by the phenotypic and genotypic method yielded similar results for the S. aureus isolates. However, in S. epidermidis, a higher positivity was observed for PCR (97.4% compared to the synergistic hemolysis method (86.8%. Among CNS, S. epidermidis was the most frequent isolate and was a delta toxin producer. Staphylococcus simulans and S. warneri tested positive by the phenotypic method, but their positivity was not confirmed by PCR for the hld gene detection. These results indicate that different genes might be responsible for the production of this toxin in different CNS species, requiring highly specific primers for their detection. PCR was found to be a rapid and reliable method for the detection of the hld gene in S. aureus and S. epidermidis.

  13. Rapid analysis of molybdenum contents in molybdenum master alloys by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Tongkong, P.

    1985-01-01

    Determination of molybdenum contents in molybdenum master alloy had been performed using energy dispersive x-ray fluorescence (EDX) technique where analysis were made via standard additions and calibration curves. Comparison of EDX technique with other analyzing techniques, i.e., wavelength dispersive x-ray fluorescence, neutron activation analysis and inductive coupled plasma spectrometry, showed consistency in the results. This technique was found to yield reliable results when molybdenum contents in master alloys were in the range of 13 to 50 percent using HPGe detector or proportional counter. When the required error was set at 1%, the minimum analyzing time was found to be 30 and 60 seconds for Fe-Mo master alloys with molybdenum content of 13.54 and 49.09 percent respectively. For Al-Mo master alloys, the minimum times required were 120 and 300 seconds with molybdenum content of 15.22 and 47.26 percent respectively

  14. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  15. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  16. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    Science.gov (United States)

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  17. Setting Standards for Medically-Based Running Analysis

    Science.gov (United States)

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  18. A Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Logan, Jeffrey [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  19. Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, P.; Logan, J.; Bird, L.; Short, W.

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  20. Text localization using standard deviation analysis of structure elements and support vector machines

    Directory of Open Access Journals (Sweden)

    Zagoris Konstantinos

    2011-01-01

    Full Text Available Abstract A text localization technique is required to successfully exploit document images such as technical articles and letters. The proposed method detects and extracts text areas from document images. Initially a connected components analysis technique detects blocks of foreground objects. Then, a descriptor that consists of a set of suitable document structure elements is extracted from the blocks. This is achieved by incorporating an algorithm called Standard Deviation Analysis of Structure Elements (SDASE which maximizes the separability between the blocks. Another feature of the SDASE is that its length adapts according to the requirements of the application. Finally, the descriptor of each block is used as input to a trained support vector machines that classify the block as text or not. The proposed technique is also capable of adjusting to the text structure of the documents. Experimental results on benchmarking databases demonstrate the effectiveness of the proposed method.

  1. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  2. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  3. Standard Test Method for Oxygen Content Using a 14-MeV Neutron Activation and Direct-Counting Technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method covers the measurement of oxygen concentration in almost any matrix by using a 14-MeV neutron activation and direct-counting technique. Essentially, the same system may be used to determine oxygen concentrations ranging from over 50 % to about 10 g/g, or less, depending on the sample size and available 14-MeV neutron fluence rates. Note 1 - The range of analysis may be extended by using higher neutron fluence rates, larger samples, and higher counting efficiency detectors. 1.2 This test method may be used on either solid or liquid samples, provided that they can be made to conform in size, shape, and macroscopic density during irradiation and counting to a standard sample of known oxygen content. Several variants of this method have been described in the technical literature. A monograph is available which provides a comprehensive description of the principles of activation analysis using a neutron generator (1). 1.3 The values stated in either SI or inch-pound units are to be regarded...

  4. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair.

    Science.gov (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu

    2010-03-01

    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  5. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  6. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  7. Performance Analysis of Modified Drain Gating Techniques for Low Power and High Speed Arithmetic Circuits

    Directory of Open Access Journals (Sweden)

    Shikha Panwar

    2014-01-01

    Full Text Available This paper presents several high performance and low power techniques for CMOS circuits. In these design methodologies, drain gating technique and its variations are modified by adding an additional NMOS sleep transistor at the output node which helps in faster discharge and thereby providing higher speed. In order to achieve high performance, the proposed design techniques trade power for performance in the delay critical sections of the circuit. Intensive simulations are performed using Cadence Virtuoso in a 45 nm standard CMOS technology at room temperature with supply voltage of 1.2 V. Comparative analysis of the present circuits with standard CMOS circuits shows smaller propagation delay and lesser power consumption.

  8. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  10. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  11. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  12. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  13. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  14. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  15. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  16. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  17. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  18. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  19. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  20. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  1. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  2. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    Directory of Open Access Journals (Sweden)

    Joaquín Huerta

    2012-08-01

    Full Text Available Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  3. Soil texture analysis by laser diffraction - standardization needed

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Palviainen, M.; Kjønaas, O. Janne

    2017-01-01

    Soil texture is a central soil quality property. Laser diffraction (LD) for determination of particle size distribution (PSD) is now widespread due to easy analysis and low cost. However, pretreatment methods and interpretation of the resulting soil PSD’s are not standardized. Comparison of LD data...... with sedimentation and sieving data may cause misinterpretation and confusion. In literature that reports PSD’s based on LD, pretreatment methods, operating procedures and data methods are often underreported or not reported, although literature stressing the importance exists (e.g. Konert and Vandenberghe, 1997...... and many newer; ISO 13320:2009). PSD uncertainty caused by pretreatments and PSD bias caused by plate-shaped clay particles still calls for more method standardization work. If LD is used more generally, new pedotransfer functions for other soil properties (e.g water retention) based on sieving...

  4. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Directory of Open Access Journals (Sweden)

    Gaetano Luglio

    2015-06-01

    Conclusion: Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon.

  5. Portable optical frequency standard based on sealed gas-filled hollow-core fiber using a novel encapsulation technique

    DEFF Research Database (Denmark)

    Triches, Marco; Brusch, Anders; Hald, Jan

    2015-01-01

    A portable stand-alone optical frequency standard based on a gas-filled hollow-core photonic crystal fiber is developed to stabilize a fiber laser to the 13C2H2 P(16) (ν1 + ν3) transition at 1542 nm using saturated absorption. A novel encapsulation technique is developed to permanently seal...

  6. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  7. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  8. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  9. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  10. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  11. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  12. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  13. Psychovisual masks and intelligent streaming RTP techniques for the MPEG-4 standard

    Science.gov (United States)

    Mecocci, Alessandro; Falconi, Francesco

    2003-06-01

    . In our implementation we adopted the Visual Brain theory i.e. the study of what the "psychic eye" can get from a scene. According to this theory, a Psychomask Image Analysis (PIA) module has been developed to extract the visually homogeneous regions of the background. The PIA module produces two complementary masks one for the visually low variance zones and one for the higly variable zones; these zones are compressed with different strategies and encoded into two multiplexed streams. From practical experiments it turned out that the separate coding is advantageous only if the low variance zones exceed 50% of the whole background area (due to the overhead given by the need of transmitting the zone masks). The SLIC module takes care of deciding the appropriate transmission modality by analyzing the results produced by the PIA module. The main features of this codec are: low bitrate, good image quality and coding speed. The current implementation runs in real-time on standard PC platforms, the major limitation being the fixed position of the acquisition sensor. This limitation is due to the difficulties in separating moving objects from the background when the acquisition sensor moves. Our current real-time segmentation module does not produce suitable results if the acquisition sensor moves (only slight oscillatory movements are tolerated). In any case, the system is particularly suitable for tele surveillance applications at low bit-rates, where the camera is usually fixed or alternates among some predetermined positions (our segmentation module is capable of accurately separate moving objects from the static background when the acquisition sensor stops, even if different scenes are seen as a result of the sensor displacements). Moreover, the proposed architecture is general, in the sense that when real-time, robust segmentation systems (capable of separating objects in real-time from the background while the sensor itself is moving) will be available, they can be

  14. Accident analysis for aircraft crash into hazardous facilities: DOE standard

    International Nuclear Information System (INIS)

    1996-10-01

    This standard provides the user with sufficient information to evaluate and assess the significance of aircraft crash risk on facility safety without expending excessive effort where it is not required. It establishes an approach for performing a conservative analysis of the risk posed by a release of hazardous radioactive or chemical material resulting from an aircraft crash into a facility containing significant quantities of such material. This can establish whether a facility has a significant potential for an aircraft impact and whether this has the potential for producing significant offsite or onsite consequences. General implementation guidance, screening and evaluation guidelines, and methodologies for the evaluations are included

  15. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  17. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  18. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  19. Nuclear microprobe analysis of the standard reference materials

    International Nuclear Information System (INIS)

    Jaksic, M.; Fazinic, S.; Bogdanovic, I.; Tadic, T.

    2002-01-01

    Most of the presently existing Standard Reference Materials (SRM) for nuclear analytical methods are certified for the analyzed mass of the order of few hundred mg. Typical mass of sample which is analyzed by PIXE or XRF methods is very often below 1 mg. By the development of focused proton or x-ray beams, masses which can be typically analyzed go down to μg or even ng level. It is difficult to make biological or environmental SRMs which can give desired homogeneity at such low scale. However, use of fundamental parameter quantitative evaluation procedures (absolute method), minimize needs for SRMs. In PIXE and micro PIXE setup at our Institute, fundamental parameter approach is used. For exact calibration of the quantitative analysis procedure just one standard sample is needed. In our case glass standards which showed homogeneity down to micron scale were used. Of course, it is desirable to use SRMs for quality assurance, and therefore need for homogenous materials can be justified even for micro PIXE method. In this presentation, brief overview of PIXE setup calibration is given, along with some recent results of tests of several SRMs

  20. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  1. Placement of empty catheters for an HDR-emulating LDR prostate brachytherapy technique: comparison to standard intraoperative planning.

    Science.gov (United States)

    Niedermayr, Thomas R; Nguyen, Paul L; Murciano-Goroff, Yonina R; Kovtun, Konstantin A; Neubauer Sugar, Emily; Cail, Daniel W; O'Farrell, Desmond A; Hansen, Jorgen L; Cormack, Robert A; Buzurovic, Ivan; Wolfsberger, Luciant T; O'Leary, Michael P; Steele, Graeme S; Devlin, Philip M; Orio, Peter F

    2014-01-01

    We sought to determine whether placing empty catheters within the prostate and then inverse planning iodine-125 seed locations within those catheters (High Dose Rate-Emulating Low Dose Rate Prostate Brachytherapy [HELP] technique) would improve concordance between planned and achieved dosimetry compared with a standard intraoperative technique. We examined 30 consecutive low dose rate prostate cases performed by standard intraoperative technique of planning followed by needle placement/seed deposition and compared them to 30 consecutive low dose rate prostate cases performed by the HELP technique. The primary endpoint was concordance between planned percentage of the clinical target volume that receives at least 100% of the prescribed dose/dose that covers 90% of the volume of the clinical target volume (V100/D90) and the actual V100/D90 achieved at Postoperative Day 1. The HELP technique had superior concordance between the planned target dosimetry and what was actually achieved at Day 1 and Day 30. Specifically, target D90 at Day 1 was on average 33.7 Gy less than planned for the standard intraoperative technique but was only 10.5 Gy less than planned for the HELP technique (p 0.05). Placing empty needles first and optimizing the plan to the known positions of the needles resulted in improved concordance between the planned and the achieved dosimetry to the target, possibly because of elimination of errors in needle placement. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  2. Standardization of Image Quality Analysis – ISO 19264

    DEFF Research Database (Denmark)

    Wüller, Dietmar; Kejser, Ulla Bøgvad

    2016-01-01

    There are a variety of image quality analysis tools available for the archiving world, which are based on different test charts and analysis algorithms. ISO has formed a working group in 2012 to harmonize these approaches and create a standard way of analyzing the image quality for archiving...... systems. This has resulted in three documents that have been or are going to be published soon. ISO 19262 defines the terms used in the area of image capture to unify the language. ISO 19263 describes the workflow issues and provides detailed information on how the measurements are done. Last...... but not least ISO 19264 describes the measurements in detail and provides aims and tolerance levels for the different aspects. This paper will present the new ISO 19264 technical specification to analyze image quality based on a single capture of a multi-pattern test chart, and discuss the reasoning behind its...

  3. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  4. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  5. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  6. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  7. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  8. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  9. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  10. Comparing Sustainable Forest Management Certifications Standards: A Meta-analysis

    Directory of Open Access Journals (Sweden)

    Michael Rawson. Clark

    2011-03-01

    Full Text Available To solve problems caused by conventional forest management, forest certification has emerged as a driver of sustainable forest management. Several sustainable forest management certification systems exist, including the Forest Stewardship Council and those endorsed by the Programme for the Endorsement of Forest Certification, such as the Canadian Standards Association - Sustainable Forestry Management Standard CAN/CSA - Z809 and Sustainable Forestry Initiative. For consumers to use certified products to meet their own sustainability goals, they must have an understanding of the effectiveness of different certification systems. To understand the relative performance of three systems, we determined: (1 the criteria used to compare the Forest Stewardship Council, Canadian Standards Association - Sustainable Forestry Management, and Sustainable Forestry Initiative, (2 if consensus exists regarding their ability to achieve sustainability goals, and (3 what research gaps must be filled to improve our understanding of how forest certification systems affect sustainable forest management. We conducted a qualitative meta-analysis of 26 grey literature references (books, industry and nongovernmental organization publications and 9 primary literature references (articles in peer-reviewed academic journals that compared at least two of the aforementioned certification systems. The Forest Stewardship Council was the highest performer for ecological health and social sustainable forest management criteria. The Canadian Standards Association - Sustainable Forestry Management and Sustainable Forestry Initiative performed best under sustainable forest management criteria of forest productivity and economic longevity of a firm. Sixty-two percent of analyses were comparisons of the wording of certification system principles or criteria; 34% were surveys of foresters or consumers. An important caveat to these results is that only one comparison was based on

  11. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  12. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  13. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  14. When is hub gene selection better than standard meta-analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  15. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  16. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  17. International cooperative analysis of standard substance, IAEA-0390

    International Nuclear Information System (INIS)

    Kawamoto, Keizo; Takada, Jitsuya; Moriyama, Hirotake; Akaboshi, Mitsuhiko

    1999-01-01

    Three kinds of algae (IAEA-0391, IAEA-0392 and IAEA-0393) were defined as the biological standard substance to monitor environmental pollution by Analytical Quality Control Service of IAEA (IAEA-AQCS). In this study, analysis of these standard substances were made using ICP-MS to compare with the results of simultaneously conducted radioactivation analysis (INAA). The respective cultures of the three algae were cooperatively prepared by IAEA-AQCS and microbial Institute of Czechoslovakia. After drying and sterilizing by Co-60 exposure, these samples were sent to KURRI. When the results from the experiment in KURRI were compared with the values recommended through statistical treatment of the data obtained by IAEA, these values of 5 elements, Fe, Cr, Mg, Mn and Na were well coincident for either of IAEA-0391, IAEA-0392 and IAEA-0393 and the values of As, Ca, Cd, Co, Cu, K and Zn were nearly coincident between them. Regarding Hg and La, the data from INAA and ICP-MS were very different from the recommended values of IAEA for either of samples. (M.N.)

  18. Performance of an iterative two-stage bayesian technique for population pharmacokinetic analysis of rich data sets

    NARCIS (Netherlands)

    Proost, Johannes H.; Eleveld, Douglas J.

    2006-01-01

    Purpose. To test the suitability of an Iterative Two-Stage Bayesian (ITSB) technique for population pharmacokinetic analysis of rich data sets, and to compare ITSB with Standard Two-Stage (STS) analysis and nonlinear Mixed Effect Modeling (MEM). Materials and Methods. Data from a clinical study with

  19. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  20. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.

  1. Rendezvous technique for recanalization of long-segmental chronic total occlusion above the knee following unsuccessful standard angioplasty.

    Science.gov (United States)

    Cao, Jun; Lu, Hai-Tao; Wei, Li-Ming; Zhao, Jun-Gong; Zhu, Yue-Qi

    2016-04-01

    To assess the technical feasibility and efficacy of the rendezvous technique, a type of subintimal retrograde wiring, for the treatment of long-segmental chronic total occlusions above the knee following unsuccessful standard angioplasty. The rendezvous technique was attempted in eight limbs of eight patients with chronic total occlusions above the knee after standard angioplasty failed. The clinical symptoms and ankle-brachial index were compared before and after the procedure. At follow-up, pain relief, wound healing, limb salvage, and the presence of restenosis of the target vessels were evaluated. The rendezvous technique was performed successfully in seven patients (87.5%) and failed in one patient (12.5%). Foot pain improved in all seven patients who underwent successful treatment, with ankle-brachial indexes improving from 0.23 ± 0.13 before to 0.71 ± 0.09 after the procedure (P rendezvous technique is a feasible and effective treatment for chronic total occlusions above the knee when standard angioplasty fails. © The Author(s) 2015.

  2. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  3. Histopathological Validation of the Surface-Intermediate-Base Margin Score for Standardized Reporting of Resection Technique during Nephron Sparing Surgery.

    Science.gov (United States)

    Minervini, Andrea; Campi, Riccardo; Kutikov, Alexander; Montagnani, Ilaria; Sessa, Francesco; Serni, Sergio; Raspollini, Maria Rosaria; Carini, Marco

    2015-10-01

    The surface-intermediate-base margin score is a novel standardized reporting system of resection techniques during nephron sparing surgery. We validated the surgeon assessed surface-intermediate-base score with microscopic histopathological assessment of partial nephrectomy specimens. Between June and August 2014 data were prospectively collected from 40 consecutive patients undergoing nephron sparing surgery. The surface-intermediate-base score was assigned to all cases. The score specific areas were color coded with tissue margin ink and sectioned for histological evaluation of healthy renal margin thickness. Maximum, minimum and mean thickness of healthy renal margin for each score specific area grade (surface [S] = 0, S = 1 ; intermediate [I] or base [B] = 0, I or B = 1, I or B = 2) was reported. The Mann-Whitney U and Kruskal-Wallis tests were used to compare the thickness of healthy renal margin in S = 0 vs 1 and I or B = 0 vs 1 vs 2 grades, respectively. Maximum, minimum and mean thickness of healthy renal margin was significantly different among score specific area grades S = 0 vs 1, and I or B = 0 vs 1, 0 vs 2 and 1 vs 2 (p <0.001). The main limitations of the study are the low number of the I or B = 1 and I or B = 2 samples and the assumption that each microscopic slide reflects the entire score specific area for histological analysis. The surface-intermediate-base scoring method can be readily harnessed in real-world clinical practice and accurately mirrors histopathological analysis for quantification and reporting of healthy renal margin thickness removed during tumor excision. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  4. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  5. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  6. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  7. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  8. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  9. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  10. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  11. Standard practice for examination of welds using the alternating current field measurement technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This practice describes procedures to be followed during alternating current field measurement examination of welds for baseline and service-induced surface breaking discontinuities. 1.2 This practice is intended for use on welds in any metallic material. 1.3 This practice does not establish weld acceptance criteria. 1.4 The values stated in either inch-pound units or SI units are to be regarded separately as standard. The values stated in each system might not be exact equivalents; therefore, each system shall be used independently of the other. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  12. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  13. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  14. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  15. Rare earths analysis of rock samples by instrumental neutron activation analysis, internal standard method

    International Nuclear Information System (INIS)

    Silachyov, I.

    2016-01-01

    The application of instrumental neutron activation analysis for the determination of long-lived rare earth elements (REE) in rock samples is considered in this work. Two different methods are statistically compared: the well established external standard method carried out using standard reference materials, and the internal standard method (ISM), using Fe, determined through X-ray fluorescence analysis, as an element-comparator. The ISM proved to be the more precise method for a wide range of REE contents and can be recommended for routine practice. (author)

  16. The Effect of Insertion Technique on Temperatures for Standard and Self-Drilling External Fixation Pins.

    Science.gov (United States)

    Manoogian, Sarah; Lee, Adam K; Widmaier, James C

    2017-08-01

    No studies have assessed the effects of parameters associated with insertion temperature in modern self-drilling external fixation pins. The current study assessed how varying the presence of irrigation, insertion speed, and force impacted the insertion temperatures of 2 types of standard and self-drilling external fixation half pins. Seventy tests were conducted with 10 trials for 4 conditions on self-drilling pins, and 3 conditions for standard pins. Each test used a thermocouple inside the pin to measure temperature rise during insertion. Adding irrigation to the standard pin insertion significantly lowered the maximum temperature (P drilling pin tests dropped average rise in temperature from 151.3 ± 21.6°C to 124.1 ± 15.3°C (P = 0.005). When the self-drilling pin insertion was decreased considerably from 360 to 60 rpm, the temperature decreased significantly from 151.3 ± 21.6°C to 109.6 ± 14.0°C (P drilling pin temperature increase was not significant. The standard pin had lower peak temperatures than the self-drilling pin for all conditions. Moreover, slowing down the insertion speed and adding irrigation helped mitigate the temperature increase of both pin types during insertion.

  17. Admissions Standards and the Use of Key Marketing Techniques by United States' Colleges and Universities.

    Science.gov (United States)

    Goldgehn, Leslie A.

    1989-01-01

    A survey of admissions deans and directors investigated the use and perceived effectiveness of 15 well-known marketing techniques: advertising, advertising research, a marketing plan, market positioning, market segmentation, marketing audit, marketing research, pricing, program and service accessibility, program development, publicity, target…

  18. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  1. Elemental analysis of the suspended particulate matter in the air of Tehran using INAA and AAS techniques. Appendix 11

    International Nuclear Information System (INIS)

    Sohrabpour, M.; Rostami, S.; Athari, M.

    1995-01-01

    A network of ten sampling stations for monitoring the elemental concentration of the suspended particulate matter (SPM) in the air of Tehran has been established. Instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS) techniques have been used for analysis of the Whatman-41 filters collected during the year 1994. Assessment of the preliminary results using the two techniques has produced the following twenty-one elements: Al, Br, Ca, Cd, Ce, Cl, Co, Cr, Cs, Fe, K, Mg, Mn, Na, Ni, Pb, Sb, Sc, Ti, V, Zn. Various standard solutions with known concentrations of elements, together with standard reference materials, have been used for quality assurance of the measured concentrations. (author)

  2. Quality of Standard Reference Materials for Short Time Activation Analysis

    International Nuclear Information System (INIS)

    Ismail, S.S.; Oberleitner, W.

    2003-01-01

    Some environmental reference materials (CFA-1633 b, IAEA-SL-1, SARM-1,BCR-176, Coal-1635, IAEA-SL-3, BCR-146, and SRAM-5) were analysed by short-time activation analysis. The results show that these materials can be classified in three groups, according to their activities after irradiation. The obtained results were compared in order to create a quality index for determination of short-lived nuclides at high count rates. It was found that Cfta is not a suitable standard for determining very short-lived nuclides (half-lives<1 min) because the activity it produces is 15-fold higher than that SL-3. Biological reference materials, such as SRM-1571, SRM-1573, SRM-1575, SRM-1577, IAEA-392, and IAEA-393, were also investigated by a higher counting efficiency system. The quality of this system and its well-type detector for investigating short-lived nuclides was discussed

  3. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  4. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  5. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  6. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  7. HPGe detectors timing using pulse shape analysis techniques

    International Nuclear Information System (INIS)

    Crespi, F.C.L.; Vandone, V.; Brambilla, S.; Camera, F.; Million, B.; Riboldi, S.; Wieland, O.

    2010-01-01

    In this work the Pulse Shape Analysis has been used to improve the time resolution of High Purity Germanium (HPGe) detectors. A set of time aligned signals was acquired in a coincidence measurement using a coaxial HPGe and a cerium-doped lanthanum chloride (LaCl 3 :Ce) scintillation detector. The analysis using a Constant Fraction Discriminator (CFD) time output versus the HPGe signal shape shows that time resolution ranges from 2 to 12 ns depending on the slope in the initial part of the signal. An optimization procedure of the CFD parameters gives the same final time resolution (8 ns) as the one achieved after a correction of the CFD output based on the current pulse maximum position. Finally, an algorithm based on Pulse Shape Analysis was applied to the experimental data and a time resolution between 3 and 4 ns was obtained, corresponding to a 50% improvement as compared with that given by standard CFDs.

  8. Standards of compounds labeled with positron nuclides approved as established techniques for medical use (2001 revision)

    International Nuclear Information System (INIS)

    2001-01-01

    The subcommittee on Medical Application of Cyclotron-Produced Radionuclides, Medical Science and Pharmaceutical Committee, Japan Radioisotope Association, revised the Standards in the title for their manufacturing, quality, manufacturing work environment etc. The facilities must have the individual committee for the organization and its responsibility is for the control and hygiene in manufacturing the nuclides, for the quality control and for the medical use. Based on this, the Standard defined such pharmaceutical items as the general rule; gas agents and injection formulations; test methods involving γ-ray measurement including spectrometry and derived determination, determination with well-type scintillation counter and ionization chamber, method to measure half-time and determination of the nuclide purity; individual definition of [ 18 F]2-deoxy-2-fluoro-D-glucose, 15 O gas and 15 O-carbon monoxide and 15 O-carbon dioxide; and guideline of manufacturing the nuclides and its environment involving monitoring and records. (K.H.)

  9. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  10. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  11. Multigrid techniques with non-standard coarsening and group relaxation methods

    International Nuclear Information System (INIS)

    Danaee, A.

    1989-06-01

    In the usual (standard) multigrid methods, doubling of grid sizes with different smoothing iterations (pointwise, or blockwise) has been considered by different authors. Some have indicated that a large coarsening can also be used, but is not beneficial (cf. H3, p.59). In this paper, it is shown that with a suitable blockwise smoothing scheme, some advantages could be achieved even with a factor of H l-1 /h l = 3. (author). 10 refs, 2 figs, 6 tabs

  12. Measurement of duodenogastric reflux: standardization of a new technique using Rose Bengal 131I

    International Nuclear Information System (INIS)

    Pires, P.W.A.; Camargo, E.E.; Mittelstaedt, W.E.M.; Speranzini, M.B.; Oliveira, M.R. de

    1988-01-01

    A nasogastric tube was introduced under radioscopic control into the stomach of 20 normal persons. After fixing the tube, Rose Bengal I 131 was injected intravenously. The individuals were then subdivided in two groups: a and B. group A: gastric fluid samples were aspirated for two hours in the ten cases in this group. Group B: A standard liquid diet was introduced via the nasogastric tube. Samples were also collected for two hours. (M.A.C.) [pt

  13. Internal standard method for determination of gallium and some trace elements in bauxite by neutron activation analysis

    International Nuclear Information System (INIS)

    Chen, S.G.; Tsai, H.T.

    1983-01-01

    A method is described for the determination of gallium and other trace elements such as Ce, Cr, Hf, Lu and Th in bauxite by the technique of neutron activation analysis using gold as internal standard. Isopropyl ether was used as organic extractant radioactive gallium from the sample. This method yields very good accuracy with a relative error of +-3%. (author)

  14. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  15. EUS-guided biliary drainage by using a standardized approach for malignant biliary obstruction: rendezvous versus direct transluminal techniques (with videos).

    Science.gov (United States)

    Khashab, Mouen A; Valeshabad, Ali Kord; Modayil, Rani; Widmer, Jessica; Saxena, Payal; Idrees, Mehak; Iqbal, Shahzad; Kalloo, Anthony N; Stavropoulos, Stavros N

    2013-11-01

    EUS-guided biliary drainage (EGBD) can be performed via direct transluminal or rendezvous techniques. It is unknown how both techniques compare in terms of efficacy and adverse events. To describe outcomes of EGBD performed by using a standardized approach and compare outcomes of rendezvous and transluminal techniques. Retrospective analysis of prospectively collected data. Two tertiary-care centers. Consecutive jaundiced patients with distal malignant biliary obstruction who underwent EGBD after failed ERCP between July 2006 and December 2012 were included. EGBD by using a standardized algorithm. Technical success, clinical success, and adverse events. During the study period, 35 patients underwent EGBD (rendezvous n = 13, transluminal n = 20). Technical success was achieved in 33 patients (94%), and clinical success was attained in 32 of 33 patients (97.0%). The mean postprocedure bilirubin level was 1.38 mg/dL in the rendezvous group and 1.33 mg/dL in the transluminal group (P = .88). Similarly, length of hospital stay was not different between groups (P = .23). There was no significant difference in adverse event rate between rendezvous and transluminal groups (15.4% vs 10%; P = .64). Long-term outcomes were comparable between groups, with 1 stent migration in the rendezvous group at 62 days and 1 stent occlusion in the transluminal group at 42 days after EGBD. Retrospective analysis, small number of patients, and selection bias. EGBD is safe and effective when the described standardized approach is used. Stent occlusion is not common during long-term follow-up. Both rendezvous and direct transluminal techniques seem to be equally effective and safe. The latter approach is a reasonable alternative to rendezvous EGBD. Copyright © 2013. Published by Mosby, Inc.

  16. EASYTRAC Project: Work package 6.4 Reversal technique to calibrate gear and thread standards

    DEFF Research Database (Denmark)

    Carmignato, Simone; De Chiffre, Leonardo; Larsen, Erik

    This report was produced as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines (CM...... (PTB) - Germany and Tampere University of Technology (TUT) - Finland. The present report describes feasibility and experimental results of a reversal and substitute element technique application for thread calibration on CMMs....

  17. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  18. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  19. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  20. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  1. A Comparative Analysis of Uranium Ore using Laser Fluorimetric and gamma Spectrometry Techniques

    International Nuclear Information System (INIS)

    Madbouly, M.; Nassef, M. H.; El-Mongy, S.A.; Diab, A.M.

    2009-01-01

    A developed chemical separation method was used for the analysis of uranium in a standard U-ore (IAEA-RGU-1) by LASER fluorimetric technique. The non-destructive gamma assay technique was also applied to verify and compare the uranium content analyzed using laser technique. The results of the uranium analysis obtained by laser fluorimetry were found to be in the range of 360 - 420 μg/g with an average value of 390 μg/g. The bias between the measured and the certified value does not exceed 9.9%. For gamma-ray spectrometric analysis, the results of the measured uranium content were found to be in the range of 393.8 - 399.4 μg/g with an average value of 396.3 μg/g. The % difference in the case of γ- assay was 1.6 %. In general, the methods of analysis used in this study are applicable for a precise determination of uranium. It can be concluded that, laser analysis is preferred for assay of uranium ore due to the required small sample weight, the low time of sample preparation and cost of analysis.

  2. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  3. Improved analysis techniques for cylindrical and spherical double probes

    Energy Technology Data Exchange (ETDEWEB)

    Beal, Brian; Brown, Daniel; Bromaghim, Daron [Air Force Research Laboratory, 1 Ara Rd., Edwards Air Force Base, California 93524 (United States); Johnson, Lee [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, California 91109 (United States); Blakely, Joseph [ERC Inc., 1 Ara Rd., Edwards Air Force Base, California 93524 (United States)

    2012-07-15

    A versatile double Langmuir probe technique has been developed by incorporating analytical fits to Laframboise's numerical results for ion current collection by biased electrodes of various sizes relative to the local electron Debye length. Application of these fits to the double probe circuit has produced a set of coupled equations that express the potential of each electrode relative to the plasma potential as well as the resulting probe current as a function of applied probe voltage. These equations can be readily solved via standard numerical techniques in order to determine electron temperature and plasma density from probe current and voltage measurements. Because this method self-consistently accounts for the effects of sheath expansion, it can be readily applied to plasmas with a wide range of densities and low ion temperature (T{sub i}/T{sub e} Much-Less-Than 1) without requiring probe dimensions to be asymptotically large or small with respect to the electron Debye length. The presented approach has been successfully applied to experimental measurements obtained in the plume of a low-power Hall thruster, which produced a quasineutral, flowing xenon plasma during operation at 200 W on xenon. The measured plasma densities and electron temperatures were in the range of 1 Multiplication-Sign 10{sup 12}-1 Multiplication-Sign 10{sup 17} m{sup -3} and 0.5-5.0 eV, respectively. The estimated measurement uncertainty is +6%/-34% in density and +/-30% in electron temperature.

  4. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  5. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  6. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  7. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  8. Characterization of the neutron sources storage pool of the Neutron Standards Laboratory, using Montecarlo Techniques

    International Nuclear Information System (INIS)

    Campo Blanco, X.

    2015-01-01

    The development of irradiation damage resistant materials is one of the most important open fields in the design of experimental facilities and conceptual nucleoelectric fusion plants. The Neutron Standards Laboratory aims to contribute to this development by allowing the neutron irradiation of materials in its calibration neutron sources storage pool. For this purposes, it is essential to characterize the pool itself in terms of neutron fluence and spectra due to the calibration neutron sources. In this work, the main features of this facility are presented and the characterization of the storage pool is carried out. Finally, an application is shown of the obtained results to the neutron irradiation of material.

  9. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  10. Development of international standards for surface analysis by ISO technical committee 201 on surface chemical analysis

    International Nuclear Information System (INIS)

    Powell, C.J.

    1999-01-01

    Full text: The International Organization for Standardization (ISO) established Technical Committee 201 on Surface Chemical Analysis in 1991 to develop documentary standards for surface analysis. ISO/TC 201 met first in 1992 and has met annually since. This committee now has eight subcommittees (Terminology, General Procedures, Data Management and Treatment, Depth Profiling, AES, SIMS, XPS, and Glow Discharge Spectroscopy (GDS)) and one working group (Total X-Ray Fluorescence Spectroscopy). Each subcommittee has one or more working groups to develop standards on particular topics. Australia has observer-member status on ISO/TC 201 and on all ISO/TC 201 subcommittees except GDS where it has participator-member status. I will outline the organization of ISO/TC 201 and summarize the standards that have been or are being developed. Copyright (1999) Australian X-ray Analytical Association Inc

  11. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    Science.gov (United States)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  12. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  13. The application of radiotracer technique for preconcentration neutron activation analysis

    International Nuclear Information System (INIS)

    Wang Xiaolin; Chen Yinliang; Sun Ying; Fu Yibei

    1995-01-01

    The application of radiotracer technique for preconcentration neutron activation analysis (Pre-NAA) are studied and the method for determination of chemical yield of Pre-NAA is developed. This method has been applied to determination of gold, iridium and rhenium in steel and rock samples and the contents of noble metal are in the range of 1-20 ng·g -1 (sample). In addition, the accuracy difference caused by determination of chemical yield between RNAA and Pre-NAA are also discussed

  14. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  15. Complete analysis of a nuclear building to nuclear safety standards

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, T A

    1975-01-01

    The nuclear standards impose on the designer the necessity of examining the loads, stresses, and strains in a nuclear building even under extreme loading conditions, both due to plant malfunctions and environmental accidents. It is necessary then to generate, combine, and examine a tremendous amount of data; really the lack of symmetry and general complication of the structures and the large number of loading combinations make an automatic analysis quite necessary. A largely automated procedure is presented in view of solving the problem by a series of computer programs linked together. After the seismic analysis has been performed by (SADE CODE) these data together with the data coming from thermal specifications, weight, accident descriptions etc. are fed into a finite element computer code (SAP4) for analysis. They are processed and combined by a computer code (COMBIN) according to the loading conditions (the usual list in Italy is given and briefly discussed), so that for each point (or each selected zone) under each loading condition the applied loads are listed. These data are fed to another computer code (DTP), which determines the amount of reinforcing bars necessary to accommodate the most severe of the loading conditions. The Aci 318/71 and Italian regulation procedures are followed; the characteristics of the program are briefly described and discussed. Some particular problems are discussed, e.g. the thermal stresses due to normal and accident conditions, the inelastic behavior of some frame elements (due to concrete cracking) is considered by means of an 'ad hoc' code. Typical examples are presented and the results are discussed showing a relatively large benefit in considering this inelastic effect.

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  18. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  19. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  20. Slit-scanning technique using standard cell sorter instruments for analyzing and sorting nonacrocentric human chromosomes, including small ones

    NARCIS (Netherlands)

    Rens, W.; van Oven, C. H.; Stap, J.; Jakobs, M. E.; Aten, J. A.

    1994-01-01

    We have investigated the performance of two types of standard flow cell sorter instruments, a System 50 Cytofluorograph and a FACSTar PLUS cell sorter, for the on-line centromeric index (CI) analysis of human chromosomes. To optimize the results, we improved the detection efficiency for centromeres

  1. Status of characterization techniques for carbon nanotubes and suggestions towards standards suitable for toxicological assessment

    International Nuclear Information System (INIS)

    Schweinberger, Florian F; Meyer-Plath, Asmus

    2011-01-01

    Nanotechnologies promise to contribute significantly to major technological challenges of the upcoming century. Despite profound scientific progress in the last decades, only minor advances have been made in the field of nanomaterial toxicology. The International Team in Nanosafety (TITNT) is an international and multidisciplinary group of scientists, which aims at better understanding the risks of nanomaterials. Carbon nanotubes (CNT) account for one of the most promising nanomaterials and have therefore been chosen as representative material for nanoscaled particles. They are currently investigated by the different platforms of TITNT. As a starting point, the present report summarizes a literature-based study on the physico-chemical properties of CNT, as they are closely linked with toxicological properties. A brief introduction to synthesis, purification and material properties is given. Characterization methods for CNT are discussed with respect to their reliability and the information content on chemical properties. Recommendations for a set of standard characterizations mandatory for toxicological assessment are derived.

  2. Status of characterization techniques for carbon nanotubes and suggestions towards standards suitable for toxicological assessment

    Science.gov (United States)

    Schweinberger, Florian F.; Meyer-Plath, Asmus

    2011-07-01

    Nanotechnologies promise to contribute significantly to major technological challenges of the upcoming century. Despite profound scientific progress in the last decades, only minor advances have been made in the field of nanomaterial toxicology. The International Team in Nanosafety (TITNT) is an international and multidisciplinary group of scientists, which aims at better understanding the risks of nanomaterials. Carbon nanotubes (CNT) account for one of the most promising nanomaterials and have therefore been chosen as representative material for nanoscaled particles. They are currently investigated by the different platforms of TITNT. As a starting point, the present report summarizes a literature-based study on the physico-chemical properties of CNT, as they are closely linked with toxicological properties. A brief introduction to synthesis, purification and material properties is given. Characterization methods for CNT are discussed with respect to their reliability and the information content on chemical properties. Recommendations for a set of standard characterizations mandatory for toxicological assessment are derived.

  3. Status of characterization techniques for carbon nanotubes and suggestions towards standards suitable for toxicological assessment

    Energy Technology Data Exchange (ETDEWEB)

    Schweinberger, Florian F, E-mail: florian.schweinberger@tum.de [International Team in Nanosafety (TITNT) and Technische Universitaet Muenchen, Catalysis Research Center, Chair of Physical Chemistry, Lichtenbergstr. 4, 85748 Garching (Germany); Meyer-Plath, Asmus [International Team in Nanosafety (TITNT) and BAM - Federal Institute for Materials Research and Testing, Division VI.5 - Polymer Surfaces, Unter den Eichen 87, 12205 Berlin (Germany)

    2011-07-06

    Nanotechnologies promise to contribute significantly to major technological challenges of the upcoming century. Despite profound scientific progress in the last decades, only minor advances have been made in the field of nanomaterial toxicology. The International Team in Nanosafety (TITNT) is an international and multidisciplinary group of scientists, which aims at better understanding the risks of nanomaterials. Carbon nanotubes (CNT) account for one of the most promising nanomaterials and have therefore been chosen as representative material for nanoscaled particles. They are currently investigated by the different platforms of TITNT. As a starting point, the present report summarizes a literature-based study on the physico-chemical properties of CNT, as they are closely linked with toxicological properties. A brief introduction to synthesis, purification and material properties is given. Characterization methods for CNT are discussed with respect to their reliability and the information content on chemical properties. Recommendations for a set of standard characterizations mandatory for toxicological assessment are derived.

  4. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  5. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  6. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  7. Standard practice for evaluation of hydrogen uptake, permeation, and transport in metals by an electrochemical technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice gives a procedure for the evaluation of hydrogen uptake, permeation, and transport in metals using an electrochemical technique which was developed by Devanathan and Stachurski. While this practice is primarily intended for laboratory use, such measurements have been conducted in field or plant applications. Therefore, with proper adaptations, this practice can also be applied to such situations. 1.2 This practice describes calculation of an effective diffusivity of hydrogen atoms in a metal and for distinguishing reversible and irreversible trapping. 1.3 This practice specifies the method for evaluating hydrogen uptake in metals based on the steady-state hydrogen flux. 1.4 This practice gives guidance on preparation of specimens, control and monitoring of the environmental variables, test procedures, and possible analyses of results. 1.5 This practice can be applied in principle to all metals and alloys which have a high solubility for hydrogen, and for which the hydrogen permeation is ...

  8. Pyrite: A blender plugin for visualizing molecular dynamics simulations using industry-standard rendering techniques.

    Science.gov (United States)

    Rajendiran, Nivedita; Durrant, Jacob D

    2018-05-05

    Molecular dynamics (MD) simulations provide critical insights into many biological mechanisms. Programs such as VMD, Chimera, and PyMOL can produce impressive simulation visualizations, but they lack many advanced rendering algorithms common in the film and video-game industries. In contrast, the modeling program Blender includes such algorithms but cannot import MD-simulation data. MD trajectories often require many gigabytes of memory/disk space, complicating Blender import. We present Pyrite, a Blender plugin that overcomes these limitations. Pyrite allows researchers to visualize MD simulations within Blender, with full access to Blender's cutting-edge rendering techniques. We expect Pyrite-generated images to appeal to students and non-specialists alike. A copy of the plugin is available at http://durrantlab.com/pyrite/, released under the terms of the GNU General Public License Version 3. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  9. Model-Based Data Integration and Process Standardization Techniques for Fault Management: A Feasibility Study

    Science.gov (United States)

    Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig

    2018-01-01

    This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.

  10. Four-arm single docking full robotic surgery for low rectal cancer: technique standardization

    Directory of Open Access Journals (Sweden)

    José Reinan Ramos

    Full Text Available The authors present the four-arm single docking full robotic surgery to treat low rectal cancer. The eight main operative steps are: 1- patient positioning; 2- trocars set-up and robot docking; 3- sigmoid colon, left colon and splenic flexure mobilization (lateral-to-medial approach; 4-Inferior mesenteric artery and vein ligation (medial-to-lateral approach; 5- total mesorectum excision and preservation of hypogastric and pelvic autonomic nerves (sacral dissection, lateral dissection, pelvic dissection; 6- division of the rectum using an endo roticulator stapler for the laparoscopic performance of a double-stapled coloanal anastomosis (type I tumor; 7- intersphincteric resection, extraction of the specimen through the anus and lateral-to-end hand sewn coloanal anastomosis (type II tumor; 8- cylindric abdominoperineal resection, with transabdominal section of the levator muscles (type IV tumor. The techniques employed were safe and have presented low rates of complication and no mortality.

  11. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  12. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  13. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  14. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  15. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  16. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  17. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    International Nuclear Information System (INIS)

    Kim, Chang Soo

    2008-01-01

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  18. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Soo [Dept. of Radiological Science, College of Health Science, Catholic University of Pusan, Pusan (Korea, Republic of)

    2008-03-15

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  19. The quantitative analysis of Bowen's kale by PIXE using the internal standard

    International Nuclear Information System (INIS)

    Navarrete, V.R.; Izawa, G.; Shiokawa, T.; Kamiya, M.; Morita, S.

    1978-01-01

    The internal standard method was used for non-destructive quantitative determination of trace elements by PIXE. The uniform distribution of the internal standard element in the Bowen's kale powder sample was obtained by using homogenization technique. Eleven elements are determined quantitatively for the sample prepared into self-supporting targets having lower relative standard deviations than non-self-supporting targets. (author)

  20. Determination of 25 elements in biological standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Guzzi, G.; Pietra, R.; Sabbioni, E.

    1974-12-01

    Standard and Certified Reference Materials programme of the JRC includes the determination of trace elements in complex biological samples delivered by the U.S. National Bureau of Standards: Bovine liver (NBS SRM 1577), Orchard Leaves (NBS SRM 1571) and Tomato Leaves. The study has been performed by the use of neutron activation analysis. Due to the very low concentration of some elements, radiochemical groups or elemental separation procedures were necessary. The paper describes the techniques used to analyse 25 elements. Computer assisted instrumental neutron activation analysis with high resolution Ge(Li) spectrometry was considerably advantageous in the determination of Na, K, Cl, Mn, Fe, Rb and Co and in some cases of Ca, Zn, Cs, Sc, and Cr. For low contents of Ca, Mg, Ni and Si special chemical separation schemes, followed by Cerenkov counting have been developped. Two other separation procedures allowing the determination of As, Cd, Ga, Hg, Mo, Cu, Sr Se, Ba and P have been set up. The first, the simplified one involves the use of high resolution Ge(Li) detectors, the second, the more complete one involves a larger number of shorter measurements performed by simpler and more sensitive techniques, such as NaI(Tl) scintillation spectrometry and Cerenkov counting. The results obtained are presented and discussed

  1. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  2. Towards a standard protocol for antimony intralesional infiltration technique for cutaneous leishmaniasis treatment

    Directory of Open Access Journals (Sweden)

    Rosiana Estéfane da Silva

    Full Text Available BACKGROUND Despite its recognised toxicity, antimonial therapy continues to be the first-line drug for cutaneous leishmaniasis (CL treatment. Intralesional administration of meglumine antimoniate (MA represents an alternative that could reduce the systemic absorption of the drug and its side effects. OBJECTIVES This study aims to validate the standard operational procedure (SOP for the intralesional infiltration of MA for CL therapy as the first step before the assessment of efficacy and safety related to the procedure. METHODS The SOP was created based on 21 trials retrieved from the literature, direct monitoring of the procedure and consultation with experts. This script was submitted to a formal computer-aided inspection to identify readability, clarity, omission, redundancy and unnecessary information (content validation. For criterion and construct validations, the influence of critical condition changes (compliance with the instructions and professional experience on outcome conformity (saturation status achievement, tolerability (pain referred and safety (bleeding were assessed. FINDINGS The median procedure length was 12 minutes and in 72% of them, patients classified the pain as mild. The bleeding was also classified as mild in 96.6% of the procedures. Full compliance with the SOP was observed in 66% of infiltrations. Despite this, in 100% of the inspected procedures, lesion saturation was observed at the end of infiltration, which means that it tolerates some degree of modification in its execution (robustness without prejudice to the result. CONCLUSIONS The procedure is reproducible and can be used by professionals without previous training with high success and safety rates.

  3. Towards a standard protocol for antimony intralesional infiltration technique for cutaneous leishmaniasis treatment.

    Science.gov (United States)

    Silva, Rosiana Estéfane da; Carvalho, Janaína de Pina; Ramalho, Dario Brock; Senna, Maria Camilo Ribeiro De; Moreira, Hugo Silva Assis; Rabello, Ana; Cota, Erika; Cota, Gláucia

    2018-02-01

    BACKGROUND Despite its recognised toxicity, antimonial therapy continues to be the first-line drug for cutaneous leishmaniasis (CL) treatment. Intralesional administration of meglumine antimoniate (MA) represents an alternative that could reduce the systemic absorption of the drug and its side effects. OBJECTIVES This study aims to validate the standard operational procedure (SOP) for the intralesional infiltration of MA for CL therapy as the first step before the assessment of efficacy and safety related to the procedure. METHODS The SOP was created based on 21 trials retrieved from the literature, direct monitoring of the procedure and consultation with experts. This script was submitted to a formal computer-aided inspection to identify readability, clarity, omission, redundancy and unnecessary information (content validation). For criterion and construct validations, the influence of critical condition changes (compliance with the instructions and professional experience) on outcome conformity (saturation status achievement), tolerability (pain referred) and safety (bleeding) were assessed. FINDINGS The median procedure length was 12 minutes and in 72% of them, patients classified the pain as mild. The bleeding was also classified as mild in 96.6% of the procedures. Full compliance with the SOP was observed in 66% of infiltrations. Despite this, in 100% of the inspected procedures, lesion saturation was observed at the end of infiltration, which means that it tolerates some degree of modification in its execution (robustness) without prejudice to the result. CONCLUSIONS The procedure is reproducible and can be used by professionals without previous training with high success and safety rates.

  4. Two-loop renormalization in the standard model, part II. Renormalization procedures and computational techniques

    Energy Technology Data Exchange (ETDEWEB)

    Actis, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Passarino, G. [Torino Univ. (Italy). Dipt. di Fisica Teorica; INFN, Sezione di Torino (Italy)

    2006-12-15

    In part I general aspects of the renormalization of a spontaneously broken gauge theory have been introduced. Here, in part II, two-loop renormalization is introduced and discussed within the context of the minimal Standard Model. Therefore, this paper deals with the transition between bare parameters and fields to renormalized ones. The full list of one- and two-loop counterterms is shown and it is proven that, by a suitable extension of the formalism already introduced at the one-loop level, two-point functions suffice in renormalizing the model. The problem of overlapping ultraviolet divergencies is analyzed and it is shown that all counterterms are local and of polynomial nature. The original program of 't Hooft and Veltman is at work. Finite parts are written in a way that allows for a fast and reliable numerical integration with all collinear logarithms extracted analytically. Finite renormalization, the transition between renormalized parameters and physical (pseudo-)observables, are discussed in part III where numerical results, e.g. for the complex poles of the unstable gauge bosons, are shown. An attempt is made to define the running of the electromagnetic coupling constant at the two-loop level. (orig.)

  5. [Pay attention to the standardized application of new techniques in surgical treatment of thyroid disease].

    Science.gov (United States)

    Tian, W; Xi, H Q; Wang, B

    2017-08-01

    The continuous development and application of new technology in thyroid surgery has promoted the rapid improvement of thyroid surgery. New technology in the field of thyroid surgery has developed rapidly. The application of neural monitoring technology has enabled the thyroid surgery to enter an accurate era. Imtraoperative neuromonitoring and continuous intraoperative neuromonitoring have made the recurrent laryngeal nerve protection more secure. Nano-carbon parathyroid gland negative imaging technology could identify parathyroid gland more precise. However, when the nano-carbon was used, the injection time, position and dosage should be grasped so as to achieve the best effect of negative imaging. Endoscopic and robotic thyroid surgery could meet the demand of cosmetic. "Treatment first, beauty second" is still the principle to be strictly followed. Do not blindly expand indications and pursue endoscopic surgery. Energy surgical instruments' update made the operation more efficient, while the instruments have some disadvantages. Thyroid surgeon must correctly understand the working principle of new energy devices and use them rationally. Through grasping the working principle and application skills of new technology in clinical work, definiting its advantages and disadvantages, adhereing to the "reasonable choice, standard application" principle, learning the pioneers' experience, the application of new thyroid diagnosis and treatment technology could be more reasonable and safe.

  6. Watermarking Techniques Using Least Significant Bit Algorithm for Digital Image Security Standard Solution- Based Android

    Directory of Open Access Journals (Sweden)

    Ari Muzakir

    2017-05-01

    Full Text Available Ease of deployment of digital image through the internet has positive and negative sides, especially for owners of the original digital image. The positive side of the ease of rapid deployment is the owner of that image deploys digital image files to various sites in the world address. While the downside is that if there is no copyright that serves as protector of the image it will be very easily recognized ownership by other parties. Watermarking is one solution to protect the copyright and know the results of the digital image. With Digital Image Watermarking, copyright resulting digital image will be protected through the insertion of additional information such as owner information and the authenticity of the digital image. The least significant bit (LSB is one of the algorithm is simple and easy to understand. The results of the simulations carried out using android smartphone shows that the LSB watermarking technique is not able to be seen by naked human eye, meaning there is no significant difference in the image of the original files with images that have been inserted watermarking. The resulting image has dimensions of 640x480 with a bit depth of 32 bits. In addition, to determine the function of the ability of the device (smartphone in processing the image using this application used black box testing. 

  7. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  8. Salivary Fluoride level in preschool children after toothbrushing with standard and low fluoride content dentifrice, using the transversal dentifrice application technique: pilot study

    Directory of Open Access Journals (Sweden)

    Fabiana Jandre Melo

    2008-01-01

    Full Text Available Objective: To investigate the salivary fluoride concentration in pre-school children after toothbrushing with dentifrice containing standard (1100ppmF/NaF and low (500ppmF/NaF fluoride concentration, using the transversal technique of placing the product on the toothbrush. Methods: Eight children of both sexes, ranging from 4 to 9 years, and 5 years and 6 months of age, participated in the study. The experiment was divided into two phases with a weekly interval. In the first stage, the children used the standard concentration dentifrice for one week, and in the second, the low concentration product. Samples were collected at the end of each experimental stage, at the following times: Before brushing, immediately afterwards, and after 15, 30 and 45 minutes. The fluoride contents were analyzed by the microdiffusion technique. Statistical analysis was done by the analysis of variance ANOVA and Student’s-t test (p<0.05. Results: The salivary fluoride concentration was significantly higher at all times, when the standard concentration product was used. The comparison between the Halogen concentration found before bushing and immediately afterwards, showed that there was a 6.8 times increase in the standard dentifrice (0.19 x 1.29μgF/ml and in the low concentration product, an increase of 20.5 times (0.02 x 0.41μgF/ml. Conclusion: Toothbrushing with both products promoted relevant increases in the salivary fluoride concentration; however, longitudinal studies are necessary to verify the clinical result of this measurement.

  9. Characterization of the storage pool of the Neutron Standards Laboratory of CIEMAT, using Monte Carlo techniques

    Energy Technology Data Exchange (ETDEWEB)

    Campo B, X.; Mendez V, R.; Embid S, M. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Av. Complutense 40, 28040 Madrid (Spain); Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98060 Zacatecas (Mexico); Sanz G, J., E-mail: xandra.campo@ciemat.es [Universidad Nacional de Educacion a Distancia, Escuela Tecnica Superior de Ingenieros Industriales, C. Juan del Rosal 12, 28040 Madrid (Spain)

    2014-08-15

    Neutron Standards Laboratory of CIEMAT in Spain is a brand new irradiation facility, with {sup 241}Am-Be (185 GBq) and {sup 252}Cf (5 GBq) calibrated neutron sources which are stored in a water pool with a concrete cover. From this storage place an automated system is able to take the selected source and place it in the irradiation position, 4 m over the ground level and in the geometrical center of the Irradiation Room with 9 m (length) x 7.5 m (width) x 8 m (height). For calibration or irradiation purposes, detectors or materials can be placed on a bench but it is possible to use the pool (1.0 m x 1.5 m and more than 1.0 m depth) for long time irradiations in thermal neutron fields. For this reason it is essential to characterize the pool itself in terms of neutron spectrum. In this document, the main features of this facility are presented and the characterization of the storage pool in terms of neutron fluence rate and neutron spectrum has been carried out using simulations with MCNPX-2.7.e code. The MCNPX-2.7.e model has been validated using experimental measurements outside the pool (Bert hold LB6411). Inside the pool, the fluence rate decreases and the spectra is thermalized with the distance to the {sup 252}Cf source. This source predominates and the effect of the {sup 241}Am-Be source in these magnitudes is not shown until positions closer than 20 cm from it. (author)

  10. Characterization of the storage pool of the Neutron Standards Laboratory of CIEMAT, using Monte Carlo techniques

    International Nuclear Information System (INIS)

    Campo B, X.; Mendez V, R.; Embid S, M.; Vega C, H. R.; Sanz G, J.

    2014-08-01

    Neutron Standards Laboratory of CIEMAT in Spain is a brand new irradiation facility, with 241 Am-Be (185 GBq) and 252 Cf (5 GBq) calibrated neutron sources which are stored in a water pool with a concrete cover. From this storage place an automated system is able to take the selected source and place it in the irradiation position, 4 m over the ground level and in the geometrical center of the Irradiation Room with 9 m (length) x 7.5 m (width) x 8 m (height). For calibration or irradiation purposes, detectors or materials can be placed on a bench but it is possible to use the pool (1.0 m x 1.5 m and more than 1.0 m depth) for long time irradiations in thermal neutron fields. For this reason it is essential to characterize the pool itself in terms of neutron spectrum. In this document, the main features of this facility are presented and the characterization of the storage pool in terms of neutron fluence rate and neutron spectrum has been carried out using simulations with MCNPX-2.7.e code. The MCNPX-2.7.e model has been validated using experimental measurements outside the pool (Bert hold LB6411). Inside the pool, the fluence rate decreases and the spectra is thermalized with the distance to the 252 Cf source. This source predominates and the effect of the 241 Am-Be source in these magnitudes is not shown until positions closer than 20 cm from it. (author)

  11. High Classification Rates for Continuous Cow Activity Recognition using Low-cost GPS Positioning Sensors and Standard Machine Learning Techniques

    DEFF Research Database (Denmark)

    Godsk, Torben; Kjærgaard, Mikkel Baun

    2011-01-01

    activities. By preprocessing the raw cow position data, we obtain high classification rates using standard machine learning techniques to recognize cow activities. Our objectives were to (i) determine to what degree it is possible to robustly recognize cow activities from GPS positioning data, using low...... and their activities manually logged to serve as ground truth. For our dataset we managed to obtain an average classification success rate of 86.2% of the four activities: eating/seeking (90.0%), walking (100%), lying (76.5%), and standing (75.8%) by optimizing both the preprocessing of the raw GPS data...

  12. Photon and proton activation analysis of iron and steel standards using the internal standard method coupled with the standard addition method

    International Nuclear Information System (INIS)

    Masumoto, K.; Hara, M.; Hasegawa, D.; Iino, E.; Yagi, M.

    1997-01-01

    The internal standard method coupled with the standard addition method has been applied to photon activation analysis and proton activation analysis of minor elements and trace impurities in various types of iron and steel samples issued by the Iron and Steel Institute of Japan (ISIJ). Samples and standard addition samples were once dissolved to mix homogeneously, an internal standard and elements to be determined and solidified as a silica-gel to make a similar matrix composition and geometry. Cerium and yttrium were used as an internal standard in photon and proton activation, respectively. In photon activation, 20 MeV electron beam was used for bremsstrahlung irradiation to reduce matrix activity and nuclear interference reactions, and the results were compared with those of 30 MeV irradiation. In proton activation, iron was removed by the MIBK extraction method after dissolving samples to reduce the radioactivity of 56 Co from iron via 56 Fe(p, n) 56 Co reaction. The results of proton and photon activation analysis were in good agreement with the standard values of ISIJ. (author)

  13. Novel technique for MR elastography of the prostate using a modified standard endorectal coil as actuator.

    Science.gov (United States)

    Thörmer, Gregor; Reiss-Zimmermann, Martin; Otto, Josephin; Hoffmann, Karl-Titus; Moche, Michael; Garnov, Nikita; Kahn, Thomas; Busse, Harald

    2013-06-01

    To present a novel method for MR elastography (MRE) of the prostate at 3 Tesla using a modified endorectal imaging coil. A commercial endorectal coil was modified to dynamically generate mechanical stress (contraction and dilation) in a prostate phantom with embedded phantom "lesions" (6 mm diameter) and in a porcine model. Resulting tissue displacements were measured with a motion-sensitive EPI sequence at actuation frequencies of 50-200 Hz. Maps of shear modulus G were calculated from the measured phase-difference shear-wave patterns. In the G maps of the phantom, "lesions" were easily discernible against the background. The average G values of regions of interest placed in the "lesion" (8.2 ± 1.9 kPa) were much higher than those in the background (3.6 ± 1.4 kPa) but systematically lower than values reported by the vendor (13.0 ± 1.0 and 6.7 ± 0.7 kPa, respectively). In the porcine model, shear waves could be generated and measured shear moduli were substantially different for muscle (7.1 ± 2.0 kPa), prostate (3.0 ± 1.4 kPa), and bulbourethral gland (5.6 ± 1.9 kPa). An endorectal MRE concept is technically feasible. The presented technique will allow for simultaneous MRE and MRI acquisitions using a commercial base device with minor, MR-conditional modifications. The diagnostic value needs to be determined in further trials. Copyright © 2012 Wiley Periodicals, Inc.

  14. Standardization of the Fricke gel dosimetry method and tridimensional dose evaluation using the magnetic resonance imaging technique

    International Nuclear Information System (INIS)

    Cavinato, Christianne Cobello

    2009-01-01

    This study standardized the method for obtaining the Fricke gel solution developed at IPEN. The results for different gel qualities used in the preparation of solutions and the influence of the gelatin concentration in the response of dosimetric solutions were compared. Type tests such as: dose response dependence, minimum and maximum detection limits, response reproducibility, among others, were carried out using different radiation types and the Optical Absorption (OA) spectrophotometry and Magnetic Resonance (MR) techniques. The useful dose ranges for Co 60 gamma radiation and 6 MeV photons are 0,4 to 30,0 Gy and 0,5 to 100,0 Gy , using OA and MR techniques, respectively. A study of ferric ions diffusion in solution was performed to determine the optimum time interval between irradiation and samples evaluation; until 2,5 hours after irradiation to obtain sharp MR images. A spherical simulator consisting of Fricke gel solution prepared with 5% by weight 270 Bloom gelatine (national quality) was developed to be used to three-dimensional dose assessment using the Magnetic Resonance Imaging (MRI) technique. The Fricke gel solution prepared with 270 Bloom gelatine, that, in addition to low cost, can be easily acquired on the national market, presents satisfactory results on the ease of handling, sensitivity, response reproducibility and consistency. The results confirm their applicability in the three-dimensional dosimetry using MRI technique. (author)

  15. TU-EF-BRD-02: Indicators and Technique Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlone, M. [Princess Margaret Hospital (Canada)

    2015-06-15

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  16. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  17. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  18. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  19. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  20. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  1. Electrodeposition as a sample preparation technique for TXRF analysis

    International Nuclear Information System (INIS)

    Griesel, S.; Reus, U.; Prange, A.

    2000-01-01

    TXRF analysis of trace elements at concentrations in the μg/L range and below in high salt matrices normally requires a number of sample preparation steps that include separation of the salt matrix and preconcentration of the trace elements. A neat approach which allows samples to be prepared straightforwardly in a single step involves the application of electrochemical deposition using the TXRF sample support itself as an electrode. For this work a common three-electrode arrangement (radiometer analytical) with a rotating disc electrode as the working electrode, as is frequently employed in voltametric analysis, has been used. A special electrode tip has been constructed as a holder for the sample carrier which consists of polished glassy carbon. This material has been proven to be suitable for both its electrical and chemical properties. Measurements of the trace elements were performed using the ATOMIKA 8030C TXRF spectrometer, with the option of variable incident angles. In first experiments an artificial sea water matrix containing various trace elements in the μg/L range has been used. Elements such as Cr, Mn, Fe, Co, Ni, Cu, Zn, Ag, Cd, Hg, and Pb deposited on glassy carbon carriers. The deposition can be optimized by controlling the potential of the working electrode with respect to the reference electrode. Metal ions with a suitable standard potential are reduced to the metallic state and plated onto the electrode surface. When deposition is finished the sample carrier is demounted, rinsed with ultra-pure water and measured directly. Deposition yields for the elements under investigation are quite similar, and with an appropriate choice of the reference element, quantification can be achieved directly by internal standardization. The influence of parameters such as time, pH value, and trace element concentration on the deposition yield has been examined, and the results will be presented along with reproducibility studies. (author)

  2. Principal Components Analysis on the spectral Bidirectional Reflectance Distribution Function of ceramic colour standards.

    Science.gov (United States)

    Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A

    2011-09-26

    The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America

  3. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  4. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  5. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  6. A Cross-State Analysis of Renewable Portfolio Standard Development

    Science.gov (United States)

    Marchand, Mariel

    As of December 2016, thirty-seven states have a renewable portfolio standard (RPS). RPS require that utilities provide a certain percentage of electricity generated using renewable sources by a certain date. This thesis builds on diffusion of innovation literature to understand how factors within a state, such as its political climate and the strength of interest groups, appear to influence the adoption process and structure of the RPS in five states--Connecticut, New Jersey, Michigan, Colorado, and Washington. Each of these states has a strong RPS as measured by its renewable energy goal over its current renewable energy production, the time frame in which this goal must be met, and the percentage of the electric load that is included in the regulation. This thesis uses both within-case and cross-case analysis to understand which combinations of internal state factors potentially lead to the adoption of a strong RPS. It finds that there are a number of combinations of factors that appear to contribute to strong RPS, depending on the internal circumstances of each state. However, more important is that without the opportunity to tailor the policy to meet the needs of the state, it is likely that states with unfavorable internal factors may not choose to adopt a RPS at all, let alone a strong RPS. While the innovation factors identified through the RPS diffusion research often contribute to states adopting a strong RPS, this thesis finds that the influence of these factors depends on a combination of the internal state factors with the RPS adoption process in shaping the structure of the RPS.

  7. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  8. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  9. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  10. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  11. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  12. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  13. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  14. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  15. Multiplex Ligation-Dependent Probe Amplification Technique for Copy Number Analysis on Small Amounts of DNA Material

    DEFF Research Database (Denmark)

    Sørensen, Karina; Andersen, Paal; Larsen, Lars

    2008-01-01

    The multiplex ligation-dependent probe amplification (MLPA) technique is a sensitive technique for relative quantification of up to 50 different nucleic acid sequences in a single reaction, and the technique is routinely used for copy number analysis in various syndromes and diseases. The aim...... of the study was to exploit the potential of MLPA when the DNA material is limited. The DNA concentration required in standard MLPA analysis is not attainable from dried blood spot samples (DBSS) often used in neonatal screening programs. A novel design of MLPA probes has been developed to permit for MLPA...... analysis on small amounts of DNA. Six patients with congenital adrenal hyperplasia (CAH) were used in this study. DNA was extracted from both whole blood and DBSS and subjected to MLPA analysis using normal and modified probes. Results were analyzed using GeneMarker and manual Excel analysis. A total...

  16. Russian Language Development Assessment as a Standardized Technique for Assessing Communicative Function in Children Aged 3–9 Years

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.,

    2016-10-01

    Full Text Available The article describes the Russian Language Development Assessment, a standardized individual diagnostic tool for children aged from 3 to 9 that helps to assess the following components of a child’s communicative function: passive vocabulary, expressive vocabulary, knowledge of semantic constructs with logical, temporal and spatial relations, passive perception and active use of syntactic and morphological features of words in a sentence, active and passive phonological awareness, active and passive knowledge of syntactic structures and categories. The article provides descriptions of content and diagnostic procedures for all 7 subtests included in the assessment (Passive Vocabulary, Active Vocabulary, Linguistic Operators, Sentence structure, Word Structure, Phonology, Sentence Repetition. Basing on the data collected in the study that involved 86 first- graders of a Moscow school, the article analyzes the internal consistency and construct validity of each subtest of the technique. It concludes that the Russian Language Development Assessment technique can be of much use both in terms of diagnostic purposes and in supporting children with ASD taking into account the lack of standardized tools for language and speech development assessment in Russian and the importance of this measure in general.

  17. A comparative study of standard vs. high definition colonoscopy for adenoma and hyperplastic polyp detection with optimized withdrawal technique.

    Science.gov (United States)

    East, J E; Stavrindis, M; Thomas-Gibson, S; Guenther, T; Tekkis, P P; Saunders, B P

    2008-09-15

    Colonoscopy has a known miss rate for polyps and adenomas. High definition (HD) colonoscopes may allow detection of subtle mucosal change, potentially aiding detection of adenomas and hyperplastic polyps. To compare detection rates between HD and standard definition (SD) colonoscopy. Prospective, cohort study with optimized withdrawal technique (withdrawal time >6 min, antispasmodic, position changes, re-examining flexures and folds). One hundred and thirty patients attending for routine colonoscopy were examined with either SD (n = 72) or HD (n = 58) colonoscopes. Groups were well matched. Sixty per cent of patients had at least one adenoma detected with SD vs. 71% with HD, P = 0.20, relative risk (benefit) 1.32 (95% CI 0.85-2.04). Eighty-eight adenomas (mean +/- standard deviation 1.2 +/- 1.4) were detected using SD vs. 93 (1.6 +/- 1.5) with HD, P = 0.12; however more nonflat, diminutive (9 mm) hyperplastic polyps was 7% (0.09 +/- 0.36). High definition did not lead to a significant increase in adenoma or hyperplastic polyp detection, but may help where comprehensive lesion detection is paramount. High detection rates appear possible with either SD or HD, when using an optimized withdrawal technique.

  18. Comparison of global sensitivity analysis techniques and importance measures in PSA

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.; Tarantola, S.; Saltelli, A.

    2003-01-01

    This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell-Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA

  19. Development of isotope dilution-liquid chromatography/mass spectrometry combined with standard addition techniques for the accurate determination of tocopherols in infant formula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joonhee; Jang, Eun-Sil; Kim, Byungjoo, E-mail: byungjoo@kriss.re.kr

    2013-07-17

    Graphical abstract: -- Highlights: •ID-LC/MS method showed biased results for tocopherols analysis in infant formula. •H/D exchange of deuterated tocopherols in sample preparation was the source of bias. •Standard addition (SA)-ID-LC/MS was developed as an alternative to ID-LC/MS. •Details of calculation and uncertainty evaluation of the SA-IDMS were described. •SA-ID-LC/MS showed a higher-order metrological quality as a reference method. -- Abstract: During the development of isotope dilution-liquid chromatography/mass spectrometry (ID-LC/MS) for tocopherol analysis in infant formula, biased measurement results were observed when deuterium-labeled tocopherols were used as internal standards. It turned out that the biases came from intermolecular H/D exchange and intramolecular H/D scrambling of internal standards in sample preparation processes. Degrees of H/D exchange and scrambling showed considerable dependence on sample matrix. Standard addition-isotope dilution mass spectrometry (SA-IDMS) based on LC/MS was developed in this study to overcome the shortcomings of using deuterium-labeled internal standards while the inherent advantage of isotope dilution techniques is utilized for the accurate recovery correction in sample preparation processes. Details of experimental scheme, calculation equation, and uncertainty evaluation scheme are described in this article. The proposed SA-IDMS method was applied to several infant formula samples to test its validity. The method was proven to have a higher-order metrological quality with providing very accurate and precise measurement results.

  20. 78 FR 45447 - Revisions to Modeling, Data, and Analysis Reliability Standard

    Science.gov (United States)

    2013-07-29

    ...; Order No. 782] Revisions to Modeling, Data, and Analysis Reliability Standard AGENCY: Federal Energy... Analysis (MOD) Reliability Standard MOD- 028-2, submitted to the Commission for approval by the North... Organization. The Commission finds that the proposed Reliability Standard represents an improvement over the...

  1. Evaluation of pressed powders and thin section standards for multi-elemental analysis by conventional and micro-PIXE analysis

    International Nuclear Information System (INIS)

    Homma-Takeda, Shino; Iso, Hiroyuki; Ito, Masaki

    2010-01-01

    For multi-elemental analysis, various standards are used to quantify the elements consists of environmental and biological samples. In this paper two different configuration standards, pressed powders and thin section standards, were assessed for their purpose as standards by conventional and micro-PIXE analysis. Homogeneity of manganese, iron, zinc (Zn), copper and yttrium added to pressed powder standard materials were validated and the relative standard deviation (RSD) of the X-ray intensity of the standards was 2 area and the metal concentration was acceptable. (author)

  2. DPASV analytical technique for ppb level uranium analysis

    Science.gov (United States)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Determining uranium in ppb level is considered to be most crucial for reuse of water originated in nuclear industries at the time of decontamination of plant effluents generated during uranium (fuel) production, fuel rod fabrication, application in nuclear reactors and comparatively small amount of effluents obtained during laboratory research and developmental work. Higher level of uranium in percentage level can be analyzed through gravimetry, titration etc, whereas inductively coupled plasma-atomic energy spectroscopy (ICP-AES), fluorimeter are well suited for ppm level. For ppb level of uranium, inductively coupled plasma - mass spectroscopy (ICP-MS) or Differential Pulse Anodic Stripping Voltammetry (DPASV) serve the purpose. High precision, accuracy and sensitivity are the crucial for uranium analysis in trace (ppb) level, which are satisfied by ICP-MS and stripping voltammeter. Voltammeter has been found to be less expensive, requires low maintenance and is convenient for measuring uranium in presence of large number of other ions in the waste effluent. In this paper, necessity of uranium concentration quantification for recovery as well as safe disposal of plant effluent, working mechanism of voltammeter w.r.t. uranium analysis in ppb level with its standard deviation and a data comparison with ICP-MS has been represented.

  3. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2011-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the

  4. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    Science.gov (United States)

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  5. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  6. Tooth contact analysis of spur gears. Part 1-SAM analysis of standard gears

    Directory of Open Access Journals (Sweden)

    Creţu Spiridon

    2017-01-01

    Full Text Available The involute gears are sensitive to the misalignment of their axes which determines transmission errors and perturbations of pressures distributions along the tooth flank. The concentrated contacts in gears are no longer as Hertz type. A semi-analytical method was developed to find the contact area, pressures distribution and depth stresses state. The matrix of initial separations is found analytically for standard and non-standard spur gears. The presence of misalignment as well as the flank crowning and flank end relief are included in the numerical analysis process.

  7. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  8. IEEE standard requirements for reliability analysis in the design and operation of safety systems for nuclear power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The purpose of this standard is to provide uniform, minimum acceptable requirements for the performance of reliability analyses for safety-related systems found in nuclear-power generating stations, but not to define the need for an analysis. The need for reliability analysis has been identified in other standards which expand the requirements of regulations (e.g., IEEE Std 379-1972 (ANSI N41.2-1972), ''Guide for the Application of the Single-Failure Criterion to Nuclear Power Generating Station Protection System,'' which describes the application of the single-failure criterion). IEEE Std 352-1975, ''Guide for General Principles of Reliability Analysis of Nuclear Power Generating Station Protection Systems,'' provides guidance in the application and use of reliability techniques referred to in this standard

  9. Differences of standard values of Supersonic shear imaging and ARFI technique - in vivo study of testicular tissue.

    Science.gov (United States)

    Trottmann, M; Rübenthaler, J; Marcon, J; Stief, C G; Reiser, M F; Clevert, D A

    2016-01-01

    To investigate the difference of standard values of Supersonic shear imaging (SSI) and Acoustic Radiation Force Impulse (ARFI) technique in the evaluation of testicular tissue stiffness in vivo. 58 healthy male testes were examined using B-mode sonography and ARFI and SSI. B-mode sonography was performed in order to scan the testis for pathologies followed by performance of real-time elastography in three predefined areas (upper pole, central portion and lower pole) using the SuperSonic® Aixplorer ultrasound device (SuperSonic Imagine, Aix-en-Provence, France). Afterwards a second assessment of the same testicular regions by elastography followed using the ARFI technique of the Siemens Acuson 2000™ ultrasound device (Siemens Health Care, Germany). Values of shear wave velocity were described in m/s. Parameters of elastography techniques were compared using paired sample t-test. The values of SSI were all significantly higher in all measured areas compared to ARFI (p < 0.001 to p = 0.015). Quantitatively there was a higher mean SSI wave velocity value of 1,1 compared to 0.8 m/s measured by ARFI. SSI values are significantly higher than ARFI values when measuring the stiffness of testicular tissue and should only be compared with caution.

  10. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  11. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  12. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  13. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  14. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS.

    Science.gov (United States)

    Creech, J B; Baker, J A; Handler, M R; Bizzarro, M

    2014-01-10

    We report a method for the chemical purification of Pt from geological materials by ion-exchange chromatography for subsequent Pt stable isotope analysis by multiple-collector inductively coupled plasma mass spectrometry (MC-ICPMS) using a 196 Pt- 198 Pt double-spike to correct for instrumental mass bias. Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily dissolved in acid in preparation for anion-exchange chemistry. Pt was recovered from anion-exchange resin in concentrated HNO 3 acid after elution of matrix elements, including the other platinum group elements (PGE), in dilute HCl and HNO 3 acids. The separation method has been calibrated using a precious metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5 ng g -1 to 4 μg g -1 . This analytical method has been shown to have an external reproducibility on δ 198 Pt (permil difference in the 198 Pt/ 194 Pt ratio from the IRMM-010 standard) of ±0.040 (2 sd) on Pt solution standards (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2 sd). Pt stable isotope data for the full set of reference materials have a range of δ 198 Pt values with offsets of up to 0.4‰ from the IRMM-010 standard, which are readily resolved with this technique. These

  15. Standardizing economic analysis in prevention will require substantial effort.

    Science.gov (United States)

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  16. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    Science.gov (United States)

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  17. Oncoplastic round block technique has comparable operative parameters as standard wide local excision: a matched case-control study.

    Science.gov (United States)

    Lim, Geok-Hoon; Allen, John Carson; Ng, Ruey Pyng

    2017-08-01

    Although oncoplastic breast surgery is used to resect larger tumors with lower re-excision rates compared to standard wide local excision (sWLE), criticisms of oncoplastic surgery include a longer-albeit, well concealed-scar, longer operating time and hospital stay, and increased risk of complications. Round block technique has been reported to be very suitable for patients with relatively smaller breasts and minimal ptosis. We aim to determine if round block technique will result in operative parameters comparable with sWLE. Breast cancer patients who underwent a round block procedure from 1st May 2014 to 31st January 2016 were included in the study. These patients were then matched for the type of axillary procedure, on a one to one basis, with breast cancer patients who had undergone sWLE from 1st August 2011 to 31st January 2016. The operative parameters between the 2 groups were compared. 22 patients were included in the study. Patient demographics and histologic parameters were similar in the 2 groups. No complications were reported in either group. The mean operating time was 122 and 114 minutes in the round block and sWLE groups, respectively (P=0.64). Length of stay was similar in the 2 groups (P=0.11). Round block patients had better cosmesis and lower re-excision rates. A higher rate of recurrence was observed in the sWLE group. The round block technique has comparable operative parameters to sWLE with no evidence of increased complications. Lower re-excision rate and better cosmesis were observed in the round block patients suggesting that the round block technique is not only comparable in general, but may have advantages to sWLE in selected cases.

  18. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  19. Standard test method for isotopic analysis of uranium hexafluoride by double standard single-collector gas mass spectrometer method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This is a quantitative test method applicable to determining the mass percent of uranium isotopes in uranium hexafluoride (UF6) samples with 235U concentrations between 0.1 and 5.0 mass %. 1.2 This test method may be applicable for the entire range of 235U concentrations for which adequate standards are available. 1.3 This test method is for analysis by a gas magnetic sector mass spectrometer with a single collector using interpolation to determine the isotopic concentration of an unknown sample between two characterized UF6 standards. 1.4 This test method is to replace the existing test method currently published in Test Methods C761 and is used in the nuclear fuel cycle for UF6 isotopic analyses. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro...

  20. Gasoline taxes or efficiency standards? A heterogeneous household demand analysis

    International Nuclear Information System (INIS)

    Liu, Weiwei

    2015-01-01

    Using detailed consumer expenditure survey data and a flexible semiparametric dynamic demand model, this paper estimates the price elasticity and fuel efficiency elasticity of gasoline demand at the household level. The goal is to assess the effectiveness of gasoline taxes and vehicle fuel efficiency standards on fuel consumption. The results reveal substantial interaction between vehicle fuel efficiency and the price elasticity of gasoline demand: the improvement of vehicle fuel efficiency leads to lower price elasticity and weakens consumers’ sensitivity to gasoline price changes. The offsetting effect also differs across households due to demographic heterogeneity. These findings imply that when gasoline taxes are in place, tightening efficiency standards will partially offset the strength of taxes on reducing fuel consumption. - Highlights: • Model household gasoline demand using a semiparametric approach. • Estimate heterogeneous price elasticity and fuel efficiency elasticity. • Assess the effectiveness of gasoline taxes and efficiency standards. • Efficiency standards offset the impact of gasoline taxes on fuel consumption. • The offsetting effect differs by household demographics

  1. Feasibility of CBCT-based dose calculation: Comparative analysis of HU adjustment techniques

    International Nuclear Information System (INIS)

    Fotina, Irina; Hopfgartner, Johannes; Stock, Markus; Steininger, Thomas; Lütgendorf-Caucig, Carola; Georg, Dietmar

    2012-01-01

    Background and purpose: The aim of this work was to compare the accuracy of different HU adjustments for CBCT-based dose calculation. Methods and materials: Dose calculation was performed on CBCT images of 30 patients. In the first two approaches phantom-based (Pha-CC) and population-based (Pop-CC) conversion curves were used. The third method (WAB) represents override of the structures with standard densities for water, air and bone. In ROI mapping approach all structures were overridden with average HUs from planning CT. All techniques were benchmarked to the Pop-CC and CT-based plans by DVH comparison and γ-index analysis. Results: For prostate plans, WAB and ROI mapping compared to Pop-CC showed differences in PTV D median below 2%. The WAB and Pha-CC methods underestimated the bladder dose in IMRT plans. In lung cases PTV coverage was underestimated by Pha-CC method by 2.3% and slightly overestimated by the WAB and ROI techniques. The use of the Pha-CC method for head–neck IMRT plans resulted in difference in PTV coverage up to 5%. Dose calculation with WAB and ROI techniques showed better agreement with pCT than conversion curve-based approaches. Conclusions: Density override techniques provide an accurate alternative to the conversion curve-based methods for dose calculation on CBCT images.

  2. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  3. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  4. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  5. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Therese, Laurent; Guillot, Philippe; Muja, Cristina

    2011-01-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  6. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  7. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  8. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Fernandez, R.F.; Zhang, W.; Robertson, J.D.; Majidi, V.

    1995-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 μg/g; and for Hg 2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag + , Ba 2+ , Cd 2+ , Cu 2+ , and Pb 2+ , share common binding sites with binding efficiencies varying in the sequence of Pb 2+ >Cu 2+ >Ag 2+ >Cd 2+ >Ba 2+ . The binding of Hg 2+ involved a different binding site with an increase in binding efficiency in the presence of Ag + . (orig.)

  9. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Robertson, J.D.; Majidi, V.

    1994-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. 5 mg of dried algae powder were mixed with 5 mL of single- and multi-metal solutions. The algae cells were then collected by filtration on 0.6 um polycarbonate membranes and analyzed by PIXE using a dual energy irradiation. When C. vulgatis was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 ug/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 ug/g; and for Hg 2+ from 10 ng/g to 10 ug/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 ug/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium is also replaced

  10. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  11. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  12. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  13. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  14. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  15. Suitable pellets standards development for LA-ICPMS analysis of Al2O3 powders

    International Nuclear Information System (INIS)

    Ferraz, Israel Elias; Sousa, Talita Alves de; Silva, Ieda de Souza; Gomide, Ricardo Goncalves; Oliveira, Luis Claudio de

    2013-01-01

    Chemical and physical characterization of aluminium oxides has a special interest for the nuclear industry, despite arduous chemical digestion process. Therefore, laser ablation inductively coupled plasma mass spectrometry is an attractive method for analysis. However, due to the lack of suitable matrix-matched certified reference materials (MRC) for such powders and ceramic pellets analysis, LA-ICPMS has not yet been fully applied. Furthermore, establishing calibrate curves to trace element quantification using external standards raises a significant problem. In this context, the development of suitable standard pellets to have calibration curves for chemical determination of the impurities onto aluminium oxide powders by LA-ICPMS analytical technique was aimed in this work. It was developed using two different analytical strategies: (I) boric acid pressed pellets and (II) lithium tetra-borate melted pellets, both spiked with high purity oxides of Si, Mg, Ca, Na,Fe, Cr and Ni. The analytical strategy (II) which presented the best analytical parameters was selected, a reference certificated material was analyzed and the results compared. The limits of detection, linearity, precision, accuracy and recovery study results are presented and discussed. (author)

  16. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Science.gov (United States)

    Luglio, Gaetano; De Palma, Giovanni Domenico; Tarquini, Rachele; Giglio, Mariano Cesare; Sollazzo, Viviana; Esposito, Emanuela; Spadarella, Emanuela; Peltrini, Roberto; Liccardo, Filomena; Bucci, Luigi

    2015-01-01

    Background Despite the proven benefits, laparoscopic colorectal surgery is still under utilized among surgeons. A steep learning is one of the causes of its limited adoption. Aim of the study is to determine the feasibility and morbidity rate after laparoscopic colorectal surgery in a single institution, “learning curve” experience, implementing a well standardized operative technique and recovery protocol. Methods The first 50 patients treated laparoscopically were included. All the procedures were performed by a trainee surgeon, supervised by a consultant surgeon, according to the principle of complete mesocolic excision with central vascular ligation or TME. Patients underwent a fast track recovery programme. Recovery parameters, short-term outcomes, morbidity and mortality have been assessed. Results Type of resections: 20 left side resections, 8 right side resections, 14 low anterior resection/TME, 5 total colectomy and IRA, 3 total panproctocolectomy and pouch. Mean operative time: 227 min; mean number of lymph-nodes: 18.7. Conversion rate: 8%. Mean time to flatus: 1.3 days; Mean time to solid stool: 2.3 days. Mean length of hospital stay: 7.2 days. Overall morbidity: 24%; major morbidity (Dindo–Clavien III): 4%. No anastomotic leak, no mortality, no 30-days readmission. Conclusion Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon. PMID:25859386

  17. Study of some environmental problem in egypt using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    El-Karim, A.H.M.G.

    2003-01-01

    this thesis deals with the investigation of the possibility of using the new (second) egyptian research reactor (ETRR-2) at Inshas (22 MW) for the neutron activation analysis (ANN) of trace elements, particularly in air dust, collected from cairo and some other cities of egypt. in this concern chapter 1 gives an introduction about the activation methods in general, describing the various techniques used and a comparison of the methods with other instrumental methods of analysis . as a main classification, the neutron activation methods involve prompt γ-ray NAA and delayed γ-ray NAA; cyclic NAA (repeated activation) was also outlined. the methodology of NAA involves the absolute method, the relative method and the mono standard (single comparator) method , which is in between the absolute and relative methods

  18. The standard deviation of extracellular water/intracellular water is associated with all-cause mortality and technique failure in peritoneal dialysis patients.

    Science.gov (United States)

    Tian, Jun-Ping; Wang, Hong; Du, Feng-He; Wang, Tao

    2016-09-01

    The mortality rate of peritoneal dialysis (PD) patients is still high, and the predicting factors for PD patient mortality remain to be determined. This study aimed to explore the relationship between the standard deviation (SD) of extracellular water/intracellular water (E/I) and all-cause mortality and technique failure in continuous ambulatory PD (CAPD) patients. All 152 patients came from the PD Center between January 1st 2006 and December 31st 2007. Clinical data and at least five-visit E/I ratio defined by bioelectrical impedance analysis were collected. The patients were followed up till December 31st 2010. The primary outcomes were death from any cause and technique failure. Kaplan-Meier analysis and Cox proportional hazards models were used to identify risk factors for mortality and technique failure in CAPD patients. All patients were followed up for 59.6 ± 23.0 months. The patients were divided into two groups according to their SD of E/I values: lower SD of E/I group (≤0.126) and higher SD of E/I group (>0.126). The patients with higher SD of E/I showed a higher all-cause mortality (log-rank χ (2) = 10.719, P = 0.001) and technique failure (log-rank χ (2) = 9.724, P = 0.002) than those with lower SD of E/I. Cox regression analysis found that SD of E/I independently predicted all-cause mortality (HR  3.551, 95 % CI 1.442-8.746, P = 0.006) and technique failure (HR  2.487, 95 % CI 1.093-5.659, P = 0.030) in CAPD patients after adjustment for confounders except when sensitive C-reactive protein was added into the model. The SD of E/I was a strong independent predictor of all-cause mortality and technique failure in CAPD patients.

  19. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  20. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  1. Late effects of craniospinal irradiation for standard risk medulloblastoma in paediatric patients: A comparison of treatment techniques

    International Nuclear Information System (INIS)

    Leman, J.

    2016-01-01

    Background: Survival rates for standard risk medulloblastoma are favourable, but craniospinal irradiation (CSI) necessary to eradicate microscopic spread causes life limiting late effects. Aims: The aim of this paper is to compare CSI techniques in terms of toxicity and quality of life for survivors. Methods and materials: A literature search was conducted using synonyms of ‘medulloblastoma’, ’craniospinal’, ‘radiotherapy’ and ‘side effects’ to highlight 29 papers that would facilitate this discussion. Results and discussion: Intensity modulated radiotherapy (IMRT), tomotherapy and protons all provide CSI which can reduce dose to normal tissue, however photon methods cannot eliminate exit dose as well as protons can. Research for each technique requires longer term follow up in order to prove that survival rates remain high whilst reducing late effects. Findings/conclusion: Proton therapy is the superior method of CSI in term of late effects, but more research is needed to evidence this. Until proton therapy is available in the UK IMRT should be utilised. - Highlights: • Craniospinal irradiation is vital in the treatment of medulloblastoma. • Survivors often suffer long term side effects which reduce quality of life. • Tomotherapy, IMRT and proton therapy reduce late effects by sparing normal tissue. • Proton therapy offers superior dose distribution but further research is necessary. • IMRT should be employed for photon radiotherapy.

  2. Enhancements and Health-Related Studies of Neutron Activation Analysis Technique

    International Nuclear Information System (INIS)

    Soliman, M.A.M.

    2012-01-01

    The work presented in this thesis covers two major points. One algorithm concerns with establishment of an accurate standardization method with multi-elemental capabilities and low workload suitable for NAA standardization at ETRR-2. The second one deals with constructing and developing an effective nondestructive technique for analysis of liquid samples based on NAA using (very) short-lived radionuclides. To achieve the first goal, attention has been directed toward implementation of the k 0 -method for calculation of the elements concentrations in the samples. The k 0 -method of NAA standardization has a considerable success as a method for accurate multi-elemental analysis with comparable low workload. The k 0 - method is based on the fact that the unknown sample is irradiated with only one standard element as comparator. To access the implementation of this method at ETRR-2, careful and complete characterization of the neutron flux parameters in the irradiation positions as well as the efficiency calibration of the γ-ray spectrometer must be carried out. The required neutron flux parameters are: the ratio of the thermal to epithermal neutron fluxes (f) and the deviation factor (α) of the epithermal neutron flux from the ideal 1/E law. The work presented in Chapter 4 shows the efficiency calibration curve of the γ ray spectrometer system which was obtained using standard radioactive point sources. Moreover, the f and α parameters were determined in some selected irradiation sites using sets of Zr-Au as neutron flux monitors. Due to different locations relative to the reactor core, the available neutron fluxes in the selected irradiation positions differ substantially, so that different irradiation demands can be satisfied. The reference materials coal NIST 1632c and IAEA-Soil 7 were analyzed for data validation and good agreement between the experimental values and the certified values was obtained. The obtained results have revealed that the k 0 -NAA

  3. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  4. Four-chamber view and 'swing technique' (FAST) echo: a novel and simple algorithm to visualize standard fetal echocardiographic planes.

    Science.gov (United States)

    Yeo, L; Romero, R; Jodicke, C; Oggè, G; Lee, W; Kusanovic, J P; Vaisbuch, E; Hassan, S

    2011-04-01

    To describe a novel and simple algorithm (four-chamber view and 'swing technique' (FAST) echo) for visualization of standard diagnostic planes of fetal echocardiography from dataset volumes obtained with spatiotemporal image correlation (STIC) and applying a new display technology (OmniView). We developed an algorithm to image standard fetal echocardiographic planes by drawing four dissecting lines through the longitudinal view of the ductal arch contained in a STIC volume dataset. Three of the lines are locked to provide simultaneous visualization of targeted planes, and the fourth line (unlocked) 'swings' through the ductal arch image (swing technique), providing an infinite number of cardiac planes in sequence. Each line generates the following plane(s): (a) Line 1: three-vessels and trachea view; (b) Line 2: five-chamber view and long-axis view of the aorta (obtained by rotation of the five-chamber view on the y-axis); (c) Line 3: four-chamber view; and (d) 'swing line': three-vessels and trachea view, five-chamber view and/or long-axis view of the aorta, four-chamber view and stomach. The algorithm was then tested in 50 normal hearts in fetuses at 15.3-40 weeks' gestation and visualization rates for cardiac diagnostic planes were calculated. To determine whether the algorithm could identify planes that departed from the normal images, we tested the algorithm in five cases with proven congenital heart defects. In normal cases, the FAST echo algorithm (three locked lines and rotation of the five-chamber view on the y-axis) was able to generate the intended planes (longitudinal view of the ductal arch, pulmonary artery, three-vessels and trachea view, five-chamber view, long-axis view of the aorta, four-chamber view) individually in 100% of cases (except for the three-vessels and trachea view, which was seen in 98% (49/50)) and simultaneously in 98% (49/50). The swing technique was able to generate the three-vessels and trachea view, five-chamber view and/or long

  5. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  6. A novel preconcentration technique for the PIXE analysis of water

    Energy Technology Data Exchange (ETDEWEB)

    Savage, J.M. [Element Analysis Corp., Lexington, KY (United States); Fernandez, R.F. [Element Analysis Corp., Lexington, KY (United States); Zhang, W. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Robertson, J.D. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Majidi, V. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States)

    1995-05-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag{sup +}, Ba{sup 2+}, and Cd{sup 2+} in the concentration range from 10 ng/g to 1 {mu}g/g; for Cu{sup 2+} and Pb{sup 2+} from 10 ng/g to 5 {mu}g/g; and for Hg{sup 2+} from 10 ng/g to 10 {mu}g/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 {mu}g/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag{sup +}, Ba{sup 2+}, Cd{sup 2+}, Cu{sup 2+}, and Pb{sup 2+}, share common binding sites with binding efficiencies varying in the sequence of Pb{sup 2+}>Cu{sup 2+}>Ag{sup 2+}>Cd{sup 2+}>Ba{sup 2+}. The binding of Hg{sup 2+} involved a different binding site with an increase in binding efficiency in the presence of Ag{sup +}. (orig.).

  7. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  8. Cybersecurity Vulnerability Analysis of the PLC PRIME Standard

    Directory of Open Access Journals (Sweden)

    Miguel Seijo Simó

    2017-01-01

    Full Text Available Security in critical infrastructures such as the power grid is of vital importance. The Smart Grid puts power grid classical security approach on the ropes, since it introduces cyberphysical systems where devices, communications, and information systems must be protected. PoweRline Intelligent Metering Evolution (PRIME is a Narrowband Power-Line Communications (NB-PLC protocol widely used in the last mile of Advanced Metering Infrastructure (AMI deployments, playing a key role in the Smart Grid. Therefore, this work aims to unveil the cybersecurity vulnerabilities present in PRIME standard, proposing solutions and validating and discussing the results obtained.

  9. Standard reference materials analysis for MINT Radiocarbon Laboratory

    International Nuclear Information System (INIS)

    Noraishah Othman; Kamisah Alias; Nasasni Nasrul

    2004-01-01

    As a follow-up to the setting up of the MINT Radiocarbon Dating facility. an exercise on the IAEA standard reference materials was carried out. Radiocarbon laboratories frequently used these 8 natural samples to verify their systems. The materials were either pretreated or analysed directly to determine the activity of 14 C isotopes of the five samples expressed in % Modern (pMC) terms and to make recommendations on further use of these materials. We present the results of the five materials and discuss the analyses that were undertaken. (Author)

  10. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  11. A hybrid electron and photon IMRT planning technique that lowers normal tissue integral patient dose using standard hardware.

    Science.gov (United States)

    Rosca, Florin

    2012-06-01

    To present a mixed electron and photon IMRT planning technique using electron beams with an energy range of 6-22 MeV and standard hardware that minimizes integral dose to patients for targets as deep as 7.5 cm. Ten brain cases, two lung, a thyroid, an abdominal, and a parotid case were planned using two planning techniques: a photon-only IMRT (IMRT) versus a mixed modality treatment (E+IMRT) that includes an enface electron beam and a photon IMRT portion that ensures a uniform target coverage. The electron beam is delivered using a regular cutout placed in an electron cone. The electron energy was chosen to provide a good trade-off between minimizing integral dose and generating a uniform, deliverable plan. The authors choose electron energies that cover the deepest part of PTV with the 65%-70% isodose line. The normal tissue integral dose, the dose for ring structures around the PTV, and the volumes of the 75%, 50%, and 25% isosurfaces were used to compare the dose distributions generated by the two planning techniques. The normal tissue integral dose was lowered by about 20% by the E+IMRT plans compared to the photon-only IMRT ones for most studied cases. With the exception of lungs, the dose reduction associated to the E+IMRT plans was more pronounced further away from the target. The average dose ratio delivered to the 0-2 cm and the 2-4 cm ring structures for brain patients for the two planning techniques were 89.6% and 70.8%, respectively. The enhanced dose sparing away from the target for the brain patients can also be observed in the ratio of the 75%, 50%, and 25% isodose line volumes for the two techniques, which decreases from 85.5% to 72.6% and further to 65.1%, respectively. For lungs, the lateral electron beams used in the E+IMRT plans were perpendicular to the mostly anterior/posterior photon beams, generating much more conformal plans. The authors proved that even using the existing electron delivery hardware, a mixed electron/photon planning

  12. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques; Developpement, amelioration et calibration des mesures de taux de reaction neutroniques: elaboration d`une base de techniques standards

    Energy Technology Data Exchange (ETDEWEB)

    Hudelot, J.P

    1998-06-19

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO{sub 2}) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of {sup 238}U (defined as the ratio of {sup 238}U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for {sup 242}Pu (on MOX rods) and

  13. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  14. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  15. Refined analysis of piping systens according to nuclear standard regulations

    International Nuclear Information System (INIS)

    Bisconti, N.; Lazzeri, L.; Strona, P.P.

    1975-01-01

    A number of programs have been selected to perform particular analyses partly coming from available libraries such as SAP 4 for static and dynamic analysis, partly directly written such as TRATE (for thermal analysis), VASTA, VASTB (to perform the analysis required by ASME 3 for pipings of class A and class B), CFRS (for the calculation of floor response spectra etc.). All the programs are automatically linked and directed by a general program (SCATCA for class A and SCATCB for class B pipings). The starting point is a list of the fabrication, thermal, geometrical and seismic data. The geometrical data are plotted (to check for possible errors) and fed to SAP for static and dynamic analysis together with seismic data and thermal data (average temperatures) reelaborated by TRATE 2 code. The raw data from SAP (weight, thermal, fixed points displacements, seismic, other dynamic) are concerned and reordered and fed to COMBIN 2 program together with the other data from thermal analysis (from TRATE 2). From Combin 2 program all the data are listed; each load set to be considered is provided, for each point, with the necessary data (thermal moments, pressure, average temperatures, thermal gradients), all the data from seismic, weight, and other dynamic analysis are also provided. All this amount of data is stored on a file and examined by VASTA code (for class A) or VASTB (for classes B,C) in order to make a decision about the acceptability of the design. Each subprogram may have an independent output in order to check partial results. Details about each program are provided and an exemple is given, together with a discussion of some-particular problems (thermohydraulic set definition, fatigue analysis, etc.)

  16. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  17. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    Science.gov (United States)

    Ramers, Douglas L.

    2005-01-01

    pressurizing the bottle on a test stand, and running sweeps of excitations frequencies for each of the piezo sensors and recording the resulting impedance. The sweeps were limited to 401 points by the available analyzer, and it was decided to perform individual sweeps at five different excitation frequency ranges. The frequency ranges used for the PZTs were different in two of the five ranges from the ranges used for the SCP. The bottles were pressurized to empty (no water), 0psig, 77 psig, 155 psig, 227 psig in nearly uniform increments of about 77psi. One of each of the two types of piezo sensors was fastened on to the bottle surface at two locations: about midway between the ends on cylindrical portion of the bottle and at the very edge of one of the end domes. The data was collected in files by sensor type (2 cases), by location (2 cases), by frequency range (5 cases), and pressure (5cases) to produce 100 data sets of 401 impedances. After familiarization with the piezo sensing technology and obtaining the data, the team developed a set of questions to try to answer regarding the data and made assignments of responsibilities. The next section lists the questions, and the remainder of the report describes the data analysis work performed by Dr. Ramers. This includes a discussion of the data, the approach to answering the question using statistical techniques, the use of an emergent system to investigate the data where statistical techniques were not usable, conclusions regarding the data, and recommendations.

  18. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  19. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  20. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...